Letting you know about recent updates

As the loyal readers among you may have noticed, there has been a lack of posting in the last week or so, but don’t worry, this trend shall soon be halted, as there are many new and exciting posts written, planned and on the way very soon.

This article is just to keep you in the loop and let you know what is going on.

Usually, I will endeavour to keep people up to date with updates to the blog via our social channels, specifically Facebook and Twitter. I feel the blog is a place for content, whilst notifying you about updating the ‘frills’ that are the design and functionality updates should be the place of social media.

Social Media Updates

Let me start by letting you know how our social side is currently evolving. A few months ago, Facebook stopped feed compatibility, meaning that if you liked us on Facebook, you no longer got updates regarding new posts. Now however I have linked Twitter to Facebook, so every tweet @tecbloggers tweets is also posted on our Facebook page.

UPDATE: We now tweet under the username @TechBloggers.

This means that you can now receive updates of new posts via your Facebook feed.

Some tweeters like to spam you with content every five minutes, likewise, update Facebook statuses practically all the time. I don’t believe in this, and only post/tweet an update should it be something you may want to know. Updates like small site improvements/issues and interesting content are the sort of thing we use our social media channels for.

Occasionally if I find, or someone brings to my attention something interesting that I think is worth sharing to the community, but doesn’t warrant a post, then it may get shared via social media. Don’t worry about getting spammed if you subscribe, we will only be posting stuff you probably want to know about.

I have also recently added a cover photo to our Facebook page, as it was looking a little bland. I didn’t have any great ideas, but I think it works for now 🙂 If you have any suggestions, by all means leave them in the comments below.

Facebook cover photo

Technology Bloggers Facebook cover photo.

In future, I don’t plan on writing as many of these sort of posts, as I feel it is better to keep you informed via social media, of updates as and when they occur.

If you don’t want to miss out on future update news, subscribe now!

Twitter Icon

Facebook Icon

In other news…

Top Writers

For a long time now we have had a top commenters widget at the bottom of every page, however the observant among you may have noticed the appearance of a new widget: a top writers list.

I came across the plugin whilst searching for something else, and thought it was a good idea. This is a community blog, so if we highlight the top commenters, why should writers not be recognised too? Well now they are 🙂

A screenshot of the top commenters and top writers list

The top commenters list excludes administrators and resets every month, the top writers list doesn’t.

Design Updates

Technology Bloggers design is constantly being updated and tweaked. I believe that continuous improvement is important. Most of my time is spent writing and replying to comments, however I do dedicate some time to improving other areas of the site.

One recent update is the removal of the social icons from the sidebar, and the addition of a new set of social buttons to the header. I felt that this area needed a bit more colour, and the buttons bring just that!

More Speed!

A few weeks ago I posted on our social channels:

“Just moved servers in order to speed up the blog 🙂
Do you notice a difference?”

We encountered a few problems, however they were soon sorted out, leading me to later post:

“A few hiccups later, Technology Bloggers is fully functioning and faster than ever!”

The blogs response time was sometimes really quite slow (usually higher than 2000 miliseconds!). I moved the blog to a different server and the response time is now around a quarter of what it was, currently around the 550 ms mark.

That is one reason you may have noticed the blog loading faster, another is because of the relentless efforts that I have been putting into slimming things down and reducing load times.

Google’s Page Speed tools have been very useful, enabling me to see where the site lags, and what can be done to improve it. I think there may be an article on the way soon with more detail on Page Speed, and how I have and am still using the tools to speed up the blog. Watch this space.

Jonny

For a while now, Thursday here on the blog seems to have been Jonny’s day, with him posting a regular feature on a Thursday for more than ten weeks now.

The day is not a dedicated day to the writings of Mr Hankins, however at the moment, I feel it is good that the regular feature is on a fixed day, as it gives consistency. His articles are very popular, and it is a delight every Thursday looking to see what new and innovative topic he has chosen to cover.

Jonny has been busy travelling of late, meaning that last week he was unable to post. Don’t worry though, he already has an article written and lined up for us for tomorrow 🙂

Competition

Just a quick note about a competition I plan on launching next Monday. Technology Bloggers has teamed up with two other blogs, and hopefully will soon be launching a competition in which anyone bar the three prize donors can enter for a chance to win one of three $50 USD prizes in a $150 competition!

UPDATE: This will now launch on Tuesday.

Until Next Time

That’s about it from me now, so remember, if you want to keep up to date, be sure to subscribe to our social profiles, and stay tuned to the blog to see our exciting future unfold…

Do you make use of Google Webmaster Tools?

Everybody knows that if you want to run a successful website, Google is one of the most important factors to consider.

How Google understands, interprets and indexes your site, is crucial to where your site appears in the SERPs, and how well your site preforms in the SERPs (specifically the Google SERPs) can be a big determinant how much traffic your site receives, and ultimately how popular/successful your site it.

Google Webmaster Tools is a very useful tool which is often underused by site owners, in order to improve the quality and quantity of traffic that your site receives. In this article I will outline some of the key features I find useful, and some of the main reasons why I use Webmaster Tools.

Google Webmaster Tools

Why Google?

Google has a monopoly on the search market, with more than 90% of all searches being done through Google – according to StatCounter Global Statistics. Therefore the chases are the majority of traffic your site receives through search is from Google. It would be naive to ignore Bing and Yahoo’s search tools available to webmasters, however if you plan on just focusing on one, Google is probably the wisest choice.

Google is renowned for its major updates, with Penguin and Panda just two recent examples. Webmaster Tools can be a great aid in helping you understand how your site has been affected by the changes and why, so you can either keep doing things the way you are, or change your strategy.

See how well you are doing

The most recent Google Webmaster Tools update has divided the dashboard into five easy to understand sections: configuration, how your site is set up (locality, URL preferences, sitelinks etc.); health, how Google crawls your site and any errors, or malware it detects, and the URLs Google is denied from crawling; traffic, how do people find your site, which search queries do you appear for, who links to your content with what keywords and how does Google+ influence your visitors; optimization, tips and tweaks on how you could adjust your content and sitemap to improve your search position; labs, the latest tools Google are trailing that may be of use.
Webmaster Tools Options - Dashboard, Messages, Configuration, Health, Traffic, Optimization

Find crawler errors

One of the main reasons I use Google Webmaster Tools is because it lets me see how Google views and interprets the sites I administer. Therefore should there ever be an error, I am able to understand what Google is struggling to read/crawl, and therefore try to address the issue. Google lets you view and test specific URLs your robots.txt file is blocking Google from indexing and crawling – there is a difference. If you are denying it access to something by mistake you can then rectify this.

Google also lets you see any pages it cannot find. If you run a content management system based site (like one powered by WordPress) it is common that you will change things using the system, and unforeseen errors will be created leading to pages not being found where they either should be, or once were. Google lets you see when it can’t find pages, along with when it is denied from accessing pages, and when inadequate redirects are in place.

If you don’t use Google Webmaster Tools and don’t reduce the problems Google encounters when crawling your site, the likelihood is that your site will suffer in the SERPs – there isn’t much debate about that.

Labs

Google say that:

“Webmaster Tools Labs is a testing ground for experimental features that aren’t quite ready for primetime. They may change, break or disappear at any time.”

however this doesn’t mean that these tools should be ignored, in fact I think they are probably one of the most overlooked resources that Google provides webmasters with.

One of the current ‘Labs’ tools that I think is very useful is the ‘Site performance’ tool. Google may not have generated any information about your site, however if you are one of the lucky ones to be analysed, this can prove a very interesting tool. In Google’s own words:

“This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users.”

As page load time becomes more and more important to users and therefore search engines alike, this page is of crucial importance for many people.

Improvements

You might not expect it, but in the ‘Optimization’ section, under the ‘HTML Improvements’ section Google will actually suggest areas where you could improve your code to ensure that your content is the best possible. Common errors Google suggests for correction include missing or duplicate title tags, (in most cases, and SEO no, no) and meta tag issues.

Traffic

The tools in the ‘Traffic’ section are probably the ones I use the most. ‘Search Queries’ gives you a fantastic incite into where your site is appearing in search results in all different locations across the world. If you pair Webmaster Tools with Analytics, this can become a lot more useful.

Links to your site and internal links lets you see your post linked to content, and the keywords that are linking to it. Generally speaking, if you want to rank well for a keyword, you need to have some links (internal and/or external) using that keyword.

The great thing about Google Webmaster Tools is that it integrates with many other Google programs, in order to improve your total control and visibility of your site. AdSense, Analytics, YouTube and AdWords are just some of the other Google products that Webmaster Tools integrates with.

That is just a quick overview of what Webmaster Tools has to offer. If you own a website, I strongly recommend that you explore it further to help improve your sites visibility in the search results, and to enable you to weather algorithm changes (like Penguin and Panda) that little bit better.

Do you use Webmaster Tools? What are your favourite features?

Problems with online anonymity

The internet probably knows what your favourite shoes look like. How you may ask? Your data is being monitored through your PC without you hearing as much as a peep about it. Private firms can spy on users from the comfort of their own computers.

The FTC has recently handed in a report advising private firms to be more open about their data collection practices. New laws regarding user privacy are also currently being worked on.

Users who want to preserve any semblance of privacy left are looking into do-not-track tools. Some suggest adding a do-not-track option directly into browsers, while others are in favor of different software that can curb data collection altogether.

With regular website cookies come other tracking cookies that help the sites we’re visiting identify our user pattern and collect our data. Current data collection practices aren’t transparent, so we have no idea what these sites are up to once they have what they need.

Failure to comply

The universal do-not-track button goes as far as requesting a website that a user’s information not be tracked as they browse a site. However there’s no guarantee that the site will comply with the request.

This option does close to nothing in terms of blocking the websites access, largely because it can’t. Google’s recent fine for lifting data from open Wi-Fi connections without user permission and Facebook’s accessing people’s texts on app user’s cell phones is proof that firms don’t always adhere to the norms of privacy – and those are two really big firms.

At best a do-no-track tool will lull you into a false sense of security where in reality you have more than one front to protect yourself on. Large private firms aren’t the only ones stealing data; there are numerous other threats which one needs to take into account.

Monsters beneath your bed

Fighting against tracking cookies alone is as much the same as looking for the monster in the closet without realizing what’s hiding under your bed.

Options such as AVG’s Do-Not-Track or DNT+ will only go as far as the do-not-track button is meant to. However, PC monitoring tools and other forms of spyware could already exist on your system – granted the data would be going to a person and not a company.

Most computer monitoring software is wired to record your browsing history. Whether or not you’re deleting your cookies becomes irrelevant here. The same is the case with spyware or malware that you mistakenly download by clicking on obscure links or opening spam emails.

No free lunches

Free Wi-Fi is a real treat till you realize that there’s a chance it’s been decked up with computer monitoring software which can record every move you make on your browser. Software such are Firesheep and Wireshark can easily make their way into your system if you’re on a network that has them preinstalled. The Wi-Fi owner has no need to break into your system manually or be anywhere near you to figure out what you’re using the Wi-Fi for.

The WiFi LogoThat’s if you’re using someone else’s Wi-Fi. However even if your own Wi-Fi is open you’re in danger of being attacked. During 2010 reports that Google was lifting private data through open Wi-Fi’s first surfaced, and regardless of how apologetic Google was, it never stopped the practice.

Even with new laws in place for the preservation of user data and more transparency as to what cookies are infiltrating user systems, there’s still a large potential for data collection against a user’s will.

The best idea would be to take a holistic approach to your browsing experience and stay safe from all sides – after all don’t-track-tools are only one a small aspect of online safety, not the key.