Do you make use of Google Webmaster Tools?

Everybody knows that if you want to run a successful website, Google is one of the most important factors to consider.

How Google understands, interprets and indexes your site, is crucial to where your site appears in the SERPs, and how well your site preforms in the SERPs (specifically the Google SERPs) can be a big determinant how much traffic your site receives, and ultimately how popular/successful your site it.

Google Webmaster Tools is a very useful tool which is often underused by site owners, in order to improve the quality and quantity of traffic that your site receives. In this article I will outline some of the key features I find useful, and some of the main reasons why I use Webmaster Tools.

Google Webmaster Tools

Why Google?

Google has a monopoly on the search market, with more than 90% of all searches being done through Google – according to StatCounter Global Statistics. Therefore the chases are the majority of traffic your site receives through search is from Google. It would be naive to ignore Bing and Yahoo’s search tools available to webmasters, however if you plan on just focusing on one, Google is probably the wisest choice.

Google is renowned for its major updates, with Penguin and Panda just two recent examples. Webmaster Tools can be a great aid in helping you understand how your site has been affected by the changes and why, so you can either keep doing things the way you are, or change your strategy.

See how well you are doing

The most recent Google Webmaster Tools update has divided the dashboard into five easy to understand sections: configuration, how your site is set up (locality, URL preferences, sitelinks etc.); health, how Google crawls your site and any errors, or malware it detects, and the URLs Google is denied from crawling; traffic, how do people find your site, which search queries do you appear for, who links to your content with what keywords and how does Google+ influence your visitors; optimization, tips and tweaks on how you could adjust your content and sitemap to improve your search position; labs, the latest tools Google are trailing that may be of use.
Webmaster Tools Options - Dashboard, Messages, Configuration, Health, Traffic, Optimization

Find crawler errors

One of the main reasons I use Google Webmaster Tools is because it lets me see how Google views and interprets the sites I administer. Therefore should there ever be an error, I am able to understand what Google is struggling to read/crawl, and therefore try to address the issue. Google lets you view and test specific URLs your robots.txt file is blocking Google from indexing and crawling – there is a difference. If you are denying it access to something by mistake you can then rectify this.

Google also lets you see any pages it cannot find. If you run a content management system based site (like one powered by WordPress) it is common that you will change things using the system, and unforeseen errors will be created leading to pages not being found where they either should be, or once were. Google lets you see when it can’t find pages, along with when it is denied from accessing pages, and when inadequate redirects are in place.

If you don’t use Google Webmaster Tools and don’t reduce the problems Google encounters when crawling your site, the likelihood is that your site will suffer in the SERPs – there isn’t much debate about that.

Labs

Google say that:

“Webmaster Tools Labs is a testing ground for experimental features that aren’t quite ready for primetime. They may change, break or disappear at any time.”

however this doesn’t mean that these tools should be ignored, in fact I think they are probably one of the most overlooked resources that Google provides webmasters with.

One of the current ‘Labs’ tools that I think is very useful is the ‘Site performance’ tool. Google may not have generated any information about your site, however if you are one of the lucky ones to be analysed, this can prove a very interesting tool. In Google’s own words:

“This page shows you performance statistics of your site. You can use this information to improve the speed of your site and create a faster experience for your users.”

As page load time becomes more and more important to users and therefore search engines alike, this page is of crucial importance for many people.

Improvements

You might not expect it, but in the ‘Optimization’ section, under the ‘HTML Improvements’ section Google will actually suggest areas where you could improve your code to ensure that your content is the best possible. Common errors Google suggests for correction include missing or duplicate title tags, (in most cases, and SEO no, no) and meta tag issues.

Traffic

The tools in the ‘Traffic’ section are probably the ones I use the most. ‘Search Queries’ gives you a fantastic incite into where your site is appearing in search results in all different locations across the world. If you pair Webmaster Tools with Analytics, this can become a lot more useful.

Links to your site and internal links lets you see your post linked to content, and the keywords that are linking to it. Generally speaking, if you want to rank well for a keyword, you need to have some links (internal and/or external) using that keyword.

The great thing about Google Webmaster Tools is that it integrates with many other Google programs, in order to improve your total control and visibility of your site. AdSense, Analytics, YouTube and AdWords are just some of the other Google products that Webmaster Tools integrates with.

That is just a quick overview of what Webmaster Tools has to offer. If you own a website, I strongly recommend that you explore it further to help improve your sites visibility in the search results, and to enable you to weather algorithm changes (like Penguin and Panda) that little bit better.

Do you use Webmaster Tools? What are your favourite features?

Good blogging practice – publishing reliable information

The web is massive bank of data, which is far too big to be regulated. Because the web can’t be regulated, it is very easy for false information to spread – fast.

If you are a blogger, it is really important that you publish information which is reliable and trustworthy. Don’t copy what the crowd says unless you know they are right, as this is not only misleading to your readers, but can also see you get penalties dished out from search engines. If you get a reputation for publishing unreliable content, the likelyhood is that your readership will fall.

When you publish something that you have found out elsewhere, you need to make sure that it is accurate and reliable, before you publish it.

How to Mythbust Rumours

When you find information, on the web, in order to ensure that it is reliable, it is always a good idea to check that it appears elsewhere. A general rule of thumb is to check that what you are reading is the same on 3 other sites, one of which is a highly reputable site.

So what is a reputable website?

Government Websites

There are a few way so to identify if a site is reputable or not. One way is to see if it is a government website. Any site which is government run is likely to be very reputable. Government websites usually end in their own unique domain name extension. If you live in the USA, government sites end in .gov or .fed.us, in the UK .gov.uk, in France .gouv.fr, .gc.ca for Canada, India’s extension is .gov.in and the list goes on.

Major News Corporations

Government sites won’t always report things that you want to verify though, so there are other ways to tell a reputable sites. Big news websites like BBC.co.uk/News and Guardian.co.uk will usually only publish information that is factual and accurate, so you can usually trust them.

The Guardian's logoThe information they publish is likely to be accurate, however it may not be impartial, so that is something to watch out for. Often news firms will take a political side, and therefore report news in a certain way – and may only publish part of a story.

High PageRank Sites

Google PageRank is calculated largely by the number of backlinks a page or site has. If a website has a very high PageRank (6+) then it is likely that it has a lot of other sites linking to it, most probably because it publishes a lot of high quality content, which people find useful and therefore link back to. High PageRank sites aren’t always trustworthy, but the higher up the spectrum of PageRank you go, the less likely it is that a site is going to be providing false information.

If a website is a PageRank 8, 9 0r 10, unless they have manipulated Google’s algorithm (through black hat SEO, which will only work for a short while, before Google catches them) then the site is likely to be extremely reliable and reputable, therefore you should be able to trust the information, data and facts that they produce.

1,000,000 to 1

If 1 highly reputable site is saying one thing, but 1 million other (not reputable) sites are saying another another, then the chances are that the 1,000,000 sites are just recycling the same false information, creating a massive bank of false information. This is one reason why you should be really careful who you trust on the web, and also make sure that you verify information with at least one reputable site. Be careful who you trust.

Academic Research

Verifying information with at least 3 sources, one of which is reputable is something which is also advised in academic research. Therefore if you use the same standards on your blog, you can’t go wrong! Search engines and readers alike will respect you for providing good quality, highly reputable content.

Technology Bloggers Policy

Every time I write an article and quote information/statistics etc. I always try to follow the 3 and 1 rule: check the information appears on 3 other sites, at least one of which is ‘reputable’. This means that everything I write should be reputable.

The post guidelines ask all writers to ensure they use the 3 and 1 rule, however we cannot guarantee that all writers do. In our Privacy Policy, we state how we try to ensure all content is true and factual, however it is always advisable to independently verify information for yourself.

Do You Verify Your Content?

Do you always try to ensure that you use the 3 and 1 rule when publishing information? That not only applies to blog posts, but also to comments. If not what measures do you use, or don’t you think it really matters?

How important is the quality of hosting to online retailers?

This is a sponsored post. To find out more about sponsored content on Technology Bloggers, please visit our Privacy Policy.

With many brick-and-mortar businesses adding an online version of their high-street store to their portfolio, it’s important that firms choose the right web hosting service. With myriad services offering cheap deals, firms ought to be wary regarding offers that appear to be too good to be true – because they usually are.

On the surface, purchasing web hosting that costs £20 a month seems like a steal. In fact, it is a steal. However the only thing that is being nicked is precious uptime for online retailers, as the vast majority of cheap hosts go hand-in-hand with downtime.

Downtime – a retailer’s worst nightmare

For online retailers, downtime is especially important; every second of downtime is potentially a lost sale. Would you rather pay a premium for quality web hosting that is reliable and constantly up, or pay a third of the price for web hosting that keeps going down? In the long-term, it may cost more for firms to pay for cheap, but less reliable hosting.

In addition, utilising the services of a web hosting service in your time zone could be important, especially for smaller firms. Imagine if your store goes down but your hosting service is half-way across the world. This is certainly not ideal for any stores looking to make sales. For example, let’s just say your UK-based store goes down at lunchtime. No amount of calls at 12pm is going to wake a firm located half-way across the world; tucked up in bed at 12am. It’s a nightmare scenario.

SEO

The importance of SEO over the last 10 years has changed the face of the internet. An increasing number of online retailers are producing fresh content in a bid to become a publishing authority in the eyes of search engines.

However, when a cheap hosting company offers dead links, 404 errors and other harmful downtime to a retailer, what are these search engines going to think? Bounced traffic isn’t going to look good in the eyes of Google or Bing.

Technology Bloggers 404 error

Technology Bloggers 404 Page

Eventually, a retailer could slip down the rankings, and get flanked by its competition. It takes a lot of dedication to work up the search rankings, so don’t let a bogus hosting firm ruin your company and its prospects.

Security

In addition, security should be a top priority for online retailers. The amount of hackers roaming cyberspace is vast and make no mistake – they’re ready to capitalise on unprotected websites. By opting for secure web hosting which features STFP and SSL, a business and its clients can feel assured that all sensitive data is kept in safe hands.

As you can see, the quality of web hosting is an absolutely integral part of the foundations of success for online retailers. In an era of cost cutting and tight purse strings, it might be tempting to lump with a cheap web hosting service from the other side of the world, but in the long-term you may end up opening your wallet more often than you think.