Simple ways to speed up your website

Having a fast website is very important. As I mentioned in my Black Friday post, nobody likes a slow website and if your site take more than a few seconds to load, the chances are you are loosing visitors because of that lag.

This article contains a few easy to implement tips which you can use to help you reduce the load time of your website.

Keep Your Code Tidy

Unless something goes wrong, or someone chooses to view your source code, most of the people who visit your website will never see any of the code that is stuffed away behind the scenes. That doesn’t mean it isn’t important however. After all, the code at the back-end is what creates the website at the front end.

Minify HTML

Minimising your HTML, CSS and JavaScript is a very easy way to reduce the size of your website. If there is less to load, then your website will load faster. If you use a CMS like WordPress, there are many plugins which can minify your code for you. If you self-code there are websites which will shrink your code for you, or you could go through it yourself, removing unnecessary spaces and tags etc.

Reduce Files Fetched

It is good practice to fetch as few files as possible when loading your website. For example, many sites use separate style sheets for different parts of the website – for example one for text, one for images and another for general layout. Every file that your page calls upon increases its overall load time. Fetching one big CSS document will usually be faster than fetching three smaller ones.

Also consider how many external resources you load – for example adding a Facebook like button will require the users browser to visit Facebook’s website to pull the code across, whilst loading your page. A link or a delayed load on things like social sharing buttons can give you a big speed boost.

Optimise Your Images

Images make your content more exciting, however if you don’t optimise them then they can often really slow your page load time down. There are various ways you can reduce the file size of your images without compromising on quality.

Resize Pictures

When you take a picture, it can often be much bigger than you really need it to be. By resizing photos before you upload them, you can massively reduce the file size of your images. If you leave the file big, but resize it using HTML or CSS – by setting a smaller height and width – then the end user still has to load the big image, and then their browser then has to squash it down to fit your new image dimensions.

Choose The Right File Type

The most commonly used image formats are .jpg, .gif and .png.  Different images lend themselves to different formats. Reducing the number of colours available to a GIF or a PNG-8 image will reduce the files size, whilst reducing the image quality will lower the size of a JPEG file.

Use An Image Compressor

Image compressors are another way to shrink images. Technology Bloggers currently uses a WordPress plugin called WP Smush.it which uses the Yahoo! Smush.it tool to reduce image files.

Example

Here is a picture that I took several years ago whilst in South Africa.

Elephants in South Africa
The full sized image was 3.44 megabytes. Resizing it in Photoshop helped me reduce that to 1.61 megabytes. Because there are lots of colours and the image was quite big, choosing GIF and PNG-8 format made it look too pixelated, so it was between PNG-24 and JPEG. PNG-24 squashed the image down to 831 kilobytes, whilst JPEG compressed it to a tidy 450 kilobytes. Although that is a lot smaller than the original file, it would still take a long time to load on a slow connection so by taking a very small hit on the image quality, I managed to get the file size down to 164 kilobytes. Finally running the image through Smush.it took it down to 157 kilobytes. Some images see a big reduction, most (like this one) see a smaller reduction of just a few percent.

Use A Content Delivery Network

Content delivery networks, or CDNs, can help to improve a websites speed and make it more reliable. Put very simply, when someone tries to access your site, without a CDN they are directed to your hosting provider, who will then serve them your website and all its files from their server. This means that if your host goes down because of a fault, or a sudden surge in traffic you loose your site, and also if your host is not close to a user, it can take a long time for them to communicate.

With a CDN, users can fetch your site faster, because it is offered in multiple locations around the world. Additionally many CDNs can cache a copy of your site, so if your host goes offline, they can provide a static version of your site to users until it comes back up.

For example, Technology Bloggers is currently hosted in Gloucester in the UK. If you access us from Australia, CloudFlare (the CDN we use) will send you to its closest data centre, which could be in Australia, which will then deliver the files you need to see our site. It is faster because your requests don’t have to travel all the way to the UK and nor does the data being sent back to you either.

Control Your Cache

Server Side

If you use a CMS, then the chances are your content is dynamically delivered upon request. Basically, when the user requests a page, your site creates it and then sends it back. By using some form of caching you can create a static image of your site, so your site doesn’t have to create the content each time a user visits it. There are various plugins you can use to help with this, Technology Bloggers uses CloudFlare’s caching system, as I have found this seems to work better than other WordPress plugins I have tried. Also, using too many plugins, slows your site down, hence why I let the CDN manage it.

User Side

A users browser also saves files for later, in case they visit your site again. It is possible to determine what files are saved and for how long these files are saved for, by adding caching headers to your .htaccess file you can change these settings.

How To Test If Your Site Is Faster

Refreshing your page and timing it with a stopwatch is one way to gauge how quick your site loads. This probably isn’t the best way to do it though!

There are various websites which rate your sites speed performance. I tend to measure Technology Bloggers using four main speed analysis sites.

Google PageSpeed

Google are keen for the web to be faster and offer a very useful tool which gives your site a score for mobile load time and desktop load time. It also suggests what it believes is slowing your site down. Google’s tool also gives an image of your fully loaded site – all the content above the fold. Unfortunately, their test doesn’t actually state how fast your site loads, just how well optimised it is.

WebPageTest

Probably the most thorough site I use is WebPageTest, which presents loads of different information, including first view load time, repeat view load time (which should be quicker if you have user side caching), a waterfall view of all the files loading, a visual representation of how your site loads, suggestions as to where performance issues lie and loads more.

An analysis of TechnologyBloggers.org using the WebPageTest tool

Pingdom

Pingdom is another useful tool, it gives a handy speed score and also tells you how fast your site is compared to other sites it has tested. It also saves your speed results, so you can view historic test result speeds on a graph, and see how your sites speed has changed.

GTmetrix

GTmetrix is another useful site. It also gives lots of details, and helps you to see what is slowing your site down. GTmetrix also lets you compare one site to another, which I’m not really sure is that useful, but it is interesting to see how your competitors site compares to your own.

An analysis of TechnologyBloggers.org using the GTmetrix tool

Happy Browsing

Remember to enjoy your new, faster site! Hopefully your visitors will too. 🙂

Black Friday – Getting caught in the rush

There are various thoughts as to where the term Black Friday originated from, one I recently heard was that it was the date that retailers expected to break even – so move from the red (making a loss) into the black. Black Friday is no longer an American phenomena, Australia, Mexico, the UK and many European countries (just to name a few) also now run similar events.

In recent times you could argue it is so-called because of the havoc it wreaks. People die in the rushes that Black Friday creates. Some sources state that the day has claimed as many as 100+ lives in the last decade. Many hundreds of people are injured every year in the stampedes and commotion that have become associated with the day. Today the BBC are reporting on a Tesco which has had to close its doors this morning, due to what it calls ‘scuffles’ amongst customers.

Whilst physical stores are experiencing high volumes of customers, I suspect their online stores are experiencing many many more. Retailers have known that Black Friday was coming all year, so they are all super prepared aren’t they? It appears not.

Many retailers websites have been experiencing higher than normal volumes of traffic today. They expected that though. So why if they were expecting this have many hight profile sites gone down?

Argos – a previously struggling UK retailer – appears to be turning itself around, yet on what will almost certainly be its websites busiest day of the year, customers have to wait to access their site.

I didn’t think they would let this happen, so tried to click through to Argos.co.uk but was greeted with this page.

Argos website down on Black Friday

I followed their advice and refreshed the page a few minutes later, and amazingly the site worked! I did try again in a different browser a little later on and I got the same message, however again after a few minutes, it cleared.

On any other day of the year, customers would find this completely unacceptable, so why on its busiest day, did Argos let its site get swamped?

Argos wasn’t alone though. Currys site was also offline for many visitors. Unlike Argos Currys gave an estimated time that they would let me in.

They predicted around 20 minutes when I first joined the queue (yes I joined a queue to enter a website) but by the time I finally got in it had been well over an hour.

Curries crush - Black Friday

Currys Black Friday queue to get into their website

What I can’t understand is why their sister site, PC World, had no waiting time yet it took me over an hour to get onto the Currys homepage. When I was in everything seemed to load pretty sharpish, so I don’t know why there was such a long wait. Surely more people on a slower connection is better than a handful loading fast?

Tesco Direct by far was the worst though. Despite being the second largest retailer in the world (as measured by revenues) the wait to get into the Tesco Direct shop was ridiculous. After at least an hour an a half, Tesco’s 30 second refresh countdown timer was still going. Every 30 seconds it was checking if there was space to let me in, and there never was.

I don’t know what Tesco were offering – and I suspect many people will never know – because honestly, who is going to wait by their computer for more than 90 minutes to access a website?

Tesco offline during Black Friday

Tesco’s Black Friday Calamity, customers can’t access Tesco Direct for hours.

So what lesson does this teach us? Well if you are a big retailer, who is taking part in Black Friday, make sure you invest in the appropriate infrastructure before the day, or else you could miss out on a huge number of potential sales.

Speed in space

One of the many problems with space travel is how we measure speed.

Speed is relative – as this very good Ted video shows.

Speeding Up

One of the problems facing human space travel isn’t travelling fast, it’s getting to that speed. The g-force excreted on the body whilst accelerating poses major health issues. So even thought we may be able to invent ways of travelling faster, unless we can control the g-force, its pointless going faster, as if we get to a fast speed too quick (accelerate too fast) the people travelling at that speed will die.

If you are driving a fast car and you very quickly put it into a lower gear and put the accelerator to the floor, you feel yourself fly into the back of your seat. If you are travelling at 60mph your body feels fine, as it does at 0mph, however in the few seconds it takes to get you there, you are subject to huge g-force’s.

Travelling from 0-60mph in 30 seconds puts the body under a lot less stress than if you do it in 3 seconds. It’s the same with space travel, the body can cope with moving reasonably quickly, however it cannot cope with getting there too fast.

F1 Example

Raikkonen F1 Crash British GP

Kimi Raikkonen’s 47G crash at Silverstone 2014

Those who enjoy F1 may remember Kimi Raikkonen’s horrific 150mph crash at Silverstone this year. For a matter of seconds the Fin had 47 Gs of force excreted upon him. For an F1 driver, 150mph is not an unusual speed, however spinning at that speed and coming to a sudden stop caused the dramatic force that Raikkonen endured. Had Raikkonen been spinning with 47 Gs of force for over a minute, the likelihood is he would have died, however because it was only for a short period of time, he was able to race again two weeks later, having sustained no lasting injuries.

Unlike us, robots can be built to sustain such forces, which is one of the reasons why missions like Rosetta and Voyager can see probes sent huge distances in (relatively) small periods of time.

Lets hope in the near future someone discovers a way to keep g-forces at bay, to enable us to travel further into space, faster!