● add a new level of security
Here are some recommended CDNs
● BunnyCDN
● CloudFlare
● KeyCDN
● StackPath
[20] Gzip is a method of compressing aka shrinking your files which enables faster network transfers. Due to the smaller size your content loads faster when
someone accesses your site. Even though gzip compression is a standard practice these days, there are still some hosts which do not offer this feature by
default. So make sure to check if it’s enabled.
You can easily test for Gzip compression by checking out the following app:
https://www.giftofspeed.com/gzip-test/
In case the result is that you don’t have Gzip compression enabled, simply reach out to your web host to activate it. If that’s not possible it might be time to
switch to a new web host.
[21] What is HTTP? HTTP is the network protocol which is the negotiation base for the exchange of all network information exchange between the user and
your site/server. It’s basically the channel that enables users to load websites.
There are two different versions of HTTP: The later version, HTTP/2 is basically a revision of HTTP/1 with major performance improvements. Your goal
should be to have your server run on HTTP/2: The update is a fundamental makeover with a difference of 18 years in between the releases. The most
advanced feature of HTTP/2 is that it can send several data requests over a single TCP connection at the same time. This allows you to download web files
from a server via ASync mode.
Here are two tools that let you test if your website supports HTTP/2:
● https://http2.pro/
● https://tools.keycdn.com/http2-test
If the result is that your website doesn’t support HTTP/2, reach out to your web hosting provider. If your provider can’t enable this feature for you, it’s time to
switch hosts.
The latest update, HTTP/3, is scheduled to be launched later this year so keep an eye out!
[22] Caching is the process in which the server stores the website data as static in memory, in order to be served faster. Hence, since it increases the page
loading speed, it also increases user experience.
There are different types of caching out there:
● server-side cache
● database cache
● object cache
● page cache
● CDN cache
Each type stores the data on different levels.
Having a caching solution in place should be standard for every website. Luckily, there are many solutions available that enable caching, especially for
WordPress.
Many web hosting providers already come up with their own caching solutions built around their existing infrastructure. Examples are SG Optimizer by
Siteground or Breeze by Cloudways. These solutions are free to use especially when you are hosting your websites with the respective providers.
Among the premium plugins available, we recommend the following ones:
● WpRocket
● WpRocket
● database cache
● object cache
● page cache
● CDN cache
Each type stores the data on different levels.
Having a caching solution in place should be standard for every website. Luckily, there are many solutions available that enable caching, especially for
WordPress.
Many web hosting providers already come up with their own caching solutions built around their existing infrastructure. Examples are SG Optimizer by
Siteground or Breeze by Cloudways. These solutions are free to use especially when you are hosting your websites with the respective providers.
Among the premium plugins available, we recommend the following ones:
● WpRocket
● Swift Performance
Some of these plugins do more than just caching and also include website optimization powerhouses, including functions such as image optimization, CSS
& JS files optimization, database optimization and more.
[23] By minifying Javascript and CSS files you will make your website smaller in size, hence, again, reducing the loading time.
A good way to do this is with one of the caching plugins mentioned above.
In this context it is also worth mentioning the Autoptimize WordPress plugin. Even though the plugin doesn’t help with caching per se, it does deal with
image and files optimization.
[24] The one factor with the most impact on the respective website size are media-rich items such as images or videos.
Often, websites contain very large images – often in “straight-from-the-camera” sizes. Frequently people don’t even consider this a problem but think about
it: When you have several 7 MB pictures on a website and a user accesses it from a 3G carrier – it becomes clear that the size is critical!
The easiest way to make sure all pictures are of adequate size is to have them resized automatically, for example by using plug-ins. For WordPress, there
are several plugins that have a solution for the situation:
● Shortpixel Image Optimizer
● Resize Image After Upload by Shortpixel
● Imsanity
● Swift Performance Pro
[25] This point sounds similar to the previous one but we can’t stress the importance of this enough: Images have a huge impact on the size of your site
(and performance!). Converting your pictures into the right format is probably around 80% of your website optimization.
Again, plug-ins are your friend! For WordPress, there are many different plugins for image optimization available, some are part of general optimization
plugins, others serve as stand-alone solutions. Here are some examples:
● Shortpixel Image Optimizer
● Imagify, part of the Wp Rocket family
● SmushIt
● Flying Images
● Imsanity
However, while striving for small pictures, always make sure you keep the image quality in mind. Smaller pictures means lower quality. So what is the
golden middle? Try to get the best of both worlds by going with the “glossy” approach: not too aggressive (lossy – best performance, at the cost of image
quality) and not too soft (lossless – best image quality, low performance).
● SmushIt
● Flying Images
● Imsanity
However, while striving for small pictures, always make sure you keep the image quality in mind. Smaller pictures means lower quality. So what is the
golden middle? Try to get the best of both worlds by going with the “glossy” approach: not too aggressive (lossy – best performance, at the cost of image
quality) and not too soft (lossless – best image quality, low performance).
[26] “Lazy loading images” (and iframes) means “loading images on websites asynchronously — that is, after the above-the-fold content is fully loaded, or
even conditionally, only when they appear in the browser’s viewport. This means that if users don’t scroll all the way down, images placed at the bottom of
the page won’t even be loaded.” – Sitepoint
This is a great technique to speed up a website tremendously.
And once again – there are plugins for this.
There are various options out there – reaching from stand-alone solutions to plugins that are part for general performance enhancement efforts:
● Autoptimize (images & iframes)
● SmushIt (images only)
● Flying Images (images only)
● WpRocket (images & iframes)
● Swift Performance Pro (images & iframes)
[27] This is a golden rule when it comes to IT: Make sure your technology is compliant with all the latest features, updates and plugins AT ALL TIMES. This
is essential for both security and performance reasons: These days there are constantly new updates released to keep up with latest developments and
security gaps. If your IT systems are not compatible with these new features you quickly fall behind.
And a hacked website is the last thing we want – it means you lose control over your OWN website and your rank that you worked so hard for will take an
immediate dip.
If you work with WordPress, make sure to regularly update WordPress core, plugins and themes.
Also, when it comes to web hosts, don’t forget that the respective server your site runs on must be modern and in accordance with the latest technology
standard as well! Hence it should be running the latest versions of PHP, MySQL, Apache or Nginx, etc. Usually this is a given if you use a cloud web
hosting platform like e.g. CloudWays or a more sophisticated one such as Kinsta.
[28] This point is specifically targeted at WordPress users.
In WordPress, there is a central database that stores all your content and data, such as posts, pages and comments. However, over time, this database
also piles up unnecessary “junk information” which slows it down. To ensure your database is operating at its maximum efficiency level, you will need to
optimize it from time to time. Here are some plugins for this purpose:
● Wp Optimize
● Optimize Database after Cleaning Revisions
● Breeze
● Wp Rocket
● Swift Performance
[29] As we heard multiple times now, Search Engines want to make sure their users have the best possible experience during their search. Hence, user
experience and security are two of the main drivers when it comes to ranking high. Search engines are obsessed with user experience and security.
You might remember a major hype from 2019 when Chrome started flagging non-https websites and marked them as non-secure. So in case your site
doesn’t run on https, your users receive a notification that it might not be secure. A nightmare for the site owner!
Basically, the algorithm works as follows: If 2 websites are (close to) identical in terms of other parameters, Google will rank the one running on https higher.
But be careful: A common mistake is for people to transfer the site to https, but forgetting to set up a 301 redirect from the http to the https version. This
means they basically end up with two websites: One running on http and one running on https. This is a serious problem as Google now sees two separate
websites instead of one!
[29] As we heard multiple times now, Search Engines want to make sure their users have the best possible experience during their search. Hence, user
experience and security are two of the main drivers when it comes to ranking high. Search engines are obsessed with user experience and security.
You might remember a major hype from 2019 when Chrome started flagging non-https websites and marked them as non-secure. So in case your site
doesn’t run on https, your users receive a notification that it might not be secure. A nightmare for the site owner!
Basically, the algorithm works as follows: If 2 websites are (close to) identical in terms of other parameters, Google will rank the one running on https higher.
But be careful: A common mistake is for people to transfer the site to https, but forgetting to set up a 301 redirect from the http to the https version. This
means they basically end up with two websites: One running on http and one running on https. This is a serious problem as Google now sees two separate
websites instead of one!
[30] Similar to the previous paragraph, always make it very clear which URL version your website runs on! There are two versions, ones including the “www”
and one skipping it.
https://domain.com – without “www”(recommended)
https://www.domain.com (also functionable)
Always make sure to be consistent with it when it comes to link building and social networks. If the “www” is not needed, don’t mention it. For example, if
your website loads without the “www”, then it would be weird and unnecessary to use the following format: https://www.domain.com across your social
networks.
[31] What is mixed content?
Mixed content refers to the mix of secure and non-secure resources found on a webpage. This happens whenever a secure webpage attempts to access
resources, such as images, or CSS, that are not seen as secure. Sounds abstract – why is this a problem?
Well, Google announced that future versions of Chrome will block mixed content errors.
This in return means that Chrome will simply not load these aspects of your website properly, resulting in potentially severe damages to the visual side of
your page.
An example: Let’s say your site runs on https, but for some reason loads a CSS file on http (seen as non-secure). That’s when you created a mixed
content error.
This issue commonly happens when you use an old theme or outdated plugins. Therefore, always make sure that your themes and plugins are up-to-date
and come from trusted sources only.
[32] According to Google, “a sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships
between them.Search engines like Google basically read this file to more intelligently crawl your site.”
A sitemap essentially provides the search engine with a more in-depth idea about the structure of your site. This in return, helps the Search Engine
understand which pages they should crawl and do so faster and better.
In case you’re using SEO plugins like Yoast or RankMath your sitemap is automatically generated. To give you a better idea, this is how a sitemap looks
like:
https://seobuddy.com/sitemap.xml
http://seobuddy.com/blog/sitemap_index.xml
[33] Robots.txt, also known as the so-called “robots exclusion protocol” is a file which contains instructions for crawling spiders. It essentially tells the Search
Engine what pages to crawl and what pages NOT to crawl.
Every Googlebot has a maximum “crawling budget” ( meaning how many URLs the Googlebot can crawl). Hence, it’s crucial to guide the spider to make
sure it crawls the most important pages and neglects the ones that are not as important anyways (attachment pages, sometimes tags, etc) and not vice
versa!
Every Googlebot has a maximum “crawling budget” ( meaning how many URLs the Googlebot can crawl). Hence, it’s crucial to guide the spider to make
sure it crawls the most important pages and neglects the ones that are not as important anyways (attachment pages, sometimes tags, etc) and not vice
versa!
[33] Robots.txt, also known as the so-called “robots exclusion protocol” is a file which contains instructions for crawling spiders. It essentially tells the Search
Engine what pages to crawl and what pages NOT to crawl.
Every Googlebot has a maximum “crawling budget” ( meaning how many URLs the Googlebot can crawl). Hence, it’s crucial to guide the spider to make
sure it crawls the most important pages and neglects the ones that are not as important anyways (attachment pages, sometimes tags, etc) and not vice
versa!
Here’s an example for a robots.txt file: https://seobuddy.com/robots.txt
Now, let’s do a little experiment:
Try to open your own robots.txt file by accessing https://yourwebsite.com/robots.txt
Now, does this open the file? Yes? Fantastic, this means you have a robots.txt in place for your website. If not, simply create one and drop it into the root of
your website, via ftp.
Here’s what you can do with the robots.txt file:
● Sculpt the crawling budget by eliminating unnecessary pages from the crawl
● Prevent spiders from accessing “private” pages and showing them in search results. This refers to “membership” or “thank you pages” that are of
secondary importance
[34] Maybe you have noticed it before when scrolling through the internet: The same URL is mentioned twice, one time with and one time without the trailing
slash.
Here’s an example:
● https://site.com/the-seo-checklist/
● https://site.com/the-seo-checklist – without the trailing slash
As mentioned before this basically means that Google sees two different pages – for the same URL. When it comes to ranking your page you will want to
avoid this at all costs.
How to fix it?
First, check out that the canonical URL is set to “/” – at this point, if the canonical is already set to “/” then you’re fine.
To fix this issue once and for all, simply to a 301 redirect from the URL without a slash to the one with the slash.
[35] Especially after blogging or running a website for years, it is relatively likely that you have a couple broken links flying around. A broken link is a
reference to another page which is no longer existent. If you or one of your users try to access a page like this, it will return a 404 error.
Depending on how many links you have, you might now say “How do i quickly identify them without clicking on every single link on my website?” No worries – there are crawling tools to help you out with this. Two recommended tools are Sitebulb and Screaming Frog SEO Spider (free up to 500 urls). Go check
them out!
While there is – as always – a WordPress plugin for the WordPress users, this is not recommended as it will create some performance issues due to its
search method. It basically requires many resources to conduct the search which in return might cause the server to go down, especially if you don’t have a
high-performance webhost.
For more details, check here for 5 ways on how to best check for broken links, showcased by Kinsta.
- there are crawling tools to help you out with this. Two recommended tools are Sitebulb and Screaming Frog SEO Spider (free up to 500 urls). Go check
them out!
While there is – as always – a WordPress plugin for the WordPress users, this is not recommended as it will create some performance issues due to its
search method. It basically requires many resources to conduct the search which in return might cause the server to go down, especially if you don’t have a
high-performance webhost.
For more details, check here for 5 ways on how to best check for broken links, showcased by Kinsta.
[36] Whenever there are two or more redirects happening at the same time, it is called a redirect chain. Here is an example:
Page A -> Page B -> Page C -> Page D.
Why should we try to break redirect chains?
● it increases crawling efficiency
● it reduces page loading times
● it improves link equity, which means that the Page D gets more “link power” which eventually results in a higher ranking
The easiest way to fix these and breaking the chain is by simply redirecting Page A to Page D via a 301 redirect.
As mentioned in the previous paragraph, it is recommended to use a crawling tool – again we suggest Sitebulb or Screaming Frog SEO Spider.
[37] As mentioned before, the meta title and description are the information that is shown in search results. For many users, this little piece of text will be the
first touchpoint with your website – eventually deciding if someone clicks on your page or not.
Sometimes it happens – either by accident or because of the CMS your website is built on – that this meta data goes missing or you end up with duplicates.
Of course this is not ideal and hence should be regularly checked.
Similar to the two previous paragraphs, you can use a crawling tool such as SiteBulb or Screaming Frog to fix this. On top, here’s a good tutorial on how to
avoid these issues at all.
[38] There are multiple different redirect types that can be used in various scenarios. The most important ones related to SEO are…
● …the 301 redirect – a permanent redirect
● …the 302 redirect – a temporary redirect
● …the 410 redirect – this page doesn’t exist anymore
To give you a better idea about when to use which redirect type, you will find some use cases below, together with the matching redirect type:
Scenario 1: You want to consolidate page A with the content of page B-> Use a 301 redirect from B to A
Scenario 2: You want to create a new page with updated information, but you already have a page B treating the same subject-> Use a 301 redirect from B to A
Scenario 3: You want to transfer the link power from page A to B
-> Use a 301 redirect from A to B
Scenario 4: You want to delete a page and mark it as “gone” from the internet-> Use a 410 redirect
Now we have talked about the different redirect types and use cases but how do we actually create these redirects? While you could do it manually, it’s
easier and faster to do so via the .htaccess file. Or, when you work with WordPress, simply use a plugin.
Here are a couple for you to consider:
Here are a couple for you to consider:
Scenario 3: You want to transfer the link power from page A to B
-> Use a 301 redirect from A to B
Scenario 4: You want to delete a page and mark it as “gone” from the internet-> Use a 410 redirect
Now we have talked about the different redirect types and use cases but how do we actually create these redirects? While you could do it manually, it’s
easier and faster to do so via the .htaccess file. Or, when you work with WordPress, simply use a plugin.
Here are a couple for you to consider:
Redirection
Simple 301 Redirects
RankMath
[39] To check your website for index coverage issues, simply go to your Google Search Console property and analyze the Coverage tab under Index
section.
Now what should you look for?
The tools reports issues such as:
● drop in total indexed pages without corresponding errors (this could mean you might be blocking the access to some pages, e.g. via robots.txt or by
noindex-ing them)
● error spikes
● missing pages
● server errors
● 404 errors
All this information helps you to get a good overview and make educated SEO decisions.