Proper execution of SEO strategy is a must to get your expected results, which is why we are providing you a detailed Technical SEO Checklist so you can make sure you have all your SEO bases covered.
We have segregated the entire checklist into 5 sections, to make it simple to read and easy to execute.
- Basic SEO checklist
- Keyword research checklist
- On-page SEO checklist
- Content checklist
- Technical SEO checklist
- Link building checklist
So, let’s begin.
Basic SEO Checklist
Basic SEO Checklist broadly includes the practices that you must execute before thoroughly dissecting and analyzing your website.
1. Install Yoast SEO
Yoast SEO is the basic necessity for every website. Yoast is a free SEO plugin that is available for WordPress and a few other CMS. It gives you ample features for initial setup of your website for search engines.
With each Blog/Page you create, the Yoast SEO plugin gives you the benefit of adding an SEO Title, Slug, and Metal Description.
Moreover, you also get multiple other essentials, such as:
- Blockability from search results
- Custom message for Facebook Opengraph
- Custom message for Twitter cards
2. Create a Sitemap
The simplest definition of a sitemap is a website’s blueprint. By creating a sitemap you are telling the entire roadmap of your website and the list of web pages that search engines can crawl & index.
If you have installed Yoast SEO (Step 1) then you are provided with inbuilt sitemap with the URL:
www.yourwebsite.com/sitemap_index.xml
Yoast SEO plugin allows nested sitemap that makes it convenient for search engines to understand different post types.
If you aren’t using Yoast SEO Plugin, search for an alternative that works best with your CMS.
3. Create a Robots.txt File
Robots.txt is yet another essential file that limits the search engine from crawling specific pages. You can restrict the search bots to visit specific pages/folders/directories.
Standard robots.txt file for WordPress is:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
For Yoast SEO plugin users, you can create Robots.txt file with a single click.
For other WordPress users, you can create a manual Robots.txt file and upload it in the root folder.
Robots.txt file includes three segments:
- User-Agent: * [ For All Bots]
- Disallow: Relative URLs [Include the list of URLs that you don’t want search engine to crawl]
- Allow: Relative URLs [Any specific URLs that you want to include for crawling]
4. Install Google Tag Manager
Google Tag Manager is used as an all in one platform to integrate all development scripts into the website without any programming skills.
It is highly recommended to integrate GTM that will help you set up Google Analytics (UA and GA 4.0), Facebook Pixel, Google Ads Tracking, Bing Ads Tracking, Event Tracking, and much more.
Google Tag Manager provides you two pieces of code.
First code is required to be placed in the HEAD section and the second one needs to be added at the beginning of the BODY section.
With this, you get the complete power of adding third party scripts without opening the website’s backend.
NOTE: Google Tag Manager is a free to use Tool.
5. Setup Google Analytics
Google Analytics is yet another powerful tool that gives you user and traffic insights for your website.
You can easily setup your Google Analytics code via GTM.
Click Here and complete the process of Google Analytics setup via Google Tag Manager.
6. Setup Web Search Console
Web Search Console is also a free to use tool that helps you track your website’s health with regards to search engines.
Google Web Search Console brings to attention any crawling issues, indexing issues, AMP issues, backlinks, and much more.
Click Here to set up your Web Search Console via GTM.
Now, many of you should also consider Bing Webmaster Tool, which also has similar usability as Google Web Search Console.
You can follow the same procedure as for Google’s WSC to set up the Bing webmaster tool.
On-Page SEO Checklist
Yes, more often keyword research is not included in technical SEO. Still, it plays a significant role in building authority and inviting quality traffic on the website.
Below is the checklist that will help you execute complete On-Page SEO.
1. Write Short & Descriptive URL
Among all URLs created on a particular niche, we see short URLs outrank long URLs.
Follow the complete hierarchy when creating the URL.
E.g. The shortest acceptable URL for your Blog could be:
https://www.yourwebsite.com/blog/your-blog-title/
Here, you should keep in mind that words are always separated by a hyphen (-) which helps Google understand different words and relate it better with your content.
To create the best and most effective URL for your WebPages, follow the ‘Anatomy of a URL’:
2. Write Compelling Title (Meta Title)
The title of a web-page is displayed on the Search Engine Result Pages (SERP) and is highly effective in improving Click-Through-Rate (CTR).
You can check the Title tag in the website’s source code.
<head><title>Example Title</title></head>
Always create the Title of your webpage within 50-60 characters. Remember, there isn’t a fixed character count for Title. So, you can create a Title tag within the given character limit.
Moreover, ensure your Title is shown within a 600 pixel size. Also make sure, the most important keywords need to be first in your title tag. Do not duplicate title tags; they must be unique for every page in your website.
To check how your webpage title will appear in the search results, take help of Snippet Optimizer from SEOMofo.
3. Write Compelling Meta Description
Metal Description is yet another essential On-Page aspect that must be considered to increase the CTR.
Basic code for Meta description is:
<meta name=”description” content=”This is an example of a meta description. This will often show up in search results.”>
Like Title, Meta Description doesn’t have a fixed character limit. However, you are recommended to create a Description within ~155–160 characters.
NOTE: Sometimes, it’s OK not to write Meta descriptions. This allows Google to find the best-fit content for your webpage and display the same on SERP.
4. Check and Create ALT Tags for All Images
They are better known as Alternative Text, which are displayed in case an Image is not loaded on screen. Code for an Alt tag is:
<img src=”digital-marketing-services.png” alt=”Digital Marketing Services”>
ALT Tags are highly recommended to give Google information about your image. They are an important part of Image SEO which you shouldn’t miss at any cost.
Screaming Frog is a great tool to help you compile a list of Images with missing ALT Tags.
Tips for a Good ALT Tag for images:
- Describe the image as specifically as possible
- Keep it short
- Use your keywords
- Avoid keyword stuffing
- Don’t include “image of,” “picture of,” etc. in your alt text
5. Check for Duplicate Content & Clean It Up
Duplicate content on your website is a big mistake that can ruin your overall SEO efforts. Enough duplicate content and your website could be penalized and removed from SERP. Find excessive duplicate content and remove it.
Most common causes of duplicate content within a website include:
- Dynamic URLs with similar content.
- Running website with http and https version
- Using scrapped content (from other’s websites).
You can fix duplicate content issue:
- 301 Redirects -Redirect outdated URLs on to new and qualitative URLs.
- Use rel=”canonical” to prioritize the prime webpage.
- Use meta-robots to deindex duplicate pages
- Prefer proper redirection of http to https version.
- Write fresh content and update your web-pages.
6. Check & Edit Robots.txt
Robots.txt file is essential to instruct webmaster tools about web pages available for crawling. Moreover, it also restricts Google search bots from indexing specific pages.
You can use multiple user-directives to instruct different bots with different crawling instructions.
A quick To-Do with Robots.txt file:
- Place robots.txt file with the exact same name in the website’s root directory
- txt file is case sensitive
- txt file is publicly accessible by all
- Each sub-domain have a different robots.txt file
- It’s a good practice to include the sitemap URL in your robots.txt file
To check whether you have a robots.txt file, add “robots.txt” after your root domain.
E.g. www.yourwebsite.com/robots.txt
7. Robots Meta Directives
It’s not possible to add a wide list of URLs in the robots.txt. Here is where the role of robots’ Meta directive becomes important.
Robots Meta directives is the one-liner code that helps you instruct the search engine bots whether to crawl/follow the webpage or not.
Most common meta directives are:
- Noindex
- Index
- Follow
- Nofollow
- Noimageindex
- None
- Noarchive
- Nocache
- Nosnippet
- Noodyp/noydir [OBSOLETE]
- Unavailable_after
You can learn more about the robots Meta directives here.
The code for creating writing Meta robots directive is as below:
<meta name=”robots” content=”noimageindex, nofollow, nosnippet”>
8. Check Schema Markup
Schema Markup is an addon code/tags that is added to the HTML and improves your visibility in the SERP.
Schema.org is the collaborative information of Google, Bing, Yahoo, and Yandex that helps you showcase additional information about your webpage.
Google provides you its Free to use online tool Structured Data Testing Tool to test the available schema in your website. It also gives you any schema errors.
If you are using WordPress for blog/article publishing, it already comes with the Schema Markups. However, you can still modify it to fit best for your business usage.
With Yoast SEO installed in WordPress, you can provide your business information that is automatically added to your website’s HTML code.
There are numerous Schema item types that you must know when optimizing your website for Schema Markup. Most common Schema item types are:
- Creative work
- Event
- Organization
- Person
- Place
- Product
You can get complete information about the Schema here. Simply search for the required schema and get entire details about its usage, code, etc.
9. Check HTTP Response Code
Your website should run entirely on a single website base.
This means, you should either run the entire website on HTTP or HTTPS. Running a single website on both versions makes two versions of your website, resulting in duplicate content.
Know the common response code classes and acknowledge your website w.r.t the response codes.
The broad classes of HTTP Status Codes are:
- HTTP 1xx status codes: This code class indicates that the request was received but hasn’t been processed yet.
- HTTP 2xx status codes: It indicates that everything is correct and is as per plan (HTTP status code 200 OK)
- HTTP 3xx status codes: Indicates that your URL is redirected to a new location (301 and 302 are highly used in Digital Marketing)
- HTTP 4xx status codes: Page not Found
- HTTP 5xx status codes: Server is unable to complete your request
Screaming Frog can help you fetch all the URLs of your website with their response code.
You should focus on all WebPages with http response code except 200.
10. Use One H1 Tag on Every Page
John Muller once replied to a tweet regarding multiple H1 Tags in a single page. John clearly replied that there isn’t any issue in using multiple H1 tag on a single webpage.
However, use of a single H1 Tag is highly recommended.
With the use of one H1 Tag in a webpage, you tell the user and search engines about the theme of the webpage.
To learn in depth about the use of H1 tags on any website, read this blog by Neil Patel.
Commonly every page has a title which is often used as the H1 tag for that page. Best way to find the H1 tag in any website is analyzing its source code.
You can simply search for ‘<h1>’ in the search code and make your decisions accordingly.
11. Check Proper Usage of Heading Tags (H1 to H6)
H1 to H6 are the chunks of HTML code that allows you to make hierarchy within your content.
Where H1 signifies the top level pointer, H6 notifies the lowest level pointer.
In SEO, H1 tag is considered as most important and H6 is considered as least important.
Basic syntax for Heading Tag is:
H1 Tag: <h1> This is Heading 1 </h1>
H2 Tag: <h2> This is Heading 2 </h2>
H3 Tag: <h3> This is Heading 3 </h3>
H4 Tag: <h4> This is Heading 4 </h4>
H5 Tag: <h5> This is Heading 5 </h5>
H6 Tag: <h6> This is Heading 6 </h6>
Check whether the Heading Tags are properly added to your website.
Before considering this, make sure Point 11 is checked and verified.
Below is a basic example of heading tags used in a perfectly optimized Blog/Article
For detailed knowledge about how to use Headings in your Website consider this blog by Yoast.
12. Check Page Speed
Google will soon introduce the Core Web Vitals that will bring page speed into the ranking factors.
You can get complete knowledge about Core Web Vitals here.
Hence, it’s essential to consider your website loading speed and optimize it ASAP.
FACT: As of 2018, more than 50% of search engine users search on mobile devices.
So, it’s essential to focus more on mobile devices and serve a quick-loading website for the mobile users.
Quicker your website loads, the more users will stay and turn into a customer.
Common ways to increase your website’s loading time are:
- Enable compression
- Minify CSS, JavaScript, and HTML
- Reduce redirects
- Remove render-blocking JavaScript
- Leverage browser caching
- Improve server response time
- Use a Content Distribution Network (CDN)
- Optimize images
Pagespeed Insights is the best online tool that you can use to check your website speed. It further gives you the checklist that can help you minimize the loading time.
For a detailed report of your website, you are recommended to use Lighthouse. Google Lighthouse is an extension for Chrome browser that fetches complete information about your website and gives you the detailed information in a downloadable format.
To provide a good user experience:
- Sites should strive to have a First Input Delay of 100 milliseconds or less.
- Sites should strive to have a CLS score of 1 or less.
- Sites should strive to have Largest Contentful Paint of 5 seconds or less.
13. Check Mobile Friendly Trait of Your Website
Yes Mobile plays a significant role in building your brand and serving your business to the customers.
Beginning July 1st 2019, Google clearly stated that it will predominantly prefer the mobile version of your website for indexing.
It’s likely that most of the users landing on your site are using mobile devices; hence you should ensure your website runs smoothly with responsive design on the mobile devices.
You can test the mobile responsiveness of your website here.
If your website has any responsivity issues, it’s highly recommended to fix them for better user experience.
Otherwise, you will see the result as “Page is Mobile Responsive.”
14.Check for Crawling & Indexing Issues
Google Web Search Console is the best free to use tool that gives the detailed insight of your website and provides you in-depth status of Crawling and Indexing issues.
You have two broad options in the Web Search Console to check crawling and indexing issues.
Primarily, you are provided with the detailed status of your website.
However, if required, you can check the status of a single URL as well.
URL Inspection is a great way to find crawling and indexing issues associated with a particular URL.
Make sure your website doesn’t have crawling and indexing issues. If found, rectify them immediately.
Once rectified, you can request for indexing the specific URL and wait for its status.
15. Check Open Graph & Twitter Cards
Open Graph is a protocol that allows the website owners to show custom text when sharing a specific URL on Social Platforms.
If you are using the Yoast SEO plugin with WordPress, you get the option to customize the Open Graph content as below.
You can check the source code of your website and check whether Open graph tags are present or not.
The common syntax for open-graph is the same as other Meta tags. Only difference they carry is that they start with “og:”
Similarly to Facebook’s Open graph, Twitter also have its alternative i.e. Twitter Cards.
Check and ensure that your entire website has the Open Graph setup properly.
16. Check Internal & External Links
The use of internal and external links is a strong way to build authority of your website and improve its visibility in the Search Results.
- Internal Links: All Links that are switching the user on another page of the same website.
- External Links: All links that take users to some other website.
Building a strong internal link web is highly recommended to allow Google better understand your website and crawl maximum possible web pages available for Crawling and Indexing.
With Internal Links you build a strong user experience and encourage longer session durations from the user.
When creating Internal/External Links you should consider:
- Link a keyword to relevant webpage
- Don’t spam only to build a strong link web
- Prefer adding external links to provide more information or add citation
- Prefer adding “rel=nofollow” to external links, if they have low authority than your website
- Don’t provide too many external links
- Replace any dead or broken internal/external link
Ahrefs, SEMRush, CognetiveSEO, etc are some great (but PAID) tools that can provide you with detailed information about Internal and External links in your website.
For detailed information about Internal and External linking best practices consider reading this awesome PDF provided by Neil Patel.
17. Ensure SSL is Working on Your Entire Website
Today every website is primarily focused on delivering security to the user and hence HTTPS has become the basic necessity for every webmaster.
Make sure your website has an SSL layer installed.
Add an SSL Certificate to load your website with https:// and redirect the entire website to the https version.
Make sure no non-https version runs on the internet.
Once this checklist is entirely consumed, you are all set to proceed further with the content optimization checklist.
However, some of the technical SEO checklist will majorly rely on Keyword research.
What is a Keyword?
Keywords are the words and phrases that people type into search engines to find the answers to what they’re looking for.
Keyword research is the foundation of SEO. So, you must do a thorough research to find potential keywords for your website/web pages.
Keywords are used everywhere that broadly includes Page title, description, Content Heading, Content, Image Alt Tag, and many more.
Why are Keywords Important in SEO Strategy?
The keywords give a clear clue about what users are searching for and how your webpage/website is presented for those searches.
Further, keywords give you the potential to build strong surrounding content and tell Google about your user-driven content.
Here is Some Don’ts Regarding Keywords:
- Don’t use keywords to trick or mislead.
- Don’t target keywords that are irrelevant to your content.
- Don’t use keywords awkwardly in your content.
So, you must have the detailed knowledge about keyword research to support your technical SEO and benefit you with great opportunities on the search results.
Below are some best practices to do Keyword Research.
Find Long Tail Keywords: The keywords that are longer than usual keywords. Often they have low search volume compared to the root keywords.
Biggest advantage of using Long Tail keywords is the high conversion rate. Data shows that Long Tail keywords have a high conversion rate as they provide precise search results related to the term.
Google Search auto complete is the best way to find long tail keywords and use them in your marketing strategy.
There are numerous FREE tools that you can use to find potential long tail keywords. Some of them are
- Google Keyword Planner
- keywords.io
- Ubersuggest
Devote quality time in running detailed keyword research and find the right chunk of keywords for your website/webpage.
SEO is an ever-changing and ongoing process, and it is not possible to include everything that’s important in one checklist, although we’re trying!. A digital marketing specialist always uses a detailed Technical SEO Checklist before altering the content or code. This detailed checklist is a must-have for you to complete the technical checks on your website.