Getting ready to audit your site’s SEO? Congratulations! You’re about to embark towards improving your overall SEO footprint while also finding every single potential wart that your website may have.
The bad news is every site will have things wrong with it – yours included. The good news is that each new issue that you find represents an opportunity to improve. As an SEO, the point of auditing your site is to find areas to squeeze more juice out of.
In my previous article entitled, SEO for Beginners, I outlined how there are both things occurring on page, as well as off page, that can impact your site’s SEO performance. Foundational to a good SEO strategy to the point of almost being a prerequisite is technical and content optimization.
Below is our comprehensive on-page SEO checklist, which contains practical advice on both technical and content optimization to help you get more out of your next website audit.
Table of Contents:
- Set Up Technical Site Monitoring and Gather Auditing Toolset
- Find and Eliminate Search Engine Crawl Barriers
- Ensure Proper HTTP Status Codes Are Served
- Optimize Your On Page Content’s Setup
- Eliminate and Consolidate Duplicate Content
- Optimize All Digital Media Including Images and Video
- Optimize for Alternative Language/Geo-Targeting
- Have a Mobile Page Setup Strategy
- Improve Site Speed to 3 Seconds or Less
- Ensure Site Security and Protect User Information
1. Set Up Technical Site Monitoring and Gather Auditing Toolset
First things first, before you begin auditing you’ll want to ensure that you’ve got your site wired up with a few critical tracking tools that will assist you with monitoring and diagnose the technical components of your site long term.
These include:
These three will require some setup and verification, but they are both fairly easy. I recommend that you setup and verify both desktop and mobile (if using mobile-separate site) versions of your site, as well as all URL-based variations (http, https, www, non-www). I also recommend that you sync Google Search Console with Google Analytics.
Additionally, you may want to set up and/or familiarize yourself with the following sites/tools which help with the auditing process and can assist with automation – especially if you have a large website.
These tools will get you most of the way, though there will likely be other tools that come into play or that you might use to accomplish some of the auditing tasks that are part of this checklist.
2. Find and Eliminate Search Engine Crawl Barriers
Crawl barriers refers to things occurring on your site that may be prohibiting crawling and indexing of your site’s content by Google and other search engines.
Elimination of crawl barriers on your site is the most basic thing you can do for SEO, and an area where many sites have issues (whether they know it or not).
No matter how great your site’s content is, if you’re directly or indirectly doing something that prohibits search engines from crawling and indexing it, then the quality of your content doesn’t matter until crawl barriers are fixed.
Most common/impactful issues:
Robots.txt file disallow
- Don’t block any directories or pages that you shouldn’t be via your site’s /robots.txt file. These directives tell Google which areas of the site they are not allowed to crawl.
- If you’re blocking something important, you may have to remove the disallow from your file in order for the content to be crawled again.
- A great way to test this is with Google’s Robots.txt tester within Google Search Console.
Meta Noindex, Nofollow tag
- Be careful when using a Meta Noindex, Nofollow tag within the page’s HTML head section on your webpages, especially if they are supposed to be indexed. The noindex, nofollow tag tells Google not to index the URL or follow any of the links.
- If you’re using this anywhere, you may need to remove this tag in order for your site’s content to be indexed.
- You can check this with Screaming Frog and/or Deep Crawl.
Internal linking / breadcrumbs
- Make sure you are linking to all pages that you want indexed either via navigation, contextual internal links, or breadcrumbs. Search engines crawl links on your site to find information, so if you’re not linking to something, they can’t find it.
- If not, maybe you need to reconsider your site’s architecture and/or look for opportunities to link to key pages contextually. The most important pages on your site should be no more than a few clicks away from the homepage.
- You can check this with Screaming Frog and/or Deep Crawl.
Rel=Nofollow tags
- Make sure your site’s important links do not contain a rel=nofollow attribute. This attribute tells search engines not to follow the link – essentially not to crawl and/or attribute any value to or from the link.
- If you’re using rel=nofollow tags on links, you may be limiting crawling or the passing of link value through your site and may want to consider removing the attribute from the site’s links.
- You can check this with Screaming Frog and/or Deep Crawl.
XML Sitemaps
- Every site should have an XML sitemap
- Every site’s XML sitemap should be submitted to search engines (can be done in Google Search Console and Bing Webmaster Tools)
- All XML sitemaps should be linked to from a site’s robots.txt file
- XML sitemaps should contain all valid URL’s on the site (e.g. those that don’t return an error)
- XML sitemap should contain fewer than 50,000 URL’s, and should be smaller than 50 MB in file size
- Especially for larger sites, XML sitemaps should be split into multiple, logically organized sitemaps via a Sitemap Index file to get around the URL and file-size constraints
- Here is a good guide from Google on XML sitemap best practice specifications
Javascript, AJAX, iFrames, or Other Accessibility Issues
- If you’re using Javascript or other types of code configurations, be sure that they aren’t obscuring navigation and links from being crawled
- Here is a really good article on Javascript and SEO.
Here are some other potential issues:
- Excessive long URL’s: While Google says they can crawl URL’s over 1,000 characters they don’t recommend it as a good practice.
- Malformed URL’s: Google says to avoid using URL’s with special characters such as commas, semicolons, colons, spaces, quotes etc.
- Non-HTML URL’s: Google also has trouble with indexing URL’s that contain non-escaped ASCII characters.
- Blocked resources: Are you blocking your site’s CSS and JS files from being crawled? If so, here’s why you shouldn’t. You can see this in Google Search Console.
- Broken links or web crawl errors at key access points: Does your site have any broken links within key access points of navigation such as main or secondary navigation? You can check these in Screaming Frog, Deep Crawl, or in Google Search Console’s Web Crawl Errors report.
- Pages restricted by cookies, logins, or forms: Is any key content hidden behind cookies, logins, or forms?
- URL parameter handling in Google Search Console: Did you accidentally set the wrong setting for your parameters in Google Search Console?
- Page rendering: Is Google having trouble rendering your site’s content? You can check this via their Fetch and Render tool within Google Search Console.
Any one of these issues can have an impact on your site’s ability to get crawled and indexed, so it’s very important to keep tabs on these types of things on an ongoing basis. Getting one of these things wrong – whether you mean to or not – can cost your company money. I’ve seen it happen more than I’d like to admit.
I recommend that you monitor these items closely on an ongoing basis, and recommend using a tool such as SEO Radar to automate the monitoring of changes made to pages on your site.
Here is how you check your site’s indexation (real-time):
Google Search Console
Site: Operator Search
Note: There have been many discussions about why there are differences between the number of indexed pages showing in Google Search Console versus a simple site: search. Per Google, “Sometimes the data we show in Index Status is not fully reflected in Google Search results. In some cases, Google may apply filters while building search results, and these filters can affect which results are shown in search.” It’s best to consider the information in Google Search Console as the most accurate source of data, and the data available via site: search as anecdotal.
3. Ensure Proper HTTP Status Codes Are Served
Outside of eliminating any crawl barriers, ensuring that you are serving the correct status codes to search engines on a page-by-page, resource-by-resource basis is very important for crawling and indexing.
200 “Okay” Status
All live assets including HTML pages, CSS & JS files, images, etc. should serve a 200 “Okay” status to search engines. This is essentially telling search engines that the URL/file in question is live and valid.
301 and 302 Redirects
There are two primary types of redirects in play – the 301 and 302 redirect.
A 301 “Permanent” redirect tells a search engine that the original URL has been permanently moved to a new location and will not be coming back.
A 302 “Temporary” redirect tells a search engine that the original URL has been temporarily moved to a new location, but that the original URL will be back at some point in the future.
Note: It has been theorized for a long time that there is a difference in how authority is passed from old pages to new pages using a 301 versus a 302, and it is still a debate in the SEO industry. Google now says it does not matter which redirect type you use.
Some redirect guidelines to abide by as it pertains to SEO:
- Always use a 301-redirect when a page is going away permanently
- Always use a 302-redirect only when the original page is coming back
- Avoid redirect chains (e.g. page A > page B > Page C), which are a series of redirects with several “hops” before reaching the final destination
- Avoid redirect loops (e.g. page A > page B > page A), which are a series of redirects that loop infinitely
- Avoid using a Meta Refresh, and instead use proper 301 redirects
- Where possible, to increase crawl efficiency, update old URL’s to new when the location of the original URL has been redirected
404 “Not Found” Status
All pages or files that are no longer live should serve a 404 “Not Found” status code. 404 Errors as they are often called occur when a page that was once live is delete or otherwise goes away without being otherwise redirected to a new location.
You can see your site’s 404 errors either by running a site crawl in Screaming Frog or Deep Crawl, or by looking at your Crawl Errors report in Google Search Console.
Google has long said that 404 errors don’t hurt your site, and while this is true, they could be indicative of technical problems, process problems in terms of page retirement, and may lead to a poor user experience if not properly configured. Not only that, but if a high-value page gets deleted, failure to redirect it to an appropriate counterpart means that any historical value will not be passed along – thus affecting the new page’s performance.
Here are some things to watch out for with 404’s:
- Always ensure that when a page is missing that it serves a proper 404 response code rather than another status such as 200 (which are called Soft 404’s).
- Always serve errors using a Custom 404 Error Page that lets users know what has happened to the page they were looking for, as well as to keep a user within the site experience.
- When a page is going to be deleted, if there is a close or exact-match counterpart, implement a 301-redirect from the old page to the new.
- To increase crawl efficiency and improve user experience, find any broken links within the site as a result of 404 errors and update them to go to an appropriate live URL.
500-Level Status Codes
500-level response codes indicate that there are issues occurring at the server level. If your site is returning these response codes, it could be an indication that something is malfunctioning with your server.
These types of errors often occur when a site goes down temporarily or when certain components become temporarily unavailable. You can see if any of these errors have occurred in Google’s Crawl Errors report.
While often this type of situation is unplanned, Google does recommend serving a specific 500-level response code when your site is undergoing planned downtime or maintenance.
They recommend using the 503 “Service Unavailable” response to avoid an negative issues caused if a crawler were to attempt to reach the page while it’s down for maintenance. This status code tells search engines that the service disruption is temporary and that the requested page should be available when they return.
While there are certainly other important HTTP status codes to familiarize yourself with, these are the ones that tend to stand in the forefront for most SEO’s and developers.
4. Optimize Your On Page Content’s Setup
Thus far, we’ve talked about what you can do to ensure that your site’s content is monitoring, crawled appropriately, and sending the right signals to Google.
Now comes the setup of your actual pages themselves, and how to make them as SEO-friendly as possible from a pure structure standpoint.
Here are some guidelines to follow and evaluate on a page-by-page basis.
URL Structure
- Your URL’s should be all lowercase, and (while allowed) you should avoid uppercase since URL’s are case sensitive and can be an issue with direct traffic
- Use the page’s target keyword in the URL (e.g. somesite.com/on-page-seo
- Separate keywords with dashes rather than underscores or spaces
- Keep your URL’s short and sweet; excessively long URL’s my look spammy to users and search engines (e.g. somesite.com/seo-is-literally-the-best-thing-in-the-history-of-ever-in-2017-you-get-the-picture)
- Limit use of parameters or otherwise ugly URL’s (e.g. somesite.com/page=1234&color=blue)
- Limit use of unnecessary directories (e.g. somesite.com/articles/english/2017/09/08/seo-is-the-best)
- Don’t use non-escaped ASCII characters
Page Title Tags
Your page’s title tag is the most important on-page SEO factor. Why? Because each page’s title – visible in your browser or in the code – is typically what shows up in the Search Engine Results page as the clickable result snippet.
- Avoid missing title tags, ensure that all pages on your site have valid titles
- Make sure all pages have a title tag value that is 100% unique
- Title tags should be descriptive, and concise
- Always include each page’s individual target keyword(s) in the page title, and look for opportunities to add keyword modifiers, but do not keyword stuff
- Put your target keyword(s) close to – if not at – the beginning of the title tag
- Use your brand name in your title tag, typically at the end
- Avoid title tag values that are too long — 60 characters is the approximate point of truncation. Moz has a great title tag length preview tool.
- Avoid title tag values that are too short (under 30-40 characters), as the space is valuable and you want to make use of it
- Avoid using values in the title tag that are non-informative
- Avoid repeated, or boilerplate titles
Meta Description Tags
Your page’s Meta Description tags, while not a ranking factor, are still very important to optimize. Why? Because they sometimes use it as the description in search results, meaning that a well-written Meta Description may actually have an impact on user click through rates from SERP’s.
- Avoid missing Meta Descriptions, ensure that all pages on your site have valid Meta Description tags
- Make sure all pages have Meta Description content that is 100% unique
- Meta Descriptions should be descriptive, and can be a 1-3 sentences.
- Always include each page’s individual target keyword(s), and look for opportunities to add keyword modifiers, but do not keyword stuff
- Avoid Meta Descriptions values that are too long — 160 characters is the approximate point of truncation for most pages – though you can go longer.
- Avoid Meta Descriptions that are too short (under 100 characters), as the space is valuable and you want to make use of it
- Avoid repeated, or boilerplate Meta Descriptions
Heading Tags
There are up to six heading tags that can be used on webpages (H1, H2, H3, H4, H5, H6). The most important area H1’s for SEO purposes.
Heading tags typically denote the most important pieces of content or areas on the page, or breaks in content where there are topic or conversational transitions.
- All pages should have a single H1
- Avoid having multiple H1’s within any single page
- Typically, the page’s primary headline (ex: Article title) should be an H1
- The page’s H1 should include the page’s target keyword(s), including any modifiers (where possible)
- H2’s – H6’s can be used multiple times per page, and should be used to establish breaks within content and page structure or to denote key structural areas of the page
- It is acceptable to use target keyword(s) within H2-H6’s tags so long as it’s not spammy or forced
On-Page Content
Aside from other areas of optimization, the one that is a major factor that is most often overlooked when it comes to the analysis capabilities of most major tools is a site’s actual on page content. This includes everything from headings, to images, links, but most specifically refers to good old-fashioned body copy.
Content Length and Rank Correlation Study by Brian Dean of Backlinko
- Longer pages (1,900-2,000 words) tend to correlate to better better rankings according to a recent rank correlation study by Brian Dean of Backlinko
- However, you shouldn’t create long pages just for the sake of creating long pages. For any keyword that you want to rank for, always do your homework to look at the avg. length of content from the top 10 or 20 results and shoot for that. If it’s 500 words or 3,000, shoot to be at least as good/long as what is already performing well, and then sprinkle your brand of uniqueness on it
- Always create engaging and interesting content that is written for users first and search engines second
- If a page’s target keyword(s) and modifiers are not already mentioned after the page’s first draft is written, take some time to work them into the copy naturally
- Also, consider using Latent Semantic keywords within your page’s copy
- Don’t be afraid to use images, videos, and/or other graphics within the content to enhance the experience
Internal & External Linking
As mentioned in our SEO for Beginners guide, linking is absolutely critical for search engines in terms of how they crawl the web. Search engines use links to find content, and also use links as a signal to the relative importance and popularity of pages.
It is very important to ensure that your site and it’s pages is well-linked.
- All pages should have a minimum of one link pointing to them for discoverability purposes – you don’t want any orphaned pages (e.g. zero links pointing to them)
- The most important pages on your site should be no more than a few clicks away from the homepage
- Your primary and secondary navigation should be logical and well-organized, and should link users to key pages directly – especially if you have a large site with lots of categories, sections, topics, etc.
- You should also include plenty of contextual links within your page’s body content or within individual articles
- Your most important content should be linked to frequently
- Don’t be afraid to link internally as many as 2-3 times per article
- Links should include contextual, descriptive, and keyword-rich anchor text as opposed to “click here”
- Don’t be afraid to link to external resources as they can lend credibility or context to your content as well as present great link building opportunities
- Be careful not to violate Google’s webmaster quality guidelines on link schemes
Structured Data
What is structured data? Structured data is a standardized format for providing information about a page and classifying the page content. Google and other search engines work hard to understand the content of a page, but you can provide explicit clues about the meaning of a page by including structured data on the page.
Search engines use structured data that they find on the web to understand the content of the page, as well as to gather information about the web and the world in general. Additionally, search engines use structured data to enable special search result features and enhancements. For example, a recipe page with valid structured data is eligible to appear in a graphical search result.
Social Media Buttons & Tagging
While social signals aren’t directly part of Google’s ranking algorithm, making it easy for visitors to share your content can lead to more overall visibility to your site’s content which can indirectly lead to things that do impact rank – like links.
According to a study by BrightEdge, using social sharing buttons leads to 7x more mentions. Unfortunately, both having social sharing buttons as well as social markup is overlooked when it comes to the impact to SEO.
Unsurprisingly, another key data point from the BrightEdge study (via Hubspot) showed that the homepages of almost half of the top 10,000 websites studied (46.4%) had no social links or plugins installed.
- Make sure all pages having social sharing buttons that are prominently placed and easy to use – especially blog or article pages
- To control how information is displayed when being shared on social media channels, make sure all pages are using valid Open Graph markup, and ensure that all required elements are configured properly. At minimum, include og:title, og:type, og:image, and og:url, but look to use non-required features as well
- Implement Twitter Cards, and if possible I’d recommend using the ‘Summary Card with Large Image’ as it has the richest display
- Include easy to find links to all social profiles within the site
HTML Sitemaps
While not absolutely critical for SEO success, I’ve found that good HTML sitemaps can be useful in terms of enhancing crawlability, distributing equity across key site pages, as well as getting a piece of content linked-crawled when you’re having trouble getting a page linked up quickly (or getting push back on link location – not uncommon).
- HTML sitemaps should be accessible from all pages – typically via a site-wide footer link
- Should be logically organized by site section
- Should include links to key top-level sections
- Should NOT include links to every page on site (save that for the XML sitemap), unless the site is very small
- Do not include broken links or links to pages that redirect
- Can be used – if necessary – to get a link crawled quickly in the event that there are backups in development or push-back on primary recommendations
5. Eliminate and Consolidate Duplicate Content
Duplicate content refers to URL’s with substantial blocks of content that either completely match other content or are appreciably similar. Duplicate content can refer to both pages within your site as well as external pages.
While there is no duplicate content penalty according to Google, having duplicate content across your own site or other sites can impact your SEO performance in a few ways and should be fixed if it can be.
Duplicate content can:
- Potentially water down the performance of a given URL by splitting the page’s value up across multiple URL’s rather than a single page. Hypothetically, if page A has 50 links, and a duplicate page B has 50 links, they may not perform as well individually as a single page with 100 inbound links.
- Duplicate content can also impact Google’s crawl budget – the number of URLs Googlebot can and wants to crawl – for your site. This is primarily an issue for larger sites as opposed to small ones. Per Google, “Wasting server resources on pages like these will drain crawl activity from pages that do actually have value, which may cause a significant delay in discovering great content on a site.”
Here’s how to fix duplicate content across your site:
- Ensure that all versions of any given URL on your site resolve via 301-redirect to a single location as opposed to multiple (e.g. http, https, www, and non-www versions of a URL should all redirect to a single preferred location). Typically this occurs at the server level.
- Set a preferred version of your domain in Google Search Console
- Make sure that all pages have canonical tags, and that canonical tags point to the preferred absolute URL of a given page. Most canonical tags will be self-referencing. However, if another page on your site is duplicative, either change it to ensure it’s 100% unique or canonicalize it to the original version of the page.
- Avoid duplicate Page Titles across the site
- Avoid duplicate Meta Descriptions across the site
- Avoid duplicate H1 tags across the site (where possible)
- Avoid duplicate or boilerplate body copy across the site (where possible)
- Avoid using query strings, parameters, or session ID’s in URL’s as much as possible. If you use them, be sure to managing appropriately in Google Search Console as well as canonicalize correctly.
- Indicate paginated content appropriately by using both rel=next and rel=prev tags as well as canonicals.
- Try to avoid linking internally to a non-preferred version of a URL
- Limit indexation of search results pages on your site as they can be very similar to other pages (ex: category pages), and are often low-quality and infinite in nature. In fact, Google even tries to remove search results pages algorithmically.
- If you have a mobile-separate site, be sure to use mobile rel=alternate tags to indicate desktop pages that have a mobile counterpart.
See also: Google’s Guidelines on Duplicate Content
Note: There are some situations where duplicate content is unavoidable, and the above fixes are critical in those situations to help search engines better understand and associate duplicate content together.
6. Optimize All Digital Media Including Images and Video
Aside from optimizing your site from a crawlability and page structure standpoint, it’s also important to optimize your digital media including images, videos, etc.
The reason optimizing your digital media is important is three-fold:
- Google and other search engines are not yet able to look at an image or video and understand the context without some sort of contextual description from a webmaster
- Images and videos – when optimized – can add valuable context to the surrounding content
- Images and video can enhance user experience and engagement
Optimizing your site’s images and video’s is simple:
- Make sure images and videos use descriptive and accurate file names (e.g. /red-flower.jpg is better than /009.jpg)
- Ensure that file-naming follows the same conventions as URL-based best practices, as the file-name will be part of the URL when published
- Ensure that all images have Alt Tag attributes that are descriptive, accurate, and generally helpful from an accessibility perspective
- Don’t embed important text within images
- Try to add captions for images and videos within your webpage
- Crawl your site for broken images (e.g. 404 errors) and fix them
- Add structured data for your site’s images and videos. This is even more important now that Google is adding badges to image search results powered by structured data
7. Optimize for Alternative Language/Geo-Targeting
If your site offers content to users across multiple languages or geographical regions, it is important to specify which versions of content should be served by Google.
How to optimize for language or geo-region:
- In addition to your primary site, consider housing a version of your site on a Country Code Top-Level Domain (also called a ‘ccTLD’). ccTLD’s represent specific geographic locations. For example: .mx represents Mexico and .eu represents the European Union.
- Use rel=alternate hreflang=x tags to in order to indicate when a page on your site has an alternative-language or alternative-region counterpart. This will allow Google to serve the correct version of a page on a URL-by-URL basis.
- Use Google Search Console’s International Targeting settings to help determine which countries a URL should be targeting
- If necessary, host country-specific versions of your site within the regions they service
- Link to all website variants from each language or geo-version of your site, typically in footer or header
Using these methods will help Google and other search engines serve the right version of your site/brand based on individual user language and/or geo-location.
8. Have a Mobile Page Setup Strategy
In this day and age, having a mobile strategy is not a nice-to-have, it’s table stakes.
According to independent web analytics company StatCounter, internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016.
The flip side of that is the fact that many companies still don’t have a well-defined mobile strategy. From an SEO’s perspective, mobile SEO is even more important going forward as Google will be launching a mobile-separate version of their index in 2018.
Here’s how to optimize for mobile at the most basic-level:
- Important! Ensure that your site has one of three mobile design configurations: Responsive Design, Dynamic Serving, or Mobile-Separate
- Use Google Search Console’s mobile usability reporting to determine if your site’s pages have any mobile usability issues such as content wider than the screen, clickable elements too close together, text that is too small to read, and more.
- Take page speed very seriously, especially as it relates to mobile
- Use Google’s mobile-friendly testing tool to determine if your webpages are mobile-friendly and/or if they have room to improve
“Nice-to-Have” mobile strategies
If you’re looking for more advanced or “nice to have” strategies, consider the following:
9. Improve Site Speed to 3 Seconds or Less
In my experience, site speed is one of the most overlooked aspects of both SEO and general strategy. Not only that, but almost every site I’ve ever worked with get’s it so, so wrong.
Google has repeatedly stated that most users expect webpages to load in 3 seconds or less, has produced many studies showing the performance implications of poor page speed, and has even created a helpful tool to estimate how much traffic your site is losing due to poor page speed. Additionally, site speed has been a confirmed minor Google ranking signal since 2010 and will be a ranking signal in their mobile-separate index in 2018.
I’ve personally argues many times that site speed represents most website’s biggest opportunities for growth, not just for the SEO channel either. However, it often doesn’t get prioritized because the perception of value is not communicated correctly, tangible ROI can’t be easily forecasted ahead of time, and efforts to improve speed can be complex and time consuming.
Best site speed tools
Here are the best tools for measuring your site speed on a page-by-page basis:
Most common site speed issues
While there can be many complexities, the most common issues affecting how quickly an individual page loads across desktop and mobile devices are:
While there are certainly many other components that impact page speed, focusing on and fixing these areas should significantly improve your site’s load times.
10. Ensure Site Security and Protect User Information
Google has been pushing for increased site security for several years, with their initiative really coming to the forefront in their 2014 “HTTPS Everywhere” presentation.
Ensuring good site security is primarily achieved by encrypting information via the HTTPS (Hypertext Transfer Protocol Secure). HTTPS is an internet communication protocol that protects the integrity and confidentiality of data between the user’s computer and the site. Users expect a secure and private online experience when using a website.
From a purely SEO standpoint, HTTPS has been a minor ranking signal since 2014, and I’d recommend transitioning your site to HTTPS as soon as you reasonably can (although I wouldn’t do it for SEO gains alone). It’s worth noting that there was an interesting debate in the SEO community recently over the merits of switching to HTTPS versus not. Interesting read for sure with lots of heated opinions.
Best practices for implementing HTTPS on your site:
- Buy a security certificate from your hosting provider (minimum of 2048-bit key)
- Ensure that you enforce an HTTPS URL preference via 301 server-side redirects across all secure pages of your site
- Make sure you update all internal links on your site to point to HTTPS rather than HTTP
- Make sure you’re not blocking the HTTPS version of your site from being indexed either via the robots.txt file or noindex tags
- Consider enabling HTTP Strict Transport Security (HSTS), which tells the browser to request HTTPS pages automatically, even if the user enters HTTP in the browser location bar
- Watch out for insecure content on your webpages (e.g. any resource served over an HTTP URL on an HTTPS page), as Google is now being more explicit in warning users when there is mixed content on the pages their browsing – which could lead to trust issues with your site
- Set up the HTTPS version of your site in Google Search Console
- Keep an eye out for security issues being reported via Google Search Console including malware, deceptive pages, harmful downloads, and uncommon downloads
See also: Google’s guide on securing your site with HTTPS
Conclusion
With so many things going into good on-page SEO, it can be tempting to deprioritize or forget key areas that may have major implications for your SEO performance.
The easy part is finding things that are wrong, and the hard part is actually getting fixes implemented. You’ve passed the first step and have found all the issues, you have a great on-page SEO checklist to tell you what and how to fix things, and now you just need to translate this into actionable recommendations that your team can implement on a page-by-page basis.
But if you consider that every item in this on-page SEO checklist is like a piece of the puzzle, with some pieces being bigger than others, the hope is that as you start putting together during and after your website audit, you’ll begin to experience substantial increases to rankings, visits, and conversions over time.
Now go forth and optimize!