Our fixation with the ideal is a reflection of our passion for perfection. When it comes to a site audit, one can easily get carried away in pursuit of a long list of green ticks and high scores.
There are many possible warnings and error messages that an SEO crawler tool, or web bot, can throw at you. A professional SEO agency is expert at recognising the real problems from the insignificant, but to anyone conducting their own analysis, knowing what to ignore is not always simple.

Knowing whether an error message is relevant to your website is not an exact science, though there are some common warnings that aren’t always worth spending time on. Here are our top 5 SEO crawler warnings that you shouldn’t fix.

1 Noindex Site Pages

Web crawler tools love to bring up noindex issues, displayed as either insights, warnings or full-on errors. If a page URL has a noindex tag, it means it cannot be listed on search engines like Google. That may sound very bad, but some pages don’t actually need indexing. Functional elements of a site, such as a log-in/password reset page or an internal search bar, would wreak havoc if they all appeared within search results.

A local SEO company will be able to tell whether certain pages require noindex tags. All of your main pages and high-content places will absolutely need indexing, so be sure to check which pages the SEO crawler is referring to as noindex.

Note: noindex is not the same as nofollow. The former ensures a page doesn’t show up in search results, whereas the latter tells search engines not to use a URL link for ranking calculations. Links to untrusted sources are typically set as nofollow.

2 Missing Meta Keywords

The meta keyword tag is the chance to summarize the content on a site, or page, a section of the HTML code that lets a search engine know how to categorize the content.

<meta name=”keywords” content=”SEO crawler warnings, SEO crawler errors, SEO audit report, meta keyword error, missing meta keyword.” />

Quite a number of years ago, meta keywords were commonplace, used to inform the search engine when and where that page/site should appear. Unfortunately, many people abused this element in order to maximize site traffic, exaggerating or otherwise embellishing the relevance of a page.

Meta keyword tags are practically obsolete, so don’t bat an eye if you see it on a site report – chances are, you won’t. A few SEO crawler tools may include it, or provide an option to enable this kind of warning message.

3 Meta Description too Short or Empty

Unlike meta keyword tags, meta descriptions are still considered highly influential for a website. In fact, Google advises that every page on your site has a meta description, and that they are all unique.

If a site does not have a description, or it’s too character length is too short, a web crawler will flag it. However, this meta element is not always as impactful as it’s made out to be. Google currently seems to prioritize actual web content, over meta description, when creating a site snippet.

Not all pages are designed to educate or convert visitors, especially those that have a very specific role or function. Consider the significance or weight of the page to determine whether a meta description is needed.

4 Low Word Count or Poor HTML-Text Ratio

In much the same way meta descriptions aren’t crucial for less important places of your site, low-content page warnings aren’t always worth worrying over. Pages lower than 100 or so words can be marked as sparse, though that number will depend on the specific SEO crawler tool used.

A page’s value is not solely based on word count, largely because not all pages are designed to impart information. A thin-content page may very well be functional, including a log-in or contact page.

At the same time, you may get a HTML-text ratio warning, illustrating that there is too little content for the underlying code. As there is no optimal ratio – advice ranges from 25% all the way up to 70% – these error messages don’t always warrant attention.

There are plenty of factors at play with a HTML-text ratio, so if you have trouble getting to the root of the problem, a local SEO agency can help out.

5 Sitemap not included in Robots.txt File

The robots.txt file tells search engines what pages to crawl or not crawl, as well as whether they even have permission to do so. It’s the first thing a bot will look for when entering your site. The sitemap is an XML file containing every webpage of the site, along with their respective metadata.

Search engines bots don’t usually have difficulty exploring smaller sites, so the sitemap-robots.txt relationship is less significant. Even so, an XML sitemap doesn’t have to appear in the robots text file if it has been correctly submitted to the Google Search Console, or if it is found in the standard location, https://www.domain.com/sitemap.xml.

Final Thought

There are lots of SEO issues that massively impact the search ranking performance of a site, such as page speed and link structures. It’s important to carefully consider which error messages to rectify, schedule for later, or downright ignore. However, for the self-taught web designer, SEO audit tools may seem to be speaking in a different language.

If you don’t know how the elements of a webpage relate to one another, it’s not obvious which issues need to be fixed. This is when professional SEO companies like Shtudio will help you out. Offering an extensive range of web development and graphic design services, they can work out the best SEO strategy to comply with your business requirements.