Search Engine Optimization

Die vier kritischsten SEO-Fehler überwachen

Website Monitoring Magazine

First of all, yes, OnPage optimization is incredibly important. The correct setting of headings and meta tags has a strong strongly on the ranking in Google and Co. But in this article we want to go one step back. Errors that prevent Google from indexing a website in the first place.

Error in the robots.txt

Almost every website has a robots.txt. There, among other things, you can store meta information for search engines that will help them crawl the page cleanly. As helpful as this is, it can also be dangerous. Mistakes here can affect the entire website. A small one:

User-agent: *
Disallow: /

would, for example, throw all pages out of Google. Can't happen to you? In the many years that we have been providing monitoring, we have heard this many times. Unfortunately, we've also often heard that it did happen. Most often, this has been the case when a stage environment has been was made publicly available. Of course, this should never be in the Google index. So we quickly adjusted the robots.txt. And to be on the safe side comittet. You already know what we want to get at. Stage's robots.txt somehow made it to Live and no one noticed.

koality.io continuously checks for such anomalies in robots.txt and thanks to the Google Lighthouse SEO Checks, headers and co. are also checked here.

Broken sitemap.xml

In the past, search engines lived up to their name. They searched the website and they did it by crawling. If a subpage could not be reached via some path through the project, it could not be indexed. Today one supports by a sitemap.xml. In this (or file(s) you will find a list with all URLs you have on offer. Combined with the date of the last change.

For Google and Co. this file simplifies the examination of the web page immensely. Only those pages have to be reindexed, which have actually been changed. have been changed. This is fast and you can assume that new content will thus find its way into the index much faster.

It's nice that you are reading our magazine. But what would be even nicer is if you tried our service. koality.io offers extensive Website monitoring especially for web projects. Uptime, performance, SEO, security, content and tech.

I would like to try koality.io for free
.

If the sitemap.xml, which has to follow a strict structure is broken, so the search engine has to start going the old way again. So much slower in index for new documents and also for changes in existing ones.

In koality.io you have to deposit the current XML sitemap, from then on it will be checked once every hour if it complies with the Google standard and if there is any content in it at all. Thus, one is informed very promptly when Google could have potential problems.

Not optimized for mobile

Most of the traffic of the internet is generated by mobile devices. Google knows this too and indexes the mobile version of the website and not the one you see you see via your PC or laptop. That makes sense. Unfortunately, we agencies and developers build the pages on our MacBooks, Linux and Windows computers. So it can happen that a release is not always optimized in detail for mobile. But of course it shouldn't be.

Fortunately, Google offers a service with which you can check your website for "mobile friendliness". can. Here you can simply enter the desired URL and then see it with the eyes of Google or already get the first hints, if necessary. Fonts are too small or buttons are hardly clickable for the normal finger on the Smartphone.

Exactly this service is also used by koality.io to check the deposited website automatically and continuously. So if a release ever violates the mobile guidelines, the website monitoring tool will inform.

Too much JavaScript

Yep, Google can do JavaScript by now. At least sometimes. So, in the near future we will be able to use our marketing websites in our beloved JavaScript frameworks, like Vue, React or Angular. Unfortunately, it's not quite there yet. It's obvious. When we call a web page with cUrl, then we get our responses after 100-500ms. That's fast. If it is first chased through the browser, then all JavaScripts are are executed (internal + external) and the images are loaded, then it takes rather 5-7 seconds. That's slow. So we have a factor of 50 between the duration. So rendering everything once in a while will be feasible, switching completely to JavaScript might not be such a good idea.

If you still want to build the page with JavaScript (like we did this one), you should make sure you turn on SSR (server side rendering). Then you should be on the safe side.

If you still rely heavily on JavaScript, koality.io can help. We scan all important page types once an hour for JavaScript errors. This involves loading the page in a Chrome browser, waiting until all JS has been executed, and then checking for errors.

If you can think of any critical SEO errors that koality.io should definitely detect, feel free to send them to us.