Dealing with flickering checks
Website Monitoring Magazine
The Internet is complex. In the few (milli-)seconds it takes to access a website, a lot of things happen, all of which are prone to errors. That's why it's not abnormal that you can't reach a page from time to time, it's delivered slowly or single images are missing. Usually everything works again when the browser tab reloads.
For monitoring software this can be a challenge, because you have to look closely at every error found and react differently if necessary. This is the reason why koality.io uses several methods to minimize "flickering", like the appearance of errors that quickly disappear again.
koality.io comes with a set of default settings that minimize the flickering of checks. If in a special case all these don't work, the user has the possibility to change the setup so that no annoying alerts are sent anymore.
Error classification and alerting
Our website monitoring service distinguishes found errors into two classes: critical and moderate errors. This is helpful in order not to be distracted from the important ones by non-critical error notifications. Therefore, in the configuration of alerts at koality.io, you can set that non-critical errors are sent only once a day in bundles and thus do not distract.
Only alert after several errors in a row.
Some systems flicker more than others. In such a case, you may need to actively intervene in error handling. koality.io offers the option to interpret an error as an error only once it has appeared more than once in a row. Since such anomalies usually only occur for a very short time, this setting can work wonders to prevent false negatives from occurring.
This setting can be adjusted per check and can be found directly under 'Settings' on the respective results page.
koality.io already provides a lot of techniques to prevent flickering results in the first place. These are continuously being refined, but already provide a very high level of security.
Especially with measured values like loading speeds there are always outliers. This is often due to busy network connections or servers. For this reason, we decided to smooth the measured values by working with measurement series and taking the average value over the last 5 measurements. This has the effect that an outlier does not immediately pull the data into critical areas.
Restart on anomalies.
We always use real Chrome browsers when performing website analysis. But browsers can also be a source of errors, because no software is perfect. For this reason, we start an analysis completely from scratch if the algorithm has the "feeling" that something did not go through correctly. For example, this could be missing content, an unusual HTTP status code, or other incorrect requests.
Change data center
When checking reachability, we go one step further. As soon as we can't reach a customer's website, we try the same request again from a completely different data center. As a result, the "fales negatives" here can be reduced to virtually zero.
It's nice that you are reading our magazine. But what would be even nicer is if you tried our service. koality.io offers extensive Website monitoring especially for web projects. Uptime, performance, SEO, security, content and tech.I would like to try koality.io for free