Select your language

You might already know that a large part of your website traffic comes from non-human visitors (bots). This could be search engines crawling your site, bots with darker intentions or maybe a malfunctioning script in your own site.

It's good to monitor all traffic, pages should be sufficiently crawled by search engines or from a security point of view it's good to know if your site isn't being compromised.

Javascript based tracking relies on the client when this client does not support java this method fails to report in that case traffic is invisible. Web Server Log files record human and non-human traffic information.

Javascript tracking is not as accurate as we would like it to be because of:

  • Visitors that use browsers with Javascript disabled
  • Visitors that Block/Delete Cookies
  • Cookies that time-out
  • Impatient visitors that click away before the tagged page loads completely
  • Above mentioned bots of which the majority does not support Javascript

An Important reason for using webserver logs could be site errors. Nothing more frustrating than to discover visitors are lost because of broken links, missing pages or other errors. Although it is possible to track customized 404 pages, Javascript based tracking is not really suited for error reports.

On the other hand, there is also some data that can't be recorded by the log file, IE screen resolution and color depth of your visitors browser. The Javascript tracker has an advantage here.

Please use this form only to provide feedback on the above guide.
For support requests please use this form instead.

 

Unable to find what you were looking for?

Our support experts are happy to assist you personally!

 

© 2001 - Hostpoint AG