What is the standard version of the robots.txt file?
In a robots.txt file you can set which files in your Website the Crawlers can access. The robots.txt file remains in the document root of the Website - the folder containing all the Website files. If we take www.yourdomain.ch as an example, the path of the robots.txt file would be www.yourdomain.ch/robots.txt.
The robots.txt file is a plain text file that contains the "Robots Exclusion Standard" (for more on this: https://en.wikipedia.org/wiki/Robots_exclusion_standard). Rules are defined in this file which allow or block a specific crawler from accessing the document root of the domain or subdomain. So if nothing is specified in the robots.txt file, crawlers can access all files.
What comes with the standard version of Hostpoint's robots.txt file?
If you haven't set your own version of a robots.txt file, Hostpoint's standard version will be used instead. It defines a Crawl-Delay (interval between each access, in seconds) so that the bots can't just follow each other immediately and instead have to wait a bit. And the Hostpoint default one allows access to all files.
User-agent: *
Crawl-delay: 3
(Please note that the Crawl Delay isn't standard, so not all bots pay attention to it.)
How can I overwrite the standard version of the robots.txt file?
If you want to create your own robots.txt file, you can place it in your Website's folder and write as many rules as you wish inside. You can find more information on robots.txt files at the following link: https://moz.com/learn/seo/robotstxt
For support requests please use this form instead.