The robots.txt file is crucial towards the search engine optimization of your websites because it directly dictates how the web server deals with search engine robots. Within the robots.txt file, you can specify whether or not you’d like bots to be able to access your site. You can also block specific bots from accessing your sites as well.
Most of the time, hosting companies leave the robots.txt in its default state, in which no bots are blocked. However, you may want to check the file and/or enquire with your web host to make sure your robots.txt file is allowing search engine bots. You can also block specific IP addresses or visitors from your web site as well using the robots.txt.