To be more specific, a Robots file. If you have a web developer, just ask them and they’ll let you know if you have one.
If you’re signed up with our website management plans, we should have one in place for you.
If you’re not sure and want to check yourself, you will need to FTP or log into your hosting account.
Next, go to your public html or www directory, and you should see a file called robots.txt in that directory. If you don’t see the file, you’re going to have to create one.
So, Here’s what this file is for and why you need it…
To start, the robots.txt file isn’t actually a file full of robots, contrary to what some people may believe. It’s actually a set of instructions for Search Engines sometimes also known as Robots, or Bots, and even User Agents. It typically tells robots what they are allowed to access and are blocked from. It can send instructions to all User Agents, and specific ones as well.
Here’s an article from Google on how to create a robots file.
To get started, just create a file in a plaintext editor and save it as robots.txt, and paste in the following: User-agent: *
Then save the file. Next you will add what is disallowed and allowed. Refer to the Google article above for further syntax on allowing and blocking resources.
After it’s created, just upload it to your www directory and you should be all set. You’ll want to make sure it’s free from errors. You can use Google’s error checker here.
Creating a robots file simple and essential to ensure the search engines crawl your website the way you want them to. So make sure yours is up to date and keeping your private pages private and indexing the pages you need to be seen.