Robots.txt Generator tool is brought to you by SEOTOOLSPORTAL.
This tool can be used to quickly generate a robots.txt file for your site, allowing you the ability to input custom user-agents and disallow or allow parts of your website more easily than creating one from scratch.
If you are not sure what a robots.txt file is, see Google's documentation here.
In a nutshell, though, it tells search engines what parts of your website they can crawl and index. It can also be used to block certain sections from being crawled if you're using very specific user agents for testing purposes etc.
The tool is easy to use. You simply input a URL address and choose from the user-agents listed whether you want to allow or disallow each one from accessing your website.
Once you have made your choices, click "Generate" and the robots.txt file will be displayed in a popup box which you can then download to your computer.
The robots.txt file will be saved as a .txt file which can be opened with any program that allows you to view txt files (for example, WordPad).
You can then open the file on your site and upload it via your favorite FTP software (usually Filezilla) to the root of your website.
Finally, change the permissions of this file to be either All Files or Read Only.
(All Files allows search engines to crawl it and read-only prevents them from editing it).
That's all you need to do! You can now enjoy a more simplified way of allowing and disallowing parts of your website for crawlers.
This tools is property of SeoToolsPortal and free to Use.