Specify the user agent, crawl delays, and paths to disallow/allow.
The code dynamically creates the proper syntax needed for your server.
Download the generated robots.txt and place it in the root folder of your website.