What is Robots.txt?

A file that tells search engines which pages or directories on a website should not be crawled or indexed. It is used to prevent search engines from indexing sensitive or low-quality pages.