Robots.txt is the simplest file on a website. It is basically a text file with a set of predetermined rules by the website's owner to Google and other search engines. Robot.txt is simple yet effective when it comes to technical SEO. Also, robots.txt should be handled very carefully. Because even minor errors will have a huge impact on SEO rankings and prevent search engine bots from crawling the important contents and pages on your website. Robots.txt is seen as the mediator for your website and search engine bots or spiders. It tells those search engine bots and spiders what is the page that needs to be crawled and what not to be crawled. Having a perfectly optimized robots.txt will let you spend your website's crawl budget effectively and effectively.
Crawl budget is the number of pages of content that Google and other search engine bots can crawl and index onto their database within a given timeframe. Crawl budget is set up by the respective search engine to crawl and index. Because if search engine bots are allowed to visit a website anytime and crawl all the pages every day, it will completely disrupt the user experience and sometimes it might crash the whole website. So which is why search engines have crawl budget for each and every website. The main purpose of having a crawl budget is to cater uninterrupted user experience for the visitors of the particular site.
So it is quite important to have perfectly optimized robots.txt to spend the crawl budget efficiently. Because spending crawl budget efficiently will have huge impacts on SEO rankings as the robots.txt will delegate the search engine bots to crawl only the most important pages and content on a website. Robots.txt is purely a technical SEO.