What are the default robots?
Default robots also known as robots.txt is the standard websites use in order to communicate with the web robots. They are used to let the web crawlers know which pages of your website you want and don't want to be scanned and processed.
Why do you need the default robots?
Privacy and SEO. As well as you can "hide" some pages of your website from web robots you are able to tell what pages exactly you what to be analyzed by the search engines and showed up in google search results.
In Magento 2 Blog you can set the Default Robots for Tag,Search as well as Author Pages which will be included before the head closing tag in page HTML.