Nowadays a lot depends on SEO and your websites' visibility in search engines. Website owners do their best to make their pages rank. However, there might be some pages they don't want to display on the search engine SERP.

For that, you configure default robots. And since the blog is a complex traffic generation tool, you might want to configure default robots for blog pages too.

So, in this article, you will learn more about default robots for blog pages.

What are the default robots?

Default robots, also known as robots.txt, are the standard websites use in order to communicate with web robots. They are used to let the web crawlers know which pages of your website you want and don't want to be scanned and processed.

Why do you need the default robots?

Privacy and SEO. As well as you can "hide" some pages of your website from web robots you are able to tell what pages exactly you what to be analyzed by the search engines and showed up in google search results.

In Magento 2 Blog you can set the Default Robots for blog Author pages.

Default Blog Author Page Robots

Besides, you can configure default robots for Tag and Search pages. 

Magento 2 default robots

Once you set the default robots for blog tag, author, and search pages, they will be included before the head closing tag in page HTML.

However, we also recommend adding them to your website's robots.txt file.

Check this guide on how to configure the robots.txt file in Magento 2 to learn more about each default robot option and find examples of Magento 2 robots.txt file.