Google used to unofficially support a Noindex directive in robots.txt, however in 2019, they announced that the directive will no longer work.
Here is what Google's John Mueller says about Noindex: in robots.txt:
We used to support the no-index directive in robots.txt
as an experimental feature.
But it's something that I wouldn't rely on.
And I don't think other search engines are using that at all.
Before Google announced the feature was discontinued, deepcrawl.com did some testing of the feature and discovered that:
- Before 2019, it still worked with Google
- It prevented URLs from appearing in the search index
- URLs that have been noindexed in robots.txt were marked as such in Google Search Console
Given that Google discontinued the feature, it shouldn't be used anymore.
Instead, use robots meta tags that are well supported and documented to prevent indexing:
<meta name="robots" content="noindex" />