If you don’t want some sensitive information on your website to be seen when people search on Google. Then, you have to prevent Googlebot (the automated software that fetches pages from websites and indexes them).
You don’t need to worry, Google search console has a friendly robot.txt generator that helps you create this file. A “robots.txt” file tells search engines whether they can access and therefore crawl parts of your site.
The robot.txt file must be placed in the root directory of your website.
How To Prevent Total Access to a Webpage
Using robot.txt is not a full guarantee for blocking sensitive information on your site because of some non-compliant search engines that will ignore the instructions on the robot.txt file.
The only solution is to In these cases, use the noindex tag if you just want the page not to appear in Google, but don’t mind if any user with a link can reach the page. For real security, though, you should use proper authorization methods, like requiring a user password or taking the page off your site entirely.