People today use DA to discover which Internet websites are trustworthy of their area and to figure out where to get very good hyperlinks. txt file is then parsed and will instruct the robot concerning which web pages are not to get crawled. As being a internet search engine crawler https://www.youtube.com/watch?v=Oe36fe1qPtQ