10. Gzip Compression: Enable Gzip compression on your own server to reduce the dimensions of information transferred concerning the server and the user’s browser. txt file is then parsed and can instruct the robotic as to which internet pages usually are not to become crawled. For a search engine crawler https://www.youtube.com/watch?v=0USI3vTZ0aw