Include a robots.txt File
By far the easiest top 10 SEO tips you will ever do as it relates to search engine optimization is include a robots.txt file at the root of your website. Open up a text editor, such as Notepad and type “User-agent: *”. Then save the file as robots.txt and upload it to your root directory on your domain. This one command will tell any spider that hits your website to “please feel free to crawl every page of my website”.
Here’s one of my best top 10 SEO tips: Because the search engine analyzes everything it indexes to determine what your website is all about, it might be a good idea to block folders and files that have nothing to do with the content we want to be analyzed. You can disallow unrelated files to be read by adding “Disallow: /folder_name/” or “Disallow: /filename.html”.
0 comments »
Leave your response!
Add your comment below,or trackback from your own site.
Be nice. Keep it clean. Stay on topic. No spam.