Beginning Web Developer Course: White Hat Search Engine Optimization (SEO)
Once you've created your site and you've done the basic optimization we covered in the last article on all important pages, the first thing is to submit your site to the Big Three search engines on their not-always-obvious submission pages: Google, Yahoo, and Bing. It's true that they all operate "crawler bots" that constantly scan the web for new sites and pages, but if you have a new site, it's unlikely that they'll be able to find a link to it without a bit of a head start.
Let Me Google That For YouSo now, after all your hard work creating content and tweaking META tags, you've submitted your site to Google. After that, most webmasters patiently wait a total of 30 or even 45 whole seconds before checking to see whether their site has shown up yet or not. It does, eventually, and then you're consumed with desire to see it ever higher on the charts.
This is a very dangerous time. It is when you will be most tempted to try Black Hat techniques, to plant secret link farms, to stuff keywords (any keywords!) into your META tags, or even the most shameful thing of all...to send $99 to one of those services that guarantee to get you listed in every search engine and directory known to humankind. In short, you're letting your search engine ranking affect your self-confidence.
But how, exactly, are you searching for your site? If you type the URL into Bing, Yahoo, or Google and it's in their database, your site will be the first listing. Take a screen shot of that and show it to all your friends, by all means, but that's not a fair test: you already know your site's URL. The question is, what keywords will people use to find your site if they don't know your site's name? If you've named it well enough, between the domain name itself and the Title tag, that's worth a lot right there, but keywords, and key phrases, are the keys to success here. If you could think of every possible way that people will search for your site, you could use those keywords and phrases in your content and META tags, and you'd be a lot closer to search engine nirvana.
Controlling The RobotsEven though most beginning webmasters like to get every single page of their site into the search engines, your site will actually be penalized in rankings if the crawler bots find pages with no useful content or that they can't access. For instance, if your site has password-protected pages or pages that could be considered spam (such as a page with nothing but keywords on it, or just links back to other pages on your site), it's best to create a small file called robots.txt in your site's home directory that will instruct them to ignore those pages. Here's an example of what that file could look like, assuming a members-only directory and the other pages mentioned:
User-agent: * Disallow: /members/ Disallow: /keywords.html Disallow: /internal-links.html