Wednesday, June 25, 2008

Answers to the Questions on SEO

1. What are the most important elements of SEO?
The main elements are to ensure that the key content on site is readable by a search engine spider (i.e. in HTML text format) and that all pages on site are linked in some way via HTML text links. If this is done then your site has a chance to compete in search results. Once this is achieved then it is generally just a case of tweaking the wording, using headings for emphasis and organizing content around specific themes.

It is also crucial to give regard to page titles and the specific descriptive snippet (Meta description) that appears in search results. Having a bland or confusing listing in the results is as bad as not being there at all. The search results are the first point of sale for the business and should be treated that way.

2. How does the quality of a website impact on its search ranking?
Firstly quality encourages links. People search for quality content; people book mark quality content and people link to quality content. Ultimately it will be the relative volume (and quality) of links to your site that search engines record (as compared to a competitor) that will determine the ranking.

Secondly Google does not treat all links equally. When calculating the all important page rank it compares the content of the site providing the link with that of the destination and weights accordingly. The more relevant the association between the content on the two sites the greater value the link will be given. Therefore the higher the “quality” of the referral site the more value that will be passed on. Also Google allocates page ranking scores and these are also passed on via links. Therefore a site with “quality” page rank passes on more search engines listing value than one with a low score.

3. Should a SEO strategy take into account the differences between search engines and web directories?
In theory you could optimize for all different flavors of search engine and directory. At times the differentiation between SE and directory can also be a little hazy. Yahoo for example is, strictly speaking, a directory not a SE. However it is much more like a SE in operation than a pay to play web directory like Yellow Pages etc. So whilst you could tailor your strategy down to a really granular level based on every different type of SE and directory in practice the most cost effective way is to target Google as this has the most stringent requirements and generally speaking if you can crack Google you will also do well in the others.

With regards to the true directories I generally encourage clients to run some paid search ads to judge what are the high performing keywords and what wording works best in terms of delivering customers to the site and then use this as the basis for directory listings and refining descriptions etc for free results. So in this regard the strategy should be inclusive across all listing mediums.

4. What are some of the most common website flaws in terms of SEO?
· Key content in unreadable formats. i.e. images, flash movies, JavaScript etc
· Menu navigation that is unable to be followed by search engine spiders i.e. that using flash, images, JavaScript without having an alternate means of accessing via text links or central site map
· Bland and undersold page titles and descriptions. Every page on a web site is an opportunity to target increasingly granular themes and keywords. Many sites use a universal approach to page titles and descriptions and try and make them suit everyone and everything but the result is that they are attractive to no one.
· Lack of thought in content wording. Web copywriting is different to that required for offline purposes as it must target not only humans but search engine cataloging systems. Therefore wording needs to be tight and reinforce content themes and trigger words that someone is likely to search for.
· Attempts to spam search engines. The best results come from tightly organized and worded pages built around relevant themes and linked via consistent means to other similar pages. Excessive repetition and use of link farms etc are often counterproductive in the long term.

5. Is it better to submit a site to a search engine or let the search engine find it?
As long as you are not doing it repeatedly there is really no harm in submitting your site to a search engine although this will not necessarily speed up the indexing process. The best way to get indexed is to work to get links to your site on other sites that have already been picked up in Google and have reasonable page ranks that provide relative assurance that the Search Engines would spider them again.

One of the things that I generally get my clients to do (if appropriate) when a new site is launched is to run a few Google AdSense contextual ads on the site (at least for a little while). This will force Google to index the site so that it can serve appropriate contextual ads.

6. What sort of tactics constitutes ‘SEO spam’ and what are the consequences?
The difference between good SEO and bad SEO is often simply a question of degree. A few highly optimized landing pages can provide great results in delivering searchers deep into the site and streamline this process for all concerned. Taken to the extreme this can turn into link farms with pages of links bouncing users and spiders around annoyingly.
Generally speaking the following are things to be careful of:
· Link farming. Pages of links linking to each other and set up for the purposes of passing on search engine points rather than providing value to customers.
· Over optimizing through repetition. Just repeating the same keyword over and over provides little value. The key is to provide context not rubbish. Read it aloud if it sounds clumsy rewrite it.
· Invisible text. E.g. White text on a white background so that the search engines pick it up but is invisible to naked eye. Search engines are smart enough to pick this up now.
· Using redirects or other forms of cloaking to serve different content to search engine spiders than to browsers.

7. If a website is properly search engine optimized, is it really necessary to pay for search results?
The organic search indexing system is often slow. It can take up to 3 months for any site changes to get picked up therefore paying to play is often the only way to legitimately get quick results. Also with search marketing it is really a case of page one or oblivion. If the keyword that you are targeting is very popular and competitive then it will take longer to get to the first page of the search results. Paid search is often the only way to get there in the short term.

SEO and SEM (Paid Search Engine Marketing) should really be considered complimentary strategies because they can feed off each other. Running PPC ads enables you to get great feedback on what customers are looking for and the most successful sales pitches that deliver them to your site. These can then be fed into your onsite SEO to refine it. The idea being that you run SEM up until you can get in the page one listings for free and then you can divert the spend elsewhere.

8. What four search engines comprise 90%+ of all general (non site-specific) web search traffic?
Google, Yahoo!, MSN/Live and Ask, though AOL would also be acceptable (AOL, however, pulls its search results from Google and is thus more of a portal that includes Google’s search engine than a true search engine itself).

9. Explain the concept - “the long tail of search.”
The long tail is an economic theory of demand. It posits that in the modern American economy, there are popular products and unpopular products in every sector and segment of demand and that, in any of those given sectors, a demand curve exists with a few popular products that have high demand and a great number of unpopular products that have a much smaller amount of demand per product. Long tail theory says that in any given demand curve, the “tail” or unpopular products, when combined, will have a greater amount of demand than the popular products at the “head.’

10. Name the three most important elements in the head section of an HTML document that are employed by search engines.
Title, Meta Description and Meta Robots are the big 3. Although Meta Robots isn’t essential to have, it’s certainly able to control spider and search activity. Meta keywords is another common answer, but it would rank as a distant 4th, as our experiments show that none of the major engines will rank a page for a keyword that is listed only in the Meta keywords tag.

11. How do search engines treat content inside an IFrame?
The engines all interpret content in an embedded IFrame as belonging to a separate document from the page displaying the IFrame content. Thus links and content inside IFrames refer to the page they come from, rather than the page they are placed on. For SEO, one of the biggest implications of this is that links inside an IFrame are interpreted as internal links (coming from the site the IFrame content is on) rather than external links (coming from the site embedding the IFrame).

13. What action does Google threaten against websites that sell links without the use of “nofollow”?
Google’s Matt Cutts has noted that pages and sites caught selling links for manipulative purposes may have their ability to pass PageRank (or other link juice weighting factors) removed.

14. What is the difference between local link popularity and global link popularity?
Local link popularity refers to links from sites in a specific topical neighborhood, while global link popularity doesn’t discriminate and counts all links from any site on the web.

15. Name four types of queries for which Google provides “instant answers” or “onebox results” ahead of the standard web results.
Flight searches, such as Seattle to Chicago; recipe searches such as chicken recipes; image searches such as those for Hopper paintings; stock quotes like GE stock quote and many more. Google lists them all on their features page. Of course, they neglected to mention our favorite.

16. Describe why flat site architecture is typically more advantageous for search engine rankings than deep site architecture.
Flat architectures on websites allow spiders to crawl a large amount of pages without having to spider through many “clicks” or different pages to reach those links. A deep site architecture will force bots to crawl to many pages before being able to reach all of the content on a site. Flat site architecture provides three primary bonuses - first, search spiders are more likely to visit all of the content; second, the spiders are more likely to discover and index new content more quickly (as they don’t have to visit as many pages to be exposed to new content); third, PageRank and link juice is more effectively passed with fewer pages and more links rather than more pages with fewer links, helping to keep content ranking and out of the (now defunct) supplemental results.

17. What are most famous Directories, supported by google, name some directories, what is open directory project.
DMOZ, Yahoo dir
DMOZ supported by Google.
The Open Directory Project is the largest, most comprehensive human-edited directory of the Web. It is constructed and maintained by a vast, global community of volunteer editors. DMOZ is open directory project.

18. Types of link Exchange?
1 way, 2 way (reciprocal), 3 way.

19. Name top search engines.
Google, Yahoo, MSN, Live, ASK, Altavista, search.com, lycos.com, hotbot.com.

20. From where you could get the traffic information?
Alexa.com