f you have a public website, you want people to visit it. Whether its function is to disseminate information, promote shopping or other commercial transactions, or generate advertising revenue, your site won’t be effective if no one sees it. Since most Internet users rely on search engines to find websites, good search listings can dramatically increase site traffic. Everyone wants those good listings. Unfortunately, many websites appear far down in search engine rankings or may not be listed at all because their designers don’t consider how people search and how search engines work. Consider a typical query. Fire up your favorite web search engine, type in the keywords you’re looking for, and hit Return. In a few seconds, you’ll be looking at the first few of a series of websites that fit your criteria. But how are these results generated? In preparing this QuickStudy, I searched for “search engine optimisation” on Google. It told me it had found at least 22,800,000 sites and listed the first 10. Users generally assume that the most relevant entries will be presented first, but in fact each search engine uses different algorithms and selection criteria to rank the pages it presents. Thus, different engines will rank and present the same set of pages differently. Users who are very determined in their research might explore beyond the first few entries or pages of entries. But if your site is buried down even a couple of hundred entries in that 22-million-long list, they’ll likely never see it. This is where search engine optimisation, or SEO, can make a big difference by improving the ranking a page gets. How searches work Most search engines, such as Google, are crawler-based and create their listings automatically. They “crawl” the web, looking at both the form and content of web pages. Page titles, body copy, and coded instructions and keywords all play a part in this process. Automated search engines don’t just rely on how often they find query terms on a page. An important technique, pioneered by Google, is link analysis, which looks at how pages connect to one another. The general assumption is that a page that many others link to is probably more important than one that stands alone and thus it should get its ranking boosted. Another factor is click-through measurement. Here, a search engine watches which results a user selects from a particular search, with a view toward dropping high-ranking pages that don’t attract clicks and promoting lower-ranking pages that do collect hits. Some search engines (but not many) depend on living, breathing people to create their listings. You either submit a short description of your entire site, or special editors write descriptions for sites they have found and reviewed. In this type of search engine, queries look for matches only in the written descriptions, not in the actual pages themselves. Finally, some hybrid engines combine both automated and human indexing. What SEO does Optimizing a site for search engines can take many different forms. SEO works by understanding how search algorithms function and what human visitors might search for. It may involve changing the coding, presentation and/or structure of a site to avoid or fix problems that could prevent search engines from fully exploring a site. Here are some factors that must be taken into account: Keywords. The choice, location and frequency of keywords on a page can make a big difference. Most SEO specialists advise using phrases of two or more words. Also, many search engines give more weight when a keyword is located in the title or early in the web page. Targeting the most effective keywords is critical. Besides choosing the relevant terms that occur most often, it’s helpful to devise processes to measure predicted traffic (how many users will search for the term each month), conversion rate (how many users searching with the term will click on an ad, buy a product or finish a transaction) and value per customer (how much revenue, on average, is generated per customer who uses the search term). Look at the competitive environment for your keywords: Who else is using them, and how well do they work? When starting a new site, it’s often better to use only one or two unique phrases per page, keeping other terms on different pages where you can provide individualized information for each. Relevant content. Keywords should reflect the page’s actual content, so you need to include HTML text on your page. Graphics are nice, but search engines can’t read them and may miss text that could make your site more relevant. Stumbling blocks. Some search engines may not read image maps or frames the way you expect. Unless you anticipate and work around these problems, some engines may not index your web pages. If pages on your site are generated with Common Gateway Interface or from databases, some search engines can’t index them. Consider creating at least some static pages, possibly having the database update the pages rather than generate them on the fly.