PR Adaptive Way Home Good for SEO
Adaptive Way WordPress
Email Adaptive Way

SEO SERP SEM the PR Adaptive Way

 People Search by Spokeo

 How to WIN in Court

SEO SERP the PR Adaptive Way is a PR Page Ranked Domain

Check PageRank

Increase your Domain/Website's SEO SERP Value.  Backlink to our PR Adaptive Way site at our Special Introductory pricing of only $4.95/month!

SEO SERP SEM the PR Adaptive Way. White Hat DoFollow Backlink to Adaptive Way TODAY! Backlink YOUR Domain Name's Website to this SEO SERP SEM Experts Page for SEO SEM and SERP Enhancements.  Your Dofollow Domain Name Backlink may look like this:

Domain Names for Sale and Buy 

or like this:

See MORE of our SEO and SERP results Below!

*What is SEO and why is it important in a Domain Name Fee Appraisal to a Fee Appraiser? Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines via the "natural" or un-paid ("organic" or "algorithmic") search results. Other forms of search engine marketing (SEM) target paid listings. In general, the earlier (or higher on the page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, video search and industry-specific vertical search engines. This gives a website web presence. Source:

*What is SERP and why is it important in a Domain Name Fee Appraisal to a Fee Appraiser? A search engine results page (SERP), is the listing of web pages returned by a search engine in response to a keyword query. The results normally include a list of web pages with titles, a link to the page, and a short description showing where the Keywords have matched content within the page. A SERP may refer to a single page of links returned, or to the set of all links returned for a search query.  Source:

*What is SEM and why is it important in a Domain Name Fee Appraisal to a Fee Appraiser? Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) through optimization (both on-page and off-page) as well as through advertising (paid placements, contextual advertising, and paid inclusions). Depending on the context, SEM can be an umbrella term for various means of marketing a website including search engine optimization (SEO), which adjusts or rewrites website content to achieve a higher ranking in search engine results pages, or it may contrast with SEO, focusing on only paid components.  Source:

Buy this domain

The Domain may be for sale by its owner!  A 12+yr Well Aged Valuable Domain

A or verified Fee Appraisal Valuation by Professional Fee Appraisers Valued to $25,790.00.  A 12yr Well Aged Valuable Domain.

 From and

I've got $50 bonus promo code which can be spent for PR4-PR8 contextual links. Such links can increase your traffic and sales for ~250% and get you in the Top 10 on Google. Sign up free at today and use this promo code on the 2nd step of registration: 4df6cbc5 - it will give you $50 bonus , which you'll be able to spend for high quality PR4-PR8 contextual links.

"When you need TRUE Legal Services, visit Jurisdictionary® to get step-by-step tips and tactics for winning ... with or without a lawyer!"

AutoSurf Traffic Exchange: Powerful Results with SEO tips | 10KHits - 10KHits provides quality traffic hits to your personal or business websites.

Free ONE Million Hits For Your Web Site! - Promote all your websites from a single traffic exchange and do it FREE of charge! Just join in and you will be given 1 million hits FREE!
Newest Featured Arrivals.  Pricing, Statistics and Availability subject to change without notice.

bennetts tree stump removal service bennetts tree stump removal in wokingham reading we can remove the smallest tree stumps to the largest of tree stumps, even in the most difficult locations. With small Narrow Access Grinders to large Track Mounted Stump Grinders we can remove one single stump in a Domestic back garden, or multiple stumps on Commercial sites in preparation for building or civil engineering projects tree stump removal, stump grinding, tree root removal, remove tree stump,  Special Promotion from DomainSells.

Best pediatric dentist in the lowcountry.  Special Promotion from DomainSells.

Browse data on the recent real estate transactions in Gentrytown Antioch Neighborhood. Great for discovering comps, Find homes for sale in Gentrytown neighborhood, houses for sale in Antioch Neighborhood Calif. Sales history, photos, and more.  Special Promotion from DomainSells.

For the very best in Hillsboro Carpet Cleaning use Mountain View Carpet Care, we are open 7 days a week 10am to 10pm  Special Promotion from DomainSells.

Buy New Hospitality RFID lock or have us repair you hotel access control, We service vingcard,onity,tesa,kaba,unican,miwa,saflok,timelok,bline,acculock  Special Promotion from DomainSells.  

South London Removals   Special Promotion from DomainSells.  

Fencing contractors in wokingham berkshire home and garden fencing, fencing repairs, entrance gates free quotations Special Promotion from DomainSells.  

Tree surgeon in wokingham berkshire carry out tree felling, pruning, crown lifting, thinning, stump grinding, Special Promotion from DomainSells.  

Etobicoke condos, etobicoke condos for sale, etobicoke condos for rent.  Special Promotion from DomainSells.  

Legal steroids supplements by Muscle Labs are wihtout question the #1 muscle supplements to gain muscle, burn fat, and improve your physique in the shortest amount of time possible.  Special Promotion from DomainSells.  

Southlake Locksmiths for commercial, residential, and automotive  Special Promotion from DomainSells. 

Home inspector Michael Del Greco from Accurate Inspections, Inc in New Jersey provides information about ice damming.  Special Promotion from DomainSells. 

Dallas crime scene clean up Special Promotion from DomainSells. 

Broadcasts from the Golden Age of Radio for your listening pleasure. Special Promotion from DomainSells. 

Enjoy thousands of hours of classic old-time radio programming… for free! Special Promotion from DomainSells. 4yr Aged SEO SERP Domain Name with Website Valuated to $5,600.00 4yr Aged Google PR1 Page Rank 1 SEO SERP Domain Name Valuated to $7,700.00 3yr Aged Google PR1 Page Rank 1 Domain Name for Sale Appraised at over $6,800.00.  Awesome 853,839 Backlinks 7yr Aged Google PR3 Page Rank 3 Domain Name for Sale Appraised at over $16,500.00.  Over 353,252 Backlinks 12yr Aged Google PR2 Page Rank 2 Domain Name for Sale Appraised at over $25,700.00 2yr Aged Google PR1 Page Rank 1 Domain Name for Sale Appraised at over $1,500.00 SPY site.  8yr Aged Domain Name for Sale Appraised at over $13,000.00 11yr Aged Google PR2 Page Rank 2 Domain Name for Sale Appraised at over $24,000.00.  Awesome 1,300 Yahoo Links 9yr Aged Google PR1 Page Rank 1 Domain Name for Sale Appraised at over $20,000.00.  Alexa Ranking: 11,671,724 6yr Aged Google PR1 Page Rank 1 Domain Name for Sale Appraised at over $14,000.00 2yr Aged Google PR1 Page Rank 1 Domain Name for Sale. 


Backlinks (EDU / GOV): 19973 (13 / 41)

Domains (EDU / GOV): 2511 (5 / 1)

IP addresses (Class C subnets): 342 (291)

URL's External Backlink Statistics

ACRank: 6

Backlinks (EDU / GOV): 1500 (13 / 41)

Domains (EDU / GOV): 212 (5 / 1) 14yr Aged Google PR2 Page Rank 2 Domain Name for Sale Appraised at over 44m Pounds.  Alexa Ranking under 1,000.  "It has 221,437 Backlinks. It's good for SEO website. has 55% SEO score." Our Flagship Domain Name


Backlinks (EDU / GOV): 34766 (1 / 0)

Domains (EDU / GOV): 3412 (1 / 0)

IP addresses (Class C subnets): 292 (218)

URL's External Backlink Statistics            

ACRank: 9

Backlinks (EDU / GOV): 31286 (1 / 0)

Domains (EDU / GOV): 3316 (1 / 0) 

SEO Experts.  SERP Experts. SEM Experts.  Good for SEO.  Domain Names Fee Appraiser Services.  Domain Names Fee Appraisal Services.  Top Domain Names for Sale.  Top Domain Names for Buy.  Domain Names for Investment Portfolios. For an Article on Backlinks, see below



Recommended Providers of SEO SERP Domain Name Related and other Legal Services:



"When you need TRUE Legal Services, visit Jurisdictionary® to get step-by-step tips and tactics for winning ... with or without a lawyer!"
do it yourself software from Standard Legal



From Wikipedia, the free encyclopedia


For the backlink functionality in Wikipedia, see Help:What links here.

Backlinks, also known as incoming links, inbound links, inlinks, and inward links, are incoming links to a website or web page. In basic link terminology, a backlink is any link received by a web node (web page, directory, website, or top level domain) from another web node.[1]

Inbound links were originally important (prior to the emergence of search engines) as a primary means of web navigation; today, their significance lies in search engine optimization (SEO). The number of backlinks is one indication of the popularity or importance of that website or page (for example, this is used by Google to determine the PageRank of a webpage). Outside of SEO, the backlinks of a webpage may be of significant personal, cultural or semantic interest: they indicate who is paying attention to that page.


Search engine rankings

Search engines often use the number of backlinks that a website has as one of the most important factors for determining that website's search engine ranking, popularity and importance. Google's description of their PageRank system, for instance, notes that Google interprets a link from page A to page B as a vote, by page A, for page B.[2] Knowledge of this form of search engine rankings has fueled a portion of the SEO industry commonly termed linkspam, where a company attempts to place as many inbound links as possible to their site regardless of the context of the originating site.


Websites often employ various search engine optimization techniques to increase the number of backlinks pointing to their website. Some methods are free for use by everyone whereas some methods like linkbaiting requires quite a bit of planning and marketing to work. Some websites stumble upon "linkbaiting" naturally; the sites that are the first with a tidbit of 'breaking news' about a celebrity are good examples of that. When "linkbait" happens, many websites will link to the 'baiting' website because there is information there that is of extreme interest to a large number of people.


There are several factors that determine the value of a backlink. Backlinks from authoritative sites on a given topic are highly valuable.[3] If both sites have content geared toward the keyword topic, the backlink is considered relevant and believed to have strong influence on the search engine rankings of the webpage granted the backlink. A backlink represents a favorable 'editorial vote' for the receiving webpage from another granting webpage. Another important factor is the anchor text of the backlink. Anchor text is the descriptive labeling of the hyperlink as it appears on a webpage. Search engine bots (i.e., spiders, crawlers, etc.) examine the anchor text to evaluate how relevant it is to the content on a webpage. Anchor text and webpage content congruency are highly weighted in search engine results page (SERP) rankings of a webpage with respect to any given keyword query by a search engine user.

Increasingly, inbound links are being weighed against link popularity and originating context. This transition is reducing the notion of one link, one vote in SEO, a trend proponents[who?] hope will help curb linkspam as a whole.


When HTML (Hyper Text Markup Language) was designed, there was no explicit mechanism in the design to keep track of backlinks in software, as this carried additional logistical and network overhead.


Most Content management systems include features to track backlinks, provided the external site linking in sends notification to the target site. Most wiki systems include the capability of determining what pages link internally to any given page, but do not track external links to any given page.


Most commercial search engines provide a mechanism to determine the number of backlinks they have recorded to a particular web page. For example, Google can be searched using to find the number of pages on the Web pointing to Google only shows a small fraction of the number of links pointing to a site. It credits many more backlinks than it shows for each website[citation needed].


Other mechanisms have been developed to track backlinks between disparate webpages controlled by organizations that aren't associated with each other. The most notable example of this is TrackBacks between blogs.

See also


1.                               ^ Lennart Björneborn and Peter Ingwersen (2004). "Toward a Basic Framework for Webometrics". Journal of the American Society for Information Science and Technology 55 (14): 1216–1227.   doi:10.1002/asi.20077. 

2.                               ^ Google's overview of PageRank

3.                               ^ "Does Backlink Quality Matter?". Adgooroo. 2010-04-21. Retrieved 2010-04-21. 

Retrieved from "


·    Internet terminology

·    World Wide Web

·    Hypertext

·    Search engine optimization


"Domains have and will continue to go up in value faster than any other commodity EVER known to man." says Bill Gates, Microsoft Corp.

"People truly just don't understand what they're passing up," says millionaire domainer, Rick Schwartz. He feels that the mainstream still hasn’t figured out the power of the domain yet." Schwartz states he has spent nearly $1 million adding to his portfolio of over 3,000 domain names. He is convinced that one day domain names like electronic real estate "will be some of the most valued property ever known to man." "It'll be worth hundreds of millions in about 10 years," he says. Many more great people have many things to say about Domain names.

A Domain name is the address of a website – for example: e.g. , , , , . Domain names are valuable and tradable commodities.


FREE for your Domain Name SEO SERP Compliments of  a Domain


















































See MORE of our OUTSTANDING SEO and SERP Results!



SEO Special! Do you need to get in Top 10 on Google to significantly increase your rankings, traffic and sales? I've found a great PR4-PR8 Contextual Link Building Service which I recommend you to check out because all their links come from PR4-PR8 webpages! Anyway, the sign up there is free, so make sure to check it out.
P.S. By the way, are you aware of
Google PageRank Formula? Download Free 3-Pages PDF Report, Revealing Google PageRank Formula.


Search engine marketing

From Wikipedia, the free encyclopedia

Search engine marketing (SEM) is a form of Internet marketing that involves the promotion of websites by increasing their visibility in search engine results pages (SERPs) through optimization (both on-page and off-page) as well as through advertising (paid placements, contextual advertising, and paid inclusions).[1] Depending on the context, SEM can be an umbrella term for various means of marketing a website including search engine optimization (SEO), which adjusts or rewrites website content to achieve a higher ranking in search engine results pages, or it may contrast with SEO, focusing on only paid components.[2]



In 2008, North American advertisers spent US$13.5 billion on search engine marketing. The largest SEM vendors were Google AdWords, Yahoo! Search Marketing and Microsoft adCenter.[1] As of 2006, SEM was growing much faster than traditional advertising and even other channels of online marketing.[3] Because of the complex technology, a secondary "search marketing agency" market has evolved. Some marketers have difficulty understanding the intricacies of search engine marketing and choose to rely on third party agencies to manage their search marketing.


As the number of sites on the Web increased in the mid-to-late 90s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[4] in 1996 and then[5] in 1998. later changed its name[6] to Overture in 2001, and was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary money-makers[7] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[8]

Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "Search Engine Marketing" was proposed by Danny Sullivan in 2001[9] to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals.

SEM methods and metrics

There are four categories of methods and metrics used to optimize websites through search engine marketing.[10][11][12][13]

  1. Keyword research and analysis involves three "steps:" ensuring the site can be indexed in the search engines, finding the most relevant and popular keywords for the site and its products, and using those keywords on the site in a way that will generate and convert traffic.
  2. Website saturation and popularity, how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed on search engines (saturation) and how many backlinks the site has (popularity). It requires your pages containing those keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The followings are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.[11]
  3. Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files[10] and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic programs WebSideStory's Hitbox; (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues ensure your website meets W3C code standards. Try to use more than one HTML validator or spider simulator because each tests, highlights, and reports on slightly different aspects of your website.
  4. Whois tools reveal the owners of various websites, and can provide valuable information relating to copyright and trademark issues.[12]

Paid inclusion involves a search engine company charging fees for the inclusion of a website in their results pages. Also known as sponsored listings, paid inclusion products are provided by most search engine companies, the most notable being Google.

The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently.[14] A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[15] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006,[16][17]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).

Some detractors of paid inclusion allege that it causes searches to return results based more on the economic standing of the interests of a web site, and less on the relevancy of that site to end-users.

Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.

Paid inclusion is a search engine marketing method in itself, but also a tool of search engine optimization, since experts and firms can test out different approaches to improving ranking, and see the results often within a couple of days, instead of waiting weeks or months. Knowledge gained this way can be used to optimize other web pages, without paying the search engine company.

Comparison with SEO

SEM is the wider discipline that incorporates SEO. SEM includes both paid search results (Using tools like Google Adwords or Microsoft adCenter) and organic search results (SEO). SEM uses paid advertising with AdWords or Microsoft adCenter , pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done. A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time. SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices.

In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.

Another part of SEM is social media marketing (SMM). SMM is a type of marketing that involves exploiting social media to influence consumers that one company’s products and/or services are valuable.[18] Some of the latest theoretical advances include search engine marketing management (SEMM). SEMM relates to activities including SEO but focuses on return on investment (ROI) management instead of relevant traffic building (as is the case of mainstream SEO). SEMM also integrates organic SEO, trying to achieve top ranking without using paid means of achieving top in search engines, and pay per click SEO. For example some of the attention is placed on the web page layout design and how content and information is displayed to the website visitor.

Ethical questions

Paid search advertising has not been without controversy, and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports[19][20][21] by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter[22] in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.

Another ethical controversy associated with search marketing has been the issue of trademark infringement. The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years. In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term.[23] Though the policy has been changed this continues to be a source of heated debate.[24]

At the end of February 2011 many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank. SEM has however nothing to do with link buying and focuses on organic SEO and PPC management.


AdWords is recognised as a web-based advertising utensil since it adopts keywords which can deliver adverts explicitly to web users looking for information in respect to a certain product or service. This project is highly practical for advertisers as the project hinges on cost per click (CPC) pricing, thus the payment of the service only applies if their advert has been clicked on. SEM companies have embarked on AdWords projects as a way to publicize their SEM and SEO services. This promotion has helped their business elaborate, offering added value to consumers who endeavor to employ AdWords for promoting their products and services. One of the most successful approaches to the strategy of this project was to focus on making sure that PPC advertising funds were prudently invested. Moreover, SEM companies have described AdWords as a fine practical tool for increasing a consumer’s investment earnings on Internet advertising. The use of conversion tracking and Google Analytics tools was deemed to be practical for presenting to clients the performance of their canvass from click to conversion. AdWords project has enabled SEM companies to train their clients on the utensil and delivers better performance to the canvass. The assistance of AdWord canvass could contribute to the huge success in the growth of web traffic for a number of its consumer’s website, by as much as 250% in only nine months.[25]

Another way search engine marketing is managed is by contextual advertising. Here marketers place ads on other sites or portals that carry information relevant to their products so that the ads jump into the circle of vision of browsers who are seeking information from those sites. A successful SEM plan is the approach to capture the relationships amongst information searchers, businesses, and search engines. Search engines were not important to some industries in the past but over the past years, the use of search engines for accessing information has become vital to increase business opportunities.[26] The use of SEM strategic tools for businesses such as tourism can attract potential consumers to view their products but it could also pose various challenges.[26] These challenges could be the competition that companies face amongst their industry and other sources of information that could draw the attention of online consumers.[26] To assist the combat of challenges, the main objective for businesses applying SEM is to improve and maintain their ranking as high as possible on SERPs so that they can gain visibility. Therefore search engines are adjusting and developing algorithms and the shifting criteria by which web pages are ranked sequentially to combat against search engine misuse and spamming, and to supply the most relevant information to searchers.[26] This could enhance the relationship amongst information searchers, businesses, and search engines by understanding the strategies of marketing to attract business.

See also

Search engines with SEM programs


  1. ^ a b "The State of Search Engine Marketing 2006". Search Engine Land. February 8, 2007. Retrieved 2007-06-07. 
  2. ^ a b "Does SEM = SEO + CPC Still Add Up?". Retrieved 2010-03-05. 
  3. ^ Elliott, Stuart (March 14, 2006). "More Agencies Investing in Marketing With a Click". New York Times. Retrieved 2007-06-07. 
  4. ^ "Engine sells results, draws fire". June 21, 1996. Retrieved 2007-06-09. 
  5. ^ "GoTo Sells Positions". March 3, 1998. Retrieved 2007-06-09. 
  6. ^ "GoTo gambles with new name". September 10, 2001. Retrieved 2007-06-09. 
  7. ^ Jansen, B. J. (May 2007). "The Comparative Effectiveness of Sponsored and Nonsponsored Links for Web E-commerce Queries" (PDF). ACM Transactions on the Web,. Retrieved 2007-06-09. 
  8. ^ "Microsoft-Yahoo Deal Gets Green Light". February 18, 2010. Retrieved 2010-07-15. 
  9. ^ "Congratulations! You're A Search Engine Marketer!". November 5, 2001. Retrieved 2007-06-09. 
  10. ^ a b Chadwick, Terry Brainerd (July 2005). "How search engine marketing tools can work for you: or, searching is really all about finding, first of three articles". Information Outlook. Retrieved 2011-03-21. 
  11. ^ a b Chadwick, Terry Brainerd (October 2005). "How search engine marketing tools can work for you: or searching is really all about finding". Information Outlook.;col1. Retrieved 2011-03-21. 
  12. ^ a b Chadwick, Terry Brainerd (November 2005). "How search engine marketing tools can work for you; or, searching is really all about finding, third of three articles". Information Outlook.;col. Retrieved 2011-03-21. 
  13. ^ "Search Engine Web Marketing Tips". Retrieved 22 February 2012. 
  14. ^ "FAQ #1: What are PermAds?". Retrieved 2010-09-12. 
  15. ^ Zawodny, Jeremy (2004-03-01). "Defending Paid Inclusions". 
  16. ^ Ulbrich, Chris (2004-07-06). "Paid Inclusion Losing Charm?". Wired News.,1367,64092,00.html?tw=wn_tophead_1. 
  17. ^ "FAQ #18: How do I register my site/URL with Ask so that it will be indexed?". Retrieved 2008-12-19. 
  18. ^ Susan Ward (2011). "Social Media Marketing". Retrieved 2011-04-22. 
  19. ^ "False Oracles: Consumer Reaction to Learning the Truth About How Search Engines Work (Abstract)". June 30, 2003. Retrieved 2007-06-09. 
  20. ^ "Searching for Disclosure: How Search Engines Alert Consumers to the Presence of Advertising in Search Results". November 8, 2004. Retrieved 2007-06-09. 
  21. ^ "Still in Search of Disclosure: Re-evaluating How Search Engines Explain the Presence of Advertising in Search Results". June 9, 2005. Retrieved 2007-06-09. 
  22. ^ "Re: Complaint Requesting Investigation of Various Internet Search Engine Companies for Paid Placement or (Pay per click)". June 22, 2002. Retrieved 2007-06-09. 
  23. ^ "Update to U.S. ad text trademark policy". May 14, 2009. Retrieved 2010-07-15. 
  24. ^ Rosso, Mark; Jansen, Bernard (Jim) (August 2010), "Brand Names as Keywords in Sponsored Search Advertising", Communications of the Association for Information Systems 27 (1): 81-98, 
  25. ^ "Google Adwords Case Study". AccuraCast. 2007. Retrieved 2011-03-30. 
  26. ^ a b c d Zheng Xiang, Bing Pan, Rob Law, and Daniel R. Fesenmaier (June 7, 2010). "Assessing the Visibility of Destination Marketing Organizations in Google: A Case Study of Convention and Visitor Bureau Websites in the United States". Journal of Travel and Tourism Marketing. Retrieved 2011-04-22.

Search engine optimization

From Wikipedia, the free encyclopedia

Search engine optimization (SEO) is the process of improving the visibility of a website or a web page in search engines' "natural," or un-paid ("organic" or "algorithmic"), search results. In general, the earlier (or higher ranked on the search results page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,[1] news search and industry-specific vertical search engines.

As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms or keywords typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.

The acronym "SEOs" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.




Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[2] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.

Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997.[3] The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997.[4]

Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[5][unreliable source?] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[6]

By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. Graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[7] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.

Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[8] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[9]

By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[10] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs.[11][12] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.[13]

In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[14] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.[15]

In 2007, Google announced a campaign against paid links that transfer PageRank.[16] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting.[17] As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript. [18]

In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[19]

Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[20]

In February 2011, Google announced the "Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice, however Google implemented a new system which punishes sites whose content is not unique. [21]

Relationship with search engines

Yahoo and Google offices

By 1997, search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[22]

Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[23] was created to discuss and minimize the damaging effects of aggressive web content providers.

Companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[24] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[25] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[26]

Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. Major search engines provide information and guidelines to help with site optimization.[27][28] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[29] Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds, allowing users to determine the crawl rate, and how many pages have been indexed by their search engine.


Suppose each circle is a website, and an arrow is a link from one website to another, such that a user can click on a link within, say, website F to go to website B, but not vice versa. Search engines begin by assuming that each website has an equal chance of being chosen by a user. Next, crawlers examine which websites link to which other websites and guess that websites with more incoming links contain valuable information that users want.
Search engines uses complex mathematical algorithms to guess which websites a user seeks, based in part on examination of how websites link to each other. Since website B is the recipient of numerous inbound links, B ranks highly in a web search, and will come up early in a web search. Further, since B is popular, and has an outbound link to C, C ranks highly too.

Getting indexed

The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[30] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[31] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[32] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links.[33]

Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.[34]

Preventing crawling

To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[35]

Increasing prominence

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[36] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[36] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[37] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

White hat versus black hat

SEO techniques can be classified into two broad categories: techniques that search engines recommend as part of good design, and those techniques of which search engines do not approve. The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[38] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[39]

An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[27][28][40] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[41] although the two are not identical.

Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.

Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[42] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.[43]

As a marketing strategy

SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, depending on the site operator's goals.[44] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[45]

SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[46] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes - almost 1.5 per day.[47] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[48] has suggested that "search marketers, in a twist of irony, receive a very small share of their traffic from search engines." Instead, their main sources of traffic are links from other websites.[49]

International markets

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[50] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[51] As of 2006, Google had an 85-90% market share in Germany.[52] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[52] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[53] That market share is achieved in a number of countries.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.

Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.[52]

Legal precedents

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[54][55]

In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. Kinderstart's website was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[56][57]

See also


  1. ^ Beel, Jöran and Gipp, Bela and Wilde, Erik (2010). "Academic Search Engine Optimization (ASEO): Optimizing Scholarly Literature for Google Scholar and Co.". Journal of Scholarly Publishing. pp. 176–190. Retrieved 2010-04-18. 
  2. ^ Brian Pinkerton. "Finding What People Want: Experiences with the WebCrawler" (PDF). The Second International WWW Conference Chicago, USA, October 17–20, 1994. Retrieved 2007-05-07. 
  3. ^ Danny Sullivan (June 14, 2004). "Who Invented the Term "Search Engine Optimization"?". Search Engine Watch. Retrieved 2007-05-14.  See Google groups thread.
  4. ^ (Document Number 19970801004204) "Documentation of Who Invented SEO at the Internet Way Back Machine". Internet Way Back Machine. Archived from (Document Number 19970801004204) the original on 1997-08-01. (Document Number 19970801004204). 
  5. ^ Cory Doctorow (August 26, 2001). "Metacrap: Putting the torch to seven straw-men of the meta-utopia". e-LearningGuru. Archived from the original on 2007-04-09. Retrieved 2007-05-08. 
  6. ^ Pringle, G., Allison, L., and Dowe, D. (April 1998). "What is a tall poppy among web pages?". Proc. 7th Int. World Wide Web Conference. Retrieved 2007-05-08. 
  7. ^ Brin, Sergey and Page, Larry (1998). "The Anatomy of a Large-Scale Hypertextual Web Search Engine". Proceedings of the seventh international conference on World Wide Web. pp. 107–117. Retrieved 2007-05-08. 
  8. ^ Thompson, Bill (December 19, 2003). "Is Google good for you?". BBC News. Retrieved 2007-05-16. 
  9. ^ Zoltan Gyongyi and Hector Garcia-Molina (2005). "Link Spam Alliances" (PDF). Proceedings of the 31st VLDB Conference, Trondheim, Norway. Retrieved 2007-05-09. 
  10. ^ Hansell, Saul (June 3, 2007). "Google Keeps Tweaking Its Search Engine". New York Times. Retrieved 2007-06-06. 
  11. ^ Danny Sullivan (September 29, 2005). "Rundown On Search Ranking Factors". Search Engine Watch. Retrieved 2007-05-08. 
  12. ^ "Search Engine Ranking Factors V2". April 2, 2007. Retrieved 2007-05-14. 
  13. ^ Christine Churchill (November 23, 2005). "Understanding Search Engine Patents". Search Engine Watch. Retrieved 2007-05-08. 
  14. ^ "Google Personalized Search Leaves Google Labs - Search Engine Watch (SEW)". Retrieved 2009-09-05. 
  15. ^ "Will Personal Search Turn SEO On Its Ear?". Retrieved 2009-09-05. 
  16. ^ "8 Things We Learned About Google PageRank". Retrieved 2009-08-17. 
  17. ^ "PageRank sculpting". Matt Cutts. Retrieved 2010-01-12. 
  18. ^ "Google Loses "Backwards Compatibility" On Paid Link Blocking & PageRank Sculpting". Retrieved 2009-08-17. 
  19. ^ "Personalized Search for everyone". Google. Retrieved 2009-12-14. 
  20. ^ "Relevance Meets Real Time Web". Google Blog. 
  21. ^ "Google Search Quality Updates". Google Blog. 
  22. ^ Laurie J. Flynn (November 11, 1996). "Desperately Seeking Surfers". New York Times. Retrieved 2007-05-09. 
  23. ^ "AIRWeb". Adversarial Information Retrieval on the Web, annual conference. Retrieved 2007-05-09. 
  24. ^ David Kesmodel (September 22, 2005). "Sites Get Dropped by Search Engines After Trying to 'Optimize' Rankings". Wall Street Journal. Retrieved 2008-07-30. 
  25. ^ Adam L. Penenberg (September 8, 2005). "Legal Showdown in Search Fracas". Wired Magazine.,1284,68799,00.html. Retrieved 2007-05-09. 
  26. ^ Matt Cutts (February 2, 2006). "Confirming a penalty". Retrieved 2007-05-09. 
  27. ^ a b "Google's Guidelines on Site Design". Retrieved 2007-04-18. 
  28. ^ a b "Guidelines for Successful Indexing". Retrieved 2011-09-07. 
  29. ^ "Sitemaps". Retrieved 2012-05-04. 
  30. ^ "Submitting To Search Crawlers: Google, Yahoo, Ask & Microsoft's Live Search". Search Engine Watch. 2007-03-12. Retrieved 2007-05-15. 
  31. ^ "Search Submit". Retrieved 2007-05-09. [dead link]
  32. ^ "Submitting To Directories: Yahoo & The Open Directory". Search Engine Watch. 2007-03-12. Retrieved 2007-05-15. 
  33. ^ "What is a Sitemap file and why should I have one?". Retrieved 2007-03-19. 
  34. ^ Cho, J., Garcia-Molina, H. (1998). "Efficient crawling through URL ordering". Proceedings of the seventh conference on World Wide Web, Brisbane, Australia. Retrieved 2007-05-09. 
  35. ^ "Newspapers Amok! New York Times Spamming Google? LA Times Hijacking". Search Engine Land. May 8, 2007. Retrieved 2007-05-09. 
  36. ^ a b "The Most Important SEO Strategy - ClickZ". Retrieved 2010-04-18. 
  37. ^ "Bing - Partnering to help solve duplicate content issues - Webmaster Blog - Bing Community". Retrieved 2009-10-30. 
  38. ^ Andrew Goodman. "Search Engine Showdown: Black hats vs. White hats at SES". SearchEngineWatch. Retrieved 2007-05-09. 
  39. ^ Jill Whalen (November 16, 2004). "Black Hat/White Hat Search Engine Optimization". Retrieved 2007-05-09. 
  40. ^ "What's an SEO? Does Google recommend working with companies that offer to make my site Google-friendly?". Retrieved 2007-04-18. 
  41. ^ Andy Hagans (November 8, 2005). "High Accessibility Is Effective Search Engine Optimization". A List Apart. Retrieved 2007-05-09. 
  42. ^ Matt Cutts (February 4, 2006). "Ramping up on international webspam". Retrieved 2007-05-09. 
  43. ^ Matt Cutts (February 7, 2006). "Recent reinclusions". Retrieved 2007-05-09. 
  44. ^ "What SEO Isn't". June 24, 2006. Retrieved 2007-05-16. 
  45. ^ Melissa Burdon (March 13, 2007). "The Battle Between Search Engine Optimization and Conversion: Who Wins?". Retrieved 2007-05-09. 
  46. ^ Andy Greenberg (April 30, 2007). "Condemned To Google Hell". Forbes. Archived from the original on 2007-05-02. Retrieved 2007-05-09. 
  47. ^ Matt McGee (September 21, 2011). "Schmidt's testimony reveals how Google tests alorithm changes". 
  48. ^ Jakob Nielsen (January 9, 2006). "Search Engines as Leeches on the Web". Retrieved 2007-05-14. 
  49. ^ "A survey of 25 blogs in the search space comparing external metrics to visitor tracking data". Retrieved 2007-05-31. 
  50. ^ Graham, Jefferson (2003-08-26). "The search engine that could". USA Today. Retrieved 2007-05-15. 
  51. ^ Greg Jarboe (2007-02-22). "Stats Show Google Dominates the International Search Landscape". Search Engine Watch. Retrieved 2007-05-15. 
  52. ^ a b c Mike Grehan (April 3, 2006). "Search Engine Optimizing for Europe". Click. Retrieved 2007-05-14. 
  53. ^ Jack Schofield (2008-06-10). "Google UK closes in on 90% market share". London: Guardian. Retrieved 2008-06-10. 
  54. ^ "Search King, Inc. v. Google Technology, Inc., CIV-02-1457-M" (PDF). May 27, 2003. Retrieved 2008-05-23. 
  55. ^ Stefanie Olsen (May 30, 2003). "Judge dismisses suit against Google". CNET. Retrieved 2007-05-10. 
  56. ^ "Technology & Marketing Law Blog: KinderStart v. Google Dismissed—With Sanctions Against KinderStart's Counsel". Retrieved 2008-06-23. 
  57. ^ "Technology & Marketing Law Blog: Google Sued Over Rankings— v. Google". Retrieved 2008-06-23.


Search engine results page

From Wikipedia, the free encyclopedia

search engine results pages (SERP), is the listing of web pages returned by a search engine in response to a keyword query. The results normally include a list of web pages with titles, a link to the page, and a short description showing where the Keywords have matched content within the page. A SERP may refer to a single page of links returned, or to the set of all links returned for a search query.



Query caching

Some search engines cache pages for frequent searches and display the cached pages instead of a live page to increase the performance of the search engine. The search engine updates the search results periodically to account for new pages, and possibly to modify the rankings of pages in the search results. Most of the results are weird and hard work is needed to make them readable.

Search result refreshing can take several days or weeks which can occasionally cause results to be inaccurate or out of date.

Different types of results

SERPs of major search engines like Google, Yahoo!, Bing, may include different types of listings: contextual, algorithmic or organic search listings, as well as sponsored listings, images, maps, definitions, videos or suggested search refinements.

The major search engines visually differentiate specific content types, such as images, news, and blogs. Many content types have specialized SERP templates and visual enhancements on the main search result page.

Generation of SERPs

Major search engines like Google, Yahoo! and Bing primarily use content contained within the page and fallback to metadata tags of a web page to generate the content that makes up a search snippet.[citation needed] The html title tag will be used as the title of the snippet while the most relevant or useful contents of the web page (description tag or page copy) will be used for the description. If the web page is not available, information about the page from dmoz may be used instead.[1]

SERP tracking

Webmasters[who?] use search engine optimization (SEO) to increase their website's ranking on a specific keyword's SERP. As a result, webmasters often check SERPs to track their search engine optimization progress. To speed up the tracking process, programmers created automated software to track multiple keywords for multiple websites.[citation needed]

See also


  1. ^ Anatomy of a search snippet