Quality Link
Search engines count links votes of trust. Quality links count more than low quality links.
There are a variety of ways to define what a quality link is, but the following are characteristics of a high quality link:

  • Trusted Source: If a link is from a page or website which seems like it is trustworthy then it is more likely to count more than a link from an obscure, rarely used, and rarely cited website. See TrustRank for one example of a way to find highly trusted websites.
  • Hard to Get: The harder a link is to acquire the more likely a search engine will be to want to trust it and the more work a competitor will need to do to try to gain that link.
  • Aged: Some search engines may trust links from older resources or links that have existed for a length of time more than they trust brand new links or links from newer resources.
  • Co-citation: Pages that link at competing sites which also link to your site make it easy for search engines to understand what community your website belongs to. See Hilltopfor an example of an algorithm which looks for co-citation from expert sources.
  • Related: Links from related pages or related websites may count more than links from unrelated sites.
  • In Content: Links which are in the content area of a page are typically going to be more likely to be editorial links than links that are not included within the editorial portion of a page.

While appropriate anchor text may also help you rank even better than a link which lacks appropriate anchor text, it is worth noting that for competitive queries Google is more likely to place weight on a high quality link where the anchor text does not match than trusting low quality links where the anchor text matches.
Query
The actual “search string” a searcher enters into a search engine.
Query Refinement
Some searchers may refine their search query if they deemed the results as being irrelevant. Some search engines may aim to promote certain verticals or suggest other search queries if they deem other search queries or vertical databases as being relevant to the goals of the searcher.
Query refinement is both a manual and an automated process. If searchers do not find their search results as being relevant they may search again. Search engines may also automatically refine queries using the following techniques:

  • Google OneBox: promotes a vertical search database near the top of the search result. For example, if image search is relevant to your search query images may be placed near the top of the search results.
  • Spell Correction: offers a did you mean link with the correct spelling near the top of the results.
  • Inline Suggest: offers related search results in the search results. Some engines also suggest a variety of related search queries.

Some search toolbars also aim to help searchers auto complete their search queries by offering a list of most popular queries which match the starting letters that a searcher enters into the search box.

R

Recall
The portion of relevant documents that were retrieved when compared to all relevant documents.
Reciprocal Links
Nepotistic link exchanges where websites try to build false authority by trading links, using three way link trades, or other low quality link schemes.
When sites link naturally there is going to be some amount of cross linking within a community, but if most or all of your links are reciprocal in nature it may be a sign of ranking manipulation. Also sites that trade links off topic or on links pages that are stashed away deep within their sites probably do not pass much link authority, and may add more risk than reward.
Quality reciprocal link exchanges in and of themselves are not a bad thing, but most reciprocal link offers are of low quality. If too many of your links are of low quality it may make it harder for your site to rank for relevant queries, and some search engines may look at inlink and outlink ratios as well as link quality when determining how natural a site’s link profile is.
Redirect
A method of alerting browsers and search engines that a page location moved. 301 redirects are for permanent change of location and 302 redirects are used for a temporary change of location.
Registrar
A company which allows you to register domain names.
Reinclusion
If a site has been penalized for spamming they may fix the infraction and ask for reinclusion. Depending on the severity of the infraction and the brand strength of the site they may or may not be added to the search index.
Referrer
The source from which a website visitor came from.
Relative Link
A link which shows the relation of the current URL to the URL of the page being linked at. Some links only show relative link paths instead of having the entire reference URL within the a href tag. Due to canonicalization and hijacking related issues it is typically preferred to use absolute links over relative links.
Remarketing (see retargeting)
Repeat Visits
Visitors to a website which have visited it in the recent past.
Search engines may look at signals like a strong stream of regular repeat visits to a site & manybrand-related searches as signals of strong user engagmenet, which lead them to rank the site higher in algorithms like Panda. Sites which get few repeat visits and few brand-related searches compared agains their overall search traffic footprint may be viewed as lower quality sites & have their rankings suppressed.
Reputation Management
Ensuring your brand related keywords display results which reinforce your brand. Many hate sites tend to rank highly for brand related queries.ResubmissionMuch like search engine submission, resubmission is generally a useless program which is offered by businesses bilking naive consumers out of their money for a worthless service.
Rewrite (see URL Rewrite)
Retargeting
Advertising programs targeted at people who have previously visited a given website or channel, viewed a particular product, or added a particular product to their shopping cart.
Due to activity bias, many retargeted ads overstate their effective contribution to conversions.
Reverse Index
An index of keywords which stores records of matching documents that contain those keywords.
Robots.txt
A file which sits in the root of a site and tells search engines which files not to crawl. Some search engines will still list your URLs as URL only listings even if you block them using a robots.txt file.
Do not put files on a public server if you do not want search engines to index them!
ROI
Return on Investment is a measure of how much return you receive from each marketing dollar.
While ROI is a somewhat sophisticated measurement, some search marketers prefer to account for their marketing using more sophisticate profit elasticity calculations.
RSS
Rich Site Summary or Real Simple Syndication is a method of syndicating information to afeed reader or other software which allows people to subscribe to a channel they are interested in.

S

Safari
A popular Apple browser.
Salton, Gerard
Scientist who pioneered the information retrieval field.
Scumware
Intrusive software and programs which usually target ads, violate privacy, and are often installed without the computer owner knowing what the software does.
Search History
Many search engines store user search history information. This data can be used for better ad targeting or to make old information more findable.
Search engines may also determine what a document is about and how much they trust a domain based on aggregate usage data. Many brand related search queries is a strong signal of quality.
Search Engine
A tool or device used to find relevant information. Search engines consist of a spider, index,relevancy algorithms and search results.
Search pogo (see pogo rate)
SEM
Search engine marketing.
SEO
Search engine optimization is the art and science of publishing information and marketing it in a manner that helps search engines understand your information is relevant to relevant search queries.
SEO consists largely of keyword research, SEO copywriting, information architecture, link building,brand building, building mindshare, reputation management, and viral marketing.
SEO Copywriting
Writing and formatting copy in a way that will help make the documents appear relevant to a wide array of relevant search queries.
There are two main ways to write titles and be SEO friendly

  1. Write literal titles that are well aligned with things people search for. This works well if you need backfill content for your site or already have an amazingly authoritative site.
  2. Write page titles that are exceptionally compelling to link at. If enough people link at them then your pages and site will rank for many relevant queries even if the keywords are not in the page titles.

SERP
Search Engine Results Page is the page on which the search engines show the results for a search query.
Search Marketing
Marketing a website in search engines. Typically via SEO, buying pay per click ads, and paid inclusion.
Server
Computer used to host files and serve them to the WWW.
Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.
Server Logs
Files hosted on servers which display website traffic trends and sources.
Server logs typically do not show as much data and are not as user friendly as analytics software. Not all hosts provide server logs.
Singular Value Decomposition
The process of breaking down a large database to find the document vector (relevance) for various items by comparing them to other items and documents.
Important steps:

  • Stemming: taking in account for various forms of a word on a page
  • Local Weighting: increasing the relevance of a given document based on the frequency a term appears in the document
  • Global Weighting: increasing the relevance of terms which appear in a small number of pages as they are more likely to be on topic than words that appear in most all documents.
  • Normalization: penalizing long copy and rewarding short copy to allow them fair distribution in results. a good way of looking at this is like standardizing things to a scale of 100.

Multi dimensional scaling is more efficient than singular value decomposition because it requires exceptionally less computation. When combined with other ranking factors only a rough approximation of relevance is necessary.
Siphoning
Techniques used to steal another web sites traffic, including the use of spyware orcybersquatting.Site MapPage which can be used to help give search engines a secondary route to navigate through your site.
Tips:

  • On large websites the on page navigation should help search engines find all applicable web pages.
  • On large websites it does not make sense to list every page on the site map, just the most important pages.
  • Site maps can be used to help redistribute internal link authority toward important pages or sections, or sections of your site that are seasonally important.
  • Site maps can use slightly different or more descriptive anchor text than other portions of your site to help search engines understand what your pages are about.
  • Site maps should be created such that they are useful to humans, not just search engines.

Slashdot
Central editorially driven community news site focusing on technology and nerd related topics created by Rob Malda.
Snippit (see Description)
Social Media
Websites which allow users to create the valuable content. A few examples of social media sites are social bookmarking sites and social news sites.
Spam
Unsolicited email messages.
Search engines also like to outsource their relevancy issues by calling low quality search results spam. They have vague ever changing guidelines which determine what marketing techniques are acceptable at any given time. Typically search engines try hard not to flag false positives as spam, so most algorithms are quite lenient, as long as you do not build lots of low quality links, host large quantities of duplicate content, or perform other actions that are considered widely outside of relevancy guidelines. If your site is banned from a search engine you may request reinclusion after fixing the problem.
Spamming
The act of creating and distributing spam.
Spider
Search engine crawlers which search or “spider” the web for pages to include in the index.
Many non-traditional search companies have different spiders which perform other applications. For example, TurnItInBot searches for plagiarism. Spiders should obey the robots.txt protocol.
Splash Page
Feature rich or elegantly designed beautiful web page which typically offers poor usability and does not offer search engines much content to index.
Make sure your home page has relevant content on it if possible.
Splog
Spam blog, typically consisting of stolen or automated low quality content.
Spyware
Software programs which spy on web users, often used to collect consumer research and to behaviorally targeted ads.
Squidoo
Topical lens site created by Seth Godin.
SSI
Server Side Includes are a way to call portions of a page in from another page. SSI makes it easier to update websites.
To use a server side include you have to follow one of the conditions:

  • end file names in a .shtml or .shtm extension
  • use PHP or some other language which makes it easy to include files via that programming language
  • change your .htaccess file to make .html or .htm files be processed as though they were .shtml files.

The code to create a server side include looks like this:
<!–#include virtual=”/includes/filename.html” –>
Static Content
Content which does not change frequently. May also refer to content that does not have any social elements to it and does not use dynamic programming languages.
Many static sites do well, but the reasons fresh content works great for SEO are:

  • If you keep building content every day you eventually build a huge archive of content
  • By frequently updating your content you keep building mindshare, brand equity, and give people fresh content worth linking at

Stemming
Using the stem of a word to help satisfy search relevancy requirements. EX: searching forswimming can return results which contain swim. This usually enhances the quality of search results due to the extreme diversity of word used in, and their application in the English language.
Stop Words
Common words (ex: a, to, and, is …) which add little relevancy to a search query, and are thus are removed from the search query prior to finding relevant search results.
It is both fine and natural to use stop words in your page content. The reason stop words are ignored when people search is that the words are so common that they offer little to no discrimination value.
Sullivan, Danny
Founder and lead editor of SearchEngineWatch.com, who later started SearchEngineLand.com.
Submission
The act of making information systems and related websites aware of your website. In most cases you no longer need to submit your website to large scale search engines, they follow links and index content. The best way to submit your site is to get others to link to it.
Some topical or vertical search systems will require submission, but you should not need to submit your site to large scale search engine.
Supplemental Results
Documents which generally are trusted less and rank lower than documents in the main search index.
Some search engines, such as Google, have multiple indicies. Documents which are not well trusted due to any of the following conditions:

  • limited link authority relative to the number of pages on the site
  • duplicate content or near duplication
  • exceptionally complex URLs

Documents in the supplemental results are crawled less frequently than documents in the main index. Since documents in the supplemental results are typically considered to be trusted less than documents in the regular results, those pages probably carry less weight when they vote for other pages by linking at them.
You can find document’s on this site that are in Google’s supplemental results by searching forsite:seobook.com *** -view:randomstring

T

Tagging, tags (see Bookmarks)
Taxonomy
Classification system of controlled vocabulary used to organize topical subjects, usually hierarchical in nature.TechnoratiBlog search engine which tracks popular stories and link relationships.
Teoma
Topical community based search engine largely reliant upon Kleinberg‘s concept of hubs andauthorities. Teoma powers Ask.com.TelnetInternet service allowing a remote computer to log into a local one for projects such as script initialization or manipulation.Term FrequencyA measure of how frequently a keyword appears amongst a collection of documents.Term Vector DatabaseA weighted index of documents which aims to understand the topic of documents based on how similar they are to other documents, and then match the most relevant documents to a search query based on vector length and angle.
Text Link Ads
Advertisements which are formatted as text links.
Since the web was originally based on text and links people are typically more inclined to pay attention to text links than some other ad formats which are typically less relevant and more annoying. However, search engines primarily want to count editorial links as votes, so links that are grouped together with other paid links (especially if those links are to off topic commercial sites) may be less likely to carry weight in search engines.
Thesaurus
Synonym directory search engines use to help increase return relevancy.
Thesaurus tools can also be used as a keyword research tool to help search marketers find related keywords to target.
Title
The title element is used to describe the contents of a document.
The title is one of the most important aspects to doing SEO on a web page. Each page title should be:

  • Unique to that page: Not the same for every page of a site!
  • Descriptive: What important ideas does that page cover?
  • Not excessively long: Typically page titles should be kept to 8 to 10 words or less, with some of the most important words occurring near the beginning of the page title.

Page titles appear in search results as the links searchers click on. In addition many people link to documents using the official document title as the link anchor text. Thus, by using a descriptive page title you are likely to gain descriptive anchor text and are more likely to have your listing clicked on.
On some occasions it also makes sense to use a title which is not literally descriptive, but is easily associated with human emotions or a controversy such that your idea will spread further and many more people will point quality editorial links at your document.
There are two main ways to write titles and be SEO friendly

  1. Write literal titles that are well aligned with things people search for. This works well if you need backfill content for your site or already have an amazingly authoritative site.
  2. Write page titles that are exceptionally compelling to link at. If enough people link at them then your pages and site will rank for many relevant queries even if the keywords are not in the page titles.

Toolbar
Many major search companies aim to gain marketshare by distributing search toolbars. Some of these toolbars have useful features such as pop-up blockers, spell checkers, and form autofill. These toolbars also help search engines track usage data.Top HeavyGoogle algorithm which penalizes websites which have a high ad density above the fold & sites which make it hard to find the content a user searched for before landing on the page.
Topic-Sensitive PageRank
Method of computing PageRank which instead of producing a single global score creates topic related PageRank scores.
Trackback
Automated notification that another website mentioned your site which is baked into most popular blogging software programs.
Due to the automated nature of trackbacks they are typically quite easy to spam. Many publishers turn trackbacks off due to a low signal to noise ratio.
The Tragedy of the Commons
Story about how in order to protect the commons some people will have to give up some rights or care more for the commons. In marketing attention is the commons, and Google largely won distribution because they found ways to make marketing less annoying.
TrustRank
Search relevancy algorithm which places additional weighting on links from trusted seed websites that are controlled by major corporations, educational institutions, or governmental institutions.
Typepad
Hosted blogging platform provided by SixApart, who also makes Movable Type.
It allows you to publish sites on a subdomain off of Typepad.com, or to publish content which appears as though it is on its own domain. If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.
U
Unethical SEO
Some search engine marketers lacking in creativity try to market their services as being ethical, whereas services rendered by other providers are somehow unethical. SEO services are generally neither ethical or unethical. They are either effective or ineffective.
SEO is an inherently risky business, but any quality SEO service provider should make clients aware of potential risks and rewards of different recommended techniques.
Update
Search engines frequently update their algorithms and data sets to help keep their search results fresh and make their relevancy algorithms hard to update. Most major search engines are continuously updating both their relevancy algorithms and search index.
URL
Uniform Resource Locator is the unique address of any web document.URL RewriteA technique used to help make URLs more unique and descriptive to help facilitate better sitewide indexing by major search engines.
Usability
How easy it is for customers to perform the desired actions.
The structure and formatting of text and hyperlink based calls to action can drastically increase your website usability, and thus conversion rates.
Usage Data
Things like a large stream of traffic, a high percent of visitors as repeat visitors, long dwell time, multiple page views per visitor, a high clickthrough rate, or a high level of brand related search queries may be seen by some search engines as a sign of quality. Some search engines may leverage these signals to improve the rankings of high quality documents and high quality websites via algorithms like Panda.
Usenet
A search service which is focused on a particular field, a particular type of information, or a particular information format.
User Engagement (see usage data)

V

Vector Space Model (see Term Vector Database)
Vertical Search
A search service which is focused on a particular field, a particular type of information, or a particular information format.
For example, Business.com would be a B2B vertical search engine, and YouTube would be a video based vertical search engine.
Viral Marketing
Self propagating marketing techniques. Common modes of transmission are email, blogging, and word of mouth marketing channels.
Many social news sites and social bookmarking sites also lead to secondary citations.
Virtual Domain
Website hosted on a virtual server.
Virtual Server
A server which allows multiple top level domains to be hosted from a single computer.
Using a virtual server can save money for smaller applications, but dedicated hosting should be used for large commercial platforms.Most domains are hosted on virtual servers, but using a dedicated server on your most important domains should add server reliability, and could be seen as a sign of quality. Dedicated servers usually run from $100 to $500 a month. Virtual servers typically run from $5 to $50 per month.

W

Wales, Jimmy
Co-founder of the popular Wikipedia.
Weblog (see Blog)
Webmaster Tools (see Google Webmaster Tools)
Whois
Each domain has an owner of record. Ownership data is stored in the Whois record for that domain.
Some domain registrars also allow you to hide the ownership data of your sites. Many large scale spammers use fake Whois data.
White Hat SEO
Search engines set up guidelines that help them extract billions of dollars of ad revenue from the work of publishers and the attention of searchers. Within that highly profitable framework search engines consider certain marketing techniques deceptive in nature, and label them asblack hat SEO. Those which are considered within their guidelines are called white hat SEO techniques. The search guidelines are not a static set of rules, and things that may be considered legitimate one day may be considered deceptive the next.
Search engines are not without flaws in their business models, but there is nothing immoral or illegal about testing search algorithms to understand how search engines work.
People who have extensively tested search algorithms are probably more competent and more knowledgeable search marketers than those who give themselves the arbitrary label of white hat SEOs while calling others black hat SEOs.
When making large investments in processes that are not entirely clear trust is important. Rather than looking for reasons to not work with an SEO it is best to look for signs of trust in a person you would like to work with.
Wiki
Software which allows information to be published using collaborative editing.
Wikipedia
Free online collaborative encyclopedia using wiki software.
Wordnet
A lexical database of English words which can be used to help search engines understand word relationships.
WordPress
Popular open source blogging software platform, offering both a downloadable blogging program and a hosted solution.
If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and agerelated trust if you have built years of link equity into a subdomain on someone else’s website.
Wordtracker
Feature rich paid keyword research tool which collects data from a couple popular meta search engines, like Dogpile.
Xenu Link Sleuth
Popular free software for checking a site for broken internal or external links and creating a sitemap.
XHTML
Extensible HyperText Markup Language is a class of specifications designed to move HTMLto conform to XML formatting.
XML
Extensible Markup Language is a simple, very flexible text format derived from SGML, used to make it easy to syndicate or format information using technologies such as RSS.
Yahoo!
Internet portal company which was started with the popular Yahoo! Directory.
Yahoo! Answers
Free question asking and answering service which allows Yahoo! to leverage social structures to create a bottoms up network of free content.Yahoo! Answers
Yahoo! Directory
One of the original, most popular, and most authoritative web directories, started by David Filo and Jerry Yang in 1994.
The Yahoo! Directory is one of a few places where most any legitimate site can pick up a trusted link. While the cost of $299 per year may seem expensive to some small businesses, a Yahoo! Directory link will likely help boost your rankings in major search engines.
Yahoo! Search Marketing
Yahoo!’s paid search platform, formerly known as Overture.
Yahoo! Site Explorer
Research tool which webmasters can use to see what pages Yahoo! has indexed from a website, and what pages link at those pages.
YouTube
Feature rich amateur video upload and syndication website owned by Google.
Zeal
Non-commercial directory which was bought by Looksmart for $20 million, then abruptly shut down with little to no warning.

X