Catalog (see Index)
Catch All Listing
A listing used by pay per click search engines to monetize long tail terms that are not yet targeted by marketers. This technique may be valuable if you have very competitive key words, but is not ideal since most major search engines have editorial guidelines that prevent bulk untargeted advertising, and most of the places that allow catch all listings have low traffic quality. Catch all listings may be an attractive idea on theme specific search engines and directories though, as they are already pre qualified clicks.
CGI
Common Gateway Interface – interface software between a web server and other machines or software running on that server. Many cgi programs are used to add interactivity to a web site.
Chrome
Primarily known as Google’s web broser, there is also an OS by the same name.
Google ensured Microsoft’s Internet Explorer was fined in Europe & then bundled Chrome in Adobe Flash security updates to install Chrome bundleware on hundreds of millions of computers.
Client
A program, computer, or process which makes information requests to another computer, process, or program.
Cloaking
Displaying different content to search engines and searchers. Depending on the intent of the display discrepancy and the strength of the brand of the person / company cloaking it may be considered reasonable or it may get a site banned from a search engine.
Cloaking has many legitimate uses which are within search guidelines. For example, changing user experience based on location is common on many popular websites.
See also:

Cluetrain Manifesto, TheBook about how the web is a marketplace, and how it is different from traditional offline business.
See also:

ClusteringIn search results the listings from any individual site are typically limited to a certain number and grouped together to make the search results appear neat and organized and to ensure diversity amongst the top ranked results. Clustering can also refer to a technique which allows search engines to group hubs and authorities on a specific topic together to further enhance their value by showing their relationships.
CMS
Content Management System. Tool used to help make it easy to update and add information to a website.
Blog software programs are some of the most popular content management systems currently used on the web. Many content management systems have errors associated with them which make it hard for search engines to index content due to issues such as duplicate content.
Co-citation
In topical authority based search algorithms links which appear near one another on a page may be deemed to be related to one another. In algorithms like latent semantic indexingwords which appear near one another often are frequently deemed to be related.
Comments
Many blogs and other content management systems allow readers to leave user feedback.
Leaving enlightening and thoughtful comments on someone else’s related website is one way to help get them to notice you.
Comments Tag
Some web developers also place comments in the source code of their work to help make it easy for people to understand the code.
HTML comments in the source code of a document appear as <!– your comment here –>. They can be viewed if someone types views the source code of a document, but do not appear in the regular formatted HTML rendered version of a document.
In the past some SEOs would stuff keywords in comment tags to help increase the page keyword density, but search has evolved beyond that stage, and at this point using comments to stuff keywords into a page adds to your risk profile and presents little ranking upside potential.
Compacted Information
Information which is generally and widely associated with a product. For example, most published books have an ISBN.
As the number of product databases online increases and duplicate content filters are forced to get more aggressive the keys to getting your information indexed are to have a site with enough authority to be considered the most important document on that topic, or to have enough non compacted information (for example, user reviews) on your product level pages to make them be seen as unique documents.
Conceptual Links
Links which search engines attempt to understand beyond just the words in them. Some rather advanced search engines are attempting to find out the concept links versus just matching the words of the text to that specific word set. Some search algorithms may even look at co-citation and words near the link instead of just focusing on anchor text.
Concept Search
A search which attempts to conceptually match results with the query, not necessarily with those words, rather their concept.
For example, if a search engine understands a phrase to be related to another word or phrase it may return results relevant to that other word or phrase even if the words you searched for are not directly associated with a result. In addition, some search engines will place various types of vertical search results at the top of the search results based on implied query related intent or prior search patterns by you or other searchers.
Contextual Advertising
Advertising programs which generate relevant advertisements based on the content of a webpage.
Conversion
Many forms of online advertising are easy to track. A conversion is reached when a desired goal is completed.
Most offline ads have generally been much harder to track than online ads. Some marketers use custom phone numbers or coupon codes to tie offline activity to online marketing.
Here are a few common example desired goals

  • a product sale
  • completing a lead form
  • a phone call
  • capturing an email
  • filling out a survey
  • getting a person to pay attention to you
  • getting feedback
  • having a site visitor share your website with a friend
  • having a site visitor link at your site

Bid management, affiliate tracking, and analytics programs make it easy to track conversion sources.
Copyright
The legal rights to publish and reproduce a particular piece of work.
Cookie
Small data file written to a user’s local machine to track them. Cookies are used to help websites customize your user experience and help affiliate program managers track conversions.
CPA
Cost per action. The effectiveness of many other forms of online advertising have their effectiveness measured on a cost per action basis. Many affiliate marketing programs andĀ contextual ads are structured on a cost per action basis. An action may be anything from an ad click, to filling out a lead form, to buying a product.
CPC
Cost per click. Many search ads and contextually targeted ads are sold in auctions where the advertiser is charged a certain price per click.
CPM
Cost per thousand ad impressions.
Many people use CPM as a measure of how profitable a website is or has the potential of becoming.
Crawl Depth
How deeply a website is crawled and indexed.
Since searches which are longer in nature tend to be more targeted in nature it is important to try to get most or all of a site indexed such that the deeper pages have the ability to rank for relevant long tail keywords. A large site needs adequate link equity to get deeply indexed. Another thing which may prevent a site from being fully indexed is duplicate content issues.
Crawl Frequency
How frequently a website is crawled.
Sites which are well trusted or frequently updated may be crawled more frequently than sites with low trust scores and limited link authority. Sites with highly artificial link authority scores (ie: mostly low quality spammy links) or sites which are heavy in duplicate content or near duplicate content (such as affiliate feed sites) may be crawled less frequently than sites with unique content which are well integrated into the web.
CSS
Cascading Style Sheets is a method for adding styles to web documents.
Note: Using external CSS files makes it easy to change the design of many pages by editing a single file. You can link to an external CSS file using code similar to the following in the head of your HTML documents
<link rel=”stylesheet” href=”https://www.unleashyourgeek.com/style.css” type=”text/css” />
CTR
Clickthrough rate – the percentage of people who view click on an advertisement they viewed, which is a way to measure how relevant a traffic source or keyword is. Search ads typically have a higher clickthrough rate than traditional banner ads due to being highly relevant to implied searcher demand & a history of questionable ad labeling disclosure by search engines.

A search engine can determine if a particular search query is navigational (branded) versus informational or transactional by analyzing the relative CTR of different listings on the search result page & the CTR of people who have repeatedly searched for a particular keyword term. A navigational search tends to have many clicks on the top organic listing, while the CTR curve is often far flatter on informational or transactional searches.
Cutts, Matt
Google’s head of search quality.
Cybersquatting
Registering domains related to other trademarks or brands in an attempt to cash in on the value created by said trademark or brand.

D

Dayparting
Turning ad campaigns on or off, changing ad bid price, or budget constraints based on bidding more when your target audience is available and less when they are less likely to be available.
Dead Link
A link which is no longer functional.
Most large high quality websites have at least a few dead links in them, but the ratio of good links to dead links can be seen as a sign of information quality.
Deep Link
A link which points to an internal page within a website.
When links grow naturally typically most high quality websites have many links pointing at interior pages. When you request links from other websites it makes sense to request a link from their most targeted relevant page to your most targeted relevant page. Some webmasters even create content based on easy linking opportunities they think up.
Dedicated Server
Server which is limited to serving one website or a small collection of websites owned by a single person.
Dedicated servers tend to be more reliable than shared (or virtual) servers. Dedicated servers usually run from $100 to $500 a month.
Virtual servers typically run from $5 to $50 per month.
Deep Link Ratio
The ratio of links pointing to internal pages to overall links pointing at a website.
A high deep link ratio is typically a sign of a legitimate natural link profile.
De-Listing
Temporarily or permanently becoming de-indexed from a directory or search engine.
De-indexing may be due to any of the following:

  • Pages on new websites (or sites with limited link authority relative to their size) may be temporarily de-indexed until the search engine does a deep spidering and re-cache of the web.
  • During some updates search engines readjust crawl priorities.
    • You need a significant number of high quality links to get a large website well indexed and keep it well indexed.
    • Duplicate content filters, inbound and outbound link quality, or other information quality related issues may also relate to re-adjusted crawl priorities.
  • Pages which have changed location and are not properly redirected, or pages which are down when a search engine tries to crawl them may be temporarily de-indexed.
  • Search Spam:
    • If a website tripped an automatic spam filter it may return to the search index anywhere from a few days to a few months after the problem has been fixed.
    • If a website is editorially removed by a human you may need to contact the search engine directly to request reinclusion.

Del.icio.us
Popular social bookmarking website.
Demographics
Statistical data or characteristics which define segments of a population.
Some internet marketing platforms, such as AdCenter and AdWords, allow you to target ads at websites or searchers who fit amongst a specific demographic. Some common demographic data points are gender, age, income, education, location, etc.
Denton, Nick
Publisher of Gawker, a popular ring of topical weblogs, which are typically focused on controversy.
Description
Directories and search engines provide a short description near each listing which aims to add context to the title.
High quality directories typically prefer the description describes what the site is about rather than something that is overtly promotional in nature. Search engines typically

  • use a description from a trusted directory (such as DMOZ or the Yahoo! Directory) for homepages of sites listed in those directories
  • use the page meta description (especially if it is relevant to the search query and has the words from the search query in it)
  • attempt to extract a description from the page content which is relevant for the particular search query and ranking page (this is called a snippet)
  • or some combination of the above

Digg
Social news site where users vote on which stories get the most exposure and become the most popular.
Directory
A categorized catalog of websites, typically manually organized by topical editorial experts.
Some directories cater to specific niche topics, while others are more comprehensive in nature. Major search engines likely place significant weight on links from DMOZ and the Yahoo! Directory. Smaller and less established general directories likely pull less weight. If a directory does not exercise editorial control over listings search engines will not be likely to trust their links at all.
Disavow
The link disavow tool is a way for a webmaster to state they do not vouch for a collection of inbound links to their website.
Recovering from manual link penalties will often require removing some of lower quality inbound links & disavowing other low quality links. For automated link penalties, like Penguin, using the disavow tool should be sufficient for penalty recovery, however Google still has to crawl the pages to apply disavow to the links. It still may make sense to remove some lower quality links to diminish any future risks of manual penalties. With the rise of negative SEO, publishers in spammy industries may be forced to proactively use the disavow tool.
Document Freshness (see fresh content)
DMOZ
The Open Directory Project is the largest human edited directory of websites. DMOZ is owned by AOL, and is primarily ran by volunteer editors.
DNS
Domain Name Server or Domain Name System. A naming scheme mechanism used to help resolve a domain name / host name to a specific TCP/IP Address.
Domain
Scheme used for logical or location organization of the web. Many people also use the word domain to refer to a specific website.
Doorway Pages
Pages designed to rank for highly targeted search queries, typically designed to redirect searchers to a page with other advertisements.
Some webmasters cloak thousands of doorway pages on trusted domains, and rake in a boatload of cash until they are caught and de-listed. If the page would have a unique purpose outside of search then search engines are generally fine with it, but if the page only exists because search engines exist then search engines are more likely to frown on the behavior.
Dreamweaver
Popular web development and editing software offering a what you see is what you get interface.
Duplicate Content
Content which is duplicate or near duplicate in nature.
Search engines do not want to index multiple versions of similar content. For example, printer friendly pages may be search engine unfriendly duplicates. Also, many automated content generation techniques rely on recycling content, so some search engines are somewhat strict in filtering out content they deem to be similar or nearly duplicate in nature.
Dwell Time
The amount of time a searcher spends on a destination website before clicking back to the search results.
Some search queries might require significant time for a user to complete their information goals while other queries might be things which are quickly answered by a landing page, thus dwell timein isolation may not be a great relevancy signal. However search engines can also look at otherengagement metrics like repeat visits, branded searches, relative CTR & if users clicking on a particular listing have a high POGO rate (by subsequently clicking on yet another different search result) to get an idea of a user’s satisfaction with a particular webiste & fold these metrics into an algorithm like Panda.
Dynamic Content
Content which changes over time or uses a dynamic language such as PHP to help render the page.
In the past search engines were less aggressive at indexing dynamic content than they currently are. While they have greatly improved their ability to index dynamic content it is still preferable to use URL rewriting to help make dynamic content look static in nature.
Dynamic Languages
Programming languages such as PHP or ASP which build web pages on the fly upon request.

E

Earnings Per Click
Many contextual advertising publishers estimate their potential earnings based on how much they make from each click.
Editorial Link
Search engines count links as votes of quality. They primarily want to count editorial links that were earned over links that were bought or bartered.
Many paid links, such as those from quality directories, still count as signs of votes as long as they are also associated with editorial quality standards. If they are from sites without editorial control, like link farms, they are not likely to help you rank well. Using an algorithm similar to TrustRank, some search engines may place more trust on well known sites with strong editorial guidelines.
Emphasis
An HTML tag used to emphasize text.
Please note that it is more important that copy reads well to humans than any boost you may think you will get by tweaking it for bots. If every occurrence of a keyword on a page is in emphasis that will make the page hard to read, convert poorly, and may look weird to search engines and users alike.
<em>emphasis</em> would appear as emphasis
Engagement Metrics
The measurement of how engaging users find a particular piece of content within a site, or a particular site in general.
Search engines may analyze end user behavior to help refine and improve their rankings. Sites which get a high CTR and have a high proportion of repeat visits from brand related searches may get a ranking boost in algorithms like Panda.
Entities
People, places or things which search engines aim to know & present background information about.
Brands are a popular form of entities, but many other forms of information like songs or movies are also known as entities. Information about entities may be shown in knowledge graph results.
Entry Page
The page which a user enters your site.
If you are buying pay per click ads it is important to send visitors to the most appropriate and targeted page associated with the keyword they searched for. If you are doing link building it is important to point links at your most appropriate page when possible such that

  • if anyone clicks the link they are sent to the most appropriate and relevant page
  • you help search engines understand what the pages on your site are associated with

Ethical SEO
Search engines like to paint SEO services which manipulate their relevancy algorithms as being unethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.
Some search marketers lacking in creativity tend to describe services sold by others as being unethical while their own services are ethical. Any particular technique is generally not typically associated with ethics, but is either effective or ineffective.
The only ethics issues associated with SEO are generally business ethics related issues. Two of the bigger frauds are

  • Not disclosing risks: Some SEOs may use high risk techniques when they are not needed. Some may make that situation even worse by not disclosing potential risks to clients.
  • Taking money & doing nothing: Since selling SEO services has almost no start up costs many of the people selling services may not actually know how to competently provide them. Some shady people claim to be SEOs and bilk money out of unsuspecting small businesses.

As long as the client is aware of potential risks there is nothing unethical about being aggressive.
Everflux
Major search indexes are constantly updating. Google refers to this continuous refresh as everflux.
In the past Google updated their index roughly once a month. Those updates were named Google Dances, but since Google shifted to a constantly updating index Google no longer does what was traditionally called a Google Dance.
Expert Document
Quality page which links to many non-affiliated topical resources.
External Link
Link which references another domain.
Some people believe in link hoarding, but linking out to other related resources is a good way to help search engines understand what your site is about. If you link out to lots of low quality sites or primarily rely on low quality reciprocal links some search engines may not rank your site very well. Search engines are more likely to trust high quality editorial links (both to and from your site).

F

Fair Use
The stated exceptions of allowed usage of work under copyright without requiring permission of the original copyright holder. Fair use is covered in section 107 of the Copyright code.
Favicon
Favorites Icon is a small icon which appears next to URLs in a web browser.
Upload an image named favicon.ico in the root of your site to have your site associated with a favicon.
Favorites (see bookmarks)
Feed
Many content management, systems such as blogs, allow readers to subscribe to content update notifications via RSS or XML feeds. Feeds can also refer to pay per click syndicated feeds, or merchant product feeds. Merchant product feeds have become less effective as a means of content generation due to improving duplicate content filters.
Feed ReaderSoftware or website used to subscribe to feed update notifications.
FeedDemon – desktop based feed reader
FFA
Free for all pages are pages which allow anyone to add a link to them. Generally these links do not pull much weight in search relevancy algorithms because many automated programs fill these pages with links pointing at low quality websites.
Filter
Certain activities or signatures which make a page or site appear unnatural might make search engines inclined to filter / remove them out of the search results.
For example, if a site publishes significant duplicate content it may get a reduced crawl priority and get filtered out of the search results. Some search engines also have filters based on link quality, link growth rate, and anchor text. Some pages are also penalized for spamming.
Firefox
Popular extensible open source web browser.
Flash
Vector graphics-based animation software which makes it easier to make websites look rich and interactive in nature.
Search engines tend to struggle indexing and ranking flash websites because flash typically contains so little relevant content. If you use flash ensure:

  • you embed flash files within HTML pages
  • you use a noembed element to describe what is in the flash
  • you publish your flash content in multiple separate files such that you can embed appropriate flash files in relevant pages

Forward Links (see Outbound Links)
Frames
A technique created by Netscape used to display multiple smaller pages on a single display. This web design technique allows for consistent site navigation, but makes it hard to deep link at relevant content.
Given the popularity of server side includes, content management systems, and dynamic languagesthere really is no legitimate reason to use frames to build a content site today.
Fresh Content
Content which is dynamic in nature and gives people a reason to keep paying attention to your website, or content which was recently published.
Many SEOs talk up fresh content, but fresh content does not generally mean re-editing old content. It more often refers to creating new content. The primary advantages to fresh content are:

  • Maintain and grow mindshare: If you keep giving people a reason to pay attention to you more and more people will pay attention to you, and link to your site.
  • Faster idea spreading: If many people pay attention to your site, when you come out with good ideas they will spread quickly.
  • Growing archives: If you are a content producer then owning more content means you have more chances to rank. If you keep building additional fresh content eventually that gives you a large catalog of relevant content.
  • Frequent crawling: Frequently updated websites are more likely to be crawled frequently.
  • QDF: Google’s query deserves freshness algorithm may boost the rankings of recently published documents for search queries where they believe users are looking for recent information.

The big risk of creating lots of “fresh” content for the sake of it is that many low cost content sources will have poor engagement metrics, which in turn will lead to a risk of the site being penalized by Panda. A good litmus test on this front is: if you didn’t own your website would you still regularly visit it & read the new content published to it.
Freshness (see fresh content)
FTP
File Transfer Protocol is a protocol for transferring data between computers.
Many content management systems (such as blogging platforms) include FTP capabilities. Web development software such as Dreamweaver also comes with FTP capabilities. There are also a number of free or cheap FTP programs such as Cute FTP, Core FTP, and Leech FTP.
Fuzzy Search
Search which will find matching terms when terms are misspelled (or fuzzy).
Fuzzy search technology is similar to stemming technology, with the exception that fuzzy search corrects the misspellings at the users end and stemming searches for other versions of the same core word within the index.

G

GAP
Google Advertising Professional is a program which qualifies marketers as being proficientĀ AdWords marketers.
Gladwell, Malcolm
Popular author who wrote the book titled The Tipping Point.
Godin, Seth
Popular blogger, author, viral marketer and business consultant.
Google
The world’s leading search engine in terms of reach. Google pioneered search by analyzing linkage data via PageRank. Google was created by Stanford students Larry Page and Sergey Brin.
GoogleBot
Google’s search engine spider.
Google has a shared crawl cache between their various spiders, including vertical search spiders and spiders associated with ad targeting.
Google AdSense (see AdSense)
Google AdWords (see AdWords)
Google BaseFree database of semantically structured information created by Google.
Google Base may also help Google better understand what types of information are commercial in nature, and how they should structure different vertical search products.
Google Bombing
Making a pank rank well for a specific search query by pointing hundreds or thousands of links at it with the keywords in the anchor text.
Google Bowling
Knocking a competitor out of the search results by pointing hundreds or thousands of low trust low quality links at their website.
Typically it is easier to bowl new sites out of the results. Older established sites are much harder to knock out of the search results.
Google Checkout
Payment service provided by Google which helps Google better understand merchant conversion rates and the value of different keywords and markets.
Google Dance
In the past Google updated their index roughly once a month. Those updates were named Google Dances, but since Google shifted to a constantly updating index, Google no longer does what was traditionally called a Google Dance.
Major search indexes are constantly updating. Google refers to this continuous refresh as everflux.
The second meaning of Google Dance is a yearly party at Google’s corporate headquarters which Google holds for search engine marketers. This party coincides with the San Jose Search Engine Strategies conference.
Google Keyword Tool
Keyword research tool provided by Google which estimates the competition for a keyword, recommends related keywords, and will tell you what keywords Google thinks are relevant to your site or a page on your site.
Google OneBox
Portion of the search results page above the organic search results which Google sometimes uses to display vertical search results from Google News, Google Base, and other Google owned vertical search services.
Google Sitemaps
Program which webmasters can use to help Google index their contents.
Please note that the best way to submit your site to search engines and to keep it in their search indexes is to build high quality editorial links.
Google Sitelinks
On some search results where Google thinks one result is far more relevant than other results (like navigational or brand related searches) they may list numerous deep links to that site at the top of the search results.
Google Supplemental Index
Index where pages with lower trust scores are stored. Pages may be placed in Google’s Supplemental Index if they consist largely of duplicate content, if the URLs are excessively complex in nature, or the site which hosts them lacks significant trust.
Google Traffic Estimator
Tool which estimates bid prices and how many Google searchers will click on an ad for a particular keyword.
If you do not submit a bid price the tool will return an estimated bid price necessary to rank #1 for 85% of Google’s queries for a particular keyword.
Google Trends
Tool which allows you to see how Google search volumes for a particular keyword change over time.
Google Webmaster Guidelines
An arbitrary & ever-shifting collection of specifications which can be used to justify penalizing any website.
While some aspects of the guidelines might be rather clear, other aspects are blurry, and based on inferences which may be incorrect like: “Don’t deceive your users.” Many would (and indeed have) argued Google’s ad labeling within their own search results is deceptive, Google has ran ads for illegal steroids & other shady offers, etc. The ultimate goal of the Webmaster Guidelines is to minimize the ROI of SEO & discourage active investment into SEO.
For background on how arbitrary and uneven enforcement actions are compare this to this, then read this.
Google Webmaster Tools
Tools offered by Google which show recent search traffic trends, let webmasters set a target geographic market, enable them to request select pages be recrawled, show manual penalty notifications and allow webmasters to both disavow links and request a manual review from Google’s editorial team.
While some of the Google Webmaster Tools may seem useful, it is worth noting Google uses webmaster registration data to profile & penalize other websites owned by the same webmaster. It is worth preceeding with caution when registering with Google, especially if your website is tied to a business model Google both hates & has cloned in their search results – like hotel affiliates.
Google Website Optimizer
Free multi variable testing platform used to help AdWords advertisers improve their conversion rates.
Guestbook Spam
A type of low quality automated link which search engines do not want to place much trust on.

H

Headings
The heading element briefly describes the subject of the section it introduces.
Heading elements go from H1 to H6 with the lower numbered headings being most important. You should only use a single H1 element on each page, and may want to use multiple other heading elements to structure a document. An H1 element source would look like:
<h1>Your Topic</h1>
Heading elements may be styled using CSS. Many content management systems place the same content in the main page heading and the page title, although in many cases it may be preferential to mix them up if possible.
Headline
The title of an article or story.
Hidden Text
SEO technique used to show search engine spiders text that human visitors do not see.
While some sites may get away with it for a while, generally the risk to reward ratio is inadequate for most legitimate sites to consider using hidden text.
Hilltop
Algorithm which ranks results largely based on unaffiliated expert citations.
HITS
Link based algorithm which ranks relevancy scores based on citations from topical authorities.
Hijacking
Making a search engine believe that another website exists at your URL. Typically done using techniques such as a 302 redirect or meta refresh.
Home Page
The main page on your website, which is largely responsible for helping develop your brand and setting up the navigational schemes that will be used to help users and search engines navigate your website.
As far as SEO goes, a home page is typically going to be one of the easier pages to rank for some of your more competitive terms, largely because it is easy to build links at a home page. You should ensure your homepage stays focused and reinforces your brand though, and do not assume that most of your visitors will come to your site via the home page. If your site is well structured many pages on your site will likely be far more popular and rank better than your home page for relevant queries.
Host (see Server)
.htaccess
Apache directory-level configuration file which can be used to password protect or redirect files.
As a note of caution, make sure you copy your current .htaccess file before editing it, and do not edit it on a site that you can’t afford to have go down unless you know what you are doing.
HTML
HyperText Markup Language is the language in which pages on the World Wide Web are created.
Some newer web pages are also formatted in XHTML.
HTTP
HyperText Transfer Protocol is the foremost used protocol to communicate between servers and web browsers. Hypertext transfer protocol is the means by which data is transferred from its residing location on a server to an active browser.
Hubs
Topical hubs are sites which link to well trusted within their topical community. A topical authorityis a page which is referenced from many topical hub sites. A topical hub is a page which references many authorities.
Hummingbird
A Google search algorithm update which better enabled conversational search

X