Finding the Best Long Tail Keywords

Long-tail Keyword Research Strategy to improve your SEOAnchor text – Anchor text is the SEO industry’s term for hyperlinked text, being that it is the anchor to the link. Anchor text is prevalent because SEO’s found that by using keyword rich anchor text, it could help enhance rankings. While it is okay to optimize anchor text internally (as long as it’s relevant and isn’t excessive) optimizing anchor text for articles and press releases that will be distributed at larger scales could be considered a link scheme. Best practice is to keep anchor text natural in any articles or press-releases.
Title tag – A title tag, or page title is a tag in the .html denoted by. The title tag represents the page’s title and can be seen on the tabs in your browser or in the headline of a search result. Title tags help both the search engines and internet users identify what your pages are about. Best practice for title tags is to create unique, relevant titles for each one of your pages.
Meta description – A meta description is another .html tag, but its purpose is to describe the page. While meta descriptions do not have any effect on rankings, they can help increase click-through rates since they do show up in the search results. Like title tags, it’s important to have unique, relevant descriptions for each page and make sure to include a call-to-action!
301 redirect – When you delete pages or change URLs, they don’t just dissipate into the digital universe. So when users try to search for a URL that no longer exists, they receive a 404 error, or ‘Not Found’ page. To avoid confusing users and to pass on any ranking authority from retired pages, it is best practice to use a 301 redirect. 301 redirects will redirect users and search engines from the old URL to another active page that you specify.
SERP – SERP stands for Search Engine Results Page. A SERP is what is returned to you after typing in a search query. In essence, it’s a page of results after you search. SERP is a coined SEO term that you’ll hear frequently.
Keyword – Keyword is another term that is used a lot in SEO. A keyword or keyword phrase is a word or set of words that exemplify the brand, its services, or products. Keywords are important because they help users and the search engines better identify what your webpages are about . Using relevant keywords in your title tags, headlines and throughout your content can help to give the search engines a better idea of what your page is about. Just be careful not to overuse keywords, or it can actually hurt your rankings.
Indexing – Indexing is the search engines’ process for collecting and storing data across the web. The search engines are constantly scouring the web for updated and new pages to add to their massive databases of information. When the search engines do find new pages, they ‘index’ it, meaning they add a copy of it to their database, so that they can retrieve it during searches.
Links – There are two types of links that you will hear SEO’s talk about. Internal and external. Internal links are links that occur between pages inside of your website. For example, all links on the navigation bar of your website are internal links. External links are links coming or going from your website, either someone has linked to your website, or yours to theirs.
Both link structures are important, although links coming to your website are seen as more of an authority signal. Best practice for links is to have an organized and convenient internal link structure, so that both users and search engines can easily find your pages. A good rule of thumb to follow is that every page should be at most two clicks away from the homepage. For external links, it’s important to create great content that users would want to link to. Any unnatural links or link schemes could end up in a penalty.
Rel=”author” – Google’s Authorship Markup, also known as rel=”author” is a tag that is used to associate authors with their Google+ profiles. This helps to put a face behind your brand, can help increase click-through rates in the search engines, promote thought leadership and can be seen as a ranking signal to Google. Rel=”author” is a great tool for any author that creates content online and has many benefits.
Canonical URL – Canonical URLs are used in cases where there is duplicate content. Say you sell a product that comes in several different colors, and you have a page for each of those colors. The search engines wouldn’t be sure which one to index since they’re all the same, so by using a canonical link you are able to specify which page should take precedence in the search engines. While it is not a guarantee, it is best practice when you have multiple pages with very similar or identical content.
1. SEM: Stands for Search Engine Marketing, and as the name implies it involves marketing services or products via search engines. SEM is divided into two main pillars: SEO and PPC. SEO stands for Search Engine Optimization, and it is the practice of optimizing websites to make their pages appear in the organic search results. PPC stands for Pay-Per-Click, and it is the practice of purchasing clicks from search engines. The clicks come from sponsored listings in the search results.
2. Backlink: Also called inlink or simply link, it is an hyperlink on another website pointing back to your own website. Backlinks are important for SEO because they affect directly the PageRank of any web page, influencing its search rankings.
3. PageRank: PageRank is an algorithm that Google uses to estimate the relative important of pages around the web. The basic idea behind the algorithm is the fact that a link from page A to page B can be seen as a vote of trust from page A to page B. The higher the number of links (weighted to their value) to a page, therefore, the higher the probability that such page is important.
4. Linkbait: A linkbait is a piece of web content published on a website or blog with the goal of attracting as many backlinks as possible (in order to improve one’s search rankings). Usually it’s a written piece, but it can also be a video, a picture, a quiz or anything else. A classic example of linkbait are the “Top 10” lists that tend to become popular on social bookmarking sites.
5. Link farm. A link farm is a group of websites where every website links to every other website, with the purpose of artificially increasing the PageRank of all the sites in the farm. This practice was effective in the early days of search engines, but today they are seeing as a spamming technique (and thus can get you penalized).
6. Anchor text: The anchor text of a backlink is the text that is clickable on the web page. Having keyword rich anchor texts help with SEO because Google will associate these keywords with the content of your website. If you have a weight loss blog, for instance, it would help your search rankings if some of your backlinks had “weight loss” as their anchor texts.
7. NoFollow: The nofollow is a link attribute used by website owners to signal to Google that they don’t endorse the website they are linking to. This can happen either when the link is created by the users themselves (e.g., blog comments), or when the link was paid for (e.g., sponsors and advertisers). When Google sees the nofollow attribute it will basically not count that link for the PageRank and search algorithms.
8. Link Sculpting: By using the nofollow attribute strategically webmasters were able to channel the flow of PageRank within their websites, thus increasing the search rankings of desired pages. This practice is no longer effective as Google recently change how it handles the nofollow attribute.
9. Title Tag: The title tag is literally the title of a web page, and it’s one of the most important factors inside Google’s search algorithm. Ideally your title tag should be unique and contain the main keywords of your page. You can see the title tag of any web page on top of the browser while navigating it.
10. Meta Tags: Like the title tag, meta tags are used to give search engines more information regarding the content of your pages. The meta tags are placed inside the HEAD section of your HTML code, and thus are not visible to human visitors.
11. Search Algorithm: Google’s search algorithm is used to find the most relevant web pages for any search query. The algorithm considers over 200 factors (according to Google itself), including the PageRank value, the title tag, the meta tags, the content of the website, the age of the domain and so on.
12. SERP: Stands for Search Engine Results Page. It’s basically the page you’ll get when you search for a specific keyword on Google or on other search engines. The amount of search traffic your website will receive depends on the rankings it will have inside the SERPs.
13. Sandbox: Google basically has a separate index, the sandbox, where it places all newly discovered websites. When websites are on the sandbox, they won’t appear in the search results for normal search queries. Once Google verifies that the website is legitimate, it will move it out of the sandbox and into the main index.
14. Keyword Density: To find the keyword density of any particular page you just need to divide the number of times that keyword is used by the total number of words in the page. Keyword density used to be an important SEO factor, as the early algorithms placed a heavy emphasis on it. This is not the case anymore.
15. Keyword Stuffing: Since keyword density was an important factor on the early search algorithms, webmasters started to game the system by artificially inflating the keyword density inside their websites. This is called keyword stuffing. These days this practice won’t help you, and it can also get you penalized.
16. Cloaking. This technique involves making the same web page show different content to search engines and to human visitors. The purpose is to get the page ranked for specific keywords, and then use the incoming traffic to promote unrelated products or services. This practice is considering spamming and can get you penalized (if not banned) on most search engines.
17. Web Crawler: Also called search bot or spider, it’s a computer program that browses the web on behalf of search engines, trying to discover new links and new pages. This is the first step on the indexation process.
18. Duplicate Content: Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. You should avoid having duplicate content on your website because it can get you penalized.
19. Canonical URL: Canonicalization is a process for converting data that has more than one possible representation into a “standard” canonical representation. A canonical URL, therefore, is the standard URL for accessing a specific page within your website. For instance, the canonical version of your domain might be instead of
20. Robots.txt: This is nothing more than a file, placed in the root of the domain, that is used to inform search bots about the structure of the website. For instance, via the robots.txt file it’s possible to block specific search robots and to restrict the access to specific folders of section inside the website.


0 – 9

Status OK – The file request was successful. For example, a page or image was found and loaded properly in a browser.
Some poorly developed content management systems return 200 status codes even when a file does not exist. The proper response for file not found is a 404.
Moved Permanently – The file has been moved permanently to a new location.
This is the preferred method of redirecting for most pages or websites. If you are going to move an entire site to a new location you may want to test moving a file or folder first, and then if that ranks well you may want to proceed with moving the entire site. Depending on your site authority andcrawl frequency it may take anywhere from a few days to a month or so for the 301 redirect to be picked up.
Found – The file has been found, but is temporarily located at another URI.
Generally, as it relates to SEO, it is typically best to avoid using 302 redirects. Some search engines struggle with redirect handling. Due to poor processing of 302 redirects some search engines have allowed competing businesses to hijack the listings of competitors.
Not Found – The server was unable to locate the URL.
Some content management systems send 404 status codes when documents do exist. Ensure files that exist do give a 200 status code and requests for files that do not exist give a 404 status code. You may also want to check with your host to see if you can set up a custom 404 error page which makes it easy for site visitors to

  • view your most popular and / or most relevant navigational options
  • report navigational problems within your site

Search engines request a robots.txt file to see what portions of your site they are allowed to crawl. Many browsers request a favicon.ico file when loading your site. While neither of these files are necessary, creating them will help keep your log files clean so you can focus on whatever other errors your site might have.
Above the Fold
A term traditionally used to describe the top portion of a newspaper. In email or web marketing it means the area of content viewable prior to scrolling. Some people also define above the fold as an ad location at the very top of the screen, but due to banner blindness typical ad locations do not perform as well as ads that are well integrated into content. If ads look like content they typically perform much better. Google launched a top heavy algorithm to penalize sites with excessive ad placement above the fold.
Absolute Link
A link which shows the full URL of the page being linked at. Some links only show relative link paths instead of having the entire reference URL within the a href tag. Due to canonicalizationand hijacking related issues it is typically preferred to use absolute links over relative links.
Example absolute link
<a href=”“>SEO Solutions</a>
Example relative link
<a href=”../seo/pagename.html”>SEO Solutions</a>
Activity Bias
Any attempted form of ad targeting might be targeted toward people who are more likely to engage in a particular activity, especially with ad retargeting. Correlation does not mean causation.
Ad Retargeting (see retargeting)
The old name for Microsoft’s cost per click ad network, later rebranded as Bing Ads.
While it has a few cool features (including dayparting and demographic based bidding) it is still quite nascent in nature compared to Google AdWords. Due to Microsoft’s limited marketshare and program newness many terms are vastly underpriced and present a great arbitrage opportunity.
Google’s contextual advertising network. Publishers large and small may automatically publish relevant advertisements near their content and share the profits from those ad clicks with Google.
AdSense offers a highly scalable automated ad revenue stream which will help some publishers establish a baseline for the value of their ad inventory. In many cases AdSense will be underpriced, but that is the trade off for automating ad sales.
AdWords is an increasingly complex marketplace. One could write a 300 page book just covering AdWords. Rather than doing that here I thought it would be useful to link to many relevant resources.
Ad Heavy Page Layout (see Top Heavy)
Affiliate Marketing
Affiliate marketing programs allows merchants to expand their market reach and mindshare by paying independent agents on a cost per action (CPA) basis. Affiliates only get paid if visitors complete an action.
Most affiliates make next to nothing because they are not aggressive marketers, have no real focus, fall for wasting money on instant wealth programs that lead them to buying a bunch of unneeded garbage via other’s affiliate links, and do not attempt to create any real value.
Some power affiliates make hundreds of thousands or millions of dollars per year because they are heavily focused on automation and/or tap large traffic streams. Typically niche affiliate sites make more per unit effort than overtly broad ones because they are easier to focus (and thus have a higher conversion rate).
Selling a conversion is typically harder than selling a click (like AdSense does, for instance). Search engines are increasingly looking to remove the noise low quality thin affiliate sites ad to the search results through the use of

See also:

Some social networks or search systems may take site age, page age, user account age, and related historical data into account when determining how much to trust that person, website, or document. Some specialty search engines, like blog search engines, may also boost the relevancy of new documents.
Fresh content which is also cited on many other channels (like related blogs) will temporarily rank better than you might expect because many of the other channels which cite the content will cite it off their home page or a well trusted high PageRank page. After those sites publish more content and the reference page falls into their archives those links are typically from pages which do not have as much link authority as their home pages.
Some search engines may also try to classify sites to understand what type of sites they are, as in news sites or reference sites that do not need updated that often. They may also look at individual pages and try to classify them based on how frequently they change. owned search service which measures website traffic.
Alexa is heavily biased toward sites that focus on marketing and webmaster communities. While not being highly accurate it is free.
Anchor Text
The text that a user would click on to follow a link. In the case the link is an image the image alt attribute may act in the place of anchor text.
Search engines assume that your page is authoritative for the words that people include in links pointing at your site. When links occur naturally they typically have a wide array of anchor text combinations. Too much similar anchor text may be a considered a sign of manipulation, and thus discounted or filtered. Make sure when you are building links that you control that you try to mix up your anchor text.
While Google pitches Android as being an “open” operating system, they are only open in markets they are losing & once they establish dominace they shift from open to closed by added many new restrictions on “partners.”
Popular web portal which merged with Time Warner & then was spun back out.API Application Program Interface – a series of conventions or routines used to access software functions. Most major search products have an API program.ArbitrageExploiting market inefficiencies by buying and reselling a commodity for a profit. As it relates to the search market, many thin content sites laced with an Overture feed or AdSense ads buy traffic from the major search engines and hope to send some percent of that traffic clicking out on a higher priced ad. Shopping search engines generally draw most of their traffic through arbitrage.
Active Server Pages – a dynamic Microsoft programming language.
Ask is a search engine owned by InterActive Corp. They were originally named Ask Jeeves, but they dumped Jeeves in early 2006. Their search engine is powered by the Teoma search technology, which is largely reliant upon Kleinberg‘s concept of hubs and authorities.
The ability of a page or domain to rank well in search engines. Five large factors associated with site and page authority are link equity, site age, traffic trends, site history, and publishing unique original quality content.
Search engines constantly tweak their algorithms to try to balance relevancy algorithms based on topical authority and overall authority across the entire web. Sites may be considered topical authorities or general authorities. For example, Wikipedia and DMOZ are considered broad general authority sites. This site is a topical authority on SEO, but not a broad general authority.
AuthoritiesTopical authorities are sites which are well trusted and well cited by experts within their topical community. A topical authority is a page which is referenced from many topical experts andhub sites. A topical hub is page which references many authorities.
Example potential topical authorities:

  • the largest brands in your field
  • the top blogger talking about your subject
  • the Wikipedia or DMOZ page about your topic

Automated Bid Management Software
Pay per click search engines are growing increasingly complex in their offerings. To help large advertisers cope with the increasing sophistication and complexity of these offerings some search engines and third party software developers have created software which makes it easier to control your ad spend. Some of the more advanced tools can integrate with youranalytics programs and help you focus on conversion, ROI, and earnings elasticity instead of just looking at cost per click.
Backlink (see Inbound Link)
Bait and Switch
Marketing technique where you make something look overtly pure or as though it has another purpose to get people to believe in it or vote for it (by linking at it or sharing it with friends), then switch the intent or purpose of the website after you gain authority.
It is generally easier to get links to informational websites than commercial sites. Some new sites might gain authority much quicker if they tried looking noncommercial and gaining influence before trying to monetize their market position.
During the first web boom many businesses were based on eyeballs more than actually building real value. Many ads were typically quite irrelevant and web users learned to ignore the most common ad types.
In many ways text ads are successful because they are more relevant and look more like content, but with the recent surge in the popularity of text ads some have speculated that in time people may eventually become text ad blind as well.
Behavioral Targeting
Ad targeting based on past recent experience and/or implied intent. For example, if I recently searched for mortgages then am later reading a book review the page may still show me mortgage ads.
A prejudice based on experiences or a particular worldview.
Any media channel, publishing format, organization, or person is biased by

  • how and why they were created and their own experiences
  • the current set of social standards in which they exist
  • other markets they operate in
  • the need for self preservation
  • how they interface with the world around them
  • their capital, knowledge, status, or technological advantages and limitations

Search engines aim to be relevant to users, but they also need to be profitable. Since search engines sell commercial ads some of the largest search engines may bias their organic search results toward informational (ie: non-commercial) websites. Some search engines are also biased toward information which has been published online for a great deal of time and is heavily cited.
Search personalization biases our search results based on our own media consumption and searching habits.
Large news organizations tend to aim for widely acceptable neutrality rather than objectivity. Some of the most popular individual web authors / publishers tend to be quite biased in nature. Rather than bias hurting one’s exposure

  • The known / learned bias of a specific author may make their news more appealing than news from an organization that aimed to seem arbitrarily neutral.
  • I believe biased channels most likely typically have a larger readership than unbiased channels.
  • Most people prefer to subscribe to media which matches their own biases worldview.
  • If more people read what you write and passionately agree with it then they are more likely to link at it.
  • Things which are biased in nature are typically easier to be cited than things which are unbiased.

Bid Management Software (see Automated Bid Management Software)
Microsoft’s search engine, which also powers the organic search results on Yahoo! Search.Bing AdsMicrosoft’s paid search program, which rivals Google AdWords and powers paid search results on Yahoo! Search.
Black Hat SEO
Search engines set up guidelines that help them extract billions of dollars of ad revenue from the work of publishers and the attention of searchers. Within that highly profitable framework search engines consider certain marketing techniques deceptive in nature, and label them as black hat SEO. Those which are considered within their guidelines are called white hat SEO techniques. The search guidelines are not a static set of rules, and things that may be considered legitimate one day may be considered deceptive the next.
Search engines are not without flaws in their business models, but there is nothing immoral or illegal about testing search algorithms to understand how search engines work.
People who have extensively tested search algorithms are probably more competent and more knowledgeable search marketers than those who give themselves the arbitrary label of white hat SEOs while calling others black hat SEOs.
When making large investments in processes that are not entirely clear trust is important. Rather than looking for reasons to not work with an SEO it is best to look for signs of trust in a person you would like to work with.
Block Level Analysis
A method used to break a page down into multiple points on the web graph by breaking its pages down into smaller blocks.
Block level link analysis can be used to help determine if content is page specific or part of a navigational system. It also can help determine if a link is a natural editorial link, what other links that link should be associated with, and/or if it is an advertisement. Search engines generally do not want to count advertisements as votes.
A periodically updated journal, typically formatted in reverse chronological order. Many blogs not only archive and categorize information, but also provide a feed and allow simple user interaction like leaving comments on the posts.
Most blogs tend to be personal in nature. Blogs are generally quite authoritative with heavy link equity because they give people a reason to frequently come back to their site, read their content, and link to whatever they think is interesting.
The most popular blogging platforms are WordPress, Blogger, Movable Type, and Typepad.
Blog Comment Spam
Either manually or automatically (via a software program) adding low value or no value comments to other sites.
Automated blog spam:

Nice post!
Discreat Overnight Viagra Online Canadian Pharmacy Free Shipping

Manual blog spam:

I just wrote about this on my site. I don’t know you, but I thought I would add no value to your site other than linking through to mine. Check it out!!!!!
cluebag manual spammer (usually with keywords as my name)

As time passes both manual and automated blog comment spam systems are evolving to look more like legitimate comments. I have seen some automated blog comment spam systems that have multiple fake personas that converse with one another.
Blogger is a free blog platform owned by Google.
It allows you to publish sites on a subdomain off of, or to FTP content to your own domain. If you are serious about building a brand or making money online you should publish your content to your own domain because it can be hard to reclaim a website’s link equity and age related trust if you have built years of link equity into a subdomain on someone else’s website.
Blogger is probably the easiest blogging software tool to use, but it lacks many some features present in other blog platforms.
Link list on a blog, usually linking to other blogs owned by the same company or friends of that blogger.BoldA way to make words appear in a bolder font. Words that appear in a bolder font are more likely to be read by humans that are scanning a page. A search engine may also place slightly greater weighting on these words than regular text, but if you write natural page copy and a word or phrase appears on a page many times it probably does not make sense or look natural if you bold ever occurrence.
Example use:

  • <b>words</b>
  • <strong>words</strong>

Either would appear as words.
BookmarksMost browsers come with the ability to bookmark your favorite pages. Many web based services have also been created to allow you to bookmark and share your favorite resources. The popularity of a document (as measured in terms of link equity, number of bookmarks, or usage data) is a signal for the quality of the information. Some search engines may eventually use bookmarks to help aid their search relevancy.
Social bookmarking sites are often called tagging sites. is the most popular social bookmarking site. Yahoo! MyWeb also allows you to tag results. Google allows you to share feeds and / or tag pages. They also have a program called Google Notebook which allows you to write mini guides of related links and information.
There are also a couple meta news sites that allow you to tag interesting pages. If enough people vote for your story then your story gets featured on the homepage. Slashdot is a tech news site primarily driven by central editors. Digg created a site covering the same type of news, but is a bottoms up news site which allows readers to vote for what they think is interesting. Netscapecloned the Digg business model and content model. Sites like Digg and Netscape are easy sources of links if you can create content that would appeal to those audiences.
Many forms of vertical search, like Google Video or YouTube, allow you to tag content.
See also:

  • – Yahoo! owned social bookmarking site
  • Yahoo! MyWeb – similar to, but more integrated into Yahoo!
  • Google Notebook – allows you to note documents
  • Slashdot – tech news site where stories are approved by central editors
  • Digg – decentralized news site
  • Netscape – Digg clone
  • Google Video – Google’s video hosting, tagging, and search site
  • YouTube – popular decentralized video site

Boolean SearchMany search engines allow you to perform searches that contain mathematical formulas such as AND, OR, or NOT. By default most search engines include AND with your query, requiring results to be relevant for all the words in your query.

  • A Google search for SEO Book will return results for SEO AND Book.
  • A Google search for “SEO Book” will return results for the phrase SEO Book.
  • A Google search for SEO Book -Jorge will return results containing SEO AND Book but NOT Jorge.
  • A Google search for ~SEO -SEO will find results with words related to SEO that do not contain SEO.

Some search engines also allow you to search for other unique patterns or filtering ideas. Examples:

See also:

BrandThe emotional response associated with a company and/or product.
A brand is built through controlling customer expectations and the social interactions between customers. Building a brand is what allows businesses to move away from commodity based pricing and move toward higher margin value based pricing. Search engines may look at signals like repeat website visits & visits to a site based on keywords associated with a known entity and use them for relevancy signals in algorithms like Panda.
Branded Keywords
Keywords or keyword phrases associated with a brand or entity. Typically branded keywords occur late in the buying cycle, and are some of the highest value and highest converting keywords. These searches may be used as relevancy signals in algorithms like Panda.
Some affiliate marketing programs prevent affiliates from bidding on the core brand related keywords, while others actively encourage it. Either way can work depending on the business model, but it is important to ensure there is synergy between internal marketing and affiliate marketing programs.
Breadcrumb Navigation
Navigational technique used to help search engines and website users understand the relationship between pages.
Example breadcrumb navigation:
Home > SEO Tools > SEO for Firefox
Whatever page the user is on is unlinked, but the pages above it within the site structure are linked to, and organized starting with the home page, right on down through the site structure.
Broken Link
A hyperlink which is not functioning. A link which does not lead to the desired location.
Links may broken for a number of reason, but four of the most common reasons are

  • a website going offline
  • linking to content which is temporary in nature (due to licensing structures or other reasons)
  • moving a page’s location
  • changing a domain’s content management system

Most large websites have some broken links, but if too many of a site’s links are broken it may be an indication of outdated content, and it may provide website users with a poor user experience. Both of which may cause search engines to rank a page as being less relevant.
Xenu Link Sleuth is a free software program which crawls websites to find broken links.
Client used to view the world wide web.
The most popular browsers are Microsoft’s Internet Explorer, Mozilla’s Firefox, Safari, and Opera.
Bush, Vannevar
WWII scientist who wrote a seminal research paper on the concepts of hypertext and a memory extension device titled As We May Think.Business.comA well trusted directory of business websites and information. is also a large pay per click arbitrage player.
Buying CycleB
efore making large purchases consumers typically research what brands and products fit their needs and wants. Keyword based search marketing allows you to reach consumers at any point in the buying cycle. In many markets branded keywords tend to have high search volumes and high conversion rates.
The buying cycle may consist of the following stages

  • Problem Discovery: prospect discovers a need or want.
  • Search: after discovering a problem look for ways to solve the need or want. These searches may contain words which revolve around the core problem the prospect is trying to solve or words associated with their identity.
  • Evaluate: may do comparison searches to compare different models, and also search for negative information like product sucks, etc.
  • Decide: look for information which reinforces your view of product or service you decided upon
  • Purchase: may search for shipping related information or other price related searches. purchases may also occur offline
  • Reevaluate: some people leave feedback on their purchases . If a person is enthusiastic about your brand they may cut your marketing costs by providing free highly trusted word of mouth marketing.

See also:
Waiting for Your Cat to Bark? – book by Brian & Jeffrey Eisenberg about the buying cycle andPersuading Customers When They Ignore Marketing.


CacheCopy of a web page stored by a search engine. When you search the web you are not actively searching the whole web, but are searching files in the search engine index.
Some search engines provide links to cached versions of pages in their search results, and allow you to strip some of the formatting from cached copies of pages.
Calacanis, JasonFounder of Weblogs, Inc. Also pushed AOL to turn Netscape into a Digg clone & then launched longtail SEO play Mahalo.
See also:

Canonical URL

Many content management systems are configured with errors which cause duplicate or exceptionally similar content to get indexed under multiple URLs. Many webmasters use inconsistent link structures throughout their site that cause the exact same content to get indexed under multiple URLs. The canonical version of any URL is the single most authoritative version indexed by major search engines. Search engines typically use PageRank or a similar measure to determine which version of a URL is the canonical URL.
Webmasters should use consistent linking structures throughout their sites to ensure that they funnel the maximum amount of PageRank at the URLs they want indexed. When linking to the root level of a site or a folder index it is best to end the link location at a / instead of placing the index.html or default.asp filename in the URL.
Examples of URLs which may contain the same information in spite of being at different web addresses: