Search for notes by fellow students, in your own course and all over the country.
Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.
Document Preview
Extracts from the notes are below, to see the PDF you'll receive please use the links above
Search engine optimization (SEO) is the process of affecting the
visibility of a website or a web page in a search engine's unpaid results often referred to as "natural," "organic," or "earned" results
...
SEO may target different kinds
of search, including image search, local search, video search, academic
search,[1] news search and industry-specific vertical search engines
...
Optimizing a website may involve editing its
content, HTML and associated coding to both increase its relevance to
specific keywords and to remove barriers to the indexing activities of
search engines
...
Contents
[hide]
1History
o 1
...
1Getting indexed
o 2
...
3Increasing prominence
o 2
...
Initially, all webmasters needed to do was to submit the
address of a page, or URL, to the various engines which would send a
"spider" to "crawl" that page, extract links to other pages from it, and
return information found on the page to be indexed
...
Site owners started to recognize the value of having their sites highly
ranked and visible in search engine results, creating an opportunity for
both white hat and black hat SEO practitioners
...
Sullivan credits Bruce Clay as being
one of the first people to popularize the term
...
"
Early versions of search algorithms relied on webmaster-provided
information such as the keyword meta tag, or index files in engines
like ALIWEB
...
Using
meta data to index pages was found to be less than reliable, however,
because the webmaster's choice of keywords in the meta tag could
potentially be an inaccurate representation of the site's actual content
...
[6][dubious – discuss] Web content
providers also manipulated a number of attributes within the HTML
source of a page in an attempt to rank well in search engines
...
To provide better results to their
users, search engines had to adapt to ensure theirresults pages showed
the most relevant search results, rather than unrelated pages stuffed
with numerous keywords by unscrupulous webmasters
...
Search engines responded by developing more complex ranking
algorithms, taking into account additional factors that were more difficult
for webmasters to manipulate
...
Early search
engines, such as Altavista and Infoseek, adjusted their algorithms in an
effort to prevent webmasters from manipulating rankings
...
[9]
Companies that employ overly aggressive techniques can get their client
websites banned from the search results
...
[10] Wired magazine reported that the same company sued
blogger and SEO Aaron Wall for writing about the ban
...
[12]
Some search engines have also reached out to the SEO industry, and
are frequent sponsors and guests at SEO conferences, chats, and
seminars
...
[13][14] Google has a Sitemapsprogram to help
webmasters learn if Google is having any problems indexing their
website and also provides data on Google traffic to the website
...
Relationship with Google
In 1998, Graduate students at Stanford University, Larry
Page and Sergey Brin, developed "Backrub," a search engine that relied
on a mathematical algorithm to rate the prominence of web pages
...
[16] PageRank estimates the
likelihood that a given page will be reached by a web user who randomly
surfs the web, and follows links from one page to another
...
Page and Brin founded Google in 1998
...
[18] Off-page factors (such as PageRank and hyperlink
analysis) were considered as well as on-page factors (such as keyword
frequency, meta tags, headings, links and site structure) to enable
Google to avoid the kind of manipulation seen in search engines that
only considered on-page factors for their rankings
...
Many
sites focused on exchanging, buying, and selling links, often on a
massive scale
...
[19]
By 2004, search engines had incorporated a wide range of undisclosed
factors in their ranking algorithms to reduce the impact of link
manipulation
...
[20]The leading
search engines, Google, Bing, and Yahoo, do not disclose the
algorithms they use to rank pages
...
[21] Patents related to search engines can
provide information to better understand search engines
...
Depending on their history of previous searches, Google crafted results
for logged in users
...
He opined that it would become
meaningless to discuss how a website ranked, because its rank would
potentially be different for each user and each search
...
[25] On June 15, 2009, Google disclosed that they had taken
measures to mitigate the effects of PageRank sculpting by use of
the nofollow attribute on links
...
[26] As a result of
this change the usage of nofollow leads to evaporation of pagerank
...
Additionally several solutions have
been suggested that include the usage of iframes, Flash and
Javascript
...
[28]
On June 8, 2010 a new web indexing system called Google
Caffeine was announced
...
According
to Carrie Grimes, the software engineer who announced Caffeine for
Google, "Caffeine provides 50 percent fresher results for web searches
than our last index
...
Historically site
administrators have spent months or even years optimizing a website to
increase search rankings
...
[30]
In February 2011, Google announced the Panda update, which
penalizes websites containing content duplicated from other websites
and sources
...
[31] The 2012 Google Penguin attempted to
penalize websites that used manipulative techniques to improve their
rankings on the search engine,[32] and the 2013 Google
Hummingbird update featured an algorithm change designed to improve
Google's natural language processing and semantic understanding of
web pages
...
In this diagram, if each bubble represents a web site, programs
sometimes calledspiders examine which sites link to which other sites, with
arrows representing these links
...
In
this example, since website B is the recipient of numerous inbound links, it ranks
more highly in a web search
...
Note: percentages are rounded
...
Pages
that are linked from other search engine indexed pages do not need to
be submitted because they are found automatically
...
[33] Google offers Google
Webmaster Tools, for which an XML Sitemap feed can be created and
submitted for free to ensure that all pages are found, especially pages
that are not discoverable by automatically following links
...
[36]
Search engine crawlers may look at a number of different factors
when crawling a site
...
Distance of pages from the root directory of a site may also be a factor in
whether or not pages get crawled
...
txt file in the root directory of the domain
...
When a search engine visits a site,
the robots
...
The
robots
...
As a search engine crawler may keep a cached
copy of this file, it may on occasion crawl pages a webmaster does not
wish crawled
...
In March 2007, Google
warned webmasters that they should prevent indexing of internal search
results because those pages are considered search spam
...
Cross linking between pages of the same website to
provide more links to important pages may improve its
visibility
...
[39] Updating content so as to keep search engines
crawling back frequently can give additional weight to a site
...
URL normalization of web pages
accessible via multiple urls, using the canonical link element[40] or via 301
redirects can help make sure links to different versions of the url all
count towards the page's link popularity score
...
The search engines
attempt to minimize the effect of the latter, among them spamdexing
...
[41] White hats tend to produce results that last a long time,
whereas black hats anticipate that their sites may eventually be banned
either temporarily or permanently once the search engines discover
what they are doing
...
As the search engine
guidelines[13][14][43] are not written as a series of rules or commandments,
this is an important distinction to note
...
White hat advice is generally summed up as creating content for
users, not for search engines, and then making that content easily
accessible to the spiders, rather than attempting to trick the algorithm
from its intended purpose
...
Black hat SEO attempts to improve rankings in ways that are
disapproved of by the search engines, or involve deception
...
Another
method gives a different page depending on whether the page is being
requested by a human visitor or a search engine, a technique known
as cloaking
...
This is in between
black hat and white hat approaches where the methods employed avoid
the site being penalised however do not act in producing the best
content for users, rather entirely focused on improving search engine
rankings
...
Such penalties can be applied either
automatically by the search engines' algorithms, or by a manual site
review
...
[45] Both companies, however, quickly apologized, fixed the
offending pages, and were restored to Google's list
...
[47] A successful Internet marketing campaign may also depend
upon building high quality web pages to engage and persuade, setting
up analytics programs to enable site owners to measure results, and
improving a site's conversion rate
...
However, search
engines are not paid for organic search traffic, their algorithms change,
and there are no guarantees of continued referrals
...
[49] Search engines can change their algorithms, impacting a
website's placement, possibly resulting in a serious loss of traffic
...
5 per day
...
[51]
International markets
Optimization techniques are highly tuned to the dominant search
engines in the target market
...
In 2003, Danny
Sullivan stated that Google represented about 75% of all searches
...
[53] As
of 2006, Google had an 85–90% market share in Germany
...
[54] As of June 2008, the marketshare of Google in
the UK was close to 90% according to Hitwise
...
As of 2009, there are only a few large markets where Google is not the
leading search engine
...
The most notable
example markets are China, Japan, South Korea, Russia and the Czech
Republic where respectively Baidu, Yahoo!
Japan, Naver, Yandex and Seznam are market leaders
...
Otherwise, the fundamental elements of
search optimization are essentially the same, regardless of language
...
SearchKing's claim was that Google's tactics to
prevent spamdexing constituted a tortious interference with contractual
relations
...
"[56][57]
In March 2006, KinderStart filed a lawsuit against Google over search
engine rankings
...
On March 16, 2007 the United States District Court for the Northern
District of California (San Jose Division) dismissed KinderStart's
complaint without leave to amend, and partially granted Google's motion
for Rule 11 sanctions against KinderStart's attorney, requiring him to pay
part of Google's legal expenses