Seo Unleashed… - Infosys

bivalvegrainInternet και Εφαρμογές Web

18 Νοε 2013 (πριν από 3 χρόνια και 6 μήνες)

191 εμφανίσεις

Seo Unleashed…
WHITE PAPER
Abstract
This white paper mainly deals with the concepts and strategies for Search Engine
Optimization (SEO). It starts by discussing the importance of SEO, how it has
evolved over the years and what lies in the future. Based on various SEO project
experiences, a detailed SEO approach has also been recommended for the
websites. The white paper also covers various best practices that a site should
employ or follow for the web site optimization, keyword optimization, promotion
and Link building strategies in detail. It discusses how Social media marketing and
SEO are inseparable and some of the latest trends in the SEO space. This white
paper can be used as a guide or best practice document for any website that wants
to achieve a higher ranking in the web search engine results pages.
Introduction - What is Search Engine Optimization?
among the search engine’s organic results.
Good search engine optimization will
ensure that a web page appears higher
in the search engine results for a range of
relevant, specific and valuable search terms
or queries. SEO can also target different
kinds of searches, including image search,
local search, and industry-specific vertical
search engines. SEO efforts may involve a
site’s coding, presentation, and structure,
as well as fixing problems that could
prevent search engine indexing programs
from fully spidering a site. The higher your
website ranks in the results of Google,
Yahoo, Bing etc., the greater the chance
that your site will be visited by many, which
in turn would skyrocket your sales and
bring a high return on investment than
any other comparable forms of marketing.
Observing sensible search engine
optimization procedures can also make a
site easy for the search engines to index
and making it more accessible to the users.
Say for example if you have built a website
for “property sell and buy”, then you need
to ensure that your site gets listed in the
top internet search results for priority
keywords like “buy property” or “sell
property”, without which you will fail to
attract internet visitors to your site, in turn
losing out on the conversions and sale.
Organizations spend millions of dollars
and work hard to build a great website,
but fail to attract visitors to their site. There
are millions of pages of web content in
the World Wide Web and your website
gets totally lost in the shuffle, just like
the proverbial needle in the haystack.
The site can become non-existent in the
cyber arena, if the search engine ignores it,
and thus failing to harness the most cost
effective and powerful Internet marketing
strategy known as Search engine
Optimization (SEO). SEO is the process
whereby a web site, or more specifically
a web page or document, is constructed
or amended to improve its placement
As you can see from the above snapshot that Google retrieved around 668,000,000 websites having keywords “property buy or sell” in some
form or the other in their content. You will be at a great loss if the site you have built is nowhere listed in the top search results
SEM vis-à-vis SEO
As seen from the above search result
snapshot, there are two types of search
results. The ones highlighted in the black
box are known as organic search results
and are driven by the SEO strategies
implemented by the website. The search
results in terms of Advertisements located
on the top of the search results or on the
right hand side of the search results are
Paid Search results/advertisements and
falls in the Search Engine Marketing (SEM)
space. Paid inclusion, Pay per click and Pay
per view are different modes in which SEM
operates.

Paid inclusion
– Site’s pages are
cataloged and rendered by the search
site, when they closely match an
organic search.

Paid Placement
– Websites pay, to
have its pages shown in response
to a particular search word entered,
regardless of how closely the page
matches the entered term and is
usually executed as a part of bidding
process by the Search engines. Google
AdWords, Yahoo Advertising and
Microsoft AdCenter are some of the
examples for Paid Placement search
engine marketing. Payment models
range from, pay per click or pay per
view or pay per conversion.
Organic SEO is better for long term and
lower margin campaigns, but is difficult
to target the local market with organic
SEO. SEM on the other hand is costly per
visitor or conversion. It is better for short
term and high margin campaigns and
definitely has the ability to target local
markets. SEO requires ongoing learning
and optimizations and can take from 2 -4
months to show some tangible results.
On the other hand SEM is instantaneous
and can produce immediate results and
improvements on the organic results as
well. As a thumb rule for any new website,
it is always recommended to start with SEM
and in parallel focus on SEO strategies and
once you reach the SEO limits, you can stop
the paid results or advertisements. In this
white paper I will only be unleashing the
concepts and strategies related to SEO and
will not focus on SEM strategies.
Positioning is important..
We
click where our eyes dwell.
Many studies done in the past (iProspect,
Forrester research, Eyetools etc.) have
quoted that

81% of internet users find products &
services by searching in popular search
engines like Google, Yahoo and Bing

A top ranking will result in 40 % more
traffic

Nearly 30 % of the searchers click on
the first 4-5 organic search results.

Over 50 % click on only the first two or
three paid advertisements

40 % of people don’t click to the
second page; 55 % won’t look past the
first 2 pages of the search results

If ranked below #10 your click through
rates falls dramatically and you are
nowhere in the game

Natural search results
are 80 % less costly
and 65 % more
effective than paid
search
Considering all the above,
it is quite obvious how
much importance a
website should give to
SEO. In a nutshell, SEO
brings

High Return on
investment, as it is the
most cost effective
form of internet
marketing

Increases the online brand value by
having greater visibility and popularity
in the cyber world

Attract more online customers and
drive relevant and targeted traffic to
the site

Increased leads, sales, conversions and
higher click through rates (CTRs)

Increased accessibility
How has SEO evolved?
The driving factor for the higher search
rankings have evolved from “quantity of
web content” to “quality and quantity
of web content” to “inbound links from
trusted websites” and now, to the level of
“personalization and social graph of the
searcher”.
Below table provides a view of how it has
evolved over years and what are the factors
an organization should ensure to achieve
higher rankings in the top search engines.
Below table provides a view of how it has
evolved over years and what are the factors
an organization should ensure to achieve
higher rankings in the top search engines.
Factors impacting search ranking
Timeline
Before 2000
• Lots of web page content
• Search spider friendly
• Keyword rich content
From 2000 to
2010
• Quality and keyword rich web page content
• Search spider friendly
• Lots of inbound links from the trusted
websites
• Promote content to get back links
2010 and
beyond
• Quality and keyword rich web page content
• Search spider friendly
• Lots of inbound links from the trusted
websites
• Promote content to get links
• Browsing and Searching history of the user
• Searcher’s demographics and location
• Social following for the content/website
• Engage social followers
• Latest trends and happenings
What do Search spiders like and dislike?
Below is the list of SEO likes and dislikes of major search engines like Google, Yahoo and Bing.
Likes Dislikes
• Correct and Validated HTML
• Static pages with keyword rich content
• Importance to meta tags like name,
description, comments, robots
• Keyword density and proximity in body
text, title tags, meta tags and inbound
links
• Quality Inbound Links with appropriate
link text
• Well-designed Site map or quick links
section for dynamic pages
• Usage of robots.txt file
• SEO compliant URL’s
• Page size < 100~120 KB
• Less than 100 links on one page
• Frames
• Dynamic pages
• Excessive usage of Flash, DHTML,
JavaScript etc.
• Duplicate content
• Keyword spamming
• Cloaking and redirects
• Hidden text and links
• Link farms and linking to untrusted
websites
Approach to SEO
Any SEO engagement will have different phases based on
whether the site is new or existing and the intent of the
website.

Research the site, customers, competitors, domain etc.
and identify the set of priority targeted keywords

Make the website SEO content compliant by targeting
the set of priority keywords in tags such as title, meta,
image, URL’s, anchor texts, body, header tags etc. with
proper keyword density and proximity

Design and Implement the website to make it search
spider friendly: SEO compliant URL’s, Site maps,
navigation links, redirects, robots.txt etc.

Give your site higher rankings by promoting your
website, doing inbound linking programs and also
getting involved in social media websites to increase the
social ranking of your website

Continuously monitor the site for search rankings
and plan for continuous optimizations based on the
competition and the requirements of search engines.
SEO work can be divided into
4 buckets :
1.
Keyword and Content Optimization
,
which is mostly to do with
identification of the right set of priority
keywords for the website, plan and
write the HTML content accordingly.
Some of the SEO activities in this phase
include
a.
Understanding customer’s online
business strategy
b.
Researching the market category,
customers and competitors
c.
Keyword research and selection to
ensure customer’s website achieves
greater hit rate
2.
Website Optimization
, which deals
with having a search friendly html
page, utilizing the HTML tags for
making them keyword rich, generating
SEO friendly URL’s and following the
website design to ensure maximum
textual content gets indexed by the
search engine.
Some of the important activities in this
bucket include -
a.
Writing page title, description, keyword
meta tags to ensure high relevancy
ranking
b.
Incorporating the website relevant
key words in site URL and cross linking
URLs (URL Optimization)
c.
Writing keyword rich body, titles, image
tags and outbound links
d.
Building internal links to help the
search engines navigate the site
e.
Working with mod rewrite - Rewriting
the Dynamic URLs
f.
Bot-friendly navigation to ensure
search engines can crawl entire site
g.
Generate sitemaps to enable indexing
of dynamic pages or content pages
that are not well linked
3.
Promotion strategy and Indexation
,
which deals with submission of site
maps and URL’s to the search engines,
local search providers and directories
etc. It is also to do with viral marketing
where organizations focus on building
more inbound links from trusted
websites which in turn increases the
search ranking of the website. Activities
to be done as a part of this phase are –
a.
Building inbound and outbound links
with proper link density and link age
b.
Building inbound links and using
keywords in the text of page links that
refer to other pages on customer site
and to any external Internet resources
c.
Writing anchor text to provide
additional relevance to the quality of
a link
d.
Using keywords in site navigation
menu links gives additional significance
to the pages to which the links refer
4.
Monitoring, Maintenance and
Further optimizations
, which involves
continuous monitoring of the website
for rankings specific to the keywords
and making sure keyword, content and
website optimizations are performed
on a regular basis to keep the site up
in rankings. Promotion strategies also
play an important role in this bucket
as the quality and quantity of inbound
links is vital for the search ranking.
Some activities related to this bucket
include
a.
Analysing and responding to site traffic
and user feedback once the website
has been optimized
b.
Doing press releases on the news sites
with an exact description of the site
having relevant keywords and the site
URL. These releases are crawled by the
Search engine spiders along with the
main URL as well.
c.
Using social networking tools and
participating in blogs and leaving the
inbound links
d.
Fine tuning the website. Effective SEO
is a continuous activity.
e.
Websites can be submitted for free on
different search engines (e.g. Google,
Yahoo, Gigablast, DMOZ and other
open directories)
Evaluation and
Recommendation of
OOTB SEO, Web Analytics
monitoring/measurement
tools if required
Submission of Sitemaps,
URL’s, details to search
engines, local search
engines, map services,
yellow pages, directories
Utilizing Tags (Meta
keywords, Title, Meta
Description, ALT for Image
files, Comment, Header)
Research the Market,
Category, and Customers
& Competitors online
Monitor the site every
week post go live and
suggest further fine
tuning or modifications
required to improve the
rankings or getting higher
visibility
Build inbound and
outbound links - Using
Press releases, affiliate
feeds, RSS feeds
,Blogs, Podcasts, Add
wikis, widgets, social
networking sites ,
publishing tools ,
advertisings in classifieds ,
Forums, broadcasting etc.
SEO Page selections
strategies, 301 redirects,
Directory Structure &
Naming Conventions, Link
count, Page size
Keyword(s) research
using online tools , search
logs, analytics, tag clouds
feature of search tools,
thesauruses and domain
SME’s
Paid-Inclusion & Pay per
Click Programs, Target/
choose specific keywords
and bid for them in
online SEM tools. Specify
URL’s for the SEM or
build a creative copy for
the selected keywords.
Decide strategy for
the SEM duration, paid
inclusion, PPC or PPV or
PPA etc. Decide which
search engines to target
and when to quit from the
SEM strategies.
SEO monitoring,
tracking and reporting
for- Obstacle analysis,
Page Rank-Competitive
position analysis, Link
Analysis - Inbound and
Outbound Links, Site
Indexation, Keyword and
Content Statistics, Log
Analyzer, Traffic from the
search engines etc.
Continuous monitoring
for SEO and SEM
and suggest further
optimizations to
the content writers,
development team,
marketing team and the
SEM agency
Effective usage of HTML/
CSS code. Limiting usage
of technology blocking
SEO (Flash, Frames,
JavaScript, Multimedia
Files), Max Robot
coverage, Internal linking
approaches
SEO-friendly URL’s
Generate Site-maps
SEO Geo-targeting
guidelines – Local hosting,
Targeted domain names,
geo compliant URL’s, local
languages, local inbound
and outbound links
Competition for the
Keyword(s)
Keywords placement,
proximity and density.
Keywords in URL
Write Relevant content
with proper keyword
density
Keywords & Content
optimization
Website
Optimization
Promotion strategies
and Site indexation
Monitoring,
Maintenance and
further optimizations
SEO Approach
SEO best practices for keywords, content and website optimization
The website architecture, the website code, the keywords, the content, the technology – they all need to be search engine friendly and
optimized in order to achieve higher ranking in the search results.
• Page Title
• Meta description
• Meta keywords
• Anchor text
• Navigation links
• Inbound and Outbound links
• Body Content
• Keyword density and
proximity
• Internal links
• Heading H1,H2 tags
• Images (ALT tag)
• Comment tag
• Site map
• Quick links
• Domain Names
• URL’s
• File Names
• Directory Structure
• Maximum robot coverage
• Link count
• Page Size
• Validated HTML
• Faster loading of the web page
Title, Meta Description & Meta Keywords
Navigation Linking
H1 & H2 Headers
Image Alt Text
Body Content
Internal Links From Content
Global Site Map Link
Below are the areas of SEO optimization for any website.
Some of the SEO best practices around
these areas are.

Title tag:
Each web page should have
a different and relevant title. Title tags
influence both relevancy and ranking.
The title tag must contain the target
keywords identified for the website
and also the intent of that specific web
page (Note: This assumes significance
since most search engines use the text
of the title as the title of the page in
the search results.) Title tags should be
consistent for whole site, or if that is
not possible then for sub-sections of
site. Using different format for title tag
for different pages will only confuse the
user on SERPs whereas a consistency
in title pages will help in strong brand
recognition.

Meta Tags:
There are different types
of Meta tags in the HTML – name,
description, comments, robots etc.
which needs to be SEO compliant.

keywords:
This may influence
the index of keywords of some
search engines. Example <META
NAME=”keywords” CONTENT =”Search
engine marketing, SEO, rankings,
search engine placement, SERP” />

description:
The description tag allows
you to influence the description of
your page in the crawlers that support
the tag. Most search engines list the
description of the page as fetched
from the meta-description tag. Meta
description must contain the targeted
keywords and also ensure that each
web page has unique and relevant
meta description. For example <META
NAME=”description” CONTENT=”This
is a best practice article on Search
Engine Optimization - the process of
improving the web site elements to
ensure higher search engine rankings”>

comments:
Comments tag is meant to
make notes about the page specifically
about what is the intent of that page
and what does it contain etc. Some
spiders crawl them for keywords.

robots:
Use appropriate “index” and
“follow” instructions in the robot meta
tag.


[<meta name=”robots”
content=”index, follow”>] - Use
this, if the crawler needs to index
this page and continue to crawl
the links on the page. For example:
Home Page.


[<meta name=”robots”
content=”noindex, follow”>] - Use
this, if the crawler needs to ignore
this page, but continue to crawl the
links on this page. For example –
Site Map


[<meta name=”robots”
content=”index, nofollow”>] - Use
this if the crawler needs to index
this page, but stop crawling the
links on this page. For example
– Customer listing with links to
individual customer contact details.


[<meta name=”robots”
content=”noindex, nofollow”>]
- Use this if the crawler needs to
ignore this page and stop crawling
the links on this page. For example
– a page with sensitive information
with links to classified information.
For dynamically generated pages, populate
the META tags also dynamically and ensure
that they are relevant and unique to that
specific dynamic generated web page.

Header (H1, H2, H3) tags
- The H(x)
tags in HTML (H1, H2, H3, etc.) are
designed to indicate a headline
hierarchy in a document. Thus, an H1
tag might be considered the headline
of the page as a whole, whereas H2
tags would serve as subheadings,
H3s as tertiary-level headlines, and so
forth. The search engines have a slight
preference for keywords appearing
in heading tags, notably the H1 tag
(which is the most important of these
to employ).The title tag of a page,
containing the important keywords,
can be used as the H1 tag. However,
if the title tag is long, a more focused,
shorter heading tag using the most
important keywords from the title
tag can also be used. This helps in
reinforcing the title tag and the target
keywords.

Image file name and ALT tags

-Filename of the image should be
descriptive of the content of the image.
If the image is of laptop, then instead
of naming it random123.jpg, it should
be named laptop.jpg. Image alt text
should be used to provide information
about the image. Ideally if the image
is related to content, it also gives a
chance to use targeted keywords in
the image. Alt text and filename help
in reinforcing the keywords and also
helps in driving the traffic through
image search.

Body tag optimization
– Keyword
density is the percentage of times a
keyword or phrase appears on a web
page compared to the total number of
words on the page. In the context of
search engine optimization keyword
density can be used as a factor in
determining whether a web page is
relevant to a specified keyword or
keyword phrase.


It’s impossible to pinpoint the
exact, optimal number of times to
employ a keyword term/phrase on
the page,


As a thumb the keywords can be
used 2 to 3 times on short pages,
4 to 6 times on longer ones and
never more than makes sense in
the context of the copy.


It is recommended that the target
keywords are present earlier in the
document.


Keywords should not be stuffed
in document text just to increase
the keyword density. Stuffing of
keywords is considered spamming
and search engines tend to
penalize such tactics.


Keywords in document text have
very little or no effect on page
rankings, so the document text
should be written keeping the
convenience of user in mind, not
the search engine.


On the other hand, totally ignoring
the keyword in document text is
also a mistake and it should be
ensured that the keywords are
included at least once or twice

Navigation and Linking
- Ensure all the
webpages are internally linked in the
website. Provide links directly to every
important page in the website from the
Home Page or the landing page, which
will increase the probability of the
spiders following all the links from the
home page and ensuring 100% content
indexing. Submit the home page to
search engines and alternatively create
a sitemap and submit the sitemap to
search engines. (Most search engines
have screens where a user can submit
a website address that is desired to
be crawled and also place where XML
sitemaps can be submitted)

Only textual content gets indexed
by the Search engine
- Minimize the
usage of frames, splash, DHTML, java
applets, flash, rich media, xml and other
plug-ins since search spiders will not
be able to decipher these. When a site
is heavy in such content, ensure usage
of appropriate parallel text pages or
Meta tags to describe the graphics or
the spider inaccessible sections of the
website.

Usage of robots.txt
- Use the Robots.
txt file on the web server to define the
crawl able areas of the site. This file is
generally placed in the root level of
the web server. Specify the parts of the
server that are off limits to the search
engine spider. Place pages/directories
of pages in the crawlable and non
crawlable folders accordingly.

Registration/login agnostic for the
search spider
- Be aware that a spider
cannot crawl across a registration wall.
Hence optimally place the registration
where required. You can also use
the registration wall from restricting
spiders from accessing sensitive or
confidential pages.

Domain Name
- If the domain name
contains the main target keywords
for which the site is to be optimized,
it helps a lot in SEO. If a site is being
built from scratch, then the domain
name should be such that it contains
the targeted keyword or keywords. If
the domain name contains an exact
(or partial) match to a search query,
chances are it will show up on the first
page of the SERPs—at least on Google.
Google gives keyword matching
domains preferential treatment.

Page Load Speed
- Site speed is now
an official ranking factor for Google
and other search engines are likely
to follow suit. It means that a page
that loads slowly will be given a lower
ranking compared to a page that loads
fast, all other things being equal. Site
should be optimized for speed and
loading times should be kept as low as
possible.

SEO friendly FLASH implementation
guidelines
- Until recently, search
engine bots did not have capabilities to
read flash but now new developments
in search bot technology allow search
engines to read text and hyperlinks
inside flash content.


Flash should be used in
moderation. Even though
search bots have acquired some
capabilities to index flash content,
basics of SEO are missing from
flash. Adobe Search Engine SDK
allows a developer to see how the
flash content appears to the end
user. It should be used to test the
crawl ability of the flash content.


New versions of flash support meta
tags. These meta tags should be
used generously to describe the
flash content.


Search engines currently do not
read traced text (using the trace()
function) or text that has been
transformed into a shape in Flash
(as opposed to actual characters).
Only character-based text that is
active in the Flash stage will be
read.


Search engines have the ability to
follow links in Flash, but it is not
guaranteed they will be able to do
it. They will not, however, follow
links to other Flash .swf files. (This
is different from loading child
.swf files into a parent .swf file.)
Therefore, links in Flash should
always point to HTML pages, not
other .swf files.


Use of SWFObject: SWFObject is
Flash detection code written in
JavaScript that checks whether a
browser has the Flash plug-in. If
the browser does have the Flash
plug-in, the .swf file is displayed
secondary to that detection. If the
browser does not have the Flash
plug-in or the JavaScript to detect
it,the primary, alternative content
contained within <div> files is
displayed instead. The key here is
that search engine spiders do not
render the JavaScript. They read
the primary content in the <div>
tags. This SWFObject should be
used to mirror the content of your
Flash .swf file exactly. It should not
be used to add content, keywords,
graphics, or links that are not
contained in the file as doing so will
result in a ban from search engine’s
index. The SWFObject JavaScript
can be downloaded from http://
code.google.com/p/swfobject/


Use of NoScript: NoScript tag can
also be used in place of SWFObject.
Content of NoScript tag must
echo that of the Flash .swf movie
exactly. It should not be used to
add content, keywords, graphics, or
links that are not in the movie.

SEO friendly JavaScript
implementation guidelines
-
JavaScript can be read by search
engines up to a certain extent.
Experiments have shown that
Google bot can read the dynamically
generated links from JavaScript. Still,
use of JavaScript is something that
should be used in moderation.


JavaScript code should be placed
in an external file, such as external.
js. Whenever the required codes
are called they are taken from the
.js file and executed. The robots.
txt should be modified to instruct
search engines that the .js file
should not be indexed.


Place the same content from the
JavaScript in a no script tag. Ensure
the contents are exactly same as
what is contained in the JavaScript
and that this content is shown to
visitors who do not have JavaScript
enabled in their browser.

SiteMap guidelines
- The Sitemaps
protocol allows a webmaster to inform
search engines about URL’s on a
website that are available for crawling.
A Sitemap is a XML file that lists the
URL’s for a site. It allows webmasters to
include additional information about
each URL: when it was last updated,
how often it changes, and how
important it is in relation to other URLs
in the site. This allows search engines
to crawl the site more intelligently.
Sitemaps are a URL inclusion protocol
and complement robots.txt, a URL
exclusion protocol.


It should contain only URL’s of all
pages (URL’s for individual images
should not be submitted) or list of
other sitemaps. Even if individual
image URL’s are submitted they
won’t be indexed.


A single sitemap file cannot contain
more than 50000 URL’s and no
larger than 10MB uncompressed.


If the sitemap is too big it should be
separated it into multiple sitemaps
and have a single sitemap index
file and make a XML format of the
sitemap index file.


A sitemap index file should not
contain more than 1000 sitemaps.


Use only a single format for all
URL’s. For e.g., if the site is specified
as ww.example.org, then ensure
that the sitemap also contains URLs
starting with www and not simply
example.org and vice-versa.


Sitemap must only contain ASCII
characters.


Upper ASCII characters and special
characters such as *, $, # are not
allowed
Site Indexation and Site
Submission strategies
If your website or a part of the website is
not getting indexed or crawled by search
engines, it will not be available in the
search results.
Site Indexation with the help of
Webmaster tools
Site Indexation issues should be resolved
for better placement in search results. It
can be done with the help of Web master
tools. There may be some part of the
website, which is not indexed by search
engine or if you want to remove some
pages from getting indexed, webmaster
tool can be helpful in identifying these
issues. Major search engines, including
Google, provide free tools for webmasters.
Google’s Webmaster Tools help
webmasters better control, how Google
interacts with their websites and get useful
information from Google about their site.
Google webmaster tool can

Identify the crawl issues in the website.

Change the crawl frequency of your
website.

Inform Google which pages should not
be indexed

Helps you eliminate unwanted dynamic
URL’s from getting indexed.

Submitting site map to the search
engine.

Specify preferred domain

Identify HTML issues

Identify any other SEO violation
Apart from this, submitting sitemap to
local search engines, map services, yellow
pages, directories etc. is helpful.
Yahoo!
(Yahoo! Site Explorer) and Microsoft (Live
Search Webmaster Tools)
also offer various
free tools to webmasters for SEO related
topics and activities.
Submit to Directories:
A web directory is something similar
to a huge reference library. Directories
bring targeted traffic. Though the traffic
generated through directories is very
limited, this traffic is from people who, by
their own choice, come to the web site
by browsing through categories in these
directories. Another important benefit
is that these directories are considered
“expert” sites by search engines. So,
submitting a site in directories increases
its chance of getting better rank in search
results.
Directories can broadly be classified into
“Free Directories” and “Paid Directories”.
Paid Directories
- Paid directories
have standard submission process and
optimization strategies and they are
given importance by the search engines.
Submitting website link to any paid
directory is a faster and effective way of
link building. Some of the popular paid
directories are
Yahoo! Directory, Business.
com, Clora Business directory, Best of the
Web etc
. Yahoo is the most popular among
paid directories.
Free/Open Directories
- Free directory
submission is an effective way to get
a large number of inbound links. A
free directory will use a keyword when
categorizing your site, providing a natural
customized link building profile from
directories in your industry. There are many
free directories available on the web, such
as
DMOZ, Boingboing.net, Lii.org etc.
Open Directory and Yahoo directory stand
ahead of others in distribution and SEO
importance.

Open Directory Project – DMOZ is one
of the most well-known free directories
of websites created by volunteer
editors. The Open Directory Project
is the largest directory and a huge
reference library on the web.

Yahoo! Directory – It is the most
popular and largest paid directory
owned by Yahoo! Yahoo directory is an
effective way of link building since it
has around 330 million unique visitors,
making it a high traffic website. It costs
around $299/year for commercial sites.
Generic Guidelines for directory
submission:

Resolve all broken links, images and
missing pages from the website.

Make sure the description of the
website given during submission is
readable and related to the content
of the website. Do not stuff this with
keywords.

Select a suitable category as per your
website. If you have difficulty in finding
a suitable category check for your
competitors site category and submit
your site there.

Constantly check whether your site is
listed in the directory. Free directories
take a bit longer to list a site
Submit to Local Search engines, Map
services, Yellow services and Information
Providers:
Local Search Guide provided by the Yellow
Pages Association is the best guide for
Internet Yellow Pages, vertical directories,
and local search engines. Make sure to
check your listings in each of the sites listed
in the guide, and update where necessary.
Local Search Guide:

http://www.localsearchguide.org
Yellow Pages Association:


http://ypassociation.org
Apart from search engines listed above,
focus primarily on the local search or map
search sections, and you can add/update/
edit your listings in them. For example you
can update listings in the Google, Yahoo!,
and Bing search Engines at the following
URLs
Register with popular local search
providers (free)

Google Local Business Center:

http://www.google.com/local/add

Yahoo! Local:

http://listings.local.yahoo.com/

Bing Local Listing Center:
https://ssl.
bing.com/listings/ListingCenter.aspx
Take care when providing information

Consistency counts

Categories/services are important

As applicable, provide supplemental
information
In addition to the online directories listed
in the Local Search Guide, update your
information in any local directory sites
that are independent of the Local Search
Guide lists. Other Yellow Pages guides may
be dominant for your local area but may
not be listed here. Also check the printed
phone directories delivered in the area
where your business is located, and see
whether they have URLs printed on their
covers where you can audit/update your
information.
Social Media Marketing and SEO
cannot be separated
Social media marketing is a form of
internet marketing, which seeks to achieve
branding and marketing communication
goals through the participation in various
social media networks. Social media
networks are primarily Internet and mobile
based tools for sharing and discussing
information among human beings. The
term most often refers to activities that
integrate technology, telecommunications
and social interaction, and the construction
of words, pictures, videos and audio. Social
media sites have good visibility and Page
Ranks in the search engines and hence,
can be an effective link building strategy
for the sites who would like to get higher
rankings and increase their visibility and
popularity over the web. SEO and Social
Media are inseparable. SEO needs Social
media Marketing as that is the trusted and
reliable way to get more inbound links
and more importantly while doing Social
Media Marketing, SEO strategies should be
followed so that, there is a benefit to the
website or the brand. Enterprises should
target the below components in the social
world -

Blogs

Forums

Communities

Press Releases

Wikis/Widgets

Media uploads

Virtual worlds

Broadcast/Podcast etc.

Followers/Influencers/Fans on social
sites like Facebook, LinkedIn etc.

Likes/Dislikes on Social Sites like
Facebook
Some of the important strategies in this
space are -
Get quality inbound links from social
media marketing campaigns

Majority (if not all) of the links that
come from a social media marketing
campaign are natural links; they’re not
reciprocated, bought, or solicited
Leverage social media websites for brand
and reputation building

Ask people to talk positively about
your site on social media sites like
Digg, MySpace, YouTube, Facebook
etc. Beware of the negative effects, as
someone can tarnish your brand as
well.
Ranking pages on social media websites

Social media sites rank extremely well
on the search engines. Their domains
are very powerful and with a few
links to an internal page, it has a great
shot of ranking, even for competitive
keywords. We might want to consider
uploading some organization videos to
YouTube or creating a MySpace profile
and building a few good links to it.
Social media needs SEO

Your success in social media will
eventually depend on search rankings
in a blog search tool and the general
search results. Maybe it’ll be your
bookmark page, maybe a Twitter
post, or the page itself, but the search
rankings will matter
Create profiles on social media sites

Establish profiles on various social
media, and link from your profile
on those sites, back to your web
site. - Digg, Propeller, Flickr, Linkedin,
Bloggingzoom, Technorati
SEO Geo Targeting Guidelines - Is
your site SEO compliant for a global
rollout?
Websites rank differently in the search
engine result pages of different countries.
Geo targeting refers to the process of
optimizing a website, such that it gets in
the top position of country specific search
engine. Country specific search engine
refers to local version of search engine
like google.co.in for India or google.hk for
Hong Kong. The guidelines for targeting a
specific country are given below:

Top-level Domain:
A top-level domain
(TLD) indicates to the search engines,
the location of the site. A .co.uk domain
name should be used if the majority of
the customers are from the UK. A .cn
domain should be used if the target
customers are mainly from China

Local Hosting:
The search engines
can determine the web site’s location
through the web hosting. A server
hosted in the country which is targeted
is beneficial from SEO point of view.
Note that the IP address of the servers
should be of the target country.

Business Address:
Placing the local
business address in plain text at the
footer of the site can certainly assist
Google or other search engines in
determining the location of the
website.

Language:
It helps to have the site’s
content being in the language of
the country being targeted. It helps
further when the copy is optimized
so that for instance, UK English can be
distinguished from US English.

Inbound Links:
Efforts should be made
to get lots of inbound links from other
websites specific to the country being
targeted. Inbound links from same
region play an important role in geo
targeting.

Search engine Webmaster Tools:

Under the “Set geographic target”
option in your Google Webmaster
Tools account, choose “associate a
geographic location with this site” and
pick the preferred country for your
website to geo-target.
Avoid Black Hat SEO techniques

Do not use the keywords that are not
related to the site. Avoid keyword
stuffing. Search engines can penalize
your site, by blocking from their search
results or reducing the Search Rank.
On a parallel note avoid duplicating
content with the intention of repeating
keywords.

Avoid usage of invisible OR hidden
text. Some websites try to mislead the
search spiders by repeating words of
interest in smaller fonts or merged with
the background or text overlaid on top
of Flash object.

Avoid cloaking or masking the page
content. Some web designers program
the page content to be altered to boost
rankings when a visit from a spider or
crawler is detected.

Do not focus your efforts on link
exchange programs; as if you get any
inbound or outbound link from a bad
or untrusted neighborhood, then your
rankings can suffer. Reciprocal linking
is common and generally accepted
when not excessive and an attempt
is made to provide value to the site
visitor by leaving genuine inbound and
Keyword
Stu￿ng
Invisible text
Content
Scrapping
Doorway Pages
Link Farming
Cloaking
Google
Bowling/Washing
Google
Bombling
BLACK
HAT SEO
Other very common SEO mistakes

Ignoring HTML tags like Title tag

Identifying and doing SEO for irrelevant
keywords

Repeating keywords to a larger extent
in the website

Invalid HTML

URL’s do not have relevant and SEO
targeted keywords

Inbound links from bad neighborhoods

Heavy focus on images and flash and
very less textual content

No Inbound Linking and Social media
strategy
outbound links.

Do not employ in any off-site tactics
to bring down your competitors
rankings. Examples include; links from
bad neighbors, un-stealthy redirects,
mass-automated querying. The idea is
to make it look to the search engines,
as if your competitor is using black-hat
techniques and eventually reduce their
rankings or penalizing by blocking
competitors website.
How to plan the SEO work?

For existing sites, fix the high priority
SEO blockers in the initial phase of the
implementation like robot coverage,
broken links, duplicate content,
redirects, canonical tags, any black
hat techniques used etc. Analyze
the webmaster tools and see if there
are any blockers or issues with the
crawl ability of the site and fix them.
Check if there are any rogue inbound
links which are actually lowering the
rankings and affecting the SEO.

Fix the tags optimization, site map and
URL optimization, directory structure,
domains and sub domains based on
the targeted keywords in parallel or
after the SEO blockers are fixed.

Do the URL and site map submissions
to the search engines once the SEO
implementation goes live on the web

Website promotion strategies to start
immediately post go live of the SEO
implementation. In few cases where
there is not much change in the overall
content and the keywords, it can start
immediately once the keywords and
the inbound links have been identified.
Inbound link strategy will lead to
gradual improvements in the rankings.

SEO improvements can take from 2
weeks to 3 months of time and hence
continuous monitoring on daily to
weekly basis is required to see the
improvements and suggest any further
optimizations. Competitor sites are
also analyzed every week to analyze
the competition and make any further
changes if required depending on the
competitor’s strategies.

Social Media marketing is a continuous
activity and should be leveraged to get
good inbound linking and attract more
consumers to the site via social media.
Different strategies like press releases
and new campaigns to be designed
intermittently to get the web visibility
and brand name and recognition and
to keep the stability in the search
rankings.

Focus on Paid SEO, if the site is a fairly
new site for a short duration or a
longer duration depending on the SEM
strategy and the marketing budgets.

SEO will bring the traffic. But it is up to
the site to ensure customer stickiness
through relevant internal searches,
browsing behavior, internal site
structures, personalization and cross
sell/up sell, promotional strategies and
proper merchandizing to achieve the
desired conversion.
New Trends in SEO

The new SEO strategy :Quality matters,
not the Quantity


Now it’s time to focus less on SEO
tricks and more give emphasis on
good online marketing strategies.

More personalized and language
specific search results from Google:


Searchers can now see relevant
results in both English as well as
in the language they used for
search query. This will give better
user experience and break down
the language barrier and improve
quality of search. This is one of the
most significant changes Google
has done recently.

SEO for mobile search and Voice
search:


Usage of smart phones for
searching is increasing day by
day. Mobile and Voice search
is definitely among the fastest
growing area in search arena. Even
search patterns change - people
are opting for voice search then
browsing.

Long and more descriptive post will
have an added advantage in rankings:


Recently Google had done some
changes in algorithm displaying
text snippet about the website on
search results page. It may now
consider actual page text instead
of page Meta description, title
or header tag to display the text
snippet on SERP.

Promotion strategies are changing, for
example these days web apps play an
imp role in online promotions. You tube
is gaining importance as new search
tool and more and more of guest
blogging is advantageous.

Reciprocal Links are dead:


Reciprocal links are of no use now
and search engines don’t give
enough importance to these types
of links. In terms of link juice they
may devalue your links. Earlier it
was very easy to obtain a reciprocal
link and SEO experts started
building these links on a rapid
rate, however these links were not
adding enough SEO value to the
business. Google also considered
this as an act of link farming and
penalized the sites participating in
this.

Social signals like Facebook “Like”,
twitter” re tweets” etc. are given more
significance by search bots.
Conclusion: The SEO Process Flow
You have to do this before
Quality/Relevant
Content
Targeting
Keywords
Website
Optimization
Link Building
Social Media
Marketing
Content and
HTML
optimization
Keyword
research,
Keyword
Placement
URL optimization,
Bot accessibility,
Directory structure,
Internal Links etc.
Link building strategy
as per your domain,
Directory submission,
paid inclusion, Using
Press releases,
a￿liate feeds, RSS
feeds ,Company
Blogs etc
Social media,
viral marketing,
Brand building
The SEO Process
On Site Optimization O￿ Site Optimization
Above figure depicts how enterprises
should apply SEO strategies to their
website. On site SEO optimization
comprises of Content optimization,
Keyword and market research and website
optimization from SEO perspective.
However Off page optimization focuses
on website promotion and link building
activities. It is very important to make
your website SEO compliant before
stepping into Link building or social media
marketing because a well optimized
website is more relevant to user’s search
query and its popularity is determined by
factors like inbound links, social media
spread etc. Hence, the combination of both
makes your site relevant and in turn, into
a possibility of higher rankings on search
engines results pages.
References
1.
EBook: “The Art of SEO” by Eric Enge, Stephan Spencer, Rand Fishkin, and Jessie C. Stricchiola
2.
Website:
www.seomoz.org

3.
Website:
www.searchenginewatch.com
4.
Website:
www.rankquest.com
About the Author
Ketan Chinchalkar
is a Senior Project Manager with the Infosys Manufacturing Digital transformation practice with experience in Online
transformation space. Within the Digital Transformation practice, he provides leadership for the Center of Excellence (COE), and
the Search and Analytics group. Ketan has close to 12 years of IT industry experience including 6 years in Digital Transformation
domain and technologies.
© 2013 Infosys Limited, Bangalore, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without
notice.
Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted,

neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or

otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document.
About Infosys
Infosys is a global leader in consulting, technology and outsourcing solutions. As a proven partner
focused on building tomorrow's enterprise, Infosys enables clients in more than 30 countries to
outperform the competition and stay ahead of the innovation curve. With $7.4B in annual revenues
and 155,000+ employees, Infosys provides enterprises with strategic insights on what lies ahead. We
help enterprises transform and thrive in a changing world through strategic consulting, operational
leadership and the co-creation of breakthrough solutions, including those in mobility, sustainability,
big data and cloud computing.
Visit www.infosys.com to see how Infosys (NYSE: INFY) is Building Tomorrow's Enterprise® today.
For more information, contact askus@infosys.com ww
w.infosys.com