SEO (Search Engine Optimization)

deliriousattackInternet και Εφαρμογές Web

4 Δεκ 2013 (πριν από 3 χρόνια και 8 μήνες)

84 εμφανίσεις


(Search Engine Optimization)

Search engine optimization (SEO)

SEO is the acronym for "search engine optimization (Search Engine Optimization) or" search
engine optimizer "(Search Engine Optimizer), which gives us a series of clues and
techniques to interpret the mode of operation search engines.

Among many techniqu
es used for SEO, the
following two techniques will be useful to apply
on LKO.


Using robot.txt file


Using meta tags

We have already used meta tags technique and applied it on LKO. There are certain aspects
which we have to consider while applying this tec
hnique, such as appropriate usage of key
words, robots and meta tag description. Also, alt text can be used on images as key words
and images should be put in a separate directory so that
indexing of the page is necessary if
it is to be available for searc
h engines

We can also apply the robots.txt file technique to optimize LKO for search engines. We can
also optimize it specifically for google search engine to enhance optimization further.
Detailed techniques on how to use a robots.txt file and its forma
t is given below.

Using robot.txt file:

Reference URL:

What on Earth is a robots.txt File?

A robots.txt is a file placed on
your server to tell the various search engine spiders not to
crawl or index certain sections or pages of your site. You can use it to prevent indexing
totally, prevent certain areas of your site from being indexes or to issue individual indexing
ns to specific search engines.

The file itself is a simple text file, which can be created in Notepad. It need to be saved to
the root directory of your site, that is the directory where your home page or index page is.

Why Do I Need One?

All search engine
s, or at least all the important ones, now look for a robots.txt file as soon
their spiders or bots arrive on your site. So, even if you currently do not need to exclude the
spiders from any part of your site, having a robots.txt file is still a good idea,

it can act as a
sort of invitation into your site.

There are a number of situations where you may wish to exclude spiders from some or



You are still building the site, or certain pages, and do not want the unfinished work
to appear in search engines


You have information that, while not sensitive enough to bother password protecting,
is of no interest to anyone but those it is intended fo
r and you would prefer it did not
appear in search engines.


Most people will have some directories they would prefer were not crawled

example do you really need to have your cgi
bin indexed? Or a directory that simply
contains thank you or error pag


If you are using doorway pages (similar pages, each optimized for an individual
search engine) you may wish to ensure that individual robots do not have access to
all of them. This is important in order to avoid being penalized for spamming a
search e
ngine with a series of overly similar pages.


You would like to exclude some bots or spiders altogether, for example those from
search engines you do not want to appear in or those whose chief purpose is
collecting email addresses.

The very fact that sear
ch engines are looking for them is reason enough to put one on your
site. Have you looked at your site statistics recently? If your stats include a section on 'files
not found', you are sure to see many entries where search engines spiders looked for, and
failed to find, a robots.txt file on your site.

Creating the robots.txt file

There is nothing difficult about creating a basic robots.txt file. It can be created using
notepad or whatever is your favorite text editor. Each entry has just two lines:

ent: [Spider or Bot name]

Disallow: [Directory or File Name]

This line can be repeated for each directory or file you want to exclude, or for each spider or
bot you want to exclude.

A few examples will make it clearer.

1. Exclude a file from an individual

Search Engine

You have a file, privatefile.htm, in a directory called 'private' that you do not wish to be
indexed by Google. You know that the spider that Google sends out is called 'Googlebot'.
You would add these lines to your robots.txt file:

nt: Googlebot

Disallow: /private/privatefile.htm

2. Exclude a section of your site from all spiders and bots

You are building a new section to your site in a directory called 'newsection' and do not wish
it to be indexed before you are finished. In this ca
se you do not need to specify each robot
that you wish to exclude, you can simply use a wildcard character, '*', to exclude them all.

Agent: *

Disallow: /newsection/

Note that there is a forward slash at the beginning and end of the directory name, in
that you do not want any files in that directory indexed.

3. Allow all spiders to index everything

Once again you can use the wildcard, '*', to let all spiders know they are welcome. The
second, disallow, line you just leave empty, that is your di
sallow from nowhere.

agent: *


4. Allow no spiders to index any part of your site

This requires just a tiny change from the command above

be careful!

agent: *

Disallow: /

If you use this command while building your site, don't forget
to remove it once your site is

Getting More Complicated

If you have a more complex set of requirements you are going to need a robots.txt file with
a number of different commands. You need to be quite careful creating such a file, you do
not want to

accidentally disallow access to spiders or to areas you really want indexed.

Let's take quite a complex scenario. You want most spiders to index most of your site, with
the following exceptions:


You want none of the files in your cgi
bin indexed at all, n
or do you want any of the
FP specific folders indexed

eg _private, _themes, _vti_cnf and so on.


You want to exclude your entire site from a single search engine

let's say Alta


You do not want any of your images to appear in the Google Image S
earch index.


You want to present a different version of a particular page to Lycos and Google.

Caution here, there are a lot of question marks over the use of 'doorway pages' in
this fashion. This is not the place for a discussion of them but if you are

using this
technique you should do some research on it first.

Let's take this one in stages!


First you would ban all search engines from the directories you do not want indexed at

agent: *

Disallow: /cgi

Disallow: /_borders/


Disallow: /_fpclass/

Disallow: /_overlay/

Disallow: /_private/

Disallow: /_themes/

Disallow: /_vti_bin/

Disallow: /_vti_cnf/

Disallow: /_vti_log/

Disallow: /_vti_map/

Disallow: /_vti_pvt/

Disallow: /_vti_txt/

It is not necessary to create a new
command for each directory, it is quite acceptable to just
list them as above.

The next thing we want to do is to prevent Alta Vista from getting in there at all. The
Altavista bot is called Scooter.

Agent: Scooter

Disallow: /

This entry can be tho
ught of as an amendment to the first entry, which allowed all bots in
everywhere except the defined files. We are now saying we mean all

bot can index the
whole site apart from the directories specified in 1 above,

except Scooter which can index


Now you want to keep Google away from those images. Google grabs these images with
a sperate bot from the one that indexes pages generally, called Googlebot
Image. You have
a couple of choices here:

Agent: Googlebot

Disallow: /images/

That w
ill work if you are very organized and keep all your images strictly in the images

Agent: Googlebot

Disallow: /

This one will prevent the Google image bot from indexing any of your images, no matter
where they are in your site.


y, you have two pages called content1.html and content2.html, which are optimized
for Google and Lycos respectively. So, you want to hide content1.html from Lycos (The
Lycos spider is called T

Agent: T

Disallow: /content1.html

and content2.h
tml from Google.

Agent: Googlebot

Disallow: /content2.html

Summary and Links

Writing a robots.txt file is, as you have seen, a relatively simple matter. However it is
important to bear in mind that it is

a security method. It may stop your speci
fied pages
from appearing in search engines, but it will not make them unavailable. There are many
hundreds of bots and spiders crawling the Internet now and while most will respect your
robot.txt file, some will not and there are even some designed specif
ically to visit the very
pages you are specifying as being out of bounds.

For those who would like to know more here are some resources you may find useful.

robots.txt File Generators

I think it may be easier to write your own file than use these but for
those who would like to
have their robots file generated automatically there are a couple of free online tools that will
do the trick for you.

Web Tools Ce

Submit Corner

If you are going to write quite complex robots.txt files it may be worth your while having a
look at RoboGen, a program that will create the files for you


More about Robots and Spiders

The Web Robots Page

Has more information than you will probably ever
need about robots and spiders of all
kinds, including a comprehensive database of information about all know ones.

Robots & Spiders & Crawlers

An inte
resting and comprehensive guide from Inktomi about how search engines collect
information. Available as a PDF also.

Spider Spotting

How to identify what spiders have visited y
our site and what they have done there

Spider Hunter

Site devoted to spider and bot information, a treasure trove for the real enthusiast! Includes
several good lists of spiders in various categories

Using Meta Tags

Reference URL:

general, users do not make much use of the lists of favorites or bookmarks browsers provide systems
and rarely direc
tly typed the URL of the page you wish to go, because he might not know it.

Given this dynamic, web positioning strategies are critical . Since on the other hand, after a search of less
than 40% of Internet users reached the second page of results that pro
vides a search engine, and only
10% check to the third. This is where it is clear the need to achieve visibility and to achieve this we need to
think like search engines to better communicate with them.

Google, Yahoo, MSN and company order their results th
rough mathematical algorithms that undergo
analysis hundreds of factors. Each company uses a different formula and the idea is not decipher.

The answer lies in the SEO . SEO is the acronym for "search engine optimization (Search Engine
Optimization) or" se
arch engine optimizer "(Search Engine Optimizer), which gives us a series of clues and
techniques to interpret the mode of operation search engines.

1. Content

What users see vs. what are the search engines.

Users can watch and / or read the website but search
engines can only read the programming code with which they are developed and, even today, still can not
read them all equally, for example, Flash has recently joined Google results but can not be optimi
zed as

Graphics, sounds and videos are part of a page that search engines can not decipher it is for this reason
that the positioning work is based on HTML tags and text.

2. Indexability

This concept refers to the ownership of the page to be properly

indexed by search engine robots,
allowing access to all corners, facilitating navigation and showing all its contents clearly.

"The meta tags

The Meta Tags are code in the header is not visible on screen when displaying the page. There are several
types o
f Meta
Tags and they do a description of the content and features we want that search engines
take into account.

Meta Robots :

This tag is only necessary if you want that page is not indexed in search engines.

Meta Description


This t
ag is important because it allows us to make a more profound than in the
title, which we refer to the page content.

Meta Keywords :

Perhaps the most important, she put the key words through the schools want this
page to be found on search engines.

Any other Meta tag is not considered by most search engines, only
for some very specialized. It is noteworthy that Meta tags are not a magic solution to position a site in
search engines, you must use all the techniques to be successful.

Optimization tech
niques for Google:

10 tips / tips to consider optimizing your site before sending it to Google. Following these tips can get
Google SEO to record better your website.

1. If your website has the Welcome screen,

make sure you have a text link that allows vis
itors access the site. It is common to see many sites with a
welcome screen very striking and full of effects thrown into Flash but no other way to access the site. It is
advisable to have a text link that gives access to the site "traditionally" because G
oogle can not read Flash
pages and therefore can not access the rest of the site.

2. Sure not to have broken links.

Sounds pretty obvious. But it's impressive the number of errors experienced by the Google engine daily
due to broken links. So we would h
ave to check all internal links on our site.

3. Check the labels TITLE

The title of the pages is very important to Google, so you should check that the TITLE tag is relevant to
the content of the page in question. This does not mean you have to put a title

of more than 20 words,
but rather in keeping with the content and easy to read by search engines and surfers.

4. Check the META tags

Rumors that Google is not interested in the META tags are not entirely certain. Google uses these tags to
describe a websi
te when there is too much code to read. So enter some valid as META tags KEYWORDS
and DESCRIPTION for keywords and site description respectively.

5. Check ALT tags

ALT tags should be the least likely used by webmasters. We add these tags to describe images

for them.
They are not a factor, but a plus for Google.

6. Check your frames (frames)

A frame is a frame or separate box in which we can load a web page. If you use frames that Google can
not index us 100%. I personally recommend not using frames, but i
f you decide to use them read this

7. Have you got dynamic pages?

It is known that the web evolved greatly in recent years, and that more and more pages based on dynamic
scripting languages (PHP, ASP, etc.).. But it seems that Google is limiting t
he amount of dynamic pages it
indexes, so we could include some static pages ... when the momentum is not necessary.

8. Regularly update

This is a very important aspect that you should consider, as quickly Google indexes more pages that are
updated with s
ome regularity. Podes note that the number of pages indexed by the search engine is
increased if you update daily, but may stagnate or decrease if not provide new content. I recommend you
put a META option in the header to tell Google how often you should
return to reindex.

9. Robots.txt

This file can be very helpful if we use it correctly. With robots.txt you can filter search engines recorded
our website and to restrict access to certain URL's that do not want to be indexed (login page, file folders,

10. "Cahe Cache or not?

Google maintains a cache of some pages to have a faster access to them. Some webmasters prefer not to
be cached, or Google cachee our pages all we have to do is place the following META tag between the


* With that prevent robots from caching and archiving our pages.

Images optimization:

There are ways to optimize your image to come out in the first results for a given search. Optimizing these
images is not as complicate
d as many think, the factors that influence the positioning of images are:

Filename: You have to put a name according to how you find your image. A name containing the
keywords. For example, if you have a picture of a 1970 Pontiac GTO, Pudes not upload th
e photo with the
name you put the camera default img1968.jpg have to raise it with the name Pontiac GTO

The ALT attribute: Make sure you provide an adequate description of the photo with the ALT attribute.
This attribute was intended for display
in text browsers that do not bear the images (many years ago). Try
not to use too much text. Focus on a text containing keywords that accurately describe the picture.

Title and text: The title gives you a better idea of what Google imagen.Al treated like
the text surrounding
the image.

Anchor Text (Text Link): When you link an image using the appropriate link text. It's like the anchor text
you use for enlzar pages.

Google Image Labeler: Use Google Image Labeler thi
s will help
your images in scrambled the top of the search engines allowing others to categorize your pictures. It's a
pretty fun tool. Simply enable the Google Image Labeler from Google Webmaster console.

These tips will get you a better position your pi
ctures and therefore have more traffic to your web site.

Optimize Your Graphics for a Fast Loading Site

Heavy images cost you money and traffic.

They cost you money because they require both significant
storage space and bandwidth.

Since your web host

will usually give you a limited amount of storage
space and a maximum data transfer allowance, heavy graphics can cause you to exceed those limits, in
which case you’ll have to pay extra.

Then, heavy images cost you traffic: put up a web page that takes
more than 10 seconds to load, and your
visitors will run away faster than you can say “back button”.

If you happen to be running an e
website, you already know that traffic equals money, so heavy graphics will make you lose both.


there is a solution: you can optimize your images for the web.

Your images should be in
either .gif or .jpg formats (.gif works best for logos and navigation buttons, while .jpg works best for

The idea is to reduce the size of your graph
ics so that they take as few bytes as possible
while retaining acceptable quality

Another useful tip is to use thumbnails.

Thumbnails are miniature versions of a picture that are
hyperlinked to its actual size version.

The thumbnail will load fast, and b
y clicking on it your visitors will
be able to see the actual size version.

Also, it is very important to specify the width and the height of your images in your HTML code.

Since the
text of your page usually loads faster, if you don't specify the width and the height of your images the
browser will have to reposition the text on
ce the pictures load, consuming more time.

If you take the time
to specify the width and heigth of your images, the browser will lay out the text where it should go from
the beginning, even before it loads the images, saving time.

Use all these technique
s and you will have a faster loading website, while you will save more of your
storage space and data transfer allowance for that useful content your visitors are always looking for

in SEO

When you insert your keywords throughout t
he content of the on
site, it is vital to understand the density
at which the keywords are optimized on a page. The "keyword density" is referring specifically to how
often a keyword is used in your copy of the in
page. Search engines compute the percentag
e of keyword
density, and found the more often a search term in content, most will think that your page is relevant and
should be ranked higher. However, there is a fine line between having an optimum density of the keyword
and over
stuffing your content w
ith keywords and terms.

Please note: Just because you have the right amount of keyword density in your SEO campaign, it means
that the higher you will automatically align. The new Google algorithm favors really related terms to
search only the keyword den
sity. Focus also on the quality of its content, which only focus on your
keyword density. If you exceed the optimal density of the keyword, you can face a penalty of over

There's percentage of "perfect" in the keyword density in SEO, but str
ives to keep our Volacci density at or
below 5 percent. The percentage of a page is relative to its length happy, so it is recommended that you
know the formula for calculating each page. The formula for the keyword density is quite simple.

1. Count ho
w many words you have on your page.

2. Count how many times you have used your keyword

3. Apply this formula

Keyword density = ([key word count]) * 100 / (total word count]

For example, let's say you have a 500 word article about PEZ dispense
rs, and you are optimizing for "FISH"


Total Word Count: 500


Applications of FISH keyword: 12

Density of "fish" = (12 * 100) / 500

The keyword density for "FISH" on page is 2.4%

SEO campaigns, have keyword optimization in
content on
page is very important to be relevant to the
search engines. However, if you focus too much on keyword density, you can potentially lose focus on
other important elements in the dynamic approach to optimizing your site. Use the formula for keywo
density to ensure that you are applying relevant applications of your keywords in quality content and links
so that you are getting the most out of your research. If you abuse your keywords, Google will get you
gods, so the stuffers are not saved.

cci is the main company of Drupal SEO and very passionate about their online success. By the end of
your contract you will have at least much additional business from your website as you spend on our
services ... or work for free until you do.

Tips for op
timizing your website:

No matter, what you are and what is your field, you can start your online business and increase your
business by making website. Do not worry if you cannot invest any penny. You can start your online
business even without money, but
have no idea where to start. How can I start my online business without
any investment? It's easy and the only thing that is required is consistency and strong will to achieve your

How to start your online business?

Like other thousands of people, y
ou can also earn money by making website. World's most popular online
earning source is google adsense service. There is no any prerequisite to start your online website design.

What is domain name and Hosting?

The first thing that is required is a websit
e; we have to purchase a domain name to make our website
online. You can find lot of cheap domain name registration companies offering cheap web hosting
packages on their dedicated servers. They also offer, web design for business promotion and seo
s. In fact, the domain name is a website name. You can register domain name with your own
name or your company name according to the domain name availability. Different domain such as .com,
.org, .edu and .net etc are available. After register your domain
name, you have to get hosting for your
website and it also require some charges.

But, do not worry; you can get your own free web site also. You can make your free blog on or or register your free domain name on so
me free hosting
provider sites.

After that you have register your domain name and web hosting you need how to make your own
website. You can make your own website easily. It can be your official web site, personal web site,
business website or seo website.

If you do not know website building techniques, you can hire a web
developer, make website via some free website or purchase cheap website. There are many websites who
offer how to build a web site. Really, it's easy and you can build a web site himself.
You can also download
some free website tool to build website. They offer easy website creator techniques and even a person do
not know anything abut web development can easily make web site.

After you have registered you domain name and website building,

you need to register for google
adsense account to start earning money. To register for Google Adsense, make your e
mail address in
Gmail. For registering on google adsense account visit this URL

Web Site Desig
n Tips:

This is very necessary for webmaster to design a google friendly website. If you're doing seo and you
build a website for search engines, it is important to make sure that your website is a seo friendly.

You need do a lot of work to get the top ran
king on search engines and get the best place. You have to
ensure that you are giving your site the best possible chance to get that high ranking is needed. Follow
the rules below and you're on the right track:


Let's start with a simple one, allowin
g its producers do not use frames on your website. It really was the
past, but still some old fashioned programmers can use them. Important to consider that the frame can
not be properly indexed by search engines and should really be avoided at all costs m

Images, Flash and add text to video

However, Flash and there is becoming accessible to search engines still have problems with it. The best
way is thinking about things that search engines can only read text, and so they will not be able to index

images, Flash or video files.

So what is it to add your image files on the file where you call an Alt tag can have descriptive text is
important. The search engine about to read and index your files which will be appropriate.

Similarly, for sites with vi
deo files or Flash files, you describe what Web shows or does your video file to
Flash page should include additional text.

Just a quick word on the entire Flash websites, this search engine to work with is a good idea of your site
for search engines can c
opy an HTML. Otherwise you risk your site, not all indexes are being

Use Search Engine Friendly URLs

Many systems that URL with the normal numbers and symbols that make no indication of whether the
page is about to generate. This can create s
erious problems, first, because there likely search engines will
not index it. This as they see the reasons for their content and, therefore, often changing prices for
dynamic indexing. It is about the search engine what your page will not be indicated.

he best way that this URL is descriptive text and forward slashes is made including handle. It's really
important as the search engine to learn what's really on your page before they read it allows. In addition
these URLs are dynamic because they are exter
nal characters on websites saying that the story is better to
include. These URLs look like normal website directories are ready when they really are dynamic URLs.

Use the direct link to your website HTML Navigation

It is important to remember that althou
gh they look good search engine with JavaScript and Flash menus,
you can cause a lot of problems. Many search engines can not follow the reasons for these links. So a
search engine home page you will find and index your site forward, they will not be able
to go on other

Model menu and a text
based links are keyword targeted landing page will include text. Within your site,
remember that almost every page of a very real chance of attaining a higher ranking, you do not need to
do all to ensure design,

content and site structure is optimized.

Finally, if you all are uncertain whether your menu is search engine friendly text in your page footer menu
using keyword targeting with the principles described above has anything back. Alternatively you for your
site and add a site map may be a link from its home page. These either can navigate to a search index
engine and will ensure all pages on your website.

Create a meaningful title tags

It is important to all of your pages relevant titles. The search engine
will read and to help them on each
page of content is about to determine. To ensure your keywords to help you remember those words
appear for the top post.

If you step above a search engine friendly website, you must follow. Besides that it also be user f
riendly so
that you win every way. You and your users on your site rankings used effectively will be able to stand a