SEO search engine optimisation for beginners *guide

SEO Basics

Introduction

SEO (search engine optimisation) is now one of the mandatory forms of promotion sites. I would, however, prove that you do not have to hire a SEO expert and thanks to a few simple rules you can manually position your site in search engines. However, if you decide to hire a company to position your site, thanks to this course you will know exactly what you pay for and also know the elementary rules of safety in terms of positioning. Below are a few of the most important and most interesting positioning techniques.

Domains and positioning

Keywords in the domain name
In terms of selecting the domain name there is no unique rule. The approach to the problem requires an individual case study. Choosing a domain keep in mind the fact that it should be easy to remember and interesting. However, keep in mind that the selection of a domain that contains keywords, brings with it many benefits. First, if someone puts on their website link to our site without anchor tag, the keywords will instead anchor tag supported the positioning of our site. e.g. Link:

[pastacode lang=”markup” manual=”%3Ca%20href%3D%22http%3A%2F%2Fwww.webdesign.com%22%3E%20http%3A%2F%2Fwww.webdesign.com%3C%2Fa%3E” message=”” highlight=”” provider=”manual”/]

It contains in fact in the anchor tag keywords! Such links may also be better recognised by the users in their search result, because the keywords in domain will be in bold.

What domain name to choose?

It has been known that Google likes a bit better primary domains, such as .com or .co.uk. Therefore, directing your website only to English users would be best to choose a name ending in .co.uk. From my observation .net is also a good domain, but remember that most users remember domain names ending in .co.uk

Taken domain

Unfortunately, if we want to base a domain name with keywords and choose a popular extension probably you find that the domain name is already taken. If we are very keen on a specific name, we can try to write to the owner of the service and request resale of his domain. Similarly, we can do if the domain is parked. When domain validity ends, it goes to transition period typically lasts approx. 40 days and often at this time services associated with the domain are suspended. After 40 days, if the domain is not renewed, it passes the so-called redemption period. If during this period the owner did not try to extend approximately within a week, domain is deleted from the register domains.

Summering up these periods, after the expiry, domain is not available for more than 70 days. After the expiry of the official period to buy a domain, we can request conditional option with domain operator – is associated usually with payment of the fee for the domain, and if the domain has been restored by the owner, the fee is returned to us. Not all providers offer such possibilities; take a look at the services:

http://www.godaddy.com
https://www.123-reg.co.uk
http://www.enom.com

If you want to see which domains are currently being removed or during expiration process, see the page:

http://www.freshdrop.net

Sometimes finding the right domain name is a difficult task. You can then make use of generators names and keyword generators:

http://www.domaintools.com
http://www.panabee.com
http://www.domainsbot.com
http://www.namemesh.com

Domain age

Domain age has been proven to influence its position in the organic results, of course the older the domain the better. However, in the initial period of the domain you can see that it is slightly higher in the organic results and then its position falls. Google in this way promotes the new domain, sometimes giving them a higher position, than it is positioning factors. After some time, results return to normal. It is a good practice to include rich text in keywords as soon as possible after registering a domain and then start working on the site.

Factors influencing ranking

Google says that there are over 200 factors that influence the position of your website in the organic results. Of course, they are covered by strict secrecy, but by trial and error as well as careful observation we managed to emerge a few key factors that will influence the position of your website in SERP. These factors are divided into internal, which are on the side of our website, such as adequate saturation of keywords, and external, which include promotion of website on other websites and, for example, receive valuable links to our site. Some of the factors are listed and described in more detail below:

  1. External inbound links
  2. Title – the title of the page
  3. Age of domain / website
  4. Keywords in the code of the page
  5. Popularity factor
  6. The presence of the keyword in the domain / friendly URLs
  7. Internal linking
  8. Regular updates and site development

Internal factors

Among the internal factors among the most important are:

  1. TITLE tag
  2. META tags
  3. Headers
  4. Highlight text
  5. Keywords prominence
  6. Keywords density
  7. Images alt attributes
  8. Structure and folders naming

Title tag

It is one of very important factors that have the control. TITLE tag should be different for each page and describe the keywords of its contents. It will also display in the title bar of the browser, and therefore should include an intuitive name. The more keywords we use, the less will be the significance of each of them for positioning. Make sure the title is intuitive, and instead of calling the contact page “Contact Us”, use the title “Contact CompanyName”, this way you ensure a better performance in the search engines under the phrase with the name of your company. Accordingly, instead of the title “About us” you can use the “CompanyName – Service x, y, z”. As for titles, should also preserve the coherence and consistency. For example, all the titles you should format with a prefix containing the name of the company, followed by a hyphen, or vertical line and keywords varied for each document .html.

Title correspond to the content of the page, so be sure to place keywords which you used in the title, directly in the content of the document. Otherwise, you may be considered by crawlers to be a spammer.

Keywords in content

Although this is probably the most obvious factor in positioning, it is worth remembering a few rules, according to which we should prepare texts on our website. Generally, when writing content for website you should mainly think about users comfort when they read it, after that we should put crawler friendliness.

Be careful; don’t use high density of keywords because it can be suspicious for crawlers. It is assumed that the overall relative saturation value should be between 4-6%. So if we are at 1,000 words, up to 60 of them may contain a key phrase. And it literally – contain the key phrase! This means that the phrase “web design” can be read as the phrase “We will design for you a website.” But these are not very strict rules and our website may be properly indexed also with a greater keywords saturation.

The structure of keywords

It has been proved that not only the density of keywords, but also their position is important in the context of positioning. We should try to make the most important keywords as high as possible on our website. This does not apply only to content, title tags or meta should be selected so the most important keywords occurred in the first place.

Keywords in the URL

In the context of search engine optimisation, it is also important to use the appropriate name for our files – documents .html or graphics, because in such a situation, the path to them contains a keyword. Consider the example address: http: //www.keyword.co.uk/another-keyword.html.
This structure has many benefits – firstly, if we do not use anchor external link – we gain the presence of keywords in the ordinary anchor. In addition, the results of the search keywords entered by the user will appear in bold.

Keywords in META tag

Although there are many controversies about whether the meta tag is the essence in terms of positioning, I believe that we should prepare for each page meta description. It should include an intuitive and neat, a brief description of the site, which also encourage the visitor to click on your link. If you use the keywords in the tag, take care of it, so that they are also in content. Vary these tags for each document .html. Sometimes you may encounter a situation in which meta tags are ignored in the description of your site in SERP for another text. It can come from catalogs, which hit your site, for example. DMOZ or the Yahoo! Directory. Often, it is worth in this case to order the robot to ignore these descriptions in order to take into account the contents of your tag.

Other benefits

  • Important for users
  • 200-250 characters
  • Avoid repeating the content of the tag title
  • Key words must be included in the page content
  • It should include a call to action
  • Do not repeat the keywords, remember phrase versions

Keywords in headlines

One of the major factors taken into consideration when collecting valuable keywords for optimisation is the use of headers. The text in the header
h1 is extremely important and should contain keywords associated with the document. Rules of keywords order and quantity also applies here. In addition, as a rule is not to use more than one h1 heading on the page, and a few headlines h2 and h3. There is no evidence that the successive headers affect positioning. In the context of changes in HTML5, you can dare to say that will increase search tolerance for the presence of more than one h1 tag, as the new semantic elements like

[pastacode lang=”markup” manual=”%3Ch1%3Etitle%3C%2Fh1%3E%0A%3Cheader%3Eand%0A%3Cfooter%3Eor%0A%3Carticle%3Eare%20somehow%20separate%20elements%20and%20may%20include%20the%20structure%20of%20headers%20from%0A%3Ch1%3Eto%3C%2Fh1%3E%0A%3Ch6%3E.%3C%2Fh6%3E%0A%3Cstrong%3EHighlighted%20text%3C%2Fstrong%3E%0A%3C%2Farticle%3E%3C%2Ffooter%3E%3C%2Fheader%3E” message=”” highlight=”” provider=”manual”/]

Some is also important to mention the most important phrases, which of course should contain keywords. It should be considered two possibilities. We highlight some text semantically, through tags such as and or in HTML, and do it with the help of CSS, using e.g.

HTML

[pastacode lang=”markup” manual=”%3Cspan%20class%3D%22thick%22%3E%20the%20highlighted%20text%20%3C%2Fspan%3E” message=”” highlight=”” provider=”manual”/]

CSS

[pastacode lang=”css” manual=”.thick%20%7B%20font-weight%3A%20bold%3B%20%7D” message=”” highlight=”” provider=”manual”/]

There is an important difference here. In the first case we give robots to understand through the structure of the text is highlighted, and it affects the indexing of keywords, in the second case – using CSS – even though the text is bold, for robot does not make any difference. Therefore, the optimal strategy would be to adopt highlighting keywords structurally, e.g. Using , while other phrases that are important to the recipient but do not contain keywords – with the help of the above stated CSS syntax. And one more note – for highlighting text do not use on Web pages underline, because it is reserved for hyperlinks and can mislead users!

Website size

The more text on our website – better. You’ll also want to make this text rich in keywords. You should also think about the separation of unrelated documents in two separate pages, create good name of the document and link it within the structure of the page with the appropriate anchor and certainly will have a positive impact on optimising your site for search engines. It is widely accepted that the page can not be too poor in content and 200 characters is an absolute minimum. Also, our site should be opened in less than approx. 9 seconds, probably not only the robot will not take up its indexing, but also users will give up its browsing.

7 Golden Positioning Rules

  1. Interesting and rich content FOR USERS, not search engines
  2. Searchable content – in line with the standards and art of design
  3. Transparency and information architecture
  4. Alignment with other forms of SEM
  5. Do not use black SEO
  6. Quick loading pages
  7. Trust your own experience

What search engines wants?

  • Many valuable links leading to the site
  • Rich content with keywords
  • Pages popular among users
  • Fast page load
  • Structure compliant with standards
  • The apparent content (note the Flash video)
  • URLs with keywords
  • White positioning practices
  • Domains that have existed for a long time

Tips for directories

  • More effective is adding listings manually
  • It should diversify descriptions, keywords and titles
  • You want to be in popular catalogues
  • You want to be in the industry directories
  • It is not worth exchanging links with directories

Tips for robots.txt

This file is placed in the root directory of our site and it contains commands that tell the search engine robots to crawl our website:

[pastacode lang=”markdown” manual=”Example%20commands%3A%0A%E2%80%A2%20User-agent%3A%20*%20-%20specifies%20all%20robots%0A%E2%80%A2%20Disallow%3A%20%2F%20directory%20%2F%20-%20blocks%20access%20to%20the%20directory%0A%E2%80%A2%20Sitemap%3A%20%2Fsitemap.xml%20-%20provides%20a%20link%20to%20the%20sitemap%0A%0AExclusion%20of%20the%20entire%20site%20from%20indexing%20by%20all%20search%20engines%3A%0A%E2%80%A2%20User-agent%3A%20*%0A%E2%80%A2%20Disallow%3A%20%2F%0A%0AExclusion%20of%20part%20of%20the%20service%20of%20indexing%20by%20all%20search%20engines%3A%0A%E2%80%A2%20User-agent%3A%20*%0A%E2%80%A2%20Disallow%3A%20%2F%20cgi-bin%20%2F%0A%E2%80%A2%20Disallow%3A%20%2F%20private%20%2F%0A%0AExclusion%20of%20the%20whole%20service%20of%20indexation%20by%20a%20search%20engine%3A%0A%E2%80%A2%20User-agent%3A%20Googlebot%0A%E2%80%A2%20Disallow%3A” message=”” highlight=”” provider=”manual”/]

Meta ROBOTS:

INDEX – page should be indexed,
FOLLOW – search should follow the links provided on the page,
NOINDEX – site should not be indexed,
NOFOLLOW – search should not follow the links provided on the page
ALL – as INDEX and FOLLOW.
NONE – as NOINDEX and NOFOLLOW

What are the stages of SEO expert?

  • Keyword Research
  • Competition Research
  • Selecting the domain name
  • Website framework
  • Work on the preparation of the site
  • Optimize site code
  • Preparation of content
  • Working with the finished service
  • Submitting to directories
  • Acquiring links
  • Site Updates
  • Additional SEM

Useful techniques for content acquisition:

  • Blog
  • FAQ rich in keyword phrases
  • Reviews of product, articles, guides
  • Content generated by users
  • Discussion forum
  • Comments
  • Social networking (Facebook, Twitter)
  • Content from external sources via RSS
  • Constructing original and interesting texts on a page inspired by users views

External factors positioning

There are many external factors positioning, the most important are undoubtedly links to our site.

Links can obtain from the following sources

  • Catalogues
  • Exchange links with partner sites
  • Writing articles, content sharing
  • Forums, comments (some scripts automatically add entries to the forum or in blog comments rel = “nofollow”, so that the crawler does not take them into account)
  • Viral Marketing – Create valuable content

The quality of links is important! Take care of:

  • Keyword in Anchor Text
  • Number of other outbound links – the more, the less important the link
  • High PR ensures good quality links
  • Try to exchange links with sites related thematically
  • Keywords that occur around the link can improve its effectiveness in positioning

Beware of spam; avoid the following:

  • Unrelated keywords site
  • Too many keywords
  • Hidden text
  • Hidden links
  • Replacing, cloaking, redirects
  • Cybersquatting

Types of penalties:

  • Filter – lower position for a while
  • Ban – removal from the search engine index
  • Sandbox – also involves lowering position and, if necessary temporary removal from the index
No Comments

Sorry, the comment form is closed at this time.