Create a search engine friendly website in 10 simple steps

It is search engines that search web information for users. They use crawlers to retrieve sites but the crawlers don’t see websites as human. A crawler is software program. It searches sites with mathematical calculations called search engine algorithm. SEOs can achieve high ranks for sites by optimizing sites to guide crawlers. The process is to manipulate internal structure and external environment of the sites and get them on top of organic search.

SEO has two stages. At first stage, factors that make sites or web pages crawlable and indexable are included. Non-technically speaking, the pages become visible for crawler. In second stage, relevance of the pages is improved by tweaking its internal structure and external environment. Search engines find the web pages relevant to the searched queries.

First stage in SEO – making the website crawlable

Practices followed to make sites visible for crawlers

  1. Indexable content

Content is written in HTML as crawlers can’t read image files, Flash, Java console or Ajax.

  1. Internal link structure

All the pages are linked to Home Page to help crawlers crawl every page

  1. Website navigational structure

A quick navigational structure makes sites more user-friendly. Technically, the inner pages must not be more than two or three clicks away from the home page.

  1. Sitemap

Sitemap is an xml file that gives information on web pages like when the last update was made and pages changed and expected changes in future. The information helps in crawling websites. The file is made xml in following format:

<url>
<loc>http://www.example.com/mypage</loc>
<lastmod>2013-10-10</lastmod>
<changefreq>monthly</changefreq>
<priority>1</priority>
</url>

  1. Leveraging robots.txt file- Denial of crawling process

This file guides crawlers in crawling content. It is found in root domain www.yourdomain.com/robots.txt and used for:

(a) Hiding non-public information

(b) Deny indexing of duplicate content

(c) Prevent crawling of specific scripts, codes and utilities

(d) Block access to xml sitemap

robots.txt file- A sample:

User-agent: *

Disallow: /

  1. Adherence to clean IP Policy

Search engines want a clean IP as they suspect that certain domains to be spammers. Domains found spamming face penalties by search engines. It is better to check servers, hosts and domains beforehand.

Second stage of SEO- enhancing the search visibility through internal improvement and link authority

After the internal link building is complete, it’s time to check parameters of the web pages to determine their visibility. The process can be divided into two stages – On-page-SEO that is manipulation of internal structure and Off-page-SEO that is collecting authoritative links.

On-page SEO- Tweaking the internal environment

It includes:

  • Domain level keyword & agnostic features
  • Targeting relevant keywords
  • Fresh and informative content
  1. Domain level keyword and agnostic features
  • Domain name should include keywords
  • It must be unique
  • Short name is easy to remember
  • Hyphen should be used to separate multiple words in URL
  • Older domains are preferred over younger ones
  1. Keyword targeting

Keyword targeting starts with searching keywords relevant with the business represented by the site. The searched keywords are incorporated in HTML Meta tags, header tags and content. Search engines give importance to meta and header tags to determine relevance of keywords with website.

(a) <title> tag

Each page should get a unique title

Keyword must be used at the beginning

Title should reflect the business offered

Long-tail keywords are preferred

Title should be of 50 characters

Better to use static domain

Standard <title> tag format:

DOCTYPE html>

<!–> <html class=”oldie”> <![endif]–>

<!–><!–> <html lang=”en-US”> <!–<![endif]–>

<head>

<title>………………..</title>

(b)  Meta Description tag-

Meta has lost its value for SEO due to over optimization but carefully written description can increase visibility and the CTR. Meta improves page rank indirectly. It is displayed beneath the title and the URL.

  • Be clear
  • Describe in 150 characters
  • Description should be like an ad copy
  • Describe each page

Standard Meta Description Format:

<meta name=”description” content=”…………….” />

Tips: Search engines avoid duplicate title and meta tags

(c) Header tag – Optimize <h1> tag with main keyword

(d) Image <alt> tag – Include main keyword in the tag

(e) Content – Web content must have main keyword preferably in first paragraph but without excessive keyword stuffing

  1. Unique and value added content

Relevance and importance are the key factors that determine website ranking in organic listing. Unique and informative content is rewarded as it is useful for human visitors and crawlers. Poor content writing must be avoided.

Forms of poor content

  • Thin content – Creating pages with little content
  • Thin slicing – Creating pages for each product/service
  • Duplicate content – Same content existing two different pages of the website or different domains (plagiarism)

Reasons for duplicate content

Content existing in two different URLs

When URL gets linked to other sites

Both http:// & https:// versions having similar content

Duplicate content can be solved with rel=”canonical” tag & robots meta tag

Canonical tag – guide search engines to pages to be shown

Code Sample : <link rel=”canonical” href=”http://example.com/blog” />

Robots meta tag – HTML code used to discourage crawlers from performing specific operations

“noindex”- Don’t index the page

“nofollow”- Don’t follow the link/or pass the link value

“noarchive”- hides cached copy of the page available in search result

Schema.org:

It is a HTML code protocol introduced by Google, Bing and Yandex. It marks important pieces of content and displays the content in organic search page. Schema.org is a protocol that improves page rank of web pages indirectly.

  1. Off-page-SEO – manipulating external environment for garnering quality back-link

In addition to relevancy and importance, online reputation of websites also determines ranking. Search engines rate reputation of websites through link analysis

Off-page-SEO includes:

  • Content is popularized in social media
  • Publishing content with back link in reputed article publishing sites
  • Editorial linking – quality content get linked to other sites
  • Link exchange – exchange links with relevant websites
  • Link bating – link quality content of other sites to your post and expect favor from others
  • Content syndication – linking other web pages for reciprocal links
  • Directory submission – website submission in DMOZ, Yahoo and other directories

Note: Google Penguin has rendered the last four options ineffective

Technical SEO factors

  • Downloading speed (2-2.5 seconds)
  • Enable page cache
  • Impressions for quick page load

Conclusion

Websites can be optimized by including keywords in internal structure and improving site reputation with the help of link building. SEOs need high traffic keywords and quality content for making sites search engine friendly.

One Comment

  1. Just want tо saʏ your aгticle is as astonishing.
    The claritу in youг post is just great and i coսⅼd assume you’re an expert οn this subject.

    Fine with your permission allow me to grab your feed to keep updated with forthcoming
    post. Thanks a miⅼlion and please keep up tһe gratifying work.

Leave a Comment

Advertisements

START TYPING AND PRESS ENTER TO SEARCH