Essential SEO Techniques

Search Engine Optimization Recitals

It is the goal of every website to be on the top of the search engine page results. The driving force behind this goal is the fact that search engines are the number 1 provider of website traffic – not just traffic but targeted site traffic. The websites that are currently on the top of search engine result pages (SERPs) are those with higher ranks and quality content.

Why do these websites rank high on the search engines? They are on the position they are in right now because they have utilized strategies that allow their original content to be consumed organically and efficiently. If you want to have the same fate as these sites, then you must begin concerning yourself with all the factors that contribute to these search rankings. Here are some steps you can take to increase your site’s rank:

File names

Be conscious of file and folder names. Try to use filenames that actually make sense - for example contact.html for the contact page, or product.html for the products page. To get a little SEO boost, if you are selling socks, make the product page socks-products.html. Same goes with images. If you have an image of a boy riding a bike, boy-riding-bike.jpg is much more descriptive than image32.jpg.

URL Schemes

Many search engines (including Google) treat the underscore as a character, not as a delimiter (like a -). So, the url www.somewebsite.com/category/buy-fishing-supplies/ is understandable to bots and they can gather this page is probably related to purchasing, supplies, and fishing. This url on the other hand www.somewebsite.com/category/buy_fishing_supplies/ is not very meaningful as buy_fishing_supplies is treated as a single word, of which there is no meaning for.

According to some estimates, Google may place as much as 10% importance on domain names, folder and filenames. That's a very healthy figure, especially when many search engines don't even place that much importance on keyword lists! They also look at alt text, and they scan the actual content of your Web site (placing extra importance on heading text). The last two things to consider are page titles and domain names.

Having your primary keyphrase within your domain name is going to help your SEO ranking, just try not to over use your primary keyword. Google continues to heavily penalize Web sites that overuse keywords within naming conventions.

Utilize ALT Attributes

ALT tags were meant to be for text browsers because the images didn't show in text browsers and the ATL tags would tell the visitor what it's about. You should put your main keyword(s) in the ALT tags where it makes sense, but don't over do it because you could get dropped in the results or even worse banned for life if you are deemed to be exploiting this method! Every single image should have alt text, which is not necessarily the same as the title for the image. They are two separate attributes for a reason you know.

Do not try to trick a search engine

Search engines can identify whether you are playing tricks on them. Be legitimate in the strategies you are using. Search engines are more likely to realize you are trying to pull one over on them and penalize you for it than they are to buy in and place you to the top of the ranks. What do you need to avoid? There are a number of "tricks" to avoid including keyword stuffing, hidden text, search engine cloaking, duplicate sites and link farms.

Add microdata to your markup

This is still a new concept, but an important one. Adding microdata to your pages is going the extra mile to ensure your markup accurately describes the content that's contained within it. I'm not going to go into details on how to do this, instead I'll just point you to another good blog post that goes into the how-to. http://vanseodesign.com/web-design/html5-microdata/

Page Title Optimization

The page title should be relevant to the page itself. Every single web page should have its own title tag. Unique page titles will increase the chance for each of the pages to be indexed properly and separately. There are cases where the search engines refrain from indexing the pages of a particular site thinking that the pages are just the same because they have the same page title. This is the reason why you should provide each page a unique page title. It's also a good idea to start your page title with a keyword.

Meta Tags Optimization

To assist in ranking well on the search engines, optimize the Meta tags - specifically the keyword and description tags. The Meta Tags provide information to the search engines. The information it provides is visible to the search engines but invisible to online users. As much as you need unique page titles, you also need unique Meta Tags for your site’s pages. Meta title is the first thing a search engine finds when crawling the page and therefore the trick lies in using them right, but also not expecting too much from them. Search engines place much less emphasis on meta tags that they used to. In fact, most search engines do not place any importance on the Keywords meta tag, and place only limited importance on the Description meta tag.

That said, it's still important to include Keywords and Description meta tags within each page you'd like to optimize. SEO is a cumulative effort. By that, I mean that no single SEO technique, by itself, will land you a top search engine position. It's the cumulative effect of all SEO techniques together that leads to top search engine positions.

Meta tag for search engine robots

You can use this tag to let search engines know if a page should be indexed or not. For example, you may have pages with wholesale prices for specific bulk buying customers that you don't want discoverable to the general public.

<code>
<meta name="robots" content="noindex, nofollow">
</code>

This meta tag requests that search engines not index the page, and not follow links to other pages that originate from this page. Only include this tag within pages you don't want indexed! This isn't to say that bots can't index or follow these pages, but serves as an explicit suggestion not to and therefore you are likely not to be penalized if content on these pages is not as SEO friendly as other pages. If you absolutely cannot have certain pages accessible by a search engine, consider password protecting the pages using apache or require users to be logged in before allowing them to load the page.

Using robots.txt

Some search engines aren't capable of understanding the Robots meta tag. These search engines require you to upload a special file to the root directory of your web site. The name of this special file needs to be exactly "robots.txt" and is a simple text file. It is a good practice to include this file either way. If you want all pages within your site to be indexed, you'll type the following within the text file:

<code>
User-agent: * Disallow:
</code>

Notice that nothing is listed as being disallowed.

If you'd like no pages within your site to be indexed, you'll type the following within the text file:

<code>
User-agent: * Disallow: /
</code>

Notice the forward slash (representing everything within the root directory).

If you'd like to specify certain files or directories that shouldn't be indexed, you must list them on separate lines within the file. Here's an example in which two directories, wp-admin and wp-includes, are disallowed:

<code>
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
</code>

Create a website sitemap

Create a site-map that contains links to all the main pages of your website, so the search engine robots can easily find all the different pages. They'll likely find them either way, but let's be real here, you want to make this experience a breeze for bots. The last thing you want to do is make them hunt for your precious content. If you scratch their backs, they'll scratch yours. Once you have a sitemap, be sure to submit it to each search engine you want to be found in and also provide a path to it in your robots.txt file.

<code>
User-agent: *
Disallow: /cgi-bin/
Disallow: /test/
Sitemap: http://www.yourdomainname.com/sitemap.xml.gz
</code>

Search Engine Friendly Internal Links

It is important that you establish internal links that are search engine friendly. In doing so, remember that the search engines can read html links easily compared to any other type of files. Insite cross linking will also help you get all of your web pages indexed by the search engines. Your web pages should be no more than three clicks away from the home page. Link to topic related quality content across your site. This will also help build you a better theme through out your web site. On every page you should link back to your home page and your main service(s).

Link Popularity

Link popularity is the most powerful SEO tool out of all them. Most search engines don't even consider web sites if there is not at least one or two links pointing to the web site. Having another site(s) link to your web site is important when it comes to getting your site a good ranking. Even a crappy website will bubble to the top of the search rankings if it's popular enough!

Read More Links

Google tends to rate hyper-linked words more highly than words that are not linked. So by putting in a description for what the Read More will take you to, you are increasing your keyword density on your Web pages as well as you're helping people with disabilities know which Read More link they happen to be clicking to get more information. Links with the text "Click Here" or "Read More" are not very descriptive.

Web Site and Markup Structure

This is also important, if you want to get indexed! Text content should out weigh the HTML content. The pages markup (and styling) should validate and be usable in all of today's leading edge browsers. Stay away from flash and inline css styling. Search engines dislike both of these a lot. Search engines used to dislike tons of JavaScript, even in external files, but recent studies found that more and more relevant search engines actually execute and read javascript. This is largely due to the explosion of JavaScript frameworks in recent years. Why does it matter? Because the GENERATED markup is often very different from the markup you see when you "view source" on a page. Here's a good article that talks about google bots crawling javascript.

Be cautious about using long JavaScript code (or inline css styles) in the head section of your Web pages. Some search engines only research a set number of characters before moving on to index another Web page. If your Web pages include long JavaScript code above the body section of your HTML documents, this code might cut into the amount of actual content that's indexed. If you need to use copious amounts of JavaScript within your pages, I recommend including the bulk of the JavaScript within external files that can be call with a single line of code.

Heading Tags

H1 tags are the next big thing that search engines look for. Generally speaking, the largest visible text within a Web page should be the most important text about the content of your Web page. So, if the purpose of your Web site is to sell socks, you'll want the words socks to appear in large text on the Web page. While often designers like to use images in headers, this is certainly not optimum for SEO success. We call this type of text heading text, which should use the and the tags.

Performance Performance Performance

Your web page's speed is important to your visitors and the search engines. Why? Because the robots will be able to spider your web page faster and easier. As of recent, major search engines will actually penalize you for having a slow website. Consider the following when optimizing your site:

  • Try your best to keep your web page under 1,024k in size (including assets!!)
  • Keep the number of network requests to a minimum (use image sprites, concat css and js)
  • Minify everything possible (html, svg's, css, and js)
  • Activate GZIP compression
  • Utilize browser caching
  • Use a CDN where possible and where it makes sense
  • Remove unused assets (js libraries and fonts are two big ones)
  • Resize/optimize images for web
  • Compress images further
  • Consider lazyloading/on-demand content for large articles
  • Replace imagery with CSS effects where possible
  • Replace JavaScript animations with CSS effects where possible
  • Use data URIs for imagery where possible

When in doubt, run your website through tools to help you diagnose issues like YSlow, Pingdom Tools, and Google PageSpeed Insights.

Keyword Density (KD)

Definition: A percentage derived from dividing the number of times your keyword appears in your Web page text by the total number of words in

your Web page text. KD=N/T (N = Number of times the keyword appears in the HTML document. T = Total number of words in the HTML document.)

This is also vital and should be used with research. You should use the keyword(s) once in the title tag, once in the heading tag, once in bold text, and get the density between 5% to 20% (Don't over do it!). Also use your keyword(s) both low and high on the web page, keyword(s) should be in the first sentence and in the last one.

More specific KD goals

Page Title: 5%-10%
Meta Description: 15%-25%
Meta Keyword List: 15%-25%
Visible Text: 2%-5%

SEO anti-spam rules:

  • Don't overuse your keywords.
  • Only use keywords that relate to the content of your Web site.
  • Repeat keywords a maximum of two times within meta tags, and don't repeat keywords within the page title.
  • Don't include comment tags within your HTML documents unless they're absolutely necessary.
  • If you do include comment tags, absolutely do not include keywords within your comment tags.
  • Do not include HTML tags within your comment tags.
  • Never place text within your Web page that's the same color as the page background.
  • Never place text within your Web page that's similar in color to the page background color.
  • When placing text within visible tables, be careful not to include text that may be similar to the background color of the table.

Publish articles and newsletter/press releases

Writing and publishing is one of the best ways of marketing your website on the internet. It helps to generate substantial traffics to your website.

You can submit articles to e-zines, article directories, web sites and magazines that offer article submissions. Don't forget to include your business information and contact address at the bottom of the article. Article Directories: shvoong.com, goarticles.com, uniterra.com, allfreelancework.com, ezinearticles.com etc. Use keyword "article submission" in your search.

Press release sites: free-press-release.com , prweb.com , prleap.com , pressbox.co.uk etc. Use keyword "free press release distribution" in your search.

Utilize the hfeed class in your markup

If you have a website with syndicated content, such as a blog, you should ensure you are adding the class "hfeed" to your content container. hfeed is used to apply metadata to your markup. The "hfeed" class is actually part of the hAtom 0.1 microformat specification. hfeed indicates to machines that the enclosed content is syndicated content (such as a blog feed). What is syndicated content? It's content that is continuously updated. Search engines have been trained to love content that is continuously updates as this usually means that it's up to date.

It takes time to index your site in the SERP's

After making changes to your site, you must first wait until the changes have been indexed by the search engines before you can see the result. It is only until that time will you be able to evaluate whether the change you have made creates positive or negative results.