Search Engine Optimisation (SEO)

(Redirected from SEO)

What is SEO?

93% of internet users worldwide use search engines to find websites (Forrester Research), and as a result Search Engine Optimisation (SEO) is becoming more and more important. SEO is the process of optimising your website for specific keywords, in order to improve your website’s ranking in search engines.

Search engines produce two types of search results: organic results and paid results. Organic results are the results that appear naturally, based on their relevance to the searcher’s query. SEO is focused on improving a website’s ranking for organic search results.

Source(s):

The six steps of Search Engine Optimisation

There is an overload of information on SEO, and if you’re new to the subject it’s hard to decide where to start. align.me has identified six distinct steps that we find important in SEO.

  1. Keyword identification
  2. Competitive analysis
  3. Onsite optimisation
  4. Offsite optimisation
  5. Analytics and reporting
  6. Conversion optimisation


Source(s):

1. Keyword identification

The first step of Search Engine Optimisation is aimed at identifying the keywords that you want to optimise your website for. In other words, for what keywords do you want to rank in the search engines? Construct a list of keywords based on logic, considering your product or service, and the industry you’re active in. More importantly, think about what the searcher might enter into the search engine. Someone looking for a vacuum cleaner might type in ‘vacuum cleaner’, but also ‘cleaning carpet’ or ‘vacuuming floors’. At this stage there are no limitations to the length of the list.

Provide your SEO consultant with the list of keywords, and ask them to perform stemming. Stemming is used to derive other relevant keywords based on the stem of a keyword. After stemming, the keyword ‘vacuum cleaner’ will also result in ‘vacuum cleaners’. Review the stemmed list of keywords, and consider whether a keyword will actually be used by a searcher.

Ask the SEO consultant to analyse your selected keywords for local search traffic (the amount of searches for this keyword) and SEO competition (the number of web pages that also rank for this keyword). Based on the search results, the SEO consultant should recommend approximately ten keywords that are worth optimising for, and need further research.

Source(s):

2. Competitive analysis

Now that you’ve identified the keywords that you want to rank, and optimise your website for, it’s time to research the strength of competition. In the second phase you need to analyse how many other web pages are ranking for your selected keywords, and how hard it will be for your website to compete.

In analysing the strength of competition, we recommend to include two sets of factors: on-page and off-page. On-page factors are elements on your website that indicate the importance of your website to search engines, and off-page factors are outside of your website. For each selected keyword, ask the SEO consultant to analyse the top ten search results, considering the following on-page and off-page factors:

Off-page factors:

  • Domain Age (DA)
  • PageRank (PR)
  • Google Index Count (IC)
  • Referring Domains - Domains (RDD)
  • Referring Domains - Page (RDP)
  • Page Backlinks (BLP)
  • Domain Backlinks (BLD)
  • Page .edu/.gov Backlinks (BLEG)
  • Yahoo Directory (YAH)

On-page factors:

  • Keyword in Title? (Title)
  • Keyword in URL? (URL)
  • Keyword in Description? (Desc)
  • Keyword in Header Tag? (Head)
  • Google Cache Age (CA)

Based on the analysis, ask the SEO consultant to recommend a final set of keywords that are worth optimising your website for.

Source(s):

3. Onsite optimisation

In this phase the actual optimisation of your website starts. For each keyword, select one page on your website that you will optimise. These are the so called hero pages. Next you need to identify which elements of your website need optimisation. To do so, ask the SEO consultant and web developer to review the code on your page, the site structure, link structure and Google Index, and make adjustments accordingly.

As a marketer, you will be mainly responsible for optimising the website copy. In short, you need to include keywords in the copy, including titles, URL’s and headings. Key to optimising website copy is that you should always write for the reader first, and then optimise for search engines. In the end your website is for visitors, and not for search engines. There's loads of great content on onsite optimisation throughout the web, but here are two to start with:

Firstly, Jim Stewart's 6 steps deal with the infrastructure as well as the words:

  1. Get Google Authorship setup
  2. Do a site:website.com search on your site to make sure you have the right no. of pages indexed.
  3. Use your keywords in title, H1, early in the article, in alt tags, filenames and captions.
  4. Use a plugin like All-in-one-seo or Yoast SEO to make the above a little easier.
  5. Write a series of at least three posts for a difficult phrase.
  6. Get a link off your home page to the post you want to rank for.

Also, here is a list of [8 actionable techniques]:

Source(s):

4. Offsite optimisation

The ranking of your website is also influenced by factors outside of your website, and SEO experts consider quality backlinks from credentialled sites to be one of the most important factors in improving your website’s ranking. A backlink is an external link pointing to your site (i.e. a blog article linking back to your website). Backlinks indicate that other people find your website content important, and therefore Google assumes the same. The key to generating backlinks is having great content on your website, which can be achieved through blogging.

Wordpress is a free tool for publishing blogs, and a great starting point for hosting your own blog. Ensure to include keywords in your blog content, but as mentioned previously, never at the cost of readability. You can also create backlinks by commenting on other relevant blogs or online communities in your industry (i.e. LinkedIn groups), and linking back to your own site. Note that the links need to point to your hero pages, and not just any page on your website. Also ensure to include keywords in the the links you’ve created. This is known as keyword rich anchor text. However, ensure that not all backlinks to the site use keyword-rich anchor text (since the introduction of Google Penguin, Google has begun penalising sites whose backlinks only have keyword in the text, and are subsequently seen as 'over-optimised' - see here for more information on how to better balance the anchor text in your backlinks). When commenting on other blogs it’s important to respect the content, and ensure the post is relevant in the context to prevent it from coming across as spam. For SEO tips specifically for WordPress read this article here. Seoprofiler is a great tool to track the number of backlinks of your website.

Source(s):

5. Analytics and reporting

It’s important to track the progress of your optimisation on a monthly basis. Google Analytics and IBP Pro are great free tools to analyse your website traffic and ranking. Note that only changes in ranking of five spots or more are likely as a result of SEO. SEO doesn’t have an immediate effect on your page ranking. In general, it takes months for a new site to appear in the Search Engine Results Page (SERP) and for any site months to see an increase in ranking. Bear in mind that SEO is an ongoing process and it needs maintenance in order to achieve results.

6. Conversion optimisation

Conversion optimisation is the final step in optimising your website, and is aimed at maximizing the percentage of website visitors that complete a desired action (i.e. download a paper or buy a product). There are no clear guidelines for conversion optimisation, other than making your website user-friendly, and ensuring that it’s easy for them to execute your desired action. If, for example, you want visitors to download a whitepaper, make sure the link to the whitepaper is prominently at the top of the page.

If your website requires more in-depth conversion optimisation, it’s worthwhile installing Google website optimiser (landing page conversion improvement tool). A great read on optimisation is Glenn Murray’s eBook SEO Secrets.

Source(s):

SEO tools

SEMrush is an online marketing software tool that collects data from over 95 million keywords and 45 million domains. On top of researching how products are currently advertised and finding the best way to word advertising text, SEMrush also provides you with an outlet to search trends, competition and dollars spent by your competition.

SEMrush has four key features:

  • Organic research: see your competitors' organic positions;
  • Advertising research: study your competitors' ads text;
  • Keyword research: find good long-tail keywords; and
  • Charts: compares various SEO metrics.

Recently, SEMrush has undergone a series of developments include a new software structure to improve usability and an expansion of their database. SEMrush has also included a number of new features including an API unit check call, the ability to enter up to five domains at a time and see what keywords they have in common, as well as the ability to see what keywords two of your competitors rank for in comparison to your own website.

HitTail is a great tool for finding long tail keywords that reflect:

  • Your traffic and content, but for which you are not already ranking
  • Decent volumes
  • Low competition

Marketers love long tail phrases, and HitTail makes it easy to choose the best ones to optimise for. See also a blog re long tail phrases at inbound marketing b2b.

Alexa will help you determine the page ranking of your competitors, and the phrases they are enjoying search traffic for.

Source(s): SEO solutions

Recent Google Updates

Panda

Google Panda is a change to the Google's search results ranking algorithm that was first released in February 2011. The change aimed to lower the rank of 'low-quality sites' or 'thin sites', and return higher-quality sites near the top of the search results. This resulted in a surge in the rankings of news websites and social networking sites, and a drop in rankings for sites containing large amounts of advertising. This change reportedly affected the rankings of almost 12 percent of all search results. In order to help affected publishers, Google has posted a blog that outlines how to create a 'high-quality site' according to Google.

Source(s): SEO solutions

Page layout

In January 2012, Google released an updated algorithm that penalised sites with little content 'above the fold' (in web developers terms, 'above the fold' means content that is accessible without needing to scroll or click on a link).

Source(s): SEO solutions

Penguin

Google Penguin is a further Google algorithm update, released in April 2012, that is aimed at decreasing search engine rankings of websites that violate Google’s Webmaster Guidelines by using black-hat SEO techniques, such as keyword stuffing, cloaking, participating in link schemes, deliberate creation of duplicate content, and others. Google Penguin also penalises sites that it deems are 'over-optimised', and have too many backlinks that use keywords in the anchor text. To avoid this, ensure that backlinks to your site use a variety of anchor-text types, including:

  • Branded anchors (e.g. align.me)
  • Naked URLs (e.g. www.align.me)
  • Universal anchors (e.g. click here)

More information is available here.


8 SEO Steps to Maintain Your Rankings when Redesigning your Website

1. Block Google from your Staging Site

When your website is placed in a staging site by developers ensure that robots.txt, noindex, no hashtag and metatag are implemented so that there are no duplicated sites uploading the same content. Password protecting the site will also render it visible to only the developer. It’s important to apply both strategies to assure yourself that the staging site is not publicly accessible.

2. Remove Step 1 When Going Live

Once you do go live with the redeveloped site, confirm that the noindex, nofollow part of the robots metatag are removed.

3. Account for No Empty Pages or a Panda May Swipe You

Google runs a static update Google Panda that checks and eliminates low quality content across the Google index. Pages with little or no content and have headlines such as “coming soon” may be swiped of the Google index. Do not allow for empty pages that indicate for low quality and may be vulnerable to Panda updates.

4. Spider your existing site and back it up

Xenu for PC and Integrity for MAC are programs that will gather and organise your URL’s. Pages that rank well for important phrases need to be matched up with their new URL’s and given priority. Backing up your old site is always a compulsory action to prevent loss in the case of an emergency.

5. Changing your content? Why?

Avoid changing the content of well ranking pages. If you must, use tools such as tagcrowd.com to replicate the relative keyword density. Page titles and menu structures also need to be examined before being changed. Google determines the importance of pages through these variables and well ranked pages may lose their popularity. If content stays the same, so should the page title. Although if content does change, ensure that the page title reflects the new content.

6. Check the new site for errors

Is the new site free from internal errors? Internal errors include 404s, 501s, 500s and 302s. 302s isn’t technically an error but should be avoided. Xenu and Integrity can pick up most errors but httpd Error log will locate any hidden errors. Also, compare the load times and speed of the site to your existing site.

7. 301 Everything

301 is your friend, internal and external. 301 tells Google where the new version of the page is and permanently redirects everything. Use the spreadsheet created in step 4 to redirect all your old pages to the equivalent new pages. Without redirecting the pages, Google will only recognise that the content from the old pages has been removed and not published under a new URL. This will ensure that the rankings won’t drop on your page. Redirecting pages also avoids back linking of external URL’s to the old site.

8. Watch Google

Always account for the possibility that things may have been missed. Google webmaster tools will keep an eye out for areas of improvement.

  • Crawl Errors – Google may identify pages you didn’t find using crawl. Redirect them.
  • HTML Suggestions – Duplicate content and page titles will negatively affect your rankings.
  • Crawl Stats – Monitor “Time Downloading a Page” and if a significant increase occurs, examine hosting options, check the plug ins and find out if javascript on site is slowing things down.
  • Index Status – Look for oddities. If Google does not have the right number of pages in the index there’s a problem.


More information is available on Jim's blog 8 SEO steps to maintain your rankings when redesigning

Source(s): SEO solutions


A short guide to SEO

How do Search Engines work?

1) Crawling and building an index Search Engines firstly build an index of data and links of all the unique documents on the internet using ‘crawlers’. This information is stored until someone searches for them.

2) Delivering relevant results, ranked by perceived importance When you search for a term, search engines will hunt through all its stored data to retrieve results that are the most relevant, and then sort them by its perceived importance of usefulness for the searcher.

How do Search Engines determine Relevancy and Importance?

They are determined mathematically by using algorithms comprised of hundreds of components, which are called “ranking factors”. Relevance used to be determined by just those pages that directly matched the words in the search term, but now there are many factors influencing relevance, which we will discuss in detail later on. Importance is currently determined by popularity. If a page is more popular, it is assumed by the engine to be more likely to be more useful and valuable to you.

Tips from Google and Bing on Search Engine Optimisation

Google

Avoid ‘cloaking’ – that is presenting your content in a different way to search engines than you do for users. Make your pages primarily for your users.

Human-friendly URLs – use descriptive and clear keywords for your URLs. Use one version of a URL for a particular document, and use 301 redirects or the rel="canonical" element for any duplicate content.

Clear and Accurate content – Make sure your content is clearly written and accurately described by the title, page name.

Clear hierachy and text links - Every page should be reachable from at least one static text link.

Bing

A clear URL with many relevant keywords

Avoid hiding content and links that are buried in rich media (Adobe Flash Player, JavaScript, Ajax)

Make sure your content uses keywords that would match what your users would search for.

Create new content regularly Don’t put the text that you want indexed by search engines inside images. For example, company names and addresses only present in the company logo.

How do people use Search Engines?

Keep in mind, while you do have to adjust your content to perform better for search engines, the ultimate objective is to engage and inform your consumer. So write for your readers, not the search engines.

A great way to do this is to observe what people search for. There are three main categories;

Do or Transactional queries – if a person wants to download a song or buy a concert ticket, they will type this into the search engine with a purpose of achieving that outcome.

Know or Informational queries – these include the more specific ‘How-tos’ as well as vague search terms, with the goal of getting information or understanding of that subject.

Go or Navigation queries – This is when the person has a specific need to find a particular webpage or site.

Why do we need Search Engine Optimisation?

You could have built the best website with the highest quality content, but unless you promote it, it could easily get lost in the sea of other search results. Search engines have no way to determining actual value of the content of your webpage, it is up to you to make sure it is as widely available as you can make it.

Search engines are not perfect. There are many problems that could arise that lead to imperfect results. Some of these include;

1)Problems with ‘spidering’ and indexing content – some content could be mistakenly or intentionally blocked from being indexed by a search engine. Duplicate webpages, content behind log-in pages and incomplete linking can lead to imperfect information and representation on search engine results pages.

2) Content that doesn’t match the query – this can include content that isn’t written with search terms in mind. An example would be, writing ‘food cooling unit’ instead of ‘refrigerator’. Both are correct, but one will be used a lot more on search engines. Targeting locations incorrectly, and not writing content in the correct context can also mislead search engines and produce results that confuse the searcher.

How to design your webpage for SEO

In this section, we will discuss ways you can optimise the following design aspects of your webpage specifically for SEO.

Indexable content - Any content that you want to be indexed by the search engine should be in a language the search engine bots can understand. Therefore make sure you use HTML text format. Use a feature like Google Cache to view your website and ensure that all the content is visible to the search engine. Otherwise, your website is not likely to appear in the search results. So if you have video or audio content include a text transcript. Images in gif, png, or jpg format can be assigned ‘alt attributes’ in HTML so it provides search engines a text description of your content.

Crawlable link structures - Ensure that your website is structured in a way that allows search engine crawlers to easily navigate their way through it. Make sure you don’t have broken links or orphan pages that won’t be able to be picked up by the crawlers. If you specifically want to hide links or pages from search engine crawlers, the metatags Nofollow and Noindex will allow you to do just that.

Keywords - Use your keywords intelligently. Keywords are one of the most important factors of the search process, the entire retrieval method is based on content that is both matched to keywords and the search terms. Other data such as the order of the words, spelling, punctuation, and capitalization of those keywords provide more information that can help the engines retrieve the right page and rank them correctly. Make sure that you ensure that keywords are prominently used in titles, text, and meta data. tips for on-page keyword optimisation - Use a keyword in the title tag atleast once and close to the beginning. Atleast 2-3 times in the body of the page, you can also use variations of the keywords instead. Atleast once in the alt attribute of an image, which will help with an image search. Once in the URL. The more specific your keywords, the better your chances of ranking as you will have less competition from other pages. Make sure your keywords are part of your indexable content.

Title Tag - A title tag is an accurate, concise description of a page's content. Some tips to optimise your title tag for SEO are: Keep the length in mind - Search engines display only the first 65-75 characters of a title tag in the search results. This is also the limit allowed by most social media sites. However, if you're targeting multiple keywords and having them in the title tag is essential to ranking, going for a longer option may be better. Place important keywords close to the front – this helps in the ranking process and may also influence a user’s decision to click on your webpage in the results. Consider readability and emotional impact - it’s important to think of a user’s experience of reading your title tag. The title tag is a new visitor's first interaction with your brand, and strive for a positive, memorable impact.

URL structure - URL structure can help in providing better rankings nad also improve user experience of your website. Try to make sure that your URL will at least include enough information to inform a user of the basic premise of the content your page offers. Make it easy to read, with fewer numbers and symbols and try to use hyphens to separate words. Shorter URLs make it easier to share your link, and ensure the whole link in visible in search results. Try to use your keywords in your URL, but overuse may trip spam filters.

Rich snippet - Rich snippets are an increasingly popular type of structured data that allow webmasters to mark up content in a way that allows search engines to easily identify what type of content it is and display it accordingly in the search results Some examples of content that can benefit from this include pages on people, products, reviews, businesses, recipes and events.

Avoid Canonical and Duplicate content - Duplicate content is when you have multiple URLs or pages of the same content. Search engines will rarely show two or more of the same content in results so it will be forced to choose one, so you run the risk of ranking all those pages lower on results. You could use a 404 Not found tag if the duplicate content is not used or accessed at all, or a 301 Redirect tag. Canonical content is when unique pieces of content are organised under a one URL. You could remedy this by reorganising them by using a 301 redirect tag, or a Canonical URL tag, consolidating all the content to a single page that will potentially rank higher in results.

Keyword Research

This sections deals with tips for choosing the right keywords for your webpage.

Assessing the value of the keyword - Settle on a keyword that matches the content of your page as well as the expectations of users who search for that keyword. Try searching for that keyword to assess what other kinds of websites rank for it and how difficult it would be to rank highly for that keyword. Check if the keyword attracts many advertisements running alongside the results (this could mean it’s a keyword with a high conversion rate). You should also try to determine how much work would be required to reach your target ranking for the particular keyword. Google Adword’s Keyword Planner could be a starting resource for you in conducting research about your keywords. It suggests keywords, provides estimated search volume and a predicts costs of running a paid campaign for the chosen keyword.

The 'long tail' of Keyword Demand – popular search terms make up less that 30% of all the searches performed on the internet. The other 70% is made up of more specific, more detailed and harder to predict terms that are referred to as the ‘long tail’ of the search. These types of searches often covert better, as they are often performed by users who have more of an idea of what they are looking for, rather than just browsing a general search term. Therefore, it would be beneficial to try to cater to the ‘long tail’, even though they are harder to predict, and not just the generic search terms.

How User Experience and Usability can affect search rankings

Some of the ways search engines can conclude the quality of your content based on how users engage with it can be found below.

Engagement Metrics - This measures the success of your page by observing how users engage with the results. For example, If you click on the first link, then immediately hit the "back" button to try the second link, this indicates that you were not satisfied with the first result. The "long click", where users click a result without immediately returning to the search page to try again, is what you should be striving for.

Machine Learning - In 2011 Google introduced the Panda Update to its search results ranking algorithm. Google started by using human evaluators to manually rate 1000s of sites for "low quality" content and then incorporated machine learning to mimic the human evaluators. Once its computers could accurately predict what the humans would judge a low quality site, the algorithm was introduced across millions of sites spanning the Internet.

Linking Patterns - Sites that are more popular and have higher quality content are usually linked to more often than lower quality sites. Therefore, search engines conclude that the link structure of the web could serve as a proxy for votes and popularity.

Growing Popularity and Links

With the analysis of links, the search engines can discover how pages are related to each other and in what ways. The development of complex algorithms have allowed to create nuance evaluations of sites and pages based on this information. These algorithms and engines, based on links, can analyse the popularity of a website & page based on the number and popularity of pages linking to them, and also abut also metrics like trust, spam, and authority.

The trustworthy sites and spammy sites are not the same operating mode. The trustworthy sites tend to link to other trusted sites, while spammy sites receive very few links from trusted sources.

Links Signals

Different factors are taken into account by professional SEOs when measuring link value and a site's link profile.

  • Global Popularity: If the site is very popular, therefore the site is very important.
  • Local/Topic-Specific Popularity: The concept of "local" popularity suggests that links from sites within a topic-specific community matter more than links from general or off-topic sites.
  • Anchor Text: he use of right keywords in a page is an essential point. If dozens of links point to a page with the right keywords, that page has a very good probability of ranking well for the targeted phrase in that anchor text.
  • Trust Rank: In order to avoid spammy sites, search engines use systems for measuring trust, many of which are based on the link graph. Earning links from highly trusted domains can result in a significant boost to this scoring metric.
  • Link Neighborhood: Spam links often go both ways. The best way is to choose sites you link to carefully and be equally selective with the sites you attempt to earn links from.
  • Freshness: it's important to keep fresh links, not only to earn links to the website, but also to continue to earn additional links over time. Search engines use the freshness signals of links to judge current popularity and relevance.

Link building basics

Three basic types of link acquisition:

  • "Natural" Editorial Links: Links that are given naturally by sites and pages that want to link to your content or company. These links require no specific action from the SEO.
  • Manual "Outreach" Link Building: The SEO creates these links by emailing bloggers for links, submitting sites to directories, or paying for listings of any kind. The SEO often creates a value proposition by explaining to the link target why creating the link is in their best interest.
  • Self-Created, Non-Editorial: These links offer the lowest value, but can, in aggregate, still have an impact for some sites. In general, search engines continue to devalue most of these types of links, and have been known to penalize sites that pursue these links aggressively. Today, these types of links are often considered spammy and should be pursued with caution.

Starting a link Building Campaign

The first step in any link building campaign is the creation of goals and strategies. Unfortunately, link building is one of the most difficult activities to measure. SEOs rely on a number of signals to help build a rating scale of link value:

  • Ranking for Relevant Search Terms: One of the best ways to determine how well a search engine values a given page is to search for some of the keywords and phrases that page targets (particularly those in the title tag and headline).
  • Moz mozRank: mozRank (mR) shows how popular a given web page is on the web. Pages with high mozRank (popular) scores tend to rank better. The more links to a given page, the more popular it becomes.
  • Domain Authority: SEO moz Domain Authority (or DA) is a query independent measure of how likely a domain is to rank for any given query. It is calculated by analyzing the Internet's domain graph and comparing it to tens of thousands of queries in Google.
  • Competitor's Backlinks: By examining the backlinks of a website that already ranks well for we targeted keyword phrase, you gain valuable intelligence about the links that help them achieve this ranking.
  • Number of Links on a Page
  • Potential Referral Traffic: Link building should never be solely about search engines. Links that send high amounts of direct click-through traffic not only tend to provide better search engine value for rankings, but also send targeted, valuable visitors on a site (the basic goal of all Internet marketing).

Some samples of Link Building Strategies

  • Get customers to link in a website: If you have partners you work with regularly or loyal customers that love your brand, you can use this to your advantage by sending out partnership badges - graphic icons that link back to your site.
  • Build a company blog: This content and link building strategy is so popular and valuable that it's one of the few recommended personally by the engineers at Google. Blogs have the unique ability to contribute fresh material on a consistent basis, participate in conversations across the web, and earn listings and links from other blogs, including blog rolls and blog directories.
  • Be newsworthy: Earning the attention of the press, bloggers and news media is an effective, time honored way to earn links.

Search engine tools and services

SEOs tend to use a lot of tools. Some of the most useful are provided by the search engines themselves. Search engines want webmasters to create sites and content in accessible ways, so they provide a variety of tools, analytics and guidance.

Common Search Engine Protocols

  • Sitemaps: Think of a sitemap as a list of files that give hints to the search engines on how they can crawl your website. Sitemaps come in three varieties:
    • XML (Recommended Format): It is extremely easy for search engines to parse and can be produced by a plethora of sitemap generators.But relatively large file sizes.
    • RSS: Easy to maintain. RSS sitemaps can easily be coded to automatically update when new content is added. But harder to manage. Although due to its updating properties.
    • Text File: Extremely easy. The text sitemap format is one URL per line up to 50,000 lines. But Does not provide the ability to add meta data to pages.
  • Robots.txt: The robots.txt file, a product of the Robots Exclusion Protocol, is a file stored on a website's root directory . The robots.txt file gives instructions to automated web crawlers visiting your site, including search spiders.
  • Meta Robots: The meta robots tag creates page-level instructions for search engine bots. The meta robots tag should be included in the head section of the HTML document.
  • Rel="Nofollow":This attribute allows to link to a resource. Literally, "nofollow" tells search engines not to follow the link, but some engines still follow them for discovering new pages.
  • Rel="canonical": Often, two or more copies of the exact same content appear on your website under different URLs. The canonical tag solves this problem by telling search robots which page is the singular "authoritative" version which should count in web results.

Search Engine Tools

  • Settings such as Geographic Target, Preferred Domain, URL Parameters and Crawl Rate.
  • Diagnostics such has Malware, Crawl Errors and HTML Suggestions.
  • Some statistics: report keyword impressions, click-through rates, top pages delivered in search results, and linking statistics.
  • Site Configuration: It's an important step: submit sitemaps, test robots.txt files, adjust site links, and submit change of address requests when you move your website from one domain to another.
  • +1 Metrics: When users share a content on Google+ with the +1 button, this activity is often annotated in search results.
  • Features
    • Identify Powerful Links: determine which links are most important.
    • Find the Strongest Linking Domains - the strongest domains linking to a domain.
    • Analyze Link Anchor Text Distribution - the distribution of the text people used when linking to the website.
    • Head to Head Comparison View - compare two websites to see why one is outranking the other.
    • Social Share Metrics - Measure Facebook Shares, Likes, Tweets, and +1's for any URL.

Myths and Misconceptions about search engines

In classical SEO times (the late 1990's), search engines had "submission" forms that were part of the optimization process. Webmasters & site owners would tag their sites & pages with keyword information, and "submit" them to the engines.

Unfortunately, the submissions were often spam, and the practice eventually gave way to purely crawl-based engines. Since 2001, not only has search engine submission not been required, but it is actually virtually useless. The best practice is to earn links from other sites. This expose the content to the engines naturally.

Meta Tag

Meta tags (in particular, the meta keywords tag) were an important part of the SEO process. We would include the keywords you wanted your site to rank for and when users typed in those terms, your page could come up in a query. This process was quickly spammed to death, and eventually dropped by all the major engines as an important ranking signal.

Keyword Stuffing and Density

A persistent myth in SEO revolves around the concept that keyword density - a mathematical formula that divides the number of words on a page by the number of instances of a given keyword - is used by the search engines for relevancy & ranking calculations. It's completely untrue time. Many SEO tools still feed on the concept that keyword density is an important metric. The best way is to use keywords intelligently and with usability in mind.

Search Engine Spam

As long as there is search, there will always be spam. The practice of spamming the search engines - creating pages and schemes designed to artificially inflate rankings or abuse the ranking algorithms employed to sort content - has been rising since the mid-1990's.

  • Users hate spam, and the search engines have a financial incentive to fight it.
  • Search engines have done a remarkable job identifying scalable, intelligent methodologies for fighting spam manipulation, making it dramatically more difficult to adversely impact their intended algorithms. Not only do manipulative techniques not help in most cases, but often times they cause search engines to impose penalties on a site.

Page Level Spam Analysis

  • Keyword Stuffing: One of the most spamming techniques, keyword stuffing is the repetitions of keyword terms or phrases into a page in order to make it appear more relevant to the search engines. The thought behind this - that increasing the number of times a term is mentioned can considerably boost a page's ranking - is generally false. The engines have very obvious and effective ways of fighting this. Scanning a page for stuffed keywords is not massively challenging, and the engines' algorithms are all up to the task.
  • Manipulating Linking: One of the most popular forms of web spam, manipulative link acquisition relies on the search engines' use of link popularity in their ranking algorithms to attempt to artificially inflate these metrics and improve visibility. This is one of the most difficult forms of spamming for the search engines to overcome because it can come in so many forms.
  • Cloaking: A basic tenet of all the search engine guidelines is to show the same content to the engine's crawlers that you'd show to an ordinary visitor. When this guideline is broken, the engines call it "cloaking" and take action to prevent these pages from ranking in their results.
  • Low values Page: Although it may not technically be considered "web spam," the engines all have methods to determine if a page provides unique content and "value" to its searchers before including it in their web indices and search results. The most commonly filtered types of pages are "thin" affiliate content, duplicate content, and dynamically generated content pages that provide very little unique text or value. The engines are against including these pages and use a variety of content and link analysis algorithms to filter out "low value" pages from appearing in the results.

Domain Level Spam Analysis

  • Linking Practices:The engines can monitor the kinds of links and quality of referrals sent to a website. Sites that are clearly engaging in the manipulative activities on a consistent or seriously impacting way may see their search traffic suffer, or even have their sites banned from the index.
  • trustworthiness: For the search engines, trust most likely has a lot to do with the links your domain has earned. Thus, if you publish low quality, duplicate content on your personal blog, then buy several links from spammy directories. Trust built through links is also a great method for the engines to employ. A little duplicate content and a few suspicious links are far more likely to be overlooked if your site has earned hundreds of links from high quality.
  • Content Value: Search engines constantly evaluate the effectiveness of their own results. They measure when users click on a result, quickly hit the "back" button on their browser, and try another result. This indicates that the result they served didn't meet the user's query.

Getting Penalties Lifted

Some recommendations:

  • Register the site with the engine's Webmaster Tools service (Google's and Bing's). This registration creates an additional layer of trust and connection between your site and the webmaster teams.
  • Make sure to thoroughly review the data in your Webmaster Tools accounts, from broken pages to server or crawl errors to warnings or spam alert messages.
  • Send your re-consideration/re-inclusion request through the engine's Webmaster Tools service rather than the public form
  • Full disclosure is critical to getting consideration. The engines, particularly Google, want the details, as they'll apply this information to their algorithms for the future.
  • Remove/fix everything you can. If you've done any manipulation on your own site (over-optimized internal linking, keyword stuffing, etc.), get it off before you submit your request.
  • Get ready to wait - responses can take weeks, even months, and re-inclusion itself, if it happens, is a lengthy process.
  • Re-inclusion can be faster by going directly to an individual source at a conference or event.

Measuring and tracking Success

In search engine optimization, measurement is critical to success. Professional SEOs track data about rankings, referrals, links and more to help analyze their SEO strategy and create road maps for success.

1. Search Engine share of referring visits

It's critical to keep track of the contribution of each traffic source for a site:

  • Direct Navigation: Typed in traffic, bookmarks, email links without tracking codes, etc.
  • Referral Traffic: From links across the web or in trackable email, promotion & branding campaign links
  • Search Traffic: Queries that sent traffic from any major or minor web search engine

Knowing both the percentage and exact numbers help to identify weaknesses and serve as a comparison over time for trend data.

2. Visits referred by specific search engi

Measuring the contribution of a search traffic from each engine is critical for several reasons:

  • Compare Performance vs. Market Share: By tracking not only search engines broadly, but by country, you'll be able to see exactly the contribution level of each engine in accordance with its estimated market share. Keep in mind that depends of each business activity (technology and Internet services (likely to be higher on Google than cooking, sports or real estate).
  • Get Visibility Into Potential Drops: If your search traffic should drop significantly at any point, knowing the relative and exact contributions from each engine will be essential to diagnosing the issue.
  • Uncover Strategic Value: It's very likely that some efforts you undertake in SEO will have greater positive results on some engines than others. If you can identify the tactics that are having success with one engine, you'll better know how to focus your efforts.

3. Visits referred by specific search engine terms and phrases

The keywords that send traffic are another important piece of analytics. The goal is to identify new trends in keyword demand, gauge the performance on key terms and find terms that are bringing significant traffic that you're potentially under optimized for. It's also possible to track search referral counts for terms outside the "top" terms/phrases - those that are important and valuable to a business.

4. Conversion rates by search query term/phrase

The conversion rate is one of the most important information. With this information, we can now do 2 things.

  • Checking rankings and working to improve this position will undoubtedly lead to more conversion.
  • With the website http://moz.com/tools, we can know what page these visitors landed on and be focus on efforts on that page to improve visitor experience.

5. Number of pages receiving at least one visit form search engines

Search engine traffic is an essential metric for monitoring overall SEO performance. With this infirmation, we can get a glimpse into indexation - the number of pages the engines are keeping in their indices from our site. Pages receiving search traffic is, quite possibly, the best long tail metric around.

Analytics software, The Right Tools for the Job

Some recommendations:

  • Testing different versions of pages ona site and making conversion rate improvements based on the results.
  • A great way to get started running tests that can inform powerful conversion rate improvements is to use Google's Content Analytics.

Some softwares:

  • Paid: Omniture, Fireclick, Mint, Sawmill Analytics, Clicktale, Coremetrics, Unica NetInsight
  • Free: Yahoo! Web Analytics (formerly Indextools), Google Analytics, Clicky Web Analytics, Piwik Open Source Analysis, Woopra Website Tracking, AWStats

Metrics for Measuring

  • Google Tools
    • Google Site Query: useful to see the number and list of pages indexed on a particular domain.
    • Google Trends: this tool shows keyword search volume/popularity data over time. If you're logged into your Google account, you can also get specific numbers on the charts, rather than just trend lines.
    • Google Trends for Websites: This tool shows traffic data for websites according to Google's data sources (toolbar, ISP data, analytics and others may be part of this).
    • Google Insights for Search: this tool provides data about regional usage, popularity and related queries for keywords.
    • Blog Search Link Query: Blog search link query shows generally high quality data and can be sorted by date range and relevance.
  • Bing Tools
    • Bing Site Query: Bing allows for queries to show the number and list of pages in their index from a given site. Unfortunately, Bing's counts are given to wild fluctuation and massive inaccuracy, often rendering the counts themselves useless.
    • Bing IP Query: this query will show pages that Microsoft's engine has found on the given IP address. This can be useful in identifying shared hosting and seeing what other sites are hosted on a given IP address.
    • Microsoft Ad Intelligence: A great variety of keyword research and audience intelligence tools are provided by Microsoft, primarily for search and display advertising.
  • Ask Site Query: Ask.com is a bit picky in its requirements around use of the site query operator.
  • Moz - Page Specific Metrics
    • Page Authority:Page Authority predicts the likelihood of a single page to rank well, regardless of its content.
    • mozRank - mozRank is very similar in purpose to the measures of static importance that are used by the search engines.
    • mozTrust - Like mozRank, mozTrust is distributed through links. Trustworthy “seeds” are identified to feed the calculation of the metric. Websites that earn links from the seed set are then able to cast (lesser) trust-votes through their links.
    • Number of Links: The total number of pages that contain at least one link to this page.
    • Number of Linking Root Domains:The total number of unique root domains that contain a link to this page.
    • External mozRank - Whereas mozRank measures the link juice (ranking power) of both internal and external links, external mozRank measures only the amount of mozRank flowing through external links (links located on a separate domain). Because external links can play an important role as independent endorsements, external mozRank is an important metric for predicting search engine rankings.
  • Moz - Domain Specific Metrics
    • Domain Authority: Domain Authority predicts how well a web page on a specific domain will rank. The higher the Domain Authority, the greater the potential for an individual page on that domain to rank well.
    • Domain mozRank: Domain-level mozRank (DmR) quantifies the popularity of a given domain compared to all other domains on the web.
    • Domain mozTrust: Domain-level mozTrust is like mozTrust but instead of being calculated between web pages, it is calculated between entire domains.
    • Number of Links: the quantity of pages that contain at least one link to the domain.
    • Number of Linking Root Domains: the quantity of different domains that contain at least one page with a link to any page on this site.

Applying that Data - To a Campaign

Some of the most common directional signals provided by tracking data points and how to respond with actions to improve or execute on opportunities.

  • Fluctuation In Search Engine Page and Link Count Numbers:

The numbers reported in "site:" and "link:" queries are rarely precise, and thus we strongly recommend not getting too worried about fluctuations showing massive increases or decreases unless they are accompanied by traffic drops.

  • Falling - Search Traffic from a Single Engine: If you have considerably less traffic, a small number of possibilities exists:
    • You're under a penalty at that engine for violating search quality or terms of service guidelines.
    • You've accidentally blocked access to that search engine's crawler. Double-check your robots.txt file and meta robots tags and review the Webmaster Tools for that engine to see if any issues exist.
    • That engine has changed their ranking algorithm in a fashion that no longer favors your site.
  • Individual - Ranking Fluctuations: Gaining or losing rankings for a particular term/phrase or even several happens millions of times a day to millions of pages and is generally nothing to be concerned about.
  • Positive - Increases in Link Metrics Without Rankings Increases: Many site owners worry that when they've done some "classic" SEO - on-page optimization, link acquisition, etc. they can expect instant results.

Using SEO for Videos

1) KEYWORDS

Naming your video is one of the first steps to help you stand out amongst the other search results. Give highest priority to the keywords and clear wording. While you may think a clever or catchy title will work best, if it doesn’t get you in the top search results it’s just not going to get you as much attention. So keep in mind what the most likely search terms your target audience is going to type out and include them in your title in a logical sequence.

2) TRANSCRIBING CONTENT

Transcribing all of the text in videos help search engines improve relevancy when delivering content to the audience. It’s also a great opportunity to include all of the keywords from your video that you couldn’t include in the title or description. Accompanying your video with a the transcribed text in a blog post not only helps with SEO, it also helps cater to individuals who prefer to read text rather than videos, or skim through while watching your video.

3) GOOGLE’s RICH SNIPPET

Thumbnails, transcripts, video length and other contextual information can add great appeal to a reader and help you stand out on the search page. The best way to do this is to use videoObject Schema, available at shcema.org. Add to code to all of your embedded video to ensure they come up with a strong presence on the search pages every time.

4) MULTI PLATFORM PROMOTION

One of the key things Search engines pick up is popularity, so the more channels you use to promote your video the more likely it’s going to rank up on the search engine. With Google and Youtube being the top two search engines for videos, you’ll need a strategy for ranking high on both. You can do this by embedding your video on your website with all the accompanying metadata, while also uploading the video to Youtube with a slightly different title. This allows you to market each platform separately giving you good presence on both Google and Youtube.



SEO resources