GOOGLE SEO part3
Keyword Research is ESSENTIAL
The first step in any seo campaign is to do some keyword research.
There are many tools on the web to help with basic keyword research (including the free Google Keyword Research Tool and there are more useful third party tools to help you do this).
You can use these keyword research tools to quickly identify opportunities to get more traffic to a page:
Example Keyword | Search Volume |
---|---|
web search tutorial for beginners | 1900 |
ecommerce search optimization tutorials | 1600 |
how to seo a website | 880 |
seo tutorial step by step | 720 |
how to seo your website | 720 |
google seo tutorial | 320 |
best seo tutorial for novices | 260 |
free optimization tutorials | 210 |
on page seo tutorial | 170 |
seo tutorials for beginners | 170 |
all in one optimization tutorial | 170 |
website optimize tutorial | 140 |
how to seo website | 140 |
seo basics tutorial | 110 |
how to seo my website | 110 |
ethical seo tutorial download | 91 |
joomla optimizer tutorial | 91 |
online seo tutorial | 91 |
diy seo tutorial | 91 |
define optimization tutorial free | 73 |
optimize tutorial | 73 |
best seo tutorial | 58 |
basic seo tutorial | 58 |
bing optimization tutorial | 58 |
step by step seo tutorial | 46 |
beginners seo tutorial course | 46 |
seo tutorial google | 46 |
optimisation definition | 36 |
search engine optimization tutorial | 36 |
optimize website for free tutorial | 28 |
website seo tutorial | 28 |
easy optimisation tutorial for beginners lesson 1 | 28 |
website seo tutorial youtube | 22 |
google seo tutorials | 22 |
black hat optimisation tutorial | 22 |
seo tips and tricks tutorial | 16 |
local optimisation tutorial | 16 |
learn search tutorial blog | 16 |
expert seo tutorial 2014 | 12 |
seo tutorial online | 12 |
seo training tutorial printable | 12 |
natural optimization tutorial beginners | 12 |
online website optimisation tutorials |
Google Analytics Keyword ‘Not Provided’
Google Analytics was the very best place to look at keyword opportunity for some (especially older) sites, but that all changed a few years back.
Google stopped telling us which keywords are sending traffic to our sites from the search engine back in October 2011, as part of privacy concerns for it’s users.
Google will now begin encrypting searches that people do by default, if they are logged into Google.com already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive “referrer” data that reveals what those people searched for, except in the case of ads.Google Analytics now instead displays – keyword “not provided“, instead.
In Google’s new system, referrer data will be blocked. This means site owners will begin to lose valuable data that they depend on, to understand how their sites are found through Google. They’ll still be able to tell that someone came from a Google search. They won’t, however, know what that search was. SearchEngineLandYou can still get some of this data if you sign up for Google Webmaster Tools (and you can combine this in Google Analytics) but the data even there is limited and often not entirely the most accurate. The keyword data can be useful though – and access to backlink data is essential these days.
This is another example of Google making ranking in organic listings HARDER – a change for ‘users’ that seems to have the most impact on ‘marketers’ outside of Google’s ecosystem – yes – search engine optimisers.
Now, consultants need to be page centric (abstract, I know), instead of just keyword centric when optimising a web page for Google. There are now plenty of third party tools that help when researching keywords but most of us miss the kind of keyword intelligence we used to have access to.
Proper keyword research is important because getting a site to the top of Google eventually comes down to your text content on a page and keywords in external & internal links. All together, Google uses these signals to determine where you rank if you rank at all.
There’s no magic bullet, to this.
At any one time, your site is probably feeling the influence of some sort of algorithmic filter (for example, Google Panda or Google Penguin) designed to keep spam sites under control and deliver relevant results to human visitors.
One filter may be kicking in keeping a page down in the serps, while another filter is pushing another page up. You might have poor content but excellent incoming links, or vice versa.
Try and identify the reasons Google doesn’t ‘rate’ a particular page higher than the competition – the answer is usually on the page or in backlinks pointing to the page.
Too few quality incoming links? Too many incoming links? No keyword rich text? Too much keyword rich text? Linking out to irrelevant sites? Too many ads above the fold? Affiliate links on every page of your site, found on a thousand other websites?
Whatever they are, identify issues and fix them.
Get on the wrong side of Google and your site might well be flagged for MANUAL review – so optimise your site as if, one day, you will get that website review from a Google Web Spam reviewer.
The key to a successful campaign, I think, is persuading Google that your page is most relevant to any given search query. You do this by good unique keyword rich text content and getting “quality” links to that page. The latter is far easier to say these days than actually do!
Next time your developing a page, consider what looks spammy to you is probably spammy to Google. Ask yourself which pages on your site are really necessary. Which links are necessary? Which pages are getting the “juice” or “heat“. Which pages would you ignore?
You can help a site along in any number of ways (including making sure your page titles and meta tags are unique) but be careful. Obvious evidence of ‘rank modifying’ is dangerous.
I prefer simple seo techniques, and ones that can be measured in some way. I have never just wanted rank for competitive terms, I have always wanted to understand the reasons why I ranked for these key phrases. I try to create a good user experience for for humans AND search engines. If you make high quality text content relevant and suitable for both these audiences, you’ll more than likely find success in organic listings and you might not ever need to get into the technical side of things, like redirects and URL rewriting.
To beat the competition in an industry where it’s difficult to attract quality links, you have to get more “technical” sometimes – and in some industries – you’ve traditionally needed to be 100% black hat to even get in the top 100 results of competitive, transactional searches.
There are no hard and fast rules to long term ranking success, other than developing quality websites with quality content and quality links pointing to it. The less domain authority you have, the more text you’re going to need. The aim is to build a satisfying website and build real authority!
You need to mix it up and learn from experience. Make mistakes and learn from them by observation. I’ve found getting penalised is a very good way to learn what not to do.
Remember there are exceptions to nearly every rule, and in an ever fluctuating landscape, and you probably have little chance determining exactly why you rank in search engines these days. I’ve been doing it for over 10 years and everyday I’m trying to better understand Google, to learn more and learn from others’ experiences.
It’s important not to obsess about granular ranking specifics that have little return on your investment, unless you really have the time to do so! THERE IS USUALLY SOMETHING MORE VALUABLE TO SPEND THAT TIME ON. That’s usually either good backlinks or great content.
There ARE some things that are evident, with a bit of experience on your side:
Keep it simple. Don’t Build Your Site With Flash or HTML Frames
Well… not entirely in Flash, and especially not if you know very little about the ever improving accessibility of Flash.
Flash is a propriety plug in created by Macromedia to infuse (albeit) fantastically rich media for your websites. The W3C advises you avoid the use of such proprietary technology to construct an entire site. Instead, build your site with CSS and HTML ensuring everyone, including search engine robots, can sample your website content. Then, if required, you can embed media files such as Flash in the HTML of your website.
Flash, in the hands of an inexperienced designer, can cause all types of problems at the moment, especially with:
- Accessibility
- Search Engines
- Users not having the Plug In
- Large Download Times
Html5 is the preferred option over Flash these days, for most designers. A site built entirely in Flash could cause an unsatisfactory user experience, and could affect your rankings, and especially in mobile search results. For similar accessibility and user satisfaction reasons, I would also say don’t build a site with website frames.
Keep It Simple, Stupid
As in any form of design, don’t try and re-invent the wheel when simple solutions will suffice. The KISS philosophy has been around since the dawn of design.
KISS does not mean boring web pages. You can create stunning sites with smashing graphics – but you should build these sites using simple techniques – HTML & CSS, for instance. If your new to web design, avoid things like Flash and Javascript, especially for elements like scrolling news tickers etc. These elements work fine for TV – but generally only cause problems for website visitors.
Keep layouts and navigation arrays consistent and simple too. Don’t spend time, effort and money (especially if you work in a professional environment) designing fancy navigation menus if, for example, your new website is an information site.
Same with website optimisation – keep your documents well structured and keep your page Title Elements and text content relevant, use Headings tags sensibly and try and avoid leaving too much of a footprint – whatever you are up to.
0 comments:
Post a Comment