Do you use underscores or hyphens in your URL’s?
Have you ever wondered if one method is better than the other?
Have you ever considered if Google interprets an undersore differently from a hyphen?
Well, the answer is yes they do. According to Googles, Matt Cutts, hyphens are recognised as a separator and underscores are not.
This may not be important if your URL does not contain any major keywords. However if you are targeting a keyword within your URL, you will need to ensure that the relevant Keywords are separated by a hyphen and not an underscore.
For example www.professional_carpet_cleaning.co.uk would be interpreted as www.professionalcarpetcleaning.co.uk, in the eyes of Google. This is a missed opportunity as their are two primary keywords (“Professional Carpet Cleaning” and “Carpet Cleaning”) that would not be recognised by Google.
However, if this URL had been structured as www.professional-carpet-cleaning.co.uk then Google would recognise the individual keywords, which would help your Web Page Optimisation efforts.
So, there you have it, always try and use hyphens as opposed to underscores in your URL’s, for maximum benefit.
All of the major search engines use a wide range of highly developed algorithms to check the integrity of your website. Attempting to manipulate the search engine results, by using dubious techniques on your web pages, will almost certainly result in your site being banned!
The top 7 techniques to avoid at all costs are as follows:
- Do not purchase links from paid for link sites or link farms.
- Do not place hidden links are your website.
- Do not set up links from known spam sites.
- Do not stuff your meta tags or web page content with massive amounts of misleading or irrelevant keywords.
- Do not attempt to place hidden keywords within your web page content, for example white text on a white background.
- Do not copy content from another website or plagiarise existing content and claim as your own.
- Do not create one set of web page content for your visitors and a separate set of content dedicated solely to search engines (Cloaking)
If you try and use any of these methods, you may experience improved search engine rankings in the short term. However you will eventually get found out and your site will suddenly disappear from the search engine results, without any warning.
In the long term it is far more sensible to compete on a level playing field and only use Professional SEO Techniques that are within the published editorial guidelines.
To learn more about search engine optimisation or for further information on our Link Building Service, please visit our website.
More than 60% of the websites I review contain poorly written Title Tags. Many simply contain the company name or trading name of the website.
From a search engine perspective, your Title Tag is the most important Meta Tag and should contain the top two or three keyword phrases that you are targeting. Title Tags such as “Welcome to PetsRUs” or “The Acme Insurance Website” will not help your search engine rankings and will not produce high volumes of traffic to your site.
It is easy to achieve a high ranking for your trading name.
In most cases, search engines will list your website within the first few pages, when searched by your company name, especially if your domain name matches your trading name. That’s the easy bit. The more difficult challenge is to be found by users who are searching for the products or services you provide, using a more generic keyword search term, relevant to your business.
Including your company name within your Title Tag is a wasted opportunity.
Think about it, if a customer knows your company or brand name it is relatively easy for them to find you, not just via the internet but by more traditional methods such as Yellow Pages, offline advertising or even Directory Enquiries! So why waste valuable space promoting your company name within your Meta Tag?
As a guidline, your Title Tag should contain a maximum of ten individual words. Therefore do not waste this space, choose your keywords carefully and make sure that they are relevant to the page content.
Make sure your Title Tag includes your targeted keywords.
I recommend that you review all of your Title Tags and if necessary change them to include at least a couple of the top keywords you are targeting. You also need to ensure that each page of your website contains a unique Title Tag. Although simply creating a well written, search engine friendly Meta Tag is not sufficient to propel you to the first page of Google, it will make a difference and you should see some improvement in your search engine rankings over time.
More and more people are beginning to understand that search engines are unable to recognise graphics, images and flash files contained on a web page. Text is the only element of a web page that Google and other search engines can reliably index. Therefore ‘text is king’ and plays a huge part in the relevance and ranking of search results.
However, if you are struggling to create web pages that contain unique content, do not be tempted to copy sections from other parts of your site, as search engines will recognise this as duplicate content. Even worse, do not be tempted to produce an exact copy of your website, hosted on a different domain, in the hope that this will increase your chances of achieving a high ranking.
Why does Google care about Duplicate Content?
Google and other search engines include a Duplicate Content clause in their editorial guidlelines. This clause states that they will not list a web page or website that contains substantially the same content as another site. This is perfectly reasonable from Googles perspective, as there users would not be impressed with search results that simply produced a list of duplicated sites. Duplicate content occurs when two websites display identical or very similar content on any number of web pages. Duplicated sites are particularly prevalent in the MLM industry where typically a company will duplicate a website thousands of times for use by their individual distributors.
Does Google penalise sites with Duplicate Content?
Unless Google believes that the duplicate content has been created as a deliberate attempt to manipulate search engine rankings, it is unlikely that your site will be banned from the search results. However Google will only list one version of a web page or website. The version that is listed will normally be the version with the most authority in terms of link popularity. This tends to be the earliest version that was published.
The Google Webmaster Blog provides information on a range of methods webmasters can use to proactively address duplicate content issues. However, the best advice would be to keep your website free of duplicate content as far as possible.
To learn more about Web Page Optimisation visit our main website or view our range of Search Engine Optimisation Articles on our Blog.
It is important to check that your website is compatible with all major web browsers. My preferred browser is Firefox. Obviously I review hundreds of different websites each week and it is not unusual to view web pages that appear to have missing text or images, over lapping text or specific functionality that simply does not work.
Inevitably the reason for this is because the site is not compatible with Firefox and the webmaster has not bothered to check if their website is compatible with the many different browsers that exist. Some web design companies are also guilty of not checking that a new site will function properly in all of the major browsers.
Because Inernet Explorer is deemed to be the major player in terms of market share, there is a tendency to forget that there are thousands of internet users, using alternative browser platforms. If your website does not perform properly when viewed through all of the major browsers then you are potentially losing out on a significant slice of business.
It is therefore vitally important to ensure that your site will function properly and look its best, no matter what browser your visitors are using.
The major web browsers and their market share as of December 2008, is as follows:
* Source – NetApplications.com
As you can see, if your website only functions properly with Internet Explorer, there is potentially a whopping 30% of visitors that are unable to view your site correctly.
You will need to download the latest version of each browser and install on your computer, in order to check that your site can be viewed correctly and is functioning properly. If you identify a problem with your site you should report this to your web designer/developer who should be able to fix the problem. Most compatibility issues are easily resolved, once they have been identified.
If you require further information on any Web Site Design issue, please Contact Us
Ensuring that each page of your website contains unique, well crafted and search engine friendly Meta Tags, is a vitally important part of the Web Page Optimisation process.
Obviously, in my job, I review many websites each day and less than 20% contain correctly structured Meta Tags. Missing Title, Description and Keywords Tags are quite common and it is not unusual to see Keywords Tags stuffed with hundreds of individual keywords.
Things you should not do in your Title Meta Tags
- Do not use generic web page titles such as “Untitled” or “New Page”.
- Do not use web page titles that have no relation to the content on the page.
- Do not repeat a keyword search term.
- Do not use very long titles that do not help your users.
- Do not stuff unnecessary keywords in your title tag.
- Do not use the same title for more than one web page.
- Do not use more than 10/12 indiviual words.
Things you should not do in your Description Meta Tags
- Do not use descriptions that have no relation to the web page content.
- Do not use generic descriptions such as “My web page” or “About Goldfish”.
- Do not use descriptions that only contain a list of keywords.
- Do not repeat the same keyword.
- Do not copy the entire content of your web page to use as your description.
- Do not use all capital letters.
- Do not use the same description on more than one web page.
- Do not use more than 30/40 individual words.
Things you should not do in your Keywords Meta Tags
- Do not repeat exactly the same keyword.
- Do not use capital letters.
- Do not use spaces between keywords, separate only with commas.
- Do not use more than 15/20 keywords.
- Do not use the same list of keywords on every page.
Once Google has found your website it will return on a regular basis to search for new or updated content. The frequency of visits will depend on various factors, such the size of your website (measured in number of pages), your Google Page Rank score (Google will visit higher ranked sites more frequently) and how often your site is updated with new content.
When the Google robot visits your site what does it see?
The Google robot does not see web pages as you can see them in your browser. Google cannot see colours, graphics or images including Flash generated images and videos. The Google robot can only see text, which it will extract from your web page HTML code. It is important therefore that the HTML code of your web pages contains everything in the right place, so that the Google robot can find all the relevant information to save in their database.
A single web page has many elements that can be understood by the Google robot. The most important elements where Google will look for information includes your Title Tag, Description Tag, (this will be used as the basis for the description used by Google in the search results), Keywords Tag, Headline Tags, Image Tags, Link Text and of course the main body text. These elements must contain the keyword(s) you wish to target and must be in the right keyword density, in order to achieve high rankings on Google.
It is important to ensure that Google is able to navigate through all of your internal web pages.
It is important to ensure that the structure of your individual web pages are Google friendly. It is also important to ensure that Google is able to navigate through the entire structure of your website and is able to move from page to page. The internal link structure of your website is vitally important and your site should contain easy to follow text links to every page that you want Google to see.
If your website is poorly designed or if it does not link to all of your internal pages, then the Google robot will skip these pages. If your website has been designed in Flash or if you place most of your web page content in images, then the Google robot will be unable to see your content.
It is important to make it as easy as possible for the Google robot to index your web pages in order to achieve the best possible rankings.
For further information please visit our SEO Packages page.