Website Content Duplication Issues

content-duplicationThe lazy webmasters, those allergic to original thought, and those who detest writing the most are almost always vehement that unique content is over-rated, unnecessary or even pointless.

Of all the website content frustrations I endure, most frequent is the failure on the part of others to understand the inherent value of descriptive accuracy and uniqueness of page content. There is intransigence in that respect, particularly on the part of some creators of e-commerce and content management systems. That is indicative of 3 parts ignorance, arrogance and stupidity, in almost equal measure…

I personally believe that it’s important that every page is able to express its reason for existence clearly. That means using software that provides containers for accurate content and meta-tag descriptors in all anticipated locations. Then you, as webmaster/designer/owner must populate those accurately… If you cannot achieve that simple task, how on earth could anyone who wants what you have, be expected to find your content?

Equally important, how can automated Search engine robots / spiders, operating in a fully automated “rules-based” methodology, be expected to determine what the site is about?

1: Ecommerce Gift Store

This site’s top level pages were beautifully rendered in elegant verbiage, all embedded in delicate, pastel-coloured images… Yes, in its entirety, on all Category pages and the Home page! Not a single word of explanatory, descriptive text! Text embedded in images is such a fundamental error of judgement, I am amazed that the client’s website designers did not vehemently urge it not be done thus… Incredible!

And of course, many pages shared global meta-data… At least the Category pages were possessed of editable titles and meta-tags…The “informational” pages, on the other hand, were bereft of any meta-tag editing facility, as the designer was not of the opinion that this was relevant in the age of Web 2.0! He was eventually disabused of that notion and, after  weeks of prompting, finally deigned to add this most basic but fundamental facility.

2: Prominent City Legal Practice

This site languished below the Google radar, despite their website designers having an “SEO expert” onboard – a Microsoft and Google certified one, allegedly. I built links to expand the keywords associated with the site, and hand-edited 30 of the main pages, out of 100+ pages. That helped a lot at Yahoo and MSN, but it did not get the site out of the doldrums at Google.

The designers were adamant that the site must have been black-listed in some way, and wanted me to identify the problem and tell them how to resolve it. I explained at the outset that duplicated content was an issue but the designers were emphatic that it must be something far more sinister; a legacy of the previous incumbents who had transgressed in some indiscernible, arcane, black-hat clad manner…

So, I instructed the office manager that she’d have to over-ride the objections of the designers and compel them to ensure that every single page had accurate Titles, Descriptions and Keywords. Basically, they had too high a percentage of “cookie-cutter” pages that all shared global meta-tags. In most cases the meta-tags contradicted the on-page content. As soon as that was sorted out, the hand-brake went off at Google HQ, and the site popped up into page 1 SERPs for almost all relevant search phrases…

Copied Content Conclusion…

Duplicate page content in all forms is (and always has been) a sin as far as search engines are concerned. Every page ought to be accurately described using the meta-data elements provided expressly for that purpose. Each page must contain accessible and unique content in both on-page and off-page elements…

This surely should not be such a hard concept to grasp? If you can’t accurately describe what your site is all about, in your own words, and place accurate information into all the areas Google et al look for clues as to content and purpose, how can you reasonably expect to prosper online?

  • If your website software does not allow you to thoroughly and accurately describe your product and services, you should be concerned, fearful even…
  • If your website designer does not think that search engines are at all relevant in the 21st century, you should be very, very afraid…

Of course, you should also bear in mind that nothing in the virtual world is set in concrete… Never, ever be afraid to start again… There are good designers out there, ethical men and women with great website software. They approach their task with intelligence, diligence, and an open-minded awareness of the possibilities. You always have the freedom to make an informed choice… so don’t settle for being 2nd best!