Website Content Duplication Issues

Website content duplication is something Google dislikes. It’s a waste of valuable space… Several content management systems, including WordPress, are able to spew forth vast amounts of duplicated content. Date archives, author archives, image attachment pages and bad use of Tags and Categories can combine to produce 1,000 pages of completely useless crud. The lazy webmasters, those allergic to original thought, and those who detest writing the most are almost always vehement that unique content is over-rated, unnecessary or even pointless. Couple that with bad content management, poor organisation and no plan; and a website becomes worthless. That’s where an SEO expert excels, shining a light on the dodgy stuff going on beneath the surface…

Descriptive accuracy and uniqueness

Of all the website content duplication frustrations I endure, the most frequent is the failure on the part of others to understand the inherent value of descriptive accuracy and uniqueness of page content. There is intransigence in that respect, particularly on the part of some creators of e-commerce and content management systems. That is indicative of 3 parts ignorance, arrogance and stupidity, in almost equal measure…

I personally believe that it’s important that every page is able to express its reason for existence clearly. That means using software that provides containers for accurate content and meta-tag descriptors in all anticipated locations. Then you, as webmaster/designer/owner must populate those accurately… If you cannot achieve that simple task, how on earth could anyone who wants what you have, be expected to find your content? Good SEO optimisation services guide you through these issues

Equally important, how can automated Search engine robots/spiders, operating in a fully automated “rules-based” methodology, be expected to determine what the site is about?

Examples of duplication

1: E-commerce Gift Store

This site’s top-level pages were beautifully rendered in elegant verbiage, all embedded in delicate, pastel-coloured images… Yes, in its entirety, on all Category pages and the Home page! Not a single word of explanatory, descriptive text! Text embedded in images is such a fundamental error of judgment, I am amazed that the client’s website designers did not vehemently urge it not to be done thus… Incredible!

And of course, many pages shared global meta-data… At least the Category pages were possessed of editable titles and meta-tags…The “informational” pages, on the other hand, were bereft of any meta-tag editing facility, as the designer was not of the opinion that this was relevant in the age of Web 2.0! He was eventually disabused of that notion and, after weeks of prompting, finally deigned to add this most basic but fundamental facility.

This site languished below the Google radar, despite their website designers having an “SEO expert” on board – a Microsoft and Google certified one, allegedly. I built links to expand the keywords associated with the site, and hand-edited 30 of the main pages, out of 100+ pages. That helped a lot at Yahoo and MSN, but it did not get the site out of the doldrums at Google.

The designers were adamant that the site must have been black-listed in some way, and wanted me to identify the problem and tell them how to resolve it. I explained at the outset that duplicated content was an issue but the designers were emphatic that it must be something far more sinister; a legacy of the previous incumbents who had transgressed in some indiscernible, arcane, black-hat-clad manner…

So, I instructed the office manager that she would have to override the objections of the designers and compel them to ensure that every single page had accurate Titles, Descriptions and Keywords. Basically, they had too high a percentage of “cookie-cutter” pages that all shared global meta-tags. In most cases, the meta-tags contradicted the on-page content. As soon as that was sorted out, the hand brake went off at Google HQ, and the site popped up on page 1 SERPs for almost all relevant search phrases…

3 – Website content duplication x 1,500

Not so long ago, I worked on a site that had 160 pages/posts of reasonable content. Unfortunately, the owner had been wildly extravagant with Category creation and had then recklessly generated over 1,200 different Tags. 4 or 5 ridiculous Tags per page/post! Each category and tag was displaying the FULL post/page. Nearly 1,500 pages of useless pages, overwhelm the core pages.

How to find Duplicate Content?

Well, that’s actually easy-peasy… There’s a website for that: https://www.siteliner.com/

Siteliner will analyse every page on your site, identify every crumb of content duplication AND show you comparisons between pages so you can reduce the problem to acceptable levels. It will you how your site compares to others… It will also show your “common content” and “unique content” percentages too.

1 Common Duplicate Unique Content
1 Common Duplicate Content Comparisons

Your goal is to be better than your competitors. Above-average scores in your niche can only help you.

Website Content Duplication Conclusions…

Website content duplication in all forms is (and always has been) a sin as far as search engines are concerned. Every page ought to be accurately described using the meta-data elements provided expressly for that purpose. Each page must contain accessible and unique content in both on-page and off-page elements…

This surely should not be such a hard concept to grasp. If you can’t accurately describe what your site is all about, in your own words, and place accurate information into all the areas Google et al look for clues as to content and purpose, how can you reasonably expect to prosper online? You need to consider optimising for helpful content rewards.

  • If your website software does not allow you to thoroughly and accurately describe your product and services, you should be concerned, fearful even…
  • If your website designer does not think that search engines are at all relevant in the 21st century, you should be very, very afraid…

Of course, you should also bear in mind that nothing in the virtual world is set in concrete… Never, ever be afraid to start again… There are good designers out there, ethical men and women with great website software. They approach their task with intelligence, diligence, and an open-minded awareness of the possibilities. You always have the freedom to make an informed choice… so don’t settle for being 2nd best!

References:

Find Duplication Issues: www.siteliner.com

Page last updated on Thursday, October 12, 2023 by the author Ben Kemp