Duplicate Page Content and SEO – a little story

Back in 2005 I had a great idea to increase the effectiveness of my PPC campaigns. It was one of those lightbulb moments in the shower.
You see, I had been running PPC for 18 months with great success due, mainly, to the low bid prices allowing us to make mistakes back then (what costs £1.30 per click now cost 17p back then).
The question in my mind was based around the concept of driving laser-targeted rather than general traffic to my landing pages, something I amazed to see still does not happen 8 years later.
I had discovered that the majority of my traffic was coming from keywords with a geographical angle, people looking for “[services KeyWord] [town]” rather than just “[services KeyWord]”. It struck me that if these people could land on a page that was specifically optimised for their [town], this would increase CTRs, reduce CPCs, raise ads in the listings and increase conversion rates.
I needed a solution fast. There are over 6000 towns, villages and counties in the UK so the idea of writing an individual page for each place was daunting and practically impossible at the time.
What we came up with was simply a database behind the site(s) that dynamically generated new pages based on optimising page elements thus:
PAGE TITLE: [services KW] [town] [brand]
META-DESCRIPTION: [call to action] for [Keyword] from [brand]. [Trust sentence]
META KEYWORDS: [services KW] [town]
H1: [services KW] in [town] [conversion copy]
It worked like a charm, achieving all the objectives in PPC and putting us years (literally) ahead of the competition.
What I did not expect was that these dynamically generated pages would start to rank in the SERPs, and rank all over the place. One website went from 5000 unique organic visitors to over 40,000 within the space of a few months. For a while we suffered from a bizarre problem – too many leads. Our sales team were burning through the calls and emails that came in, sifting for the juicy ones and forgetting what my Grandad taught me: “Look after the pennies and the pounds will look after themselves”
This was a great ride for a while and we soon got down to building links to some of our higher traffic pages, which I was now referring to as “geo-longtails”.
Then Penguin and Panda hit, and these pages started dropping. Logical really – they weren’t really adding value and did contain a lot of duplicate content, something Google had long stated it didn’t like.
We’ve since gone back to the drawing board and implemented a complete overhaul of all longtail pages to:
1. Be static, permanent pages (with dynamic content)
2. Contain more than 80{4d2303c3d7018ed3e1a955ec2105d2bcecc5b881e14ee890535c5ae577f46e7a} unique content
If you are getting this kind of feedback from your SEO reports:
Duplicate page content problem in SEO? Attwooddigital.com
…then you may benefit from spending 1.55mins watching this video from Matt Cutts at Google on this subject:

The founder of Attwood Digital, Mark is a digital marketing veteran having been working online since before the dotcom boom. He created the world's first online skip hire service in 2003, has created multiple online courses, lectured on digital marketing and even written a book on the subject. He is also an ICO advisor and crypto-enthusiast.
Write a Comment