More and more website owners are concerned that they might get penalized accidentally or overtly because of duplicate content. For example, if you run mirror sites, will search engines ban you? If you have listings that are similar in nature, is that an issue?
What happens if you syndicate content through RSS? Will other sites be considered the “real” site and rob you of a rightful place in the search results? This Search Engine Strategies San Francisco session looks at the issues and explores solutions.
Moderator:
Adam Audette, President, AudetteMedia, Inc.
Speakers:
- Shari Thurow, Founder & SEO Director, Omni Marketing Interactive
- Kathleen Pitcher, Senior Manager, Acquisitions Marketing, Pogo.com/Electronic Arts, Inc.
- Michael Gray, Owner, Atlas Web Service
Shari Thurow, Founder & SEO Director, Omni Marketing Interactive
What can happen with duplicate content?
It lowers the indexation count on Google, which means the best converting pages might not appear in search results.
Web pages from your shared-content partners’ sites may actually end up getting better search visibility.
What duplicate content does to a searcher, is they end up seeing duplicate pages over and over again which translates into a poor user experience.
How do you deal with duplicate content?
1. Information architecture, site navigation and page interlinking
- Are URLS linked to consistently?
- Are the links labeled consistently?
2. Robots.txt file
- Are you preventing the web page from being spidered?
3. Robots meta tag
- If articles are shared across the network of sites, are you implementing noindex, no follow appropriately?
4. Canonical tag
5. Ensure redirects are good (301s)
6. Use of webmaster tools
The idea behind all of this is consistently. Don’t say one thing in webmaster tools and then submit a sitemap that says the opposite. Be consistent with search engines and they will reward you.
Kathleen Pitcher, Senior Manager, Acquisitions Marketing, Pogo.com/Electronic Arts, Inc.
What does duplicate content look like?
There are two camps when it comes to duplicate content. The first is the malicious, bad kind of duplicate content. The second is the good kind that serves a purpose.
Good:
- Find content on different URLs
- RSS
- Blogs
- Forums
- Retail site with products in multiple categories
Bad:
- Same content and multiplied across your site
- Blatantly stealing content from other sites
What are the consequences?
There are no specific penalties, but you may notice your organic search visibility slipping. I.E. – content could get filtered into their supplemental index.
Learning’s and best practices
1. Determine if you have duplicate content
2. Leverage resources
- Talk to other departments
- Consult with your agency
- Research industry sites
- Review webmaster forums
- Talk to industry peers
3. Be proactive
- Write unique page content
- Identify authority pages
- Be aware of engine updates
- Manage syndicated content
4. Manage syndicated content effectively
- Allow ample time for your content to get indexed
- Require links back
- Require condensed versions
- Use generic meta data
And don’t forget not to freak out, there are always solutions.
Michael Gray, Owner, Atlas Web Service
As opposed to the other presenters who shared how to fix duplicate content, Michael discussed how to make it work for you.
There are some circumstances duplicate content is a good thing if you use it as a weapon. When you syndicate content, many will take it “in whole.” Use this as an opportunity to gain links, especially if the sites picking up your content are more trusted and authoritative. Try to set up arrangements with people in your space (blogs, magazines, etc.) so they will pick up your content.
How to potentially outrank someone for their own content:
- Take content or a data feed that someone else has legally syndicated or allowed to be re-used
- Place that content on a different domain
- Build in-links with very keyword focused anchor text to the content
- If you can build more trust than the original website, you may be considered the originator in the eyes of Google
- This is a very common tactic in shady/aggressive affiliate industries
Why I love web scrapers
Most web scrapers are stupid. They search for keywords and leave whatever links they find in posts in place. You should use this as an advantage by linking to yourself with high value focused keyword anchor text in every post.
As long as you offset these low value links with high quality links this works to your advantage. Always insert links back to the original website and/or original page.
Further, change the anchor text, link and surrounding text of links inserted after your content (i.e. from something like Yoast’s RSS footer plugin) every 3-4 months so you’re getting different links to different parts of the website.
Takeaways:
- Look for opportunities to syndicate your duplicate content, gain attention, trust and links
- Refine your copy to target more keywords
- Be on the lookout for people who may be reusing your content who aren’t helping you
- Allow your blog and RSS feed to be syndicated with self-referencing and keyword focused links to commercial pages.