The role of duplicate content in SEO is one of the trickiest SEO elements according to many experts. According to some SEO’s Google has a duplicate content penalty, a claim Google engineers refute. These assertions have made a few site owners to lax in their lookout for duplicate content with catastrophic SEO results. This article looks at duplicate content in detail and tries to answer the question as to whether or not duplicate content is bad for SEO.
When duplicate content is bad for SEO
The best way to determine whether or not duplicate content is bad for SEO is the intention. When your intention is to get more views or to manipulate the search engines into thinking you have awesome content, then it’s bad. It’s as simple as that.
In fact, according to Google, action can be taken against a website when the website is viewed to be manipulating itself into the top rankings. Also, the website can be acted against by the search engines when the content they display is intended for spamming purposes. Either way, the key for negatives in duplicate content lies in the manipulation intent.
The concerns of the search engines are their users. The search engine’s obligations are to create value for users. When users send queries to the users, their answers may not come. Google wants to give these users a variety of solutions. Duplicating content does not deliver this value.
When duplicate content is good for a website
Sometimes, duplicate content occurs on a website but leaves no SEO problems. For example, when different pages are accessed in different routes, the search engines can assume this incidence as accidental. Other duplicates for mobile sites and printer friendly versions of the website can also be ignored as accidental by search engines. All these instances are not exposed to the filters. The SEO losses are negligible in this case.
What to do about duplicate content on the website
When your website has duplicate content issues, the following are some of the actions you can take to save the day.
- Using permanent redirects – the 301 redirect when implemented tells the search engines that a page has been moved to a new location. You can take a weaker version of the page and send it to the more authoritative version of the site.
- Linking to the index page the same way – when creating backlinks to the website index page, its best to use a uniform back linking strategy. It gives the algorithms insights into what you consider as important.
- Use a duplicate content checker like Plagiarisma or Plagiarism Cheker to investigate duplicate content on the website and the entire web.
- Use noindex code on content that is phished or copied
- Avoid repeating the footer information in the website
- Expand the pages with similar content its good practice to create bigger pages than have more that are duplicates of each other. Combine pages that can be combined as much as possible.
- Provide and publish content that is unique on your website
- Use the relative canonical tag to specify preferred page URL’s