Webpage Content and Web2.0 Content are same! Is it harmful for rank or not?

TausifAlHossain

Member
Registered
Joined
Nov 4, 2015
Messages
15
Points
0
Guys, I need help. I have a question! I have a webpage and content is unique. And the webpage already got good rank. But I want better than present. Now, my question is if I use the same content without changing any single word for creating 50+ web2.0 then is it helpful for my rank? Or it will destroy my present rank? I am bit confuse about this. Can you guys please help me to figure out the things?

Thanks in advance! :)
 

RDO Servers

Well-known member
Registered
Joined
Apr 3, 2015
Messages
1,027
Points
83
You have a website with unique content.
You want to make 50+ other websites using the same content?

If I understand you right, then no, this is a bad idea. The 50+ sites will be considered plagiarized content and will have no benefit.
 

ulterios

Well-known member
Registered
Joined
Nov 25, 2015
Messages
481
Points
0
If you are trying to do what RDO stated above, then that's definitely NOT something that you want to do. Since you have said that your unique content already is indexed and ranks well, any further posting of the same content will be seen as plagiarized content and it will not rank well at all.

The search engines see the first instance of any content that they find as the original and if they find the same content on another site later then they know (more like assume) that it came after the first instance they saw the content and right away that will make it not rank as good as the first time they saw that particular content.

Many people will re-write or spin their content into similar but different articles/posts and put it up on different sites but you have to be careful of this because the SE's have gotten better at detecting spun content and will not rank it as well.

If you were meaning something else then maybe if you give us a bit more clarification then we can better help you.
 

PTTed

Well-known member
Registered
Joined
Jul 15, 2015
Messages
329
Points
0
PTTed
In my experience this is not how it actually works. Google will not automatically credit the site/page it found first. Instead they choose which page to rank depending on the standard ranking algorithm.

According to the standard ranking algorithm, if you have two pages that are identical, then the page that outranks the other one will be the page with the most inbound link juice and or the most relevant inbound anchor text. If there are no links pointing to either page either from within the domain itself or from other domains, then the page that ranks the highest will be the page that is published on the domain that has the most authority.

This was proven a few years back when there was a public study where a page on a well known website was copied to another domain and republished verbatim. Then the "copied" page on the second site was boosted with link juice. The result was that Google replaced the ranking of the original authentic page with the ranking of the page that copied it. So essentially the ranking was hijacked and stolen by stealing the original content and then pushing more link juice at the stolen page.

This is why Google tells you that if you are going to allow people to republish your work, you should have them link back to the original source page. If they don't link back to the original source page, then you run into a scenario where it is possible that you will lose your ranking and the other publisher will outrank you.

I think it still works that way unless something has changed in the last year or so that I didn't hear about which is possible. That study was repeated multiple times and the hijack was always successful.
 

elcidofaguy

Well-known member
Registered
Joined
Jan 13, 2015
Messages
866
Points
0
Guys, I need help. I have a question! I have a webpage and content is unique. And the webpage already got good rank. But I want better than present. Now, my question is if I use the same content without changing any single word for creating 50+ web2.0 then is it helpful for my rank? Or it will destroy my present rank? I am bit confuse about this. Can you guys please help me to figure out the things?

Thanks in advance! :)
I think the best way to answer that would be to ask the following questions...

Do you really think Google will rank those other pages with duplicate content? Lets say I search for red apples and the top 51 results is your website plus 50 other web 2.0 properties with it all being duplicate content... If that was the case would people continue to use Google?

Put yourself in Google's shoes... What is it that Google wants to do? How would you provide value to your customers? Would you provide duplicate content x50 times over on the same search results?

Finally do you also think that your web page is ranking purely on the basis of content?
 

ulterios

Well-known member
Registered
Joined
Nov 25, 2015
Messages
481
Points
0
Meaning that if you have the same duplicate content on other sites, don't go linking them to one another. If you have the same content on multiple sites and linking them all together, to the search engines it will look like they are all a bunch of spam sites since they all have the same content.

Linking other sites that you create with the duplicate content on them to your first site that's ranking, will hurt that original site.
 

PTTed

Well-known member
Registered
Joined
Jul 15, 2015
Messages
329
Points
0
Guys, I need help. I have a question! I have a webpage and content is unique. And the webpage already got good rank. But I want better than present. Now, my question is if I use the same content without changing any single word for creating 50+ web2.0 then is it helpful for my rank? Or it will destroy my present rank? I am bit confuse about this. Can you guys please help me to figure out the things?
If the Web 2.0 sites that you are publishing this content on are all simply made up of content that was copied from somewhere else - like suppose you created one page web 2.0 sites with just your copied page on them, then those sites aren't going to rank at all. One of them might rank for a short while if you point some links at it. But the Google Panda algorithm will quickly demote all of the duplicate content. So they won't rank and won't get any Google traffic at all.

If you used those Web 2.0 sites to link back to your original article, then it could help boost your ranking for a while, but ultimately end up hurting it. It might boost your ranking for a while when Google first discovers those new inbound links to your original article. But, eventually Google will realize it is simply a link network and nuke everything. If you use the same anchor text in all the links then your original page will likely get demoted due to the Google Penguin algorithm.

If you added enough unique content to each of those pages in addition to the republished content, and you used different anchor text each time to link back to the original page, or if you used the bare URL perhaps, then it is possible Google might count those inbound links for a long, long time and it could therefore boost your ranking of your original page. If the Web 2.0 sites you are publishing on are not identified as low quality (like with Google Panda) and the links are not identified as being intentionally manipulative (as with Google Penguin) then you stand a chance at boosting your ranking.

You really don't want to syndicate your content across low quality sites at all.

About 8-9 years ago or so, there were link networks that syndicated your exact article across many different sites. You would pay "x" amount of dollars per month and you could publish one article that would get sent out and republished on a couple hundred other sites verbatim. This worked very well at the time and you could boost your rankings way up. Google nuked all the sites that those kinds of services published on. They nuked them way before they ever came out with the Panda or Penguin updates. So that type of service was rendered useless I think somewhere around 2007 or 2008 give or take.

Then new services popped up that did the same thing except this time they spun your article so that the content across each of the network sites was mostly unique. And they would drip that content over time so that it appeared these links were happening naturally. And then software was written that you could buy for your computer that would allow you to spin your articles and mass submit them to article directories or splogs (spam blogs). These services worked really well for a long time. Now most of those services are dead and most of that software sits idle on internet marketers' computers.

The spam industry was forced to redesign things again to somehow get content published on sites where Google would count the links or else get their software to produce entire websites that Google misidentifies as being useful sites and therefore counts the links from them. Some of it works well and a lot of it doesn't. I digress.

To shorten what is getting to be a really long story:

Don't republish your content verbatim on crappy sites that nobody is going to read. There is more likelihood of you hurting yourself doing that rather than helping in any way.
 

TausifAlHossain

Member
Registered
Joined
Nov 4, 2015
Messages
15
Points
0
TausifAlHossain
If I ain't wrong then it will hurt the main website rank, right?
 

PTTed

Well-known member
Registered
Joined
Jul 15, 2015
Messages
329
Points
0
PTTed
Sorry dude. Either you didn't read through all of the great, highly specific, well thought out advice everyone in this thread just gave to you? Or else you just don't understand English. I can't help you at this point. :bash:

My intuition tells me you should just give up on whatever hopes you have of doing any kind of internet marketing or making money with websites. You will be saving yourself countless amounts of future frustration.
 

SEOPub

Well-known member
Registered
Joined
Mar 15, 2015
Messages
654
Points
0
Honestly, Google does not care if the content is identical. They index and rank identical content all of the time. Look at sites that contain quotes, music lyrics, news articles, etc.

There is a chance that some identical content will get pushed into the supplemental SERPs, but that is a completely different issue.

The real issue with your plan is that Web 2.0 sites suck for backlinks. They are extremely weak, unless you work to boost them. Of course, if you work to boost them and are using identical content, you run the risk of Google deciding that one of them is the more authoritative version and ranking that over your website.

The plan is lousy, not so much because of the identical content (although that is not the best idea either), but because the link sources stink.

Stop being lazy/cheap and create new content.
 
Older Threads
Newer Threads
Replies
15
Views
5,458
Replies
11
Views
5,457
Replies
4
Views
2,574
Replies
11
Views
5,952
Latest Threads
Replies
1
Views
26
Replies
0
Views
189
Replies
1
Views
40
Replies
2
Views
83
Recommended Threads

Latest postsNew threads

Latest Hosting OffersNew Reviews

Sponsors

Tag Cloud

You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an alternative browser.

Top