Guys, I need help. I have a question! I have a webpage and content is unique. And the webpage already got good rank. But I want better than present. Now, my question is if I use the same content without changing any single word for creating 50+ web2.0 then is it helpful for my rank? Or it will destroy my present rank? I am bit confuse about this. Can you guys please help me to figure out the things?
If the Web 2.0 sites that you are publishing this content on are all simply made up of content that was copied from somewhere else - like suppose you created one page web 2.0 sites with just your copied page on them, then those sites aren't going to rank at all. One of them might rank for a short while if you point some links at it. But the Google Panda algorithm will quickly demote all of the duplicate content. So they won't rank and won't get any Google traffic at all.
If you used those Web 2.0 sites to link back to your original article, then it could help boost your ranking for a while, but ultimately end up hurting it. It might boost your ranking for a while when Google first discovers those new inbound links to your original article. But, eventually Google will realize it is simply a link network and nuke everything. If you use the same anchor text in all the links then your original page will likely get demoted due to the Google Penguin algorithm.
If you added enough unique content to each of those pages in addition to the republished content, and you used different anchor text each time to link back to the original page, or if you used the bare URL perhaps, then it is possible Google might count those inbound links for a long, long time and it could therefore boost your ranking of your original page. If the Web 2.0 sites you are publishing on are not identified as low quality (like with Google Panda) and the links are not identified as being intentionally manipulative (as with Google Penguin) then you stand a chance at boosting your ranking.
You really don't want to syndicate your content across low quality sites at all.
About 8-9 years ago or so, there were link networks that syndicated your exact article across many different sites. You would pay "x" amount of dollars per month and you could publish one article that would get sent out and republished on a couple hundred other sites verbatim. This worked very well at the time and you could boost your rankings way up. Google nuked all the sites that those kinds of services published on. They nuked them way before they ever came out with the Panda or Penguin updates. So that type of service was rendered useless I think somewhere around 2007 or 2008 give or take.
Then new services popped up that did the same thing except this time they spun your article so that the content across each of the network sites was mostly unique. And they would drip that content over time so that it appeared these links were happening naturally. And then software was written that you could buy for your computer that would allow you to spin your articles and mass submit them to article directories or splogs (spam blogs). These services worked really well for a long time. Now most of those services are dead and most of that software sits idle on internet marketers' computers.
The spam industry was forced to redesign things again to somehow get content published on sites where Google would count the links or else get their software to produce entire websites that Google misidentifies as being useful sites and therefore counts the links from them. Some of it works well and a lot of it doesn't. I digress.
To shorten what is getting to be a really long story:
Don't republish your content verbatim on crappy sites that nobody is going to read. There is more likelihood of you hurting yourself doing that rather than helping in any way.