There is a LOT of duplicate content out there on the web. If you search for product reviews on Google, the first 10 pages of the search results will contain very similar (or even duplicate) articles where the user did not even care to change the content after copying!
Does that pose a problem in itself? Here is one opinion
But before you start believing everything you read, the first thing to look at is the fact that although you can be sure to provide unique content on your website, ANYONE can copy it and use it for themselves. This means that you have virtually no control over your public content. The search engines understand this fact and in my opinion, take it into consideration when assigning you SERPs.
Another point to look at is the fact that when you get directory submissions done, the same article is copied out to hundreds or even thousands of directories! Does that mean that you get penalized? Far from it! If you get links from high quality, high PR article directories, you will improve your SERPs, which shows that duplication is not a major issue.
But you should always be sure to never have the same content on two different pages of your website. It will most likely kill your rankings!
In essense, search engines check whether the duplication issue is under your control or not. If it isnt, chances are, you never get penalized. But it is under your control (and you still do it, like having the same content pages on your website) you can get penalized.
I personally have never really worried too much about duplicate content, but I do try my best to reduce it when I can.
1. When you do social bookmarking, change your titles and tags a little bit for every submission.
2. Try not to submit the same article to too many directories at the same time.
3. NEVER have the same content on any 2 pages of your website.
4. You might get penalized if you copy content from other websites, so stay away from it.
Check out link building services, which increase your SERPs without providing duplicate content.