Rotten Tomatoes does contain different information, and different statistics. For example, take Robin Hood: Prince of Thieves.
IMDb gives a 6.9/10 from 119,115 users. That's the only real statistic it gives us; there is no 'Metascore' for this movie. (Personally I think the Oscar nomination should be mentioned next to this score, but it's sort of buried further down the page)
Rotten Tomatoes, on the other hand, shows us several numbers. The Tomatometer is at 50% for "All Critics", with an average rating of 5.7/10 and 52 reviews. The "Top Critics" Tomatometer is at 36%, with an average rating of 5.9/10 and 14 reviews. The Audience rating, however, is 73%, with an average rating of 3.4/5 from 333,273 users.
--
The above example shows how you can sometimes use IMDb's stats of general ratings as a median reference between the audience numbers and critics numbers. But this doesn't always paint the best picture. Let's take another example: the recent (and generally accepted as a flop) 47 Ronin.
Here, IMDb gives us a 6.3/10 rating from 60,849 users. But it also gives a Metascore of 29/100. Yet the number in a gold star in bigger font is just the '6.3'. So even though this movie has a dramatically lower Metascore, they only feature the general user rating from IMDb.
The story is much more dramatic on Rotten Tomatoes. It has a 13% 'All Critics' Tomatometer rating from 72 reviews, with an average rating of 4.1/10. The 'Top Critics' Tomatometer shows 0% with 13 reviews and an average rating of 2.9/10; not one single top critic liked this film, out of 13 critics! Yet the audience rating shows 51% with an average of 3.3/5 with 53,921 ratings.
We can see how IMDb got its '6.3' rating here: both seem to show a middle-of-the-road rating from general audience ratings. But Rotten Tomatoes' critics stats show this movie isn't worth wasting two hours of your life on. The gamble of whether you may like the movie or not becomes more certainly less likely when you take the critics' reviews into consideration.
--
As we can see above, even within each website, sometimes they have not only greatly varying statistics, but sometimes missing or hidden information (Robin Hood not having a Metacritic rating and not prominently displaying the oscar nod). People weigh their options based on the data they have at hand, so the information you give people - along with its context - will change their minds greatly, regardless of the source.
This is why I think it's much more realistic to look at multiple different sites. You need as much information as you can get, and no one site gives you all the relevant data, as it varies from film to film and user community to user community. It would seem you can't just depend on 'the crowd' to give someone an accurate idea of whether they will like a film; a survey would probably be better, but nobody's going to survey every film they watch.