Recently, I have found this "review" of the 50 years El Dorado rum:
"El Dorado Grand Special Reserve commemorates the 50th anniversary of Guyana gaining independence from the United Kingdom in 1966. A blend of rums aged for between 33 and 50 years: 65% was distilled in 1966, 25% between 1966 and 1976, and 10% comes from 1983 and packaged in a crystal decanter and I received this bottle from my wife as a retirement gift and being the 90th. Bottle in my collection, I have yet to open this bottle to share my views. Being a huge fan of El Dorado rums, it would be a shame to give anything but a 10 pending a tasting of this bottle, but will revise accordingly at a later date."
Yes, it's the only one there (mere mortals obviously do not have spare cash to throw at bottles designed to be overpriced), and it's not a proper review, because the author admits he did not taste the rum, he only parrots the marketing fluff from the bottle/leaflet, and then adds it "must be superb", giving it a 10 mark based on... nothing! This way, I could "review" bottles by looking at them through the shop's window! Oh, that one looks fancy, brown-like, must smell nice, 7 out of 10! What rubbish!
A review should convey an authentic experience of tasting, comparing and evaluating the particular rum expression - an experience the "reviewer" does not have, because he did not open the damned bottle! I have reported this "review", asking it to be removed, IMO it's got no business being displayed.
Signup to like this comment
Thanks for the comments, yes I'm aware a handful of members have in essence created ratings 'placeholders', saying they haven't rated them yet but plan to shortly.
If they're reported, I typically ping the user saying they need to either try the rum and update their rating, or it will be deleted. Over the years I've had to delete a few but more folks update them pretty quickly.
When you see these, please report them (flag to the bottom right of each rating), and I'll ping the user and set a reminder to check back in 1 week. If it's not updated, then easy to merit deleting.
You are right, the reviews just stating that "there is a rum in my cabinet waiting to be reviewed so here is my '10' " are worthless. In all honesty there are also a lot of reviews not that much better with no information at all (I expected it sweeter- not my taste- strong alcohol burn-the best I tasted (with just 1 or 2 ratings)) and a super high or super low score.
I learned on day one of my RumRatings account that the average score does not mean much; dry rums score 1-2 full points lower than sweet ones so all you can do is dig in the reviews and try to find the helpfull ones. Thankfully there are a lot of those as well!
This is an annoying to new rum drinkers like myself who are still trying to get the lay of the land, so to speak. Just today, I’ve edited two of my ratings based on nothing more than experience with more rum.
But, even My first ratings were based on actual tasting, and therefore have some value. Ok maybe not, but you really have to do your homework regarding ratings.
@harrie - I know what you are trying to say, but out of principle, I cannot accept that.
Us here are not professionals, or at least we are not required to (some of us may be). Thus we are not required to provide credentials, certificates or documents proving our abilities to review rums. We simply trade opinions - explore, drink, compare, experience and share what we have seen and tasted.
But the very minimal requirement for that process is the Experience! To be able to judge the qualities of the rum, to say if it's sweet or dry, bitter or easy on the palate, young or well aged, to be able to do that, you must first open the bottle, pour a glass, smell it and taste it. In my opinion, you should also swallow it - I know professional tasters don't, and I don't blame them, since alcohol is a poison in the end, but a normal consumer does not spit premium, aged alcohol into a bucket. To finish judging what the alcohol brought you, IMO you should also appreciate the finish, and what it does with your system afterwards - how fast you become tipsy, and what is the resulting mood - that may, of course, depend on many other parameters, but I still consider it part of the experience.
Then, and ONLY then, one can say: "I have been there", and only then one can write reviews.
I totally agree with you and I only accept "professional" alcoholic beverage reviews with a grain of salt. Yes, most of those reviewers spit out the tasting into a bucket, which means that they cannot possibly find the ultimate test of a good beverage. I don't care how good a rum or other alcoholic beverage tastes, if it wreaks havoc on my system the next day, then it is not recommended to drink. I have lowered quite a few of my initial ratings because of this.
I’ve missed this discussion till now. I find it very interesting cause when I around a year ago got pretty sure that there’s quite a lot of reviews that are fake or of no use. One person had bought around one hundred bottles, and tasted even more, during 10 days or something like that and the reviews are more or less copy and paste with added photos of unopened bottles taken at supermarkets. If he isn’t a fake it would be interesting to know what he did with all the bottles (he was on vacation) and how he did feel after drinking this much during these days.
There are also examples of members that reviews the same rum two or three times under different names.
If Andy thinks it’s hard to throw people like this out of his great site give him ammo by telling him your opinion and also boycott these kind of people.
I agree with the other posters here. One cannot review a rum without tasting and experiencing it. The whole purpose of this site as I use it and understand it is to be educational by providing information about rums based on having tried the rum, you know an actual “review” not a “preview”. It is impossible to rate something you haven’t tried. I have a bottle of Zafra 30 sitting in my cabinet that I cannot wait to review but I am waiting for a celebratory event to open it given it’s rarity. I can’t review it yet because I have never actually tried it even though I own a bottle of it. It would be disingenuous to do so.
I think the average ratings are not bad, of course it would be better to remove the placeholder.
The best way of using the ratings is to find an reviewer with same preferences.
If you find one who love the same rums you do, it's easy to follow his recommendations.
Hey there, great discussion and really applicable points all around. There's definitely a consensus that a small number of raters aren't 'value-add' (eg haven't rated) or at worst represent a rum company - which throws off things like the average rating of a rum. A few features I'm prioritizing in the near future are:
1) Better recommendations - so the system understands the rums a user has rated and comes up with some smart suggestions based on all the data RR has. EG rated these 3 rums high, try this other rum based on other raters with similar ratings.
2) 'Smarter' averages - many users have suggested (in general) the more ratings a user has the more they often trust their ratings. It's not always the case, but a user with 1 rating typically has a lot less experience than one with 50+ ratings. So I'm thinking through either removing users below a minimum number of ratings (say 3 or so) from the averages, and/or weighting the average towards users with more ratings.
Open to thoughts if you think either/both of these would help :)
That 2nd point especially seems like that would help a lot and may even “solve the issue”. I definitely trust a review coming from someone with more than 30 reviews more than one coming from someone with 1 review.
I have learned to ignore the average score. I sort the reviews by Rating and then look for reviewers on that rum with whom I respect their opinion. If a review looks appealing to me and I am not familiar with that reviewer, I then check that one's cabinet to see how they rated other rums that I am familiar with. But I agree, those with only 3 reviews or less that are more than one year old should not be computed into the average. I once looked for the worst overall rum based on those with multiple reviews. I was not about to spend anything on Captain Morgan Grapefruit just to add my own two cents worth.
It’s definitely so that a reviewer with 50+ reviews normally are more trustworthy then someone with just a few reviews.
But it can also be the other way when someone more or less reviews in some kind of industrial way with lots of copy and paste.
I mentioned one reviewer when I replied last time. This guy bought and tasted 133 bottles and besides that tasted 213 different rums at distilleries, in total 346 reviews during around 14 days. To these reviews he added around 300 photos taken on unopened bottles, most of theme from supermarkets or shops.
That’s more reviews then most of us do in a lifetime.
During one day he tasted around 40 different rums!
Is this trustworthy.
No, no, no, not in any way!
We don’t need people like him on this site, which I like a lot.
I wholeheartedly agree with you and I know exactly who you are talking about! I don't respect any of his reviews because they are obviously mass produced. He takes just one sip (if that much) and immediately calls it crap. His reviews also should not be calculated as part of the overall score. When I look at a rum that I am thinking of buying and see one very low rating for that rum, most of the time it is his.
Really like your "smarter averages" idea. When I am looking at a new rum to purchase I use these ratings to help make my decision. I did this yesterday. For many rums especially those with just a handful of reviews one of these "non reviews" can have a really big impact on a decision to purchase. In general though, this is a great community and the suggestions above are appreciated. It is a tribute to the sincerity of Rum Ratings that such issues are freely and openly addressed. Bravo all.
point nr. 2 looks interesting. I believe there are some very good rums (even few hidden gems) that are too much specific (dry, high esters etc.) to be liked by beginners, but deserve higher average rating than they have...
I think it’s a good idea to take away ratings from people with very few ratings from the average, but as we both now isn’t that the only problem.
So my solution is to take away the extreme rating scores, just as they do in ski jumps.
For example the 5 or 10% highest and lowest ratings. Or a combination of your point 2 and that.
I personally like the idea of a minimum number of reviews before a person's scores get weighted into the averages. I think that minimum should be higher than 3; I would say about 8.
I'm not sure I agree that someone with hundreds of reviews should get a higher weighting, because not everyone's tastes are the same. I have over 400 reviews, and I tend to like sweeter rums. But that's not everyone's preference, so I don't think I necessarily deserve a higher weighting.
I think our ratings are normal distributed:
Therefore it would be better to use a percentile of let's say 95% to reduce the disruption.
We need a sufficiently large number of reviews for the use of percentile. I saw also a disruption with small number of ratings. Not sure if we can use a percentile also on such a small number of reviews.
I can guarantee you my first ratings were crap. Completely subjective and uninformed. A few months and many bottles of rum later, I found that my ideas about rum and ratings have changed drastically.
I’ve since gone back to edit most of my ratings to more accurately reflect my experience.
That being said, I like what Paul said about one shot not really being a valid means of rating. Sure, pros can probably be accurate that way, but I don’t really know a rum until I’ve spent some time with it, usually a bottle or two.
In my last reply I suggested that Andy should do like they do in judging sports and take away the extreme ratings by taking away for example the 5% lowest and highest ratings from the average, which I still think is a good idea.
Another way to reduce the impact from the extreme ratings is to leave the average value and instead use the median value.
I know there's a lot of ways to do this, but I personally don't like removing 'extreme' ratings because then the Candela Mamajuana would be a perfect '10' (right now it has seven '10's and one '3' The reason I don't want to eliminate the 'extreme 3' is because it's from an experienced taster while all the '10's are from people with only one or two ratings. That's why I suggested requiring a certain minimum number of reviews before your scores get counted.
Again, I realize there's going to be lots of differences of opinion on how best to do this, but I wanted to share my own thoughts.
The special case you relate to, has already an average rating of 9.1 after just eight ratings.
This makes at least me check who has rated it, and in this particular case I would not trust any of them as seven of them are quite inexperienced and the eighth and I have diametrically different views on rum.
So I mean it doesn’t matter if the average rating or the median value is 9.1 or 10 as such a value always must be checked, when the reviewers are so few, before using it as a guide.
I think Paul wrote that the average grade can be ignored and that the important thing is to keep track of who made the reviews and that they roughly have the same view of Rum as oneself.
I completely agree with him (at least when the reviewers are very few), if someone wants guidance it’s more important to look at reviews made by someone that have the same preferences as oneself if available.
I forgot to tell you that I agree on that the reviewers must create a minimum number of reviews before their scores get counted.
Absolutely. I agree that one of the best ways to find a new rum you like is to find a reviewer with similar tastes.
We both agree that reviews by people who have rated only one or two rums shouldn't be counted, so that would give the Candela Mamajuana a score of 3, regardless of whether 'extreme' reviews were counted or not (because it would only have 1 review that was counted).
Again, I think all reviews from experienced reviewers should be counted because some people do have different tastes, and to me I don't think any experienced reviewer's taste should be ignored because other tasters might also have a similar taste.
I finally looked at all of the reviews for Candela Mamajuana. None of them would encourage me to buy a bottle from way down in Florida. Ironically, the "10" ratings are obviously trying to push their product, but wind up achieving the exact opposite. The only review to be believed on this one is the "3". Without having tried it, I would say that a "5" rating would be closer to the truth. It is less than $30, so that alone leads one to question all of the "10" ratings.
I think Paul’s analysis of “the Candela case” most likely is completely correct.
Using a median value does not help in “the Candela case“ with seven "overbids" and probably one "underbid", cause that will give a weighted value of “10”.
Reviewers with few ratings always tend to set too high ratings regardless of whether they do it out of ignorance or whether they do it to bring up the value for some other reason, but I don’t know if the best way to handle “the Candela case” is to exclude all reviewers with few ratings, which gives a weighted value of “3”.
In order to get a reasonably well-weighted value, I think that it should include at least two ratings, when possible, even if it means that it then must include one rating from someone with too few ratings.
In this case that will give a weighted value of “6,5”, both as average and median.
This is probably a little too high, but maybe better than “3”, and definitely better than “10” or “9,1“.
About using a median value. That doesn’t ignore any value, it only reduces the influence from extreme values.
The reason that I estimated the Candela case as a "5" is not because of mathematics. Instead, the only reliable reviewer in this case always tends to rate rums much lower while adding phrases like "add 2 points" for such and such drinkers.
In my own case, I discovered that I had way too many rated as a "7". Since they are all ranked in order (as best as possible for 333), I wound up changing the ratings from the lowest half of my 7's to all 6's. So now, most of my ratings are a "6" since these reviews help us to avoid buying crap. I now have just as many 7's and 5's. I also have similar amounts for 8 and above as well as for 4 and less. This makes for a much better normal distribution on my part. My overall ranking and price helps me to decide which rums to buy again, which is only 10 percent of all that I have reviewed.
Paul, do you think price is really a valid rating point? If it’s good, it’s good regardless of cost, right?
I do not consider price at all in my ratings, unless it is like my most recent hundred dollar rum that cannot hold a candle to my best ones. Price is always a separate part of my "rebuy equation".
Since I am retired and do not have unlimited funds, I have had to consider price from the very beginning. When I started this rum journey in December of 2017, I had a very complex formula to compute the QPR. It has since been modified to something much simpler: taste that gives my overall ranking counts twice as much as price. I even tried multipliers of 2.5 and 3.0 for taste, with price always being 1.0. The 2 to 1 ratio works fine for me. If I go higher than 2.0 for rankings, then some rather expensive rums will wind up on my repeat purchases list. This would also rule out some of the cheaper rums with lower rankings. Even though I can afford some of the expensive rums, many just do not seem like a good buy when all is said than done. It is literally pissing money away. But with no more concerts and no more travel in this pandemic, this expensive rum hobby is all that seems to be left to look forward to until a reliable vaccine is found. Thank goodness for my wonderful dog!
Paul, I understand exactly why you chose the value 5 in “the Candela case”, because I too have read many of the “reliable” reviewers reviews (especially about Agricole) that most often end with something like Agricole fans can easily add to points to my score.
I also have a hard time understanding why someone tastes hundreds of different Agricoles when he so obviously dislikes that drink, except for some of the more expensive ones, and in that way affected the average rating in a negative way, but that’s another question.
Paul, I also understand what you mean by that the alarm bells ringing when a $30 rum gets seven "10", but here the “reliable” reviewer has done a good job of drawing the attention of a possible uncritical future buyer that here it’s probably something wrong, because we all want to find an exceptionally good rum that’s very cheap
Another interesting rum is the "Marque Reserve Exxtra Anejo 8-Year". It has 13 reviews with an average score of 9.8. Yet only one reviewer of the 13 has more than 1 review.
The alarm bell rings even higher even though this rum costs $40-45.
Average rating ZERO until someone credible takes pity on it and rates it.
Normally, I don't trust the reviewer with the 3 rating for the Candela, but something is better than nothing at all. In this case, it is the only reliable review, just enough for me to best estimate it at a 5 rating. This certainly does not justify me driving about 600 miles to try it because they cannot ship spirits to Louisiana.
Hi everyone, thanks again for the input!
I've just rolled out an update so only members with 5 or more ratings are counted in averages. 5 can easily change to 3, 7, or whatever minimum rating number makes the most sense... I thought even a 'fake' rater might rate a few rums quickly, but they're unlikely to rate 5 (from what I've seen). You'll find the https://rumratings.com/brands/6691-candela-mamajuana average is now 3.0 and https://rumratings.com/brands/11295-marque-reserve-exxtra-anejo-8-year is 9.0.
I can appreciate the bell curve idea and taking out extremes, but haven't come up with a good way to accommodate rums with limited ratings. It could be applied to say rums with 20+ total ratings, but that gets messy in the code. Hopefully this min rating solution at least gets us 90% of the way to a much more trustworthy average.
Next up I'm planning a premium feature to sort by minimum rating similar to https://rumratings.com/brands?order_by=average_rating, but instead of minimum total ratings it will be minimum user ratings. I'm also planning the ability to export cabinets and search results, in addition to smarter recommendations based on your individual likes.
Keep the suggestions coming! if anything else you'd like to see changed or added comes to mind feel free to either post a new discussion to see what others think or ping me at firstname.lastname@example.org :)
Great! A good step to make your fantastic site even better.
I agree. Great update!
I agree too. Thank you @andy
Thanks for all your consideration Andy
Brilliant moves, Andy!!!!!
Hello again, another followup to this thread - a new feature for premium members just rolled out to alter the minimum number of user ratings. So now, instead of needing to rely on the default 5 minimum user ratings you can change this to 10, 20 etc on the Explore page: https://rumratings.com/brands?order_by=quality.
In the future I'm hoping to have a slider here for more granular selections - EG calculate the average from users with between 6-50 ratings.
I found it under User Experience with options of counting averages from users having anywhere from 5 to 50 reviews. I found no tab for order_by=quality. But I do like what I saw! If you were to move the minimum number of reviews down to 1, we might be able to find more of the fake reviews.
I use the wish list shelf in my cabinet to park the bottles that I have not yet tasted or purchased. As for the over under rating discussion, I look at the cabinet of those rating 10s and generally find that if they have only a few ratings, they are just starting out. If they have 50+ ratings and they have some of the same highly rated rums that I have, then the rating is real. They like that rum. Others might not but that is the beauty of this site. Lots of information, opinions and discussion. Everyone has a slightly different palate. Cheers to everyone.
It seems that fake news has found its way into Rum Ratings. I am not going to worry about it because it’s easy to spot. The reviews, ratings and comments in the Rum
Ratings site makes it the best on the web!
I trust and Respect the reviews of my fellow rum raters. I don’t always agree with them as some prefer dry rums and I prefer sweeter runs, but in my collection I have both types, but my favourites are the sweeter rums.
I also like that people can have a different opinion on a rum and aren’t slammed for their review.
I am a new reviewer with a palate "in training", so to say. I would not mind at all if my first ratings were not counted. I can hardly imagine that I will feel confident or be considered relevant before having tasted at least 30 rums or so. The use of a median makes sense to me too.
As a beginner, I am also likely to wait before I post ratings. I need time to compare and make my mind. So I can end up posting even 5 ratings at a time.
A great discussion.
Advertisement | Go Premium to remove
Easily access Rumratings while on the go by adding a shortcut to your home screen.
All you have to do is click the
icon and then
Add the RumRatings App