You know, and this applies to any reviews on any audio gear - reviewers seem to avoid direct comparisons to competing products. Yes there are some instances, but most reviews end up being a flowery language opinion of the product being reviewed. Maybe they need to rethink the method, and see reviews like an athletics meet? i.e. get 4 or 5 key products ganged up, and directly compare them in the SAME system, and with the SAME music files, to then get a performance score. And throw in some blind tests with friends at a meet. This would give fact hungry audiophiles data points to actually use effectively in them advancing to a demo our even a future purchase decision.
Without such a review, we are left with a romantic novel of 'music floating in the air', and how it inspired the reviewer to gush about it.
Also we RARELY see a bad review... wonder why?
Anyway, we have this forum, where we can get real data points by asking owners of said product, or those who demo'ed a product v other products in the same setting and at the same time. This IMO is really useful information to help us.
Sum up IMO, read magazine reviews as a start of the story, as entertaining. Then get down to the serious stage after, by using other methods.
Without such a review, we are left with a romantic novel of 'music floating in the air', and how it inspired the reviewer to gush about it.
Also we RARELY see a bad review... wonder why?
Anyway, we have this forum, where we can get real data points by asking owners of said product, or those who demo'ed a product v other products in the same setting and at the same time. This IMO is really useful information to help us.
Sum up IMO, read magazine reviews as a start of the story, as entertaining. Then get down to the serious stage after, by using other methods.