What's new

Another ho-hum subwoofer review at Hometheaterhifi (1 Viewer)

Ned

Supporting Actor
Joined
Feb 20, 2000
Messages
838
http://www.hometheaterhifi.com/volum...r-10-2004.html

Why am I calling it ho-hum?

The good:

They show distortion/SPL readings for 15, 20 and 25hz.


The bad:

They don't give max SPL readings, with THD for those individual frequencies. Instead they mix together 20, 31.5 and 50hz for a final result of 120db. So that could be mostly 50hz SPL with a meagre amount from the lower bands. Who knows? Useless and misleading.

If they (hometheaterhifi) are going to give objective measurements they need to do so consistently among all subwoofer reviews. If you read through their last 10 sub reviews you will see about 10 different methodologies. I like their site but they need to get their act together.
 

paul clipsel

Stunt Coordinator
Joined
May 31, 2004
Messages
163
The reviews over there are so variable that I no longer read them, except perhaps some of the dvd player reviews. I agree they don't have consistent and good reviewers, so the reviews have become meaningless. They started out well enough but keeping up with commercialism seems to have been a bigger priority over consistency.

PC
 

Kevin C Brown

Senior HTF Member
Joined
Aug 3, 2000
Messages
5,723
I think an Earthquake sub was reviewed in one of Keith Yates reviews in HT Mag, or S&V, or somewhere. I remember it didn't do so great for distortion results...
 

Ned

Supporting Actor
Joined
Feb 20, 2000
Messages
838
Wayne-

Why should I do this quietly? They are a public website offering information to those interested in home theater. I think it is appropriate to raise issues on a public forum.

For example, would you ever post an SPL result that was an amalgamation of 20, 31 and 50hz? What does such a figure tell you? Do you see where such a figure ("120db") would be misleading to someone who doesn't fully understand such measurements? After all, a single SVS CS Ultra can only do 112db at 20hz, but the Earthquake can do "120db". There will undoubtedly be people who interpret the numbers that way, hence my complaints.

By the way, my SVS CSi and CS Plus subs are capable of 120db when I combine, oh say, 10hz and 50hz. Impressive eh?
 

Wayne A. Pflughaupt

Moderator
Premium
Senior HTF Member
Joined
Aug 5, 1999
Messages
6,818
Location
Corpus Christi, TX
Real Name
Wayne
Nothing at all wrong with that. But if the bathrooms at your local Wal-Mart are filthy, what’s the best way to get something done about it?
  1. Kick it around with your buddies when they come over for a game.
  2. Complain to the store manager.

Regards,
Wayne A. Pflughaupt
 

oliverLim

Agent
Joined
Sep 15, 2004
Messages
28
I too have been very dissappointed with the variability of their reviews. It just seems like they want to please the manufacturers by fudging their reviews recently.

Sad that it has become so.

Oliver
 

Edward J M

Senior HTF Member
Joined
Sep 22, 2002
Messages
2,031
Aside from the maximum output figure, this review is very similar to any of John's previous subwoofer reviews.

With one exception (that I'm aware of), he tests all his subwoofers indoors. He uses MLS based software, attempts to gate out room reflections on the FR sweeps, and runs THD at a fixed SPL (100 dB) at several test frequencies. In this respect, you can compare other products he has tested to the Earthquake (with some possible exceptions for varying mic distance - I'd have to check on that to be sure).

The max SPL figure using a few combined sines was merely an attempt to convey how loud the sub can play in-room. You might not think the data is terribly useful, but it's just one line in the review, after all.


Do you direct, irrefutable evidence - that would withstand legal scutiny in court - that John Johnson has fudged any data in this review? If you don't, your statement amounts to libel. You may not like the presentation style or the methodology of a given review, and that's fine. But that's completely different than stating a reviewer fudged the data when you have no evidence of the same. In my dealings with him thus far, I believe John Johnson to be a man of professional integrity. Unless you have evidence to the contrary (which I KNOW you don't) he deserves the benefit of the doubt. That kind of talk has no place at Home Theater Forum.

Ned, I agree that a consistent approach to testing any product is a good idea, and your point is valid. In reality, that is easier said than done. Any future subwoofer reviews I do (for example) will continue along the methodology outlined in my PB10-ISD review. This is completely different than John's test methodology, and you will probably perceive that as contributing to the problem (even if you prefer one set of test methods over another).

There isn't much John can do about that; the environment outside of the Secrets labs is not conducive to ground plane work; there is too much ambient noise and too many nearby reflective structures. And unless Secrets let's one person do all the subwoofer reviews using the same exact methodology and equipment, there will continue to be differences in methodology and format.

I can only get one review done about every 6 weeks, and also have to take a hiatus during the winter when the snow flies. John clearly needs to review more subwoofers than that, so I'm sure we'll see more of his review format in the future.

I'm hoping to squeeze in two more sub reviews before the snow flies; I'm done ground planing the current subwoofer I'm evaluating (the Acoustic Visions MRS-10), and that review is being written. Hopefully I'll take receipt of the next sub soon so I can at least get the GP work done before the inclement weather hits. Then I can laugh at the wind driven snow outside as I type the review by the crackling fireside. ;)

Regards,

Ed
 

oliverLim

Agent
Joined
Sep 15, 2004
Messages
28
Ed,

I never said they did. I said it seemed like. Why?

If you look at all the Sub reviews they have done since the last two years, they all seem different measurements were done. Some had max SPL at each freq. Some had max spl at less then 10% distortion. Some had no measurements at all! and the list goes on. I know there are many different reviewers and each has their own style and measurement technics. But you cannot deny that even with the same reviewers, there always seem to be different measurements being presented with each review.

If their DVD reviews have become a benchmark, it is only because we can see and know they measure the same things in each measurement criteria. It will not have been such a success today if each batch of measurements measures totally different variables.

It is with that in mind that I am so dissappointed with their subs review. I was hoping for them to do the same with consistent criteria so that we can compare them all togehter. Of course there is the objective review which cannot be compared and contrasted which I am sure everyone understands.

So I do not understand why my statement that it seemed like they decided to go down this route so as not to offend sub manufacturers is wrong. I am not saying they are. Only they can answer that themself. But the change from naming and comparing actual model numbers with the sub in review to changing names to Vendor A and Vendor B seems to support that. Perhaps Sub manufacturers may have a part to play in that by not allowing them do such comparasion by saying it infringes in some rights or what. But it would perhaps be good if they explained why they could not be more consistent.

Your first 4 reviews were great and told us alot and allowed to do some direct comparision. I know you measured outdoor for your latest review so we could not really directly compare them with your previous reviews. It is still a great review and I know you put in alot of hard work in it. To me (but maybe not to others), it would have been more useful if you decided to measure it in your room as that way it would allow a better comparision. But many others will probably disagree with me.

I hope you see why I said what I said. I do not mean any disrespect to you or to the staff at Secrets and I do read almost all their reviews the moment they are up. I just felt they could and had the potential to do even better in my eyes.

Oliver
 

Edward J M

Senior HTF Member
Joined
Sep 22, 2002
Messages
2,031
The word "seems" only modifies the first half of the sentence: "It just seems like they want to please the manufacturers...". You could finish the sentence with "by standing on their heads", and you'll realize what I'm getting at. The words "...by fudging the data" stand alone and are unqualified; that's what I took exception to.

And I'm not showing a sudden allegiance to Secrets because I had one review posted there. I stuck up for John because you said Secrets fudged its subwoofer reviews. I don't think that's true, and you shouldn't say so unless you have proof of that.

No question, your points are valid about the inconsistent format of the reviews. In order to properly compare any products, the test environment needs to be the same every time, either in a shoot-out or from review to review.

In fact, I was among the more vocal posters about the disparity between the B4+ and the DD18 review test methods when people tried to draw performance comparisons between those two products.

For the record, I measured both outside and inside (FR at the seating positions) for the PB12-Ultra/2, STF-2, and PB10-ISD reviews. The PB10-ISD review would have been no different in format or content had it instead been published here at HTF.

Some day, I'd like to get the PB12-ISD and the STF-3 back together (they are both available locally) for a GP session and update the data set in that review, just for consistency with my current methods. Naturally, the subjective portions of the review won't change; after all, updating a data set won't change the subwoofers themselves (unless you live in Bizarro World).

Regards,

Ed
 

oliverLim

Agent
Joined
Sep 15, 2004
Messages
28
Understand where you are coming from. Perhaps fudge is the wrong word to use. My bad.

I just hope that Secrets editors see why there are so many people who are unhappy recently about their reviews. They do not have to please the readers. But readers are ultimately what matters to them in the end.

On a other note, when you measure indoors, do you actually have a checklist on what/where everything should be at? :)

The reason I asked was that movement of items in the room esp sofa and even position of doors can change a rooms response. I learned the hard way when after a few hours of measurement and BFDing and my FR suddenly changed. I was troubleshooting to see what settings I changed in my Sub and my BFD when I noticed that one of my doors was not closed properly. My wife walked in and did not shut the door totally and my FR in the 30-40hz range dropped like 5db. :)

I would assume other stuff like the aircon being on or off, books and mags on the carpet floor (probably more in the HF) would affect FR somewhat.

As you also use TrueRTA, what averages number do you use and why? Also there is this speed tradeoff option. What does this do and what is the right option for sub measurements.

Thanks
Oliver
 

Users who are viewing this thread

Latest Articles

Forum statistics

Threads
356,815
Messages
5,123,815
Members
144,184
Latest member
H-508
Recent bookmarks
0
Top