The live-production industry is very often a world of comparisons: Is this loudspeaker a better choice than some other loudspeaker? Which console is "the best?" Which mic should I buy?
A problem (as I see it) is that many of these comparisons are tackled via means which are highly subjective, such as listening tests. It has gotten to the point where I find myself irked by the statement "To my ears..." It's not that the experience of others has no value in decision making, it's simply that the experience of others involves a potentially staggering number of variables that can't be controlled for.
To my way of thinking, a far better basis for comparing audio devices is by measuring their performance via repeatable experiments, experiments that produce numerical readings. Various potential bits of technology can then be shortlisted via the quantitative information, and then later evaluated subjectively by individuals.
The specific purpose of this experiment was ultimately to evaluate how various microphones stack-up to the venerable SM58. The other microphones chosen for the first round of testing all have a purchase price below that of Shure's long-time workhorse. As such, an alternative title for this shootout could have been, "The Cash-Strapped Audio-Human's Guide To Affordable Mics."
The results of this experiment are shown in the interactive table below. Please be aware that the Shootout has some limitations, which are discussed below.
Longer bars indicate better performance. Blue bars indicate an average mic in that area, green is significantly above average, and orange is significantly below average.
Click a heading to sort the table by that heading. Subsequent clicks will invert the sort order.
Click a color bar to "zoom" the number.
Click a microphone model to add that model to the filter. Filter and unfilter the list with the buttons below.
|Manufacturer||Model||Condition||Price||Polar Response||Proximity Effect||Mechanical Noise||Wind Noise||Feedback Resistance||Cupping Resistance|
The mics in this study were tested according to the methods described in this article.
It should be noted that there are two areas of uncertainty in the measurements presented above, with one area being particularly questionable.
During testing, it was difficult to get repeatable results for mechanical noise. As it turns out, dropping an object onto a microphone (even from a repeatable distance) suffers from unpredictability in exactly where the object will strike the microphone under test. This couples with the issue that mics are more susceptible to mechanical noise in different places along their construction. As such, the overall fairness of the test is questionable. A reasonable guess would be that the worst performers are actually significantly better than the results would indicate. Also, some of the mics tested are billed specifically as "instrument" mics, meaning that their internal shockmounting against handling noise is probably much reduced.
The wind noise measurement seemed relatively consistent and fair overall, but the test was by no means an accurate simulation of a plosive vocal sound.
An overall consideration is that, in this study as a whole, complex behaviors have been reduced to a single number. This is necessary for simplicity of comparison, but unavoidably fails to produce a complete picture of any single microphone's performance - especially under different sets of real-world conditions. As a follow-on to that point, it's important to recognize that "bench tests" like this don't account for many real-life factors. For instance, the feedback performance of the mics was tested with one mic routed to a single monitor speaker. This, of course, tells you very little about what happens when that same mic is routed to many monitors across a stage. Further, a particular "cupping peak" for a mic might cause me enormous trouble with my system tuning, but be just fine for you.
For these reasons, I made the suggestion that I did in the introduction: That this study is useful as a way to "shortlist" purchase contenders, but that subjective testing specific to your particular situation is necessary for making a final decision. Please remember that this study tells you nothing about durability, tonality, or whether a mic will be accepted by the musicians you're working with!
In my interpretation, this study reveals why the SM58 remains an industry standard: It is very strong at being an all-around performer. A brand-new SM58 would seem to be a mic that can be counted on to have very good proximity effect resistance and feedback characteristics, and to be quite acceptable in all other areas. The only other mics fitting the same overall pattern (exceptionally good in two areas, with average performance in all others) were the DRV 200 and the Pro 61.
There appears to be surprisingly high value-for-money available at the inexpensive end of the spectrum. A DRV100 can be had for less than $20 per unit, yet manages to rate as average in all areas. A D38 falls short in proximity effect and cupping resistance, but acceptability of those parameters can be very subjective, whereas mechanical noise and feedback are areas of high performance. The D8000M also distinguished itself as displaying standout behavior in feedback resistance, while being solidly average everywhere else.
This study was made possible, in part, by my Patreon supporters. If this was useful to you and you'd like me to do more of this kind of thing, please support my efforts. I do work in various areas of interest, not just audio.