INFORMATION TECHNOLOGY IS COMPLICATED. It’s a simple fact you deal with
every day, just as we do in our labs, and vendors do in creating their products.
When NETWORK COMPUTING sets out to compare products, we understand the
task isn’t trivial and, for the vendors involved, the stakes are high. If products
do poorly in our tests, we know and the vendors know they’ll spend the year
explaining the results to their customers. The stakes are high for us, too. If we
ask the wrong questions or report misleading or incorrect test results, we’ll
lose your trust—and that’s NETWORK COMPUTING’s most valuable asset.
When we invited Cisco Systems and Meru Networks to a head-to-head
shoot-out of WLAN gear, we knew the results would be contentious. We knew
Meru claims superior performance and a novel approach. We also knew it’s
rare for Meru to participate in such tests, so we were eager to get it right.
For every review we perform, planning starts months before the issue
date. We invite the vendors and share our test plan. We then create the test
bed and the vendors ship us their products. We welcome the participants
to our labs, so that vendor reps can educate us about the products and ver-
ify that we’ve set them up correctly. The reps can even tune their products’
performance in our environment. Once everyone is satisfied with the test
environment, the participants leave and we perform the tests.
Once the tests are complete, we share our results with the vendors and ask
them to explain any unusual results. In this case, Cisco explained those results
in part as resulting from standards violations by the Meru gear. As our authors
worked to verify these claims and to understand how each vendor’s product
does what it does, the story became less about the results and more about the
difficulty we had in arriving at a reasonable explanation for the results.
Although Meru strongly rejects any claims that it violates IEEE 802.11
standards, it also will only go so far in explaining how its products work.
Since Meru so strongly disagre