Introduction As will be shown, the SSD benchmark testing “lies” are more ones of omission than of commission. But despite honorable intentions, any result must be considered a “lie” when it provides meaningless or, worse yet, misleading information. The underlying cause of these lies is a familiar one: accurate testing takes time. So short-cuts are taken. Important procedures get skipped. Key considerations are ignored. Results are read before performance stabilizes. Important tests are run without proper preparation—or not run at all. The situation with performance testing of solid state drives is not unlike what occurred when the U.S. Environmental Protection Agency (EPA) introduced its gasoline mileage rating system in the 1970’s. The test simulations were not representative of the way most people drive, so the results were notoriously high. Virtually no one got the highway or city mileage “determined” from the testing, while the regulators and auto manufacturers alike acknowledged, yet completely ignored the problem. Subsequent enhancements have dramatically improved the accuracy of the results, but the changes were slow in coming. For example, highway speed limits increased from 55 to 65 MPH in 1987, but the EPA tests did not take this into account until the 2008 model year—21 years later! The “your mileage may vary” caveat applies equally to SSD benchmark testing today. Some testing is robust, with the results providing an accurate prediction of the performance that might be expected in the real-world. More likely, the results are way off, way too often. Here are the three key factors that determine whether or not SSD benchmark testing…