In the interview with the actual researcher who conducted the study in 1993, it turns out that yes, he did find that listening to Mozart music was associated with better spatial ability. What's the catch? First, the effect lasted 10-15 minutes. That's right, listening to Mozart was associated with better spatial abilities for about 10 minutes afterward. So, unless you plan to have your child listen to Mozart right before an IQ test, it's probably not going to make a huge difference.
Secondly, the study was conducted among college students. Like many psychological studies conducted in academic settings, the sample of convenience is college students. Reporters and the public assumed that this finding was applicable to young children as well, but who knows. Young children were not the age group studied.
So how does a modest research finding get distorted into a media frenzy and fuel the marketing of hundreds of baby Mozart CDs? The original researcher was honest about reporting the findings in an academic journal. He did not exaggerate the results. It was simply a matter of the media taking a modest finding and running with it. Apparently, the study's author was misquoted several times and before you know it, "The Mozart Effect" was born.
Besides being an interesting case itself, this situation is a perfect example of how research often gets misinterpreted in the media. We, as humans, and especially the media, love sound bites. We love it when stories are easy, clear cut, black-and-white. We love it when something as simple as music seems to be the parenting remedy that will ensure that are children are intelligent. Unfortunately, child development research is rarely clear cut or black-and-white. There are usually shades of grey and nuances to the findings that make them complicated and sometimes difficult to understand. After all, we are studying human behavior and we humans (especially children) are notoriously difficult to study, even using the best scientific methods we have. Some reporters simply don't take the time or don't have the skill to understand these complexities. All this is to say, you might want to be cautious next time you read about a research finding in the news. If something seems to good to be true, it probably is.