About

Magazines & Anthologies
Rampant Loon Media LLC
Our Beloved Founder and Editor-in-Chief
Our SUBMISSION GUIDELINES

Follow us on Facebook!


MAGAZINES & ANTHOLOGIES

Read them free on Kindle Unlimited!
 

 

 

 

 

Monday, August 24, 2009

Ruminations of an Old Goat

I found an interesting article while waiting for a database upgrade to complete at work. It reports on a study by the Center for Disease Control and Prevention concerning video game players. The study claims the average video gamer is 35, overweight, introverted and prone to depression.

There is definitely a temptation to simply respond, "Duh!" But is a stereotype from the '80s really accurate?

The article doesn't go into too much depth about the study, but I was able to dig up more information by Googling it. That was revealing, all by itself. The study was conducted solely in the Seattle-Tacoma area of Washington state with a sample size of 552 adults, aged 19 to 90. Slightly less than half of the participants were found to be video gamers. Fifty-six percent of the gamers were men. The men were found to be overweight more often and the women more prone to depression and overall poorer health. Both sexes were more likely to socialize digitally rather than in person.

I expect most of us know at least one person who fits the profile from the study. Stereotypes come from somewhere, after all. But just how much attention should be paid to this study? Let's consider some things that crossed my mind while reading about the study.

First, 552 people make for a very small sample size. A study that attempts to draw broad conclusions about the population in general would require a much larger sample size before the results are reasonable. I'll admit that some studies can be quite accurate with a sample size of 500 or fewer. In those cases, though, the study is very focused and the sample size represents a reasonable percentage of those affected. An example would be a study concerning a rare disease or people with a specific disease who had received the same treatment. As the number of video game playing adults in the U.S. likely number more than fifty million people, the sample size of this study is far too small to draw reasonable conclusions concerning the wider population.

Even if the sample size were larger, the geographic area is too limited. Whose brilliant idea was it to limit the study to Seattle-Tacoma? That area is one of the high tech centers in the U.S., similar to the Research Triangle area in which I live. I have no idea how many people in Seattle are involved in the IT industry, but most of the people I run into around Raleigh seem to earn a living in IT. By its very nature, IT draws more than its share of introverts who never were particularly good at athletics. And, of course, people who make their living in IT are far more likely to play video games. Heck, video games are probably what drew them into IT in the first place. Seattle-Tacoma also has the highest level of Internet usage of any metropolitan area in the country. So any sampling from Seattle-Tacoma is almost certainly going to skew high for sedentary introverts.

While I can't access the study itself (it's on a subscription-only site), none of the articles define what constitutes a video game player for the study. This is particularly important as the study has to have some way to differentiate between someone who plays a game of solitaire or Mine Sweeper every now and then from the person who plays video games regularly and often. The percentage of video game players in the sample population -- stated at 45% -- is too small to include all people who play games on their computer.

So how much game playing does the CDC consider necessary to call someone a video gamer? One hour per day seems much too low. Two hours per day seems closer but still seems a bit short in my opinion. If I had been the person asked, I'd have said a person would have to spend three or more hours per day playing video games to qualify as a video gamer. But I would guess that fewer than 45% of the population plays video game for three or more hours per day, even in Seattle-Tacoma. So let's go with two or more hours per day.

Think about just how much spare time you have each day. During the week, you spend a lot of time getting ready for work, commuting to an from work and working. Throw in an hour for lunch and you're looking at a minimum of 10 hours per day associated with work. (For those who are stay at home mothers, the numbers are probably higher.) Toss our another hour for dinner and you're up to 11 out of 16 waking hours already used up. That only leaves these people five hours per weekday to get in their video game fix and to do any chores -- shopping, laundry, whatever -- that require their attention. Assuming my guess at the study's definition of a video gamer is correct, the video gamer is almost guaranteed to be sedentary simply because there isn't enough time in the day to meet do what is needed to survive, play video games and get exercise or have a social life.

Finally, the study claims video gamers are more prone to depression than non-video gamers. While the study doesn't claim video games cause depression, the headlines and, most likely, the publicity release for the study will cause the casual reader to come to that conclusion. The study's authors can state, honestly, that they never claimed that video games cause depression. But with so many kids playing video games these days, there's bound to be a nice, big grant available if enough casual readers start worrying about how video games "cause" depression. That certainly seems to be how "science" is done these days.

I could be wrong, but this is the kind of "study" that attracts lots of attention on TV and in the newspapers. It's the kind of "study" that leads to all sorts of ideas, most of them wrong, getting into the minds of the public. Once those ideas take root, future studies that disprove those ideas will be dismissed or ignored because they contradict what "everyone knows."

In the world of science there is real science and there is junk science. The best way to tell the two apart is by the number of news articles you see in the popular press concerning a study. I suspect there is an inverse ratio between the number of articles and the validity of the science. If I'm right about that, this study is junk, not science.
blog comments powered by Disqus