The Looming Credibility Of Game-Based Learning by Terry Heick, TeachThought.com
Fire and ice.
Texting and driving.
Video games and learning.
That's the commonly-held perception anyway--and it's powerful. For decades now, the connotation of video games has been skewed juvenile. It didn't help that they were thrust into the pubic consciousness by kid-friendly arcades and Mario back in the 1980s.
First impressions being what they are, here in (almost) 2014 educators still struggle to articulate the possibility of video games--in part because we don't fully understand their extraordinary potential ourselves.
Instead, we bang along trying to mash nuanced digital avatars on screen with often dated curriculum standards that barely hint at what players have been doing in video games for years.
But in many ways that's changing, thanks in large part to mobile games and the dynamic technology embedded in new consoles like the Xbox One. Improved voice and motion recognition from Kinetic, digital media integration, and diverse gaming experiences ranging from Indie titles to AAA blockbusters are all evidence of progress within the world of video games.
More important, however, is the changing definition of a video game. Gamification is a term you've heard a lot about over the last couple of years, and it's only becoming more ubiquitous with improving mobile hardware, notification alerts, and social media integration.
In short, just as you one day won't be able to tell the difference between a television and a computer monitor, separating games from simulations from sentient apps from a simple shopping experience will also be difficult.
There will be growing pains here, and certainly some trade-offs we need to be aware of as a culture. But the end result may be that we no longer have to justify teaching design with Minecraft, or tone and mood with Fallout 3.
Because the students will likely have already created video games of their own to teach us.