2173 Salk Avenue, Suite 250 Carlsbad, CA

support@assignmentprep.info

https://www.gcu.edu/blog/teaching-school-administration/how-using-technology-tea

April 5, 2024

https://www.gcu.edu/blog/teaching-school-administration/how-using-technology-teaching-affects-classrooms
https://drexel.edu/soe/resources/student-teaching/advice/how-to-use-technology-in-the-classroom/
Topic Choice: 
The Role of Technology in Education: Examine the impact of technology on education. Discuss whether technology enhances or hinders the learning experience for students. 
These 2 texts are from a source  you need a login for this is the text: From “Candy Crush” to “Call of Duty,” some 150 million Americans play video games, including all but a small fraction of children. The global spread of technology and migration of video games to mobile devices have helped propel the industry to record sales — $61 billion worldwide in 2015. Among the biggest converts to video games are educators, who are using them to teach such subjects as history, geography, science and math and to hone students’ critical-thinking abilities. Meanwhile, developers are creating games to improve attention skills in children with ADHD, delay cognitive decline in adults, help recovering stroke victims and improve corporate customer service. But questions remain about video games’ effectiveness at enhancing learning and cognition and whether games promote addiction or shorten attention spans. And while half of players are women, critics say gaming culture discourages female participation, a problem that could have ramifications for education as games become more prominent in schools.
A fifth-grader plays “Minecraft” to help him understand what he reads in “The Hobbit” at Quest Academy in Palatine, Ill. Many educators are using video games to help teach history, science and other subjects as well as to sharpen students’ critical-thinking skills and improve attention skills.
(Getty Images/The Chicago Tribune/Chuck Berman)
Overview
Elizabeth Box’s seventh-grade civics students in rural Okeechobee, Fla., were struggling, but not many seemed too bothered by failing grades or poor understanding of the concepts.
Parents didn’t seem to be concerned, either. Many in the small, lower-income city did not have secondary education, and Spanish, not English, was the primary language for a good number of them.
“We had a real problem in our community with general apathy towards academics,” says Box.
Box searched for new ways to engage her students, eventually ending up creating a video game involving an apocalyptic future world in which the United States has been mostly destroyed and a dictatorship has taken over. The players discover ancient documents — including the Constitution — and are tasked with various missions to rebuild the U.S. government.
Box’s “Give Me Liberty” was a hit. The students, who already were playing games on their smartphones, jumped at the opportunity to compete against themselves and their classmates.
However, state exam scores for her class have been mixed: 65 percent passed in the 2013–14 school year — the first year the game was used — but only 48 percent in 2014–15, according to Box, as students still wrestled with civics concepts.
Despite the mediocre scores, student engagement has improved markedly, she says. “They love it because they aren’t being forced to move forward when they don’t understand,” Box says. Instead, they can just keep trying the missions until they get it right. “I’ve seen kids not just working in the classroom but going home at night and pursuing the work there,” she says. “If my way of doing things encourages them to enjoy learning and take pride in their work, then that is what I consider success.”
Nearly half of the nation’s population plays video games, and the games increasingly are being played on smartphones and other mobile devices. As more educators adapt to gaming, questions remain about games’ effectiveness at enhancing learning and cognition, and whether games promote addiction or shorten attention spans.
(Getty Images/Bloomberg/Simon Dawson)
Box is leveraging two powerful trends to engage her students — recreational gaming and “gamification,” or the use of games by educators and companies to turn work into a game that engages players by having them earn points or gifts. Educators use such games to help teach humanities, science and math. So-called serious games are also used for training and simulation and to improve health or manage diseases and even explore subjects such as climate change, cancer and moral dilemmas.
Gamification is on the rise. According to a Pew Internet & American Life Project survey, half of the experts surveyed predicted that gamification would become a major factor in education, health care and the workplace by 2020.1 Students play digital games at least once a week in their classrooms, according to more than half of the teachers surveyed in 2013 by the Joan Ganz Cooney Center, the nonprofit organization that developed the “Sesame Street” television show.2
Recreational video games are a bigger phenomenon than gamification. The global spread of technology and the migration of video games to mobile devices helped propel the industry to record sales of $61 billion worldwide in 2015.3 Americans alone spent $22.4 billion on games, hardware and accessories in 2014, according to the Entertainment Software Association (ESA). More than 150 million Americans — nearly half the population — play video games on a personal computer (PC), TV, game console or portable device, with 42 percent playing at least three hours a week, ESA said.4 About 10 percent of Americans identify themselves as “gamers,” according to Pew.5
Overall, men and women play at equal rates, but, when separated out by age, younger men, ages 18-29, and women over 50 play more than those of the opposite gender in these age groups, said Pew.6 And 97 percent of children play computer and video games.7
The term “video games” encompasses best-selling PC-based games (“Minecraft”), social media-embedded games (“Candy Crush”), smartphone apps (“Angry Birds”), console-based action games (“Grand Theft Auto”) and massively multiplayer online games (“World of Warcraft”). Games also can be streamed on YouTube, and enthusiasts can watch others play games on websites such as Twitch.
“What was once considered a weird hobby of computer nerds or a part of youth culture in the golden age of arcade games has become a leading entertainment sector of the mainstream culture,” wrote Rachel Kowert, a research psychologist on the board of the Finland-based Digital Games Research Association, and Thorsten Quandt, a professor of communication studies at the University of Münster, Germany, in their 2016 book, The Video Game Debate.8
Now several foundations — including the Cooney Center, the Bill and Melinda Gates Foundation, the MacArthur Foundation and the Robert Wood Johnson Foundation — are pouring millions of dollars into figuring out how to use video games for education. And the National Science Foundation (NSF), an independent federal agency that promotes scientific research, has provided millions of dollars in grants for game development.9 The U.S. Department of Education is also supporting the development of educational games, and President Obama in 2011 called for investments in digital education technologies, including games.10
“Ten years ago, we had a lot of questions about whether you could get anything serious out of a game,” says Chris Hoadley, an NSF program director. Now, he says, it’s clear that “serious games can enhance not only acquisition of facts or specific onscreen skills but also some of the more fuzzy, squishy, 21st-century things like leadership, teamwork and agency” — the sense that they are making their own free choices.
“Games give children autonomy and agency, helping them design their own solutions, collaborate with friends and create natural ’affinity groups’ that help bring learning alive outside the classroom,” wrote journalist Greg Toppo, in The Game Believes in You.11 “For the skills-and-assessment type, games scratch an equally essential itch: They frontload massive amounts of content, offer focused and efficient drill-and-practice, build on prior knowledge, strengthen grit and, at the end of the day, deliver a personalized performance data stream that would make the most hard-assed psychometrician smile,” said Toppo, referring to someone who measures the psychological attributes, skills and abilities needed to work in a particular job or profession.
“Video games clearly offer opportunities for learning,” says Marcia Linn, a professor of development and cognition at the University of California, Berkeley. However, she says, “not all games help people learn.”
Ron Smith, a program coordinator at Helen Bernstein High School in Los Angeles, agreed. “The ’game theory’ school is overblown,” he told the Pew Research Center in 2010. “I am not convinced that there is a correlation between gaming and academic success in low-achieving students, and high achievers will thrive anywhere.”12
Meanwhile, decades-long debates continue about the potential harmful effects of games, including whether some contribute to violent behavior and whether gaming might lead to social isolation or addiction to gaming. Growing screen time is leading to worries about shortening the attention spans of developing brains. Thirteen-to-18-year-olds consume an average of nine hours of digital media a day, and 8-to-12-year-olds six hours — not counting time spent doing school or homework online, reported Common Sense Media, a San Francisco nonprofit that promotes safe technology and media use for children.13
“Children now spend more time with digital media than with any other single influence,” said the American Academy of Pediatrics in October 2015.14 The organization is expected to issue new recommendations by November on managing children’s digital media consumption.
Others worry that as educators adopt games as a teaching tool, lack of access to broadband, Wi-Fi or computers could put rural or lower-income schools at a disadvantage, exacerbating the so-called digital divide.15
Women have fought for recognition and acceptance in game design and game play and have become increasingly vocal about wanting equal consideration. In 2014, the growing public debate about the lack of diversity in gaming triggered a digital culture war that included online harassment and death threats against those — mostly women — who spoke out. The “Gamergate” controversy has yet to subside.16
Some educators and academics say the exclusion of women and minorities, including LGBT people, and the lack of female and minority characters could affect the learning sphere. “The cultural pressures that are preventing women from getting into these games are hurting our country’s ability to be competitive in science, technology, engineering and math fields,” says Rabindra Ratan, an assistant professor in the Department of Media and Information, at Michigan State University.
As research about potential benefits or downsides of video games evolves, here are some issues under debate:
Do video games help people learn?
For years, video game enthusiasts have said that well-designed games can impart learning through play alone. Games are especially good at motivation due to their reinforcing reward systems, they say. But the debate on whether — and how — games can enhance learning is far from finished.
Well-designed games contain good principles for learning, according to James Paul Gee, author of What Video Games Have to Teach Us About Learning and Literacy. The professor of literacy studies at Arizona State University said those principles:
Offer players strong identities;
Make players think like scientists;
Lower the consequences of failure;
Enable players to practice challenges until they get it right; and
Encourage players to think about relationships, instead of just isolated facts and events.17
What people learn in video games is not always good, but it is often good learning, Gee said. Good learning comes from “social and interactional systems within which a powerful technology like video games is placed, not from the game all by itself,” he said.18
“A crappy game is not going to be good for learning, but neither is a crappy book,” agrees Constance Steinkuehler, a Gee disciple and co-director of the Games+Learning+Society (GLS) center at the Wisconsin Institute of Discovery at the University of Wisconsin, Madison.
Games for learning need to be “beautifully built, engaging and with child development in mind,” Steinkuehler says.
Douglas Clark, an associate professor in the Department of Teaching and Learning at Vanderbilt University’s Peabody College, said the important question is “how games can support learning.”19 In a meta-analysis examining the findings from several studies on games and learning published from 2000 to 2012, the data indicated that game-playing students outperformed nonplayers “in terms of cognitive, intrapersonal and interpersonal learning outcomes,” Clark said. But he and his colleagues said they would “argue against simplistic quotations of findings suggesting that games universally outperform non-game learning approaches.”20
Studies showing that games can improve learning are abundant. Another meta-analysis reported in 2013 in an American Psychological Association journal found many benefits, including improvements in spatial navigation, reasoning, memory and perception.21
Games designed for education, such as “River City,” are “not only engaging but also help learners acquire deep science-inquiry skills and conceptual knowledge,” said the National Science Foundation-funded Center for Innovative Research in Cyber-Learning.22 In the game, middle school students try to figure out why so many residents of a 19th-century river town have become ill.
But most game developers are still struggling to find the right formula for learning, said John L. Sherry, an associate professor in the Department of Communication and the Cognitive Science Program at Michigan State University. There is no game equivalent to “Sesame Street” despite decades of effort, he said. “We have a large and growing catalog of games that are pedagogically or scientifically sound but lack fun, or games that are fun to play but lack necessary content or pedagogy,” he said. Sherry also said learning studies are riddled with flaws, and that results are too general to have much meaning.23
Richard Mayer, a professor of psychology at the University of California, Santa Barbara, agrees that good evidence is hard to find. “In contrast to the claims made by game advocates, researchers are charged with the much less glamourous task of testing the claims in rigorous scientific studies,” he wrote.24
“There’s some evidence in some cases that people can learn from educational games, but they have to be designed based on sound instructional design principles,” Mayer says.
Brain training games have received extra scrutiny. In October 2014, scientists from the Stanford Center on Longevity, and the Max Planck Institute for Human Development in Berlin and elsewhere warned that the games were only exploiting older people’s fear of losing their memory.25
“To date, there is little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life,” the scientists said. The statement elicited anger from more than 100 researchers who wrote to the Stanford group and said they strongly disagreed.26
The Federal Trade Commission has taken a dim view of brain training. In January 2016, Lumos Labs agreed to pay a $2 million fine for making what the federal agency said were deceptive and unfounded claims that its Lumosity games improve memory and reduce cognitive decline.27
Pediatricians also have been dubious about video games, especially when it comes to toddlers, whose developing brains may be more susceptible to benefits and harms.
“There is hope that these media might provide a new platform for learning for young children,” says David Hill, chairman of the Council on Communications and Media at the American Academy of Pediatrics. But, he notes, “the data we have available at this time do not point in that direction.”
Do video games have harmful effects?
Video games frequently induce what psychologists call a “moral panic” — when public fears and government interventions exceed the perceived threat — primarily over the violent or sexually explicit content of some games, according to authors Kowert and Quandt.28
“Mortal Kombat” — in which players are encouraged to maim and kill opponents in bloody hand-to-hand combat — triggered a public backlash when it was introduced in the early 1990s. In early versions of the game, a player rips off his opponent’s head and pulls out his spine, part of a signature “fatality” finishing move. A 1993 Senate hearing on “Mortal Kombat” and other violent video games helped persuade the industry to rate most games sold in the United States and Canada. Game makers now voluntarily label games “Teen,” (for 13 and up) “Mature” (for 17 and up) or “Adults Only” (18 and over; content may include intense violence, graphic sex and/or gambling with real currency).29
Bloody first-person “shooter” games like “Doom” — reportedly played obsessively by the two teens who killed 13 people at a Columbine, Colo., high school in 1999 — also caused an uproar. In 2012, a devotee of the bloody war game “Call of Duty” killed 28 people, including 20 children, at the Sandy Hook elementary school in Newtown, Conn. Such mass shootings prompted calls to restrict or ban violent video game sales. But since the mid-2000s various courts — including the U.S. Supreme Court — have ruled in 13 cases that video games are protected speech.30
Characters from “Assassin’s Creed” are presented at the annual Electronic Entertainment Expo, or E3, last June in Los Angeles. Gaming is more popular among older women than older men. In the over-age-50 bracket, 38 percent of women play games compared with 29 percent of men. Older women generally play more “casual” games, including “Candy Crush,” “Just Dance” and “The Sims,” while men favor action and shoot-’em-ups like “Creed.”
(Getty Images/David McNew)
Aggression has been an ongoing concern. Some of the most commercially successful games — such as “Call of Duty” and “Grand Theft Auto” — are graphically violent. It’s a hotly debated topic, with many observers vehemently denying a link between gaming and violence and others equally convinced there is a cause and effect.
The Pew Research Center found that 40 percent of Americans think violent games induce violence in players, while 53 percent disagreed. About a third of game players agreed there was a connection between games and violence.31
The American Psychological Association has said playing violent games is linked to increased aggression but stopped short of saying that play leads directly to violence.32 “To date, there is very limited research addressing whether violent video games cause people to commit acts of criminal violence,” said Mark Appelbaum, chair of an association task force that reviewed studies published between 2005 and 2013.33
The American Academy of Pediatrics has concluded that media violence “is a risk factor for aggressive behavior.”34 But the manifestation of that aggression depends on many factors, including “the child’s home life, self-esteem, support network, health and temperament,” the group said.35
Hill, of the academy’s Council on Communications and Media, says “there are truly overwhelming data supporting a strong correlation between aggressive behavior and aggressive thought patterns and violent media consumption, including violent video games.” If video games indeed do teach, “it’s logical to accept the premise that they’re good at whatever they’re designed to teach” — including violent behavior, he says.
Chris Ferguson, who has been researching games and violence since 2004, says there are only “small correlations” between video games and aggression, and that “correlation doesn’t mean causation.” Ferguson, co-chair of the Department of Psychology at Stetson University in DeLand, Fla., sees inherent biases in how video games are studied and in how doctors perceive games because of what he says is a lack of familiarity with gaming and preconceived notions about harmful effects. Ferguson found in a 2015 survey that older researchers and doctors were more likely to view games negatively.36
Hill, however, likens Ferguson to a climate change denier, saying, “I remain skeptical of his methodology, and I do not find what he’s published convincing.”
Technologies “do not have any effects, good or bad, all by themselves,” said Arizona State University’s Gee. What matters most is how games are used and in what context, he said.37 For instance, children raised in violent or abusive households may use violent games as an outlet for their anger, he said.
“Most of the studies that say games are harmful don’t hold up in terms of their validity,” says Scot Osterweil, creative director of the MIT Education Arcade, which funds game development and research. Like Gee, he says games alone should not be blamed for addiction or antisocial or obsessive behavior, but that different types of media are frequently singled out. “When I was a kid, parents worried about kids who read all the time because that was not a well-socialized kid,” he says.
Addiction appears to be a problem in less than 10 percent of players, but experts have not formed a consensus on how to define or diagnose the condition, often known as Internet gaming disorder.38 Before publishing its most recent edition of the authoritative Diagnostic and Statistical Manual of Mental Disorders in 2013, the American Psychiatric Association said more study was required before gaming addiction could be deemed an official condition.39 Current criteria for the disorder include preoccupation with Internet games, withdrawal symptoms when gaming is taken away and continued excessive use of games despite knowledge of psychosocial problems.40
Some have suggested that gaming addiction is a compulsive behavior driven by personality traits, such as introversion, neuroticism and low emotional intelligence.41 Another study found that extended game play may cause hyperconnectivity in the brain — an overload of communications connections — that may feed underlying psychiatric issues.42
Douglas Gentile, associate professor of psychology at Iowa State University, said the ready availability of games fuels what he thinks is a growing addiction problem. Gentile has found other potentially harmful effects, such as attention problems, stemming from both excessive television viewing and video game exposure.43
Pediatricians worry about other harms. “Language acquisition is impaired in young children who watch a lot of passive screen media,” says Hill. In very young children, “we know that face-to-face interaction is best,” he says, adding, “To the extent that these devices distract from face-to-face interaction it’s likely it’s robbing children of the best opportunity to learn.”
A 2015 study found that 10-to-16-month-olds who played with electronic toys used fewer adult words and vocalized less than a comparison group that used traditional toys and books.44
Is there a diversity gap in gaming?
Are video games for straight, white men only? That question emerged in a fiery debate in 2014 that started off ostensibly as an online discussion about ethics in gaming journalism and quickly grew into a mostly misogynistic backlash against demands for more diversity in gaming.
“On one side are independent game-makers and critics, many of them women, who advocate for greater inclusion in gaming,” wrote Washington Post technology columnist Caitlyn Dewey. “On the other side of the equation are a motley alliance of vitriolic naysayers: misogynists, anti-feminists, trolls, people convinced they’re being manipulated by a left-leaning and/or corrupt press, and traditionalists who just don’t want their games to change.”45
Dewey was describing “Gamergate,” the name given to the hate storm — punctuated by threats of death or rape — that erupted against critics, particularly women, who asserted that females are underrepresented as developers and executives in the gaming industry and characterized in games primarily as hypersexualized bimbos or targets for violence.
The issues raised by Gamergate may not be familiar to the general public. Pew found that more than 40 percent of Americans were unsure whether women and minorities were portrayed poorly in video games. The group found that blacks, whites and Hispanics play video games at about the same rates, but that Hispanics are more likely to identify as gamers.46
The games themselves in 2015 were slightly more diverse, featuring more female leads — and more who weren’t amply-busted damsels in distress — but the number of playable minority characters remained “pitiful,” according to the tech website Mashable.47
Academics, game players and educators have long debated whether gaming has a diversity gap. The Gamergate controversy, which continued into 2015, highlighted long-running complaints that gaming culture is not only misogynistic but frequently racist and homophobic, and that the diversity gap could have long-term negative consequence by discouraging women and minorities from going into science, technology, engineering and math (STEM) careers.
Software engineer Brianna Wu founded “Giant Spacekat,” which features games with female protagonists. Wu and others who criticized what they see as misogyny in the gaming industry faced abuse and death threats from gamers in the “Gamergate” controversy. The critics asserted that females are underrepresented as developers and executives in the gaming industry and are characterized in games primarily as hypersexualized bimbos or targets for violence.
(Getty Images/The Boston Globe/Joanne Rathe)
“Gaming culture has been pretty misogynistic for a long time now. There’s ample evidence of that over and over again,” Kate Edwards, executive director of the International Game Developers Association, said about Gamergate. “What we’re finally seeing is that it became so egregious that now companies are starting to wake up and say, ’We need to stop this. This has got to change.’”48
Some gamers, developers and company executives condemned the attacks, saying an abusive, exclusionary environment was bad for gaming, and the controversy spurred some companies to start new programs to bring more women and minorities into the industry. For instance, Intel pledged $300 million to help transform itself into a more diverse company by hiring more women and minorities in its games division.49 And last June, the company announced it was establishing a $125 million fund for female- and minority-led tech start-ups and supporting the game developers’ association initiative to double the number of female developers by 2025.50
At least one developer, Brad Wardell, said gaming absolutely is a male-dominated field but that it is not intentional. “Demographically speaking, core gaming is 95 percent-plus men,” said Wardell, president and CEO of Stardock Corp. in Plymouth, Mich. The lack of women in gaming “has nothing to do with being inclusive,” he said. “If a universal utility is overwhelmingly used by men, there’s no scenario where core gaming is going to do better at attracting women. Women simply have different hobbies than men. And that’s fine.”51
The statistical gap between the percentage of men and women who play video games is closing, but differences remain. Among those ages 18 to 29, Pew finds that 77 percent of players are male. One-third of young men call themselves “gamers,” compared with only 9 percent of young women.52 Common Sense Media found that teen boys spend an average of 56 minutes a day playing video games, while girls spend an average of seven minutes.53
Gaming is more popular among older women than older men, with 38 percent over age 50 playing games, compared with 29 percent of men in that age bracket. Women — especially older women — generally play more “casual” games, including “Candy Crush,” “Just Dance,” and “The Sims,” while men seem to favor action and shoot-’em-ups like “Assassin’s Creed,” “Halo” and “Call of Duty,” according to the video game market research company Newzoo.54
Stephanie Llamas, a senior analyst for the New York-based games data company SuperData Research, takes exception to that simple characterization. Women are the largest gaming demographic for the noncasual PC role-playing games (54 percent) and represent almost 40 percent of massively multiplayer online (MMO) game players and digital console gamers, she said. MMOs allow huge numbers of people to compete against each other, no matter where they live or what language they speak.
“Not only have women grown their stake as gamers in general, they have shown a desire to be more active participants in the gaming community as a whole,” she said.55
Newzoo confirms that the number of girls and women who played console-based video games (not casual games) five or more days a week grew from 1.2 million in 2011 to 5 million in 2014.56
But a cultural divide exists between male and female players, in part due to concerted industry marketing to boys, says Erin Robinson Swink, creative director of a master’s degree program in games and playable media at the University of California, Santa Cruz. “I don’t think it was a natural thing that happened. It was a very artificial boundary of what a game is and who wants to play them,” says Swink, who previously was an independent game designer.
Adrienne Shaw, an assistant professor in Temple University’s Department of Media Studies and Production, agreed the industry has focused too much on marketing to people who identify as “gamers,” marginalizing women, minorities and LGBT people. People who play games are much more than just gamers, Shaw said. “We could see much better games if people felt like this was not a medium that caters only to super fans,” she wrote.57
Ratan, of Michigan State University, says when women do play they may be reluctant to fully engage or fade into the background because of pressure they feel from male players. In one study, he found that even though women gained skill at the same rate as men in the MMO game “League of Legends,” they were less confident of their skills and focused more on helping a male partner advance than themselves.58
“The culture around these competitive games is what causes the gap,” says Ratan. In addition, he says, women who feel stereotyped as poor performers or less-skilled in games tend to believe that they, or other women, are less suited for careers in STEM fields.
UC Berkeley’s Linn says she, too, thinks that such stereotypical views in gaming could reinforce ideas that women can’t succeed in STEM careers. But she also says lack of access to technology could be a bigger obstacle to increasing diversity in gaming.
Box, the Florida middle school teacher, sees no differences between boys and girls. “Some of the girls are just as cutthroat about their gaming as the boys are,” she says. She only sees individual variations in approaches to the game, not a gender divide.
Background
From Tennis to Angry Birds
As television and computers evolved, so did video games. Some say video games helped drive the computer revolution.
“No one can deny that the ubiquitous invasion of computers into the home was started by the video game console,” said Ralph H. Baer, a German-born American engineer who in 1966 invented the idea of playing games on a TV. Baer said the rise of video games, with its demand for better processing speeds and memory capacity, increased graphics capability, improved art design and better user interfaces, helped to shift the United States into “a truly technological society.”59
Scientists first created a crude arcade-type game in 1940, displaying it at the New York World’s Fair. By 1950, renowned British mathematician Alan Turing and a colleague had each simultaneously programmed the first chess-playing computer.60
Throughout the 1950s, computer scientists created more game-playing computers, but many consider the antecedent of video games to be “Tennis for Two,” developed by William A. Higinbotham, a physicist who helped design the first atomic bomb.61 Higinbotham was working at the Brookhaven National Laboratory — a nuclear research facility — on Long Island when he debuted the ping pong-like game, displayed on a monochrome oscilloscope screen and employing two control boxes with knobs to serve and hit a simulated ball. Brookhaven employees spent hours waiting to play the novelty, which was only displayed for two years.
The game had a greater purpose. Higinbotham built it as an attraction for visitors, who could play it during public tours of the lab. He hoped it would put the public at ease about the lab’s nuclear weapons work.62
In 1962, members of the model railroad club at the Massachusetts Institute of Technology used a new, high-speed, mini-computer called the PDP-1 to fashion the two-player game “Spacewar!” The game, in which players controlled spaceships that shot torpedoes at each other, became so popular “that some rules had to be laid down: no ’Spacewar’ except during lunch hour and after 5 pm,” said club member J. M. Graetz.63
Although “Spacewar” was first played by geeks, it eventually was loaded onto every PDP computer built by Digital Equipment and influenced the Atari arcade game “Asteroids.”64
Baer, who became known as “the Father of Video Games,” began thinking about how to capitalize on the growing number of TV sets in American homes.65 By 1967, he had come up with the Brown Box, a unit covered in faux-wood-patterned vinyl with two controls and program cards for games such as ping-pong, checkers, sports games and target shooting. It was licensed to Magnavox, which sold it as the Odyssey in 1972.66
It was the first, albeit crude, video game console, a device that connected to a TV, functioned like a computer and was designed for interactive video game display and play.67 Odyssey, however, failed to entrance many Americans — only 200,000 units were sold in three years.68
Magnavox soon had competition: Atari’s “Pong.” The ping-pong-like game was introduced in 1972 as a coin-operated arcade game. The home game, sold exclusively through Sears, was a massive hit, selling 150,000 games in its first year, 1975.69 By 1980, sales of Atari’s next-generation home gaming console, the Atari 2600, were booming, with 2 million sold, fueled by the extremely popular “Space Invaders,” which came with the console.70 Atari eventually sold more than 27 million of the 2600 units, making it one of the first big home gaming successes.71
Video games soon hit a bump, however: U.S. Surgeon General C. Everett Koop in 1982 declared them a health hazard to children, warning that they could be addictive and promote violence.72 As the home video market became saturated, Americans seemed to be bored with gaming. Retailers thought video games were a passing fad and started pulling game-related merchandise from their shelves.73 The industry crashed.
But the crash was short-lived. With the introduction of the hugely popular “Pac-Man,” “Donkey Kong” and “Mario Bros.” at the arcade and games for the growing number of home use platforms, the 1980s eventually became known as the “Golden Age” of video games. Sales reached billions of dollars.
Game makers then decided to target children. Companies like Nintendo told parents they could monitor and restrict game playing through special built-in controls on their consoles. Video games started to be viewed as toys, helping to reboot the industry.74
In the meantime, computers — such as the Commodore, Apple II and Macintosh — became smaller and were brought into the home, creating a huge market for video games for those devices and expanding the audience for gaming.
In 1989 Nintendo introduced the Game Boy, a hand-held battery-powered mini-console that made gaming almost ubiquitous for children of a certain age and led to successive generations of portable gaming. Home gaming took another leap with the introduction of more compact, newer-generation consoles made by Nintendo and Sega in the mid-1980s and early ’90s. By 1995, Sony unveiled its PlayStation, which became the market leader. The PlayStation 2 is the best-selling console in history, with more than 150 million units sold.75
In the mid-2000s, Nintendo’s Wii simulator console burst onto the scene, allowing players to virtually bowl, play tennis, golf or have dance-offs in their living rooms. Sales reached more than 100 million units worldwide.
In 2000, Sega crossed the next frontier: consoles that could be connected to the Internet. Sega’s Dreamcast allowed players to download games and play with millions of others around the world. Although that device failed commercially, it paved the way for the next generation — Microsoft’s Xbox Live and Sony’s PlayStation Network — which helped popularize massively multiplayer online games and allowed players to chat and strategize in real-time.76
By 2009, 41 percent of American households owned a video game console, a number that has remained steady through 2015, according to the Pew Research Center.77
The Internet also revolutionized gaming on computers. Players could go to game sites and play online or download games. According to ESA, the computer has overtaken the dedicated console as the device of choice for gaming in the United States.78
The introduction of the iPhone in 2007 and the iPad in 2010 created more ways to play games, leading to a big shift in the market. The simplistic game “Angry Birds” leveraged the popularity of handheld devices and was one of the first smartphone applications to become a giant hit among many different players. By 2011 — two years after it was introduced — “Angry Birds” had been downloaded 50 million times.79
Smartphones are now used by 35 percent of American game players, and one-third play social games — those involving more than one player. Fourteen percent play puzzle, card, and board games; only 5 percent of the games played on handheld mobile devices are action games.
Serious Play
Not all games were developed for recreational purposes. So-called serious games arose simultaneously, first on computers and then on consoles, smartphones and tablets.
Serious games are used for recruitment, training and simulations for the military, corporations and health care; educational games used in K-12 classrooms; and “Games for Good,” which teach, train or generate awareness or critical thinking about issues or societal problems, such as immigration, inequality, race or religious conflict.80
Serious game designers “utilize strategizing, hypothesis testing or problem solving, usually with higher order thinking rather than rote memorization or simple comprehension,” said Kathy Sanford, a professor of language and literacy at the University of Victoria, British Columbia. “Characteristics of such games include a system of rewards and goals to motivate players, a narrative context that situates activity and establishes rules of engagement, learning content that is relevant to the narrative plot and interactive cues that prompt learning and provide feedback,” she said.81
The U.S. military was among the first to develop and use computer games for serious purposes. A Johns Hopkins University research center created the first computer war game, “Hutspiel,” for the military in 1955. It allowed players to test the impact of nuclear weapons in a simulated battle between NATO and the Soviet Union.82
More war games were created throughout the 1960s. By the ’70s, as the video game industry was expanding, the serious games industry also began to expand. Even inventor Baer included some educational games with the Magnavox Odyssey.83
Many consider “The Oregon Trail,” released in 1971, the first serious game for education. Developed by three history teachers in Minnesota, it was initially text-only and played on a teletype machine.84 Players are pioneers who attempt to follow the Oregon Trail from Independence, Mo., to Oregon’s Willamette Valley in 1848. The Minnesota Educational Computer Consortium eventually distributed it nationally, starting in 1978, and released a commercial version in 1985.85
While conceived to teach geography and history, the game became a big commercial success, selling 65 million copies.86
In 1986, the Learning Co. released “Reader Rabbit,” the first educational game for young children. It was designed for a new IBM computer for home use, the PCJr.
Three years later, independent game developer Will Wright introduced “SimCity,” a popular game in which the player must develop a city while maintaining his constituents’ happiness and a budget. Wright’s next game, “The Sims,” in which a player creates virtual people and essentially directs their lives, became one of the biggest-selling video games of all time. In 2014 the fourth edition of “The Sims” was the top-selling computer game in the United States, the third edition was No. 2 and various other versions occupied slots in the top 20.87
“The Sims” was released at the cusp of a major change in the serious games market. The turn of the millennium was “a demarcation point,” says Ben Sawyer, a game developer who co-founded the Serious Games Initiative in 2002 at the Washington-based Woodrow Wilson International Center for Scholars, a congressionally chartered policy forum. The initiative aimed to encourage the development of games to address policy and management issues.
Sawyer says the pace of development quickened around 2000, as many elements converged, including the ability to make three-dimensional games and greater interest from the government, the military and other public and private-sector funders.
From 2002 to 2010, 1,265 games were released worldwide, compared with 953 during the 22-year period from 1980 to 2002. Educational games were the largest segment (66 percent) early on, but recently have dropped to about 26 percent, as advertising-related serious games expanded, along with health games and others.88
Still, the serious game market grew slowly compared to the explosion in recreational gaming. In 2004, for instance, the first meeting of Games for Change, a gathering of serious games enthusiasts, attracted only 35 people. By 2012, more than 800 people attended, and 11,000 watched the webcast, but those numbers pale in comparison to the tens of thousands who attend recreational industry conventions.89
Serious game development has been hampered by its economic model, said Damien Djaouti, an associate teacher of computer science at the University of Montpellier, in France.90 In the past, serious games makers tried to retail them like a recreational game. While that may have worked for big titles such as “The Sims,” it was unsuccessful for most serious games.
Now, serious games can be tailored for a particular client who will pay the studio for the work. Or corporations and schools can buy games for a flat or per-user fee, meaning developers don’t have to depend on the vagaries of the retail market.
This new model “is likely to enable the current wave of ’Serious Games’ to last longer and embrace more public recognition than their ancestors,” said Djaouti.
______________________________________
Digital technology is becoming increasingly commonplace in K-12 education, and many researchers argue that it will save money and transform schools into more effective institutions. But other experts contend that the evidence so far is slim on exactly what computers can accomplish in the classroom. The dominance of standardized testing means digital technologies must raise students’ test scores to levels administrators and policymakers deem significant. But computer-based learning may not be well suited for that task, and further efforts to computerize education may require schools to shift away from standardized testing, experts say. Until now, most successful computer-learning initiatives have required specialized training for teachers. But experts say developing technology that will be easy for nonspecialists to use remains a challenge. Meanwhile, despite the debate over the effectiveness of computerized education, all-online K-12 schools are proliferating nationwide, and enrollment in online courses is soaring.
Ten-year-old Mirei Hosono uses an iPad in his English-as-a-second-language class at Center Grove Elementary School in Greenwood, Ind., on Oct. 28, 2011. The school received a $200,000 grant from the state to buy 230 of the devices. Education experts say computers and other digital devices increasingly are taking on roles once filled solely by teachers.
(AP Photo/Daily Journal/Scott Roberson)
Overview
Students learning to read have long followed a familiar routine: They read a passage of text aloud in class and wait for the teacher to correct their pronunciation.
But in the digitized world of 21st-century education, computers are increasingly taking on the teachers’ role. Computers can now “hear” students speak, for example, correct their pronunciation and evaluate their progress over time, says Michael L. Kamil, a professor emeritus at the Stanford University School of Education. “Until recently, computers couldn’t listen to oral reading and understand it,” he says. But new programs make it possible.
Such advances are part of a much bigger movement to integrate technology into classrooms, creating what education scholars call a “blended learning environment.” As computers increasingly dominate every realm of business and life, experts say schools must prepare young people not only to use digital technology but also to understand how to program it, how it shapes culture and behavior and how it can be harnessed to perform tasks once considered the sole realm of humans.
Yet, while digital devices have become ubiquitous worldwide, debate is raging over whether — and which — technologies have proved their worth as learning tools. Some school systems have fully embraced technology, for example by providing every student with a laptop computer. But critics argue that money for such programs would be better spent on teachers.
And in some localities, technology is threatening teachers’ very jobs. In cash-strapped Ohio, for example, schools could attain a 50-1 student-teacher ratio — more than twice the conventional 20 or so pupils per teacher — by combining live teaching with large amounts of online study, Robert Sommers, director of the Office of 21st Century Education in the Ohio governor’s office, told the state legislature last spring.1 Similar proposals are surfacing in many other states.
Corina Dill and other low-income middle-school students in San Angelo, Texas, participate in a new program on Oct. 8, 2011, that teaches them to use open-source software. The students can take the computers home after completing three Saturday workshops at Angelo State University. Debate is raging over whether digital technologies have proven their worth as learning tools.
(AP Photo/San Angelo Standard-Times/Kimberly Parker)
“I teach a class for aspiring school administrators, and the first thing I tell them is that the schools you are in today are not the schools you are going to be leading,” says James Lerman, director of the Progressive Science Initiative, a program at Kean University in Union, N.J., which helps experienced teachers become certified to teach math and science. “What happened to the music industry and the publishing industry” as the digital revolution turned their business models upside down “is just beginning to happen to schools.”
Digital learning has been getting a boost in localities across the nation this year. For example, Idaho became the first state to require high-school students to complete two or more online courses to receive a diploma.2 And a mere two years after spending $500 million to upgrade Internet access in its public schools, New York City announced it will spend the same amount in 2012 on more technical improvements.3
Many education specialists are somewhere in the middle on the issue of computerized education. Decades of experience make clear that computer software can effectively train people to perform certain complex tasks, says David Moursund, an emeritus professor of education at the University of Oregon, at Eugene. “We’ve known for a long time that computers could take on part of the task of the human teacher or tutor,” notably by teaching basic skills such as multiplication or spelling, and do the job as well as the average teacher, says Moursund.
The military and the airline industry, he notes, both use computer simulations to train people for tough, high-stakes jobs such as distinguishing between incoming missiles and harmless radar-screen blips, and servicing jet aircraft. “With enough money, you can develop simulation that’s quite good, nearly indistinguishable from the real thing,” Moursund says. Similarly, software programs that tutor students in subjects such as arithmetic are customizable for any skill level and thus uniquely helpful in schools, said John Danner, co-founder of Palo Alto, Calif.-based Rocketship Education, which operates a network of well-regarded K-5 charter schools in low-income Northern California communities. “When students learn things that are developmentally appropriate for where each of them are, they learn things much faster than if you teach to the middle,” as classroom teachers typically must do, he said.4
Nevertheless, computers can never replace the human touch in elementary and high school classrooms, experts say. Teachers do what technology can’t, “such as being a live person who cares about you,” says Grover J. Whitehurst, director of the Brown Center on Education Policy at the Brookings Institution, a centrist think tank in Washington.
“Blended models” of schooling that combine computer-based learning with live classes seem to be emerging as the most common model, Whitehurst says. In fact, as computers increasingly take over routine tasks and the Internet provides easy access to unlimited streams of information, demands for teachers to possess more sophisticated conceptual skills will increase, some analysts say. But education specialists worry that teachers aren’t receiving adequate training to function in this new, digitally dominated world.
“The teacher of the future helps you navigate the ocean of information” that the online world provides, says Paulo Blikstein, an assistant professor of education at Stanford University and director of its Transformative Learning Technologies Lab. “I can go to Wikipedia to memorize historical figures’ names, but I need somebody to talk with me about power relations” and other concepts, “to help me make sense” of the facts. Teachers will “need to know much more about learning how to learn, about how to help students make sense of these huge amounts of information, where you need to interpret what you see,” Blikstein says. “But we’re not training teachers to help with these things.”5
Some digital-technology enthusiasts argue that computer games tailored for learning could be an education booster. But many education-technology scholars say that, so far, most games developed as teaching tools don’t actually teach much.6
Learning claims for games such as the popular “Oregon Trail” — a simulation game developed in the 1970s to teach about pioneer life — are overblown and rest on the too-frequent misunderstanding that student motivation guarantees learning, says Kamil, at Stanford’s School of Education. For players to learn from a game, winning and enjoying the game must both depend on whether the player learns something that the game intends to impart, he says. “If you watch a bunch of boys play “Oregon Trail” they spend all their time shooting deer,” clearly enjoying themselves, but not accruing any history-related skills or knowledge.
Too many games can be won by using non-learning-related strategies such as repeated blind guessing, Kamil says. “They may get kids engaged, but they don’t get them engaged in an actual learning task.”
Nevertheless, some games do contain the seeds of very effective learning, but researchers are only just learning the principles that underlie such games, education-technology analysts say.
A game in which a player enters some virtual world and advances through it by solving challenges that involve uncovering the rules of the place offer the very highest form of learning, wrote James Gee, a professor of literacy studies at Arizona State University. A game in which a player solves a science mystery, for example, could be a much more fruitful learning experience than an ordinary biology course in which a student learns facts and repeats them on a test. In fact, “decades of research have shown that students taught under such a regime … cannot actually apply their knowledge to … understand the conceptual lay of the land in the area they are learning,” said Gee.7
By contrast, a computer game can closely approximate an activity such as practicing biology in real life, Gee wrote. “Biology is not a set of facts but a ’game’ certain types of people ’play’” by doing certain activities, using particular tools and languages, holding certain principles and playing “by a certain set of ’rules,’” all activities that games players do in virtual game worlds. “Keep in mind that … ’Full Spectrum Warrior’” — a computer-simulation game about anti-guerrilla fighting — “is a game when I buy it off the rack but serious learning when a soldier ’plays’ the professional training version,” Gee wrote.8
As policymakers and schools struggle to keep up with ever-advancing digital technology, here are some of the questions that are being asked:
Are computers in schools improving education?
Hopes have been high for decades that computer games, tutoring software and other digital technologies could make students more engaged and effective learners. But with many schools now coming online with high-speed Internet connections, the evidence on learning outcomes remains mixed.
For elementary-school students, decades of research demonstrate that “we can develop computer programs that teach kids to do more mundane things” — such as add, subtract and multiply — better than the average classroom teacher can, says Moursund of the University of Oregon.
Computers’ strength as skills instructors lies partly in data-gathering and data-analysis abilities that humans can’t match, says Moursund. For example, to learn to type on a keyboard, “a program can time how long you touch a key, tally mistakes, note your fast and slow fingers” and adjust the task in real time to provide additional exercise for an individual’s weak spots. “A human tutor can’t possibly adjust so much” and thus is less efficient, he says.
“If you asked me to bet” on whether “picking an elementary teacher at random or a million-dollar piece of software” would produce better learning outcomes for “30 young kids learning an essential” basic skill such as adding or recognizing how different combinations of letters sound, “I’d pick the software,” says Brookings’ Whitehurst.
But technology is often put into classrooms with little technical support and thus is seldom as effective as it might be, says Paul Resta, director of the Learning Technology Center at the University of Texas, Austin. “If a teacher has technical problems [with operating a software program] more than once” and can’t get a quick remedy from an information-tech specialist, which many schools don’t have, “guess what’s going to become of that software” after that? “I call it the dark-screen phenomenon.”
Some research on computer-based learning initiatives shows small or no learning gains.
In a 2009 study for the Texas state government, analysts at a nonprofit research group found that a pilot project that “immersed” some high-need middle schools in technology by providing a wireless computer for every teacher and student increased participants’ ability to use technology and modestly improved math scores, especially among higher-achieving students. But the technology didn’t improve students’ ability to direct their own learning, apparently worsened their school-attendance rates and had no apparent effect on reading-test scores.9
In the technology-intensive Kyrene School District in Chandler, Ariz., classrooms have numerous laptop computers with Internet access and interactive software that provide a wide variety of instructional opportunities: drills in every subject, individual-study programs and multimedia projects that help students create blogs and social-networking profiles for books they read in class. But according to one key benchmark — standardized test scores — the technology hasn’t helped learning. Since 2005, the district’s math and reading scores have remained stagnant, even as scores statewide have risen.10
The results baffle local school leaders. “My gut is telling me we’ve had growth. But we have to have some measure that is valid, and we don’t have that,” said Kyrene school Superintendent David K. Schauer.11
In a review of high school math programs that blend customized tutoring software with in-class lessons, both developed by researchers from Pittsburgh’s Carnegie-Mellon University, the U.S. Department of Education found that the programs had “no discernible effect” on students’ math-test scores.12 The widely used and highly regarded “Cognitive Tutor” software, developed by Pittsburgh-based Carnegie Learning Inc., a startup created by cognitive and computer scientists, also came up short in other federal analyses. Carnegie Learning was recently bought by Apollo Group Inc., the owners of the online, for-profit University of Phoenix.13
Can computers replace classroom teachers?
In search of budget savings, some public officials are touting online learning and so-called “blended” classes that use both computer-based and in-person instruction as potential means of saving money on teacher salaries. However, some technology experts say getting rid of teachers is a mistake. Instead, they say, school districts should be helping students navigate the digital world by searching out the best learning technology and hiring more teachers who are well trained in using it.
Still, financial strains and demands for better performance by schools mean that schools must — and ultimately will — replace some teaching slots with digital technologies, says Christopher Dede, a Harvard University professor of learning technologies. A “perfect storm” of trends is driving toward that outcome, he says.
Because of “permanent” financial problems in K-12 education, Dede wrote, “student-teacher ratios are climbing to levels unworkable for even the best conventional instruction. We cannot solve this problem by the personal heroism of individual teachers,” on whom schools have largely relied up to now to succeed in difficult conditions. Instead, Dede added, administrators “must find technology-based strategies effective for classroom teaching and learning with large numbers of pupils.”14
But, he adds, so far few technologies have been up to the task. While many computer-based learning programs require well-trained, intensely committed teachers to be effective, “large-scale educational improvement requires more,” Dede wrote. He urged education researchers to double their efforts to create learning technologies that will work even in the worst of circumstances, including in schools with scant resources and many ill-prepared teachers, since those are conditions many students face.15
Experience outside of education proves this is achievable, Dede wrote. “All other professions are successfully transforming to affordable models that use technology to empower typical professionals to be effective,” so there’s no reason technology can’t be developed to help average teachers spur strong student learning, too, he wrote.16
“We may be at a transition point” at which “we can offload some teachers’ responsibilities” onto software, to free teachers to do other tasks, such as working with students on special projects, says Brookings’ Whitehurst. At least a few college-level institutions, notably the private, nonprofit Salt Lake City-based Western Governors University, seem to have mastered the knack of delivering low-cost computer-based education that works, so there seems no reason that good technology-based approaches can’t be developed for K-12 as well, he says.
Demands to improve student learning — perhaps using technology — fall heaviest on the lowest-performing schools, most of which enroll many disadvantaged students. Yet, in these schools it’s particularly unlikely that technology could replace staff, says Stanford’s Blikstein. “This is the critical part of the story. These kids need so much help to be brought up to speed. I don’t think this kind of technology could replace a teacher.”
At the college level, where computer-based courses have taken a stronger hold, nearly 64 percent of public-university faculty who have taught both online and traditional courses said in a 2009 survey that it took “somewhat more” or “a lot more” effort to teach online than in person. Nearly 85 percent said it takes more effort to develop online courses than regular ones.17
Making good use of digital technology requires substantial change in how teachers view their roles. “It comes back to authority and control,” says Christine Greenhow, an assistant professor in the School of Education and Information Studies at the University of Maryland, College Park. “If you see your job as pouring knowledge into the minds of students who are empty vessels,” that doesn’t mesh with the technology revolution, she says. Today’s students have cell phones, laptops and other devices on which they can research anything and everything on their own, she says.
Eventually, computer-teaching systems will diagnose students’ learning problems on the spot, based on data collected from the students’ interaction with the software, then design appropriate interventions. Those interventions might include calling for a live teacher, many of whom, in the future, may act more like “coaches” who address particular problems that learning software has identified, predicts Paul Kim, chief technology officer at Stanford’s School of Education.
Tomorrow’s teachers will have to both tailor instruction more individually and deal with deeper, more conceptual learning, many analysts say. For example, “one challenge, especially at upper grade levels, is to come up with questions for which Wikipedia won’t supply good answers,” says Stanford education Professor Daniel Schwartz.
Within a few decades, teachers may be sharply divided into an elite class of professionals who are savvy at both technology and teaching and a second, less-prestigious group who act more or less as babysitters, managing students in classrooms, wrote Whitehurst. A teacher will be “either … an expert on the design and delivery of instruction through technology or … the equivalent of a hall monitor or a tutor for struggling students, with commensurate salaries.”18
Are computer games effective for learning?
From computer games’ earliest development, in the 1950s and ’60s, it was clear that they motivated players to commit time and energy to conquering their challenges to a degree that school lessons seldom do. This discovery, together with computers’ ability to hold massive amounts of text, pictures and sound, encouraged development of games especially tailored for learning. However, not every game that has academic content and motivates students to play it actually provides a learning experience, some scholars say.
In many current games, players, alone or in groups, enter virtual worlds — such as Yellowstone National Park, in the game “WolfQuest” — or real-world sites that they visit while accessing added digital information about the place via technology such as smartphones. The idea is to explore the real or virtual place and solve problems there, explains a new report on games and learning by the National Research Council (NRC), a federal agency staffed by scholarly researchers.19 In the well-regarded game “River City,” developed by Harvard’s Dede, for example, players explore a highly detailed simulation of a 19th-century American city to uncover and solve a public-health crisis.20
The NRC said games that challenge students to solve complicated problems in rule-based virtual worlds have the potential to kick-start the kind of inquiry- and project-based scientific learning that many education theorists have sought for decades. Such games can help students “visualize, explore and formulate scientific explanations for scientific phenomena” that they wouldn’t otherwise be able to observe and manipulate, the NRC said. The games also tend to “spark high levels of engagement, encourage repetition and practice and motivate learners with challenges and rapid feedback,” it said.21
Still, many researchers say they’re less interested in figuring out how to increase the supply of educational computer games than in discovering the principles that fuel enthusiasm and hard work by students.
“I don’t think we should make school into a game,” says Barry Fishman, an associate professor of learning technologies at the University of Michigan. “My objective is to find out why people work so hard at games” and then figure out how the same principles might be applied to many kinds of learning situations.
Hoping to find out why his 6-year-old son enjoyed computer games so much, Gee says he “failed many times” at the first game he tried, one he picked randomly from a store shelf: “The New Adventures of the Time Machine.” He says he “had to engage in a virtual research project via the Internet to learn some things I needed to know” to play. Gee grew amazed that “lots of young people pay lots of money” to get this difficult experience and “realized that this was just the problem our schools face: How do you get someone to learn something long, hard and complex and yet enjoy it?”22
Research is revealing underlying principles of effective learning games, says Eric Klopfer, an associate professor of education at the Massachusetts Institute of Technology (MIT). Such games allow for many different solutions to the problems and questions they pose; encourage both collaboration with other players and independent action on the part of players; set up novel problems for players to solve and provide feedback to help players advance, he says. A compelling narrative and characters to identify with also are important, he says.
But many games don’t operate on those principles, and some don’t teach much, or anything, of value, critics say.
“In trivial games, you solve a problem and then get a reward,” but the learning and the other aspects of the game aren’t connected, so that the game only provides some traditional drill-type instruction rather than deep learning, says Klopfer. In the popular “MathBlaster” game, for example, players earn opportunities to participate in an outer-space adventure video game by giving the right answers to math questions, but the questions aren’t conceptually connected to the game’s story.
Adding elements of play or contest to all learning activities, including rote memorization, is what some education theorists call for when they suggest “gamifying everything.” But that’s a shallow use of game principles and an approach that may even be inferior to more traditional educational methods, Klopfer suggests.
In fact, not all researchers find that games are useful at motivating and engaging students. In a 2007 study based on student surveys and interviews, Nicola Whitton, a research fellow in educational-games technology at Manchester Metropolitan University in England, found that “a large proportion” of students “do not find games motivational at all” and that “there is no evidence of a relationship between an individual’s motivation to play games recreationally and his or her motivation to use games for learning.”23
Serious attempts to develop highly effective learning games are in their infancy, experts say.
One barrier is that “gamers and educators are very different cultures, and you need to get them together” to have a real shot at figuring out how the principles of the two disciplines may intersect, says Stanford’s Schwartz. The two sides often resist such cross-disciplinary discussions, he says.
Mike Kerr, principal of the KIPP Empower Academy in Los Angeles, said kindergarteners participated in an experiment last year with “blended learning,” which uses both computers and classroom teachers. Results from the trial year were so promising that school administrators decided to continue using computers in kindergarten. The charter school serves minority and low-income students in impoverished south Los Angeles.
(Getty Images/McClatchy Tribune/Jonathan Alcorn)
Furthermore, the effectiveness of any education technology, including games, “depends on a combination of the technology and the context in which it’s delivered,” which includes school and classroom conditions, teacher skills and more, says MIT’s Klopfer. Generally, for a game to succeed as a learning tool, a teacher or a community of people must be available to support and help players navigate it, he says.
Currently, teachers often don’t use games to optimize learning and in many cases aren’t equipped to do so, the NRC said. In the “River City” game, for example, players are supposed to explore the town, then formulate and test original hypotheses about what’s causing disease there. But “some teachers have asked students to use the curriculum to simply confirm correct answers that the teachers provided in advance,” essentially canceling out the opportunity for intellectual initiative, the NRC said. Behind teachers’ misuse of the game lie lack of time, pressure to prepare for high-stakes standardized tests and a lack of the “deep-content knowledge and effective teaching strategies” suitable for inquiry-based learning, the group said.24
Background
A Digital World
The Information Age is only decades old. But many scholars argue that, eventually, digital technology will change everything, including concepts of learning, as surely as the greatest upheavals in history have done.25
“We can liken this age to the age of the invention of the printing press, and I don’t think that’s an exaggeration,” says Kean University’s Lerman. Especially in these early days, however, “there’s more than one way things can change,” he says.
Technology can be employed to “do old things in new ways,” Lerman says. For example, he says, teachers can learn to give more effective lectures, and students can learn from master teachers they’ll never meet if outstanding lectures are archived on YouTube. However, digital technology also can encourage “doing new things” to transform education into the student-driven, lifelong enterprise that many scholars see as the wave of the future, Lerman says.
Experts say that many characteristics of the Information Age will transform schools and learning, and each raises important questions about the future of education.
For example, “in the Age of Information, everything can be customized, and the last frontier is education,” says Stanford’s Blikstein. One need only pick up an American pre-calculus, biology or history textbook to see that the number of possible subjects of study is huge and beyond the ability of any one student or class to cover, even within a single discipline. Rather than trying to cram in as many as possible, as schools tend to do today, future schools with extensive access to online and other computer-learning technologies can allow students to pursue subjects of special interest. “Apart from the very basic things” — such as reading and basic math —“you should learn things that relate to your life and community,” with one student studying trigonometry and another studying statistics, for example, Blikstein says.
Furthermore, with digital devices ubiquitous, “we’re emerging into the era of student as content creator,” says Lerman. “That has profound implications for almost everything we do in schools.” How, he asks, does one assess learning when students create their own projects? What, Lerman continues, is the role of a teacher, if not as the sole “expert dispenser of validated knowledge?”
As much of the world’s information moves online, learning facts becomes less important than knowing how to find and use them. “Should we require students to regurgitate facts they’ve assimilated from classes, or should we allow students to access on a test any information they want” and use it to analyze a problem and propose a solution? asks Lerman. “In business, we’d call that collaboration. In school, we call it cheating.”
Tutor, Tool, Tutee
Electronic computers were invented in the early 1940s and used as early as 1943 for a wartime educational purpose — as flight simulators whose mock aircraft “controls” responded to pilots’ actions the same way controls on real planes did. In the 1960s computers entered K-12 classrooms after software was developed to lead students step-by-step through a process such as long division.
Soon the number of computers in schools began rising. In 1963, just 1 percent of high schools used computers for instruction. By 1975, 55 percent had computers, though only 23 percent used them primarily for learning. The rest used them for administrative purposes.26
Robert P. Taylor, a professor at Columbia University Teachers College in New York City, identified three ways computers can aid learning.27
First, step-by-step instructional software can “tutor” students in some subjects. In 1963, computer giant IBM partnered with Stanford’s Institute for Mathematical Studies in the Social Sciences to develop programmed-learning software for elementary schools, jointly created by computer scientists and learning experts. In 1966, IBM introduced its Model 1500 computer, especially designed to run instructional programming. The computer had unusual-for-the-time features such as audio capability and a “light pen” that allowed users to write on the computer screen.
Second, Taylor wrote, a computer can serve as a tool, such as a calculator or word processor. In fact, he said, outside of schools, “tool-mode computing is popularly seen as synonymous with computer use, period.”28 (Nevertheless, schools often have ignored the potential usefulness of digital tools such as database and spreadsheet software for homework and vocational training, instead expecting students to learn such programs on their own, says Stanford’s Kamil.)
Finally, Taylor wrote, a student can learn to program a computer to do new tasks, effectively acting as the machine’s “tutor.”
“Because you can’t teach what you don’t understand, the human tutor” — the programmer —“will learn what he or she is trying to teach the computer,” wrote Taylor. Furthermore, “learners gain new insights into their own thinking through learning to program.”29
This argument — that students should learn the inner workings of computers and learn to “teach,” or program them — proved persuasive. In the 1970s and ’80s, many schools installed computer labs and required every high school student to take a programming course, the aim being to teach thinking skills and prepare young people for computer-science careers.
Special programming languages were developed for beginners. Many high school courses used BASIC, invented in 1964, which featured short programs and simple-to-understand error messages. In the early days, the classes seemed successful. Student programmers have “taught” computers to “tutor younger students in arithmetic operations, to drill students on French verb endings, to play Monopoly, to calculate loan interest, … draw maps” and “to generate animated pictures,” Taylor wrote in 1980.30
LOGO, a language created in 1967 to extend the supposed benefits of programming to elementary-school students, allowed students to move a cursor — called a “Turtle” — around a screen to draw simple pictures.
A child gradually learns the different programming commands — expressed in words and numbers typed on a keyboard — that move the Turtle around the screen to draw a picture, wrote LOGO inventor Seymour Papert, an MIT mathematician. The challenge of drawing on the screen by typing out a series of programming commands “is engaging enough to carry children through” the lengthy process of ferreting out how to write LOGO programs to create any design that they envision, said Papert.31
Ultimately, the process “can change the way they learn everything,” by encouraging habits such as exploring new situations to figure out the rules by which they operate and accepting mistakes as inevitable consequences of exploration that are correctable with patience and logic, Papert said.32
Research has failed, however, to produce evidence that problem-solving skills used in programming classes transfer to other types of learning or even to later programming work, wrote Roy D. Pea, director of Stanford’s Center for Innovations in Learning. Children who studied programming engaged in “very little preplanning” when they worked on new programs, he said. Rather than using logic to “debug” nonworking programs — as programming classes teach — they usually just erased them and started over from scratch, Pea reported. “Transfer of problem-solving strategies between dissimilar problems” proved “notoriously difficult … even for adults.”33
By the 1990s, enthusiasm for teaching programming to students all but died out for a “whole host of reasons,” says Yasmin B. Kafai, a professor of learning and technology at the University of Pennsylvania’s Graduate School of Education.
For one thing, most schools “had not integrated programming into the rest of the curriculum,” leaving it without obvious applications to other activities, Kafai says. Then, beginning around 1990, multimedia CD-ROMS provided a more immediately attractive use for computers, with games to play and videos to view.
Many teachers weren’t up to the task of teaching programming adequately, says Oregon’s Moursund. When training teachers to teach LOGO, Moursund says he found that “many had no insight into problem solving” and thus couldn’t teach students the deeper thinking skills that programming could impart.
Proponents of getting students to program didn’t give up, however. In the 1990s and 2000s, new languages for beginners emerged. Perhaps the most prominent is Scratch, a free online Web community designed to teach programming concepts by letting users create and post online videos, music, graphics and computer games. Scratch’s developers, which include the National Science Foundation and MIT, aimed to make the language a favorite hobby rather than a school subject.34
“Kids only spend 18 percent of their waking hours in school, so there’s lots of time outside that can be leveraged,” says Kafai, a Scratch developer and researcher. In the past, students had no access to programming resources except through schools, “but now the situation has flipped. Every child has a smartphone” that can be used to program. As of October, the website had 921,785 registered members, 270,318 of whom had created more than 2.1 million projects.35
After enrollment surged in the 1980s and ’90s, the percentage of high schools offering elective introductory courses in computer science dropped from 78 percent to 65 percent between 2005 and 2009. During the same period the percentage offering Advanced Placement (AP) courses also declined, from 40 to 27 percent.36 The College Board, which had offered two levels of computer-science exams, ended its more advanced AP exam after the May 2009 tests.37
College computer-science enrollment also fell, from a record per-department average of 100 newly enrolled students in 2000 to 50 by 2007.38
Enrollments have remained “in a trough” in recent years, says Joan Peckham, a professor of computer science at the University of Rhode Island, in Kingston. Part of the problem is image. “Research finds that students have a very poor image of computing” as “boring” and “full of these nerdy people facing a screen all day,” she says. There’s a lot to lose should interest remain low, Peckham says. “We have a technical and an interdisciplinary world” in which virtually every profession depends on sophisticated computer applications.
Connected Computers
Perhaps the heaviest blow to programming came from the Internet. As schools gained online access, networked computers’ potential to serve as tools of hitherto unimagined power for accessing information and communicating quickly outpaced other computer uses. Internet-connected learning provides a tantalizing glimpse of the world of personalized study that many scholars say the Information Age will ultimately bring.
The Internet allows students and teachers to try out different ways of learning, something that was hard to do when every learning methodology was available only as a pricey textbook or software purchase, says Stanford’s Schwartz. For example, the Kahn Academy offers a large number of short lectures posted on YouTube on such subjects as solving quadratic equations and learning the parts of a cell. MIT graduate Salman Kahn’s videos are “a good example of stuff that’s easy to use” and that multiplies learning options cheaply, Schwartz says.
Original documents, maps, archived film footage and interviews with people involved in historical events began appearing online in the past 15 years, providing opportunities for more in-depth, self-directed learning, says Resta of the University of Texas. For example, with primary sources online, “you can have kids actually practice historical thinking” by using original sources to construct their own versions of how and why some historical events happened, he says.
As the Internet has provided learning opportunities, it has increased pressure on schools to provide Internet-connected devices.
“An infrastructure for learning should support learning in and out of the classroom,” and thus an effective, modern education system should find a way to supply students and educators with Internet-access devices for around-the-clock use, said the federal government’s most recent national education-technology plan, issued in 2010.39 Some school districts and one state, Maine, as well as countries such as Uruguay and Peru, have implemented one-digital-device-per-student programs, typically dispensing laptop computers to students in some grades.
Maine debuted the biggest U.S. program in 2002, placing a laptop into the hands of every middle-school student. The program didn’t mandate specific uses but provided training for teachers to help them integrate the computers into the curriculum. “If you just drop the computers on the kids’ desks, it won’t work,” said Gov. Angus King, an Independent. “It’s a fundamentally different way of teaching. It’s not standing up in front of the classroom lecturing.”40
Research on one-child, one-device programs supports King’s contention, scholars say.
Few studies show that laptop programs raise standardized-test scores significantly. However, “greater quantity and improved quality of writing; more teacher and peer feedback on student work; wider opportunities to access information from a wide variety of sources; and deeper exploration of topics through in-depth research” are demonstrated outcomes of programs that are integrated into the curriculum, according to Mark Warschauer, a professor of education and informatics at the University of California, Irvine.41
Some studies do show test-score improvement. For example, between 2000 and 2005 the percentage of Maine’s eighth-graders who met the state’s proficiency standard for writing rose from 29.1 percent to 41.4 percent, and classes that used laptops for drafting and editing outperformed those that didn’t.42
Other one-child, one-device programs operate on the principle that ownership of computers is enough by itself to improve learning. “When every child has a connected laptop, limits are erased as they can learn to work with others around the world, to access high-quality, modern materials, to engage their passions and develop their expertise,” according to the Cambridge, Mass.-based One Laptop Per Child Foundation, which distributes laptops free to children in developing countries.43
But research fails to back up that contention, some scholars contend.
In Birmingham, Ala., researchers found that two years into a program that gave students computers but didn’t formally integrate them into curricula, only 20 percent used the laptops “a lot” in class, while 60 percent used them “a little” and 20 percent said they never used them.44
In Uruguay, which received laptops from the foundation in 2007, “only about 25 percent of the kids are bringing them to class,” says Kim, at Stanford’s School of Education. He cites the limited use as evidence that before students can be motivated to use free laptops in class, educators must actively engage them in projects that encourage them to do their own Internet research.
In the past few years, as cell phones have aqcuired as much memory as computers, some schools have been flirting with the notion of bring-your-own-technology programs. Such initiatives generally allow students to use their own devices — usually smartphones — in class while allowing students who don’t own Internet technology to borrow devices that belong to the school. In a survey of school administrators in the fall of 2010, nearly two-thirds said they were unlikely to allow students to use their own mobile devices in class. However, just under a quarter said they were likely to do so.45
Using bring-your-own-technology programs to save schools money and encourage student engagement raises fears, however. Besides worrying about unfairness to students who don’t own high-tech phones, administrators see murky areas of legal liability if students access inappropriate Web pages, cheat or disrupt classes using their own equipment.46
“You can see the tension as some schools say, ’We have to ban personal cell phones in class,’” says Kean University’s Lerman. But many young phone owners are discovering phones’ productive capabilities, doing “unbelievable things,” he says. “Some have written novels.” Schools should encourage such innovations, not ban them, he says.

Struggling With a Similar Paper? Get Reliable Help Now.

Delivered on time. Plagiarism-free. Good Grades.

What is this?

It’s a homework service designed by a team of 23 writers based in Carlsbad, CA with one specific goal – to help students just like you complete their assignments on time and get good grades!

Why do you do it?

Because getting a degree is hard these days! With many students being forced to juggle between demanding careers, family life and a rigorous academic schedule. Having a helping hand from time to time goes a long way in making sure you get to the finish line with your sanity intact!

How does it work?

You have an assignment you need help with. Instead of struggling on this alone, you give us your assignment instructions, we select a team of 2 writers to work on your paper, after it’s done we send it to you via email.

What kind of writer will work on my paper?

Our support team will assign your paper to a team of 2 writers with a background in your degree – For example, if you have a nursing paper we will select a team with a nursing background. The main writer will handle the research and writing part while the second writer will proof the paper for grammar, formatting & referencing mistakes if any.

Our team is comprised of native English speakers working exclusively from the United States. 

Will the paper be original?

Yes! It will be just as if you wrote the paper yourself! Completely original, written from your scratch following your specific instructions.

Is it free?

No, it’s a paid service. You pay for someone to work on your assignment for you.

Is it legit? Can I trust you?

Completely legit, backed by an iron-clad money back guarantee. We’ve been doing this since 2007 – helping students like you get through college.

Will you deliver it on time?

Absolutely! We understand you have a really tight deadline and you need this delivered a few hours before your deadline so you can look at it before turning it in.

Can you get me a good grade? It’s my final project and I need a good grade.

Yes! We only pick projects where we are sure we’ll deliver good grades.

What do you need to get started on my paper?

* The full assignment instructions as they appear on your school account.

* If a Grading Rubric is present, make sure to attach it.

* Include any special announcements or emails you might have gotten from your Professor pertaining to this assignment.

* Any templates or additional files required to complete the assignment.

How do I place an order?

You can do so through our custom order page here or you can talk to our live chat team and they’ll guide you on how to do this.

How will I receive my paper?

We will send it to your email. Please make sure to provide us with your best email – we’ll be using this to communicate to you throughout the whole process.

Getting Your Paper Today is as Simple as ABC

No more missed deadlines! No more late points deductions!

}

You give us your assignments instructions via email or through our order page.

Our support team selects a qualified writing team of 2 writers for you.

l

In under 5 minutes after you place your order, research & writing begins.

Complete paper is delivered to your email before your deadline is up.

Want A Good Grade?

Get a professional writer who has worked on a similar assignment to do this paper for you