Search   |   Back Issues   |   Author Index   |   Title Index   |   Contents

Articles

spacer

D-Lib Magazine
September/October 2008

Volume 14 Number 9/10

ISSN 1082-9873

The Effectiveness of a Web-based Board Game for Teaching Undergraduate Students Information Literacy Concepts and Skills

 

Karen Markey, Fritz Swanson, Andrea Jenkins, Brian J. Jennings, Beth St. Jean, Victor Rosenberg, Xingxing Yao, Robert L. Frost
University of Michigan

Point of Contact for this article: Karen Markey, <karen.markey@umich.edu>

Red Line

spacer

Abstract

To teach incoming undergraduate students information literacy skills, a research team at the University of Michigan School of Information developed the Defense of Hidgeon, a web-based board game. We opted for a game in lieu of other approaches because what people are doing when they are playing good games is good learning. This article describes the game's backstory, how to navigate its 34-space game board, and special game-play features. The research team invited a class of undergraduate students to play the game, gave monetary awards to winning teams, and interviewed students about their game-play experiences to determine what they learned and obtain their suggestions for improvements to the game. The authors offer three premises for the redesign of the Defense of Hidgeon and discuss these premises with regard to the design of future information literacy games.

The Problem

Now more than ever, incoming undergraduate students need to know how to conduct research. Digitization is radically expanding the information universe. Printed sources are migrating into digital forms, unprecedented amounts of new information are being born digital, and entirely new Internet-based tools – search engines, directories, library portals, current alert services, metasearch engines, review services, relevance rankings, popularity-based retrieval – are proliferating. As a consequence, all information seekers are now in the same deep and rich pool of information where academics have trod alone for years.

The digitization that has led to today's data-saturated environment has separated information users from information experts. Instead of going to libraries, students search using Google and other Internet search engines (Fast and Campbell, 2004; Head, 2007). Bereft of expert knowledge of the disciplines, students are totally in the dark about where to start, how to build on a good start, how to evaluate what they find, and how to navigate the complicated interrelationships between different research and discovery tools.

The Solution

Games hold much promise for teaching incoming students information literacy skills and concepts. What people are doing when they are playing good games is good learning (Gee, 2003, 199). Johnson (2006, 31) praises games for their ability to help us "find order and meaning in the world and make decisions that create order." Squire and Jenkins (2003, 29) promote games for good learning because they "encourage collaboration among players and thus provide a context for peer-to-peer teaching and for the emergence of learning communities." Gee (2003, 26) argues that "games are potentially particularly good places where people can learn to situate meanings through embodied experiences in a complex semiotic domain and meditate on the process" and presents three dozen learning principles that are built into good games (207-212).

But equally important is that games are available. Games can be with the student when an information expert cannot be. Games can be in the dorm room, at the coffee house, and anywhere else that the Internet can be accessed. Games are a way to bring information expertise to the users where they are already working.

A Web-based Board Game That Teaches the General-to-Specific Model

The Storygame Project team developed and evaluated a prototype web-based board game called the Defense of Hidgeon: The Plague Years that teaches incoming undergraduates information literacy skills, specifically, the General-to-Specific Model for conducting library research (Kirk, 1974). Our updated model advises students to start research with broad overview tools such as the web and general and discipline-specific encyclopedias, handbooks, and histories so they develop a general understanding of their topics. It advances students to finding tools – bibliographies, abstracting & indexing databases, and catalogs – for specific information upon which they can build a foundation of understanding. Finally, it advances the few students who want to specialize and achieve depth in their topics to forward-chaining tools – citation indexes – to find cutting-edge research. The game's backstory places players in the 14th century at the height of the Black Death. Students play on two- to four-person teams, roll a digital die, and travel around the medieval Duchy of Hidgeon where they are tasked with scrutinizing information in Duchy libraries about plagues, past, present, and future, in order to help the Duchy's ruler develop a plan of action. Students must prove they are the Duchy's richest, fastest, and most efficient team of researchers.

Game Board Navigation

Game pieces travel the web-based board (see Figure 1) in a clockwise direction. Teams click on the electronic die (top left of the board) that moves their game piece forward 1 to 6 spaces. On the board are these 34 game spaces: 17 Monastery Library spaces, 4 Sage Advice spaces, 2 Garrison spaces, 2 Library Study spaces, 2 Oracle spaces, and one each of the following spaces: Fox Hunt, Manor House, Tavern, Well, Public House, Hospital, and Castle of the Duke.

Image showing the game board

Figure 1. Defense of Hidgeon's game board

After moving forward to a particular game space, the game issues instructions about how to proceed. Teams reap the greatest benefit from landing on Monastery Library spaces, because these spaces issue questions for which teams are rewarded for correct answers; however, teams must do library research to answer the questions correctly. In Figure 2, the "peasant" game piece is located at the Eastern Oracular Library of St. Jerome.

Image showing the game piece at the Eastern Oracular Library of St. Jerome

Figure 2. Game piece at the Eastern Oracular Library of St. Jerome

Teams have to examine web pages, online encyclopedia articles, books, edited works, and journal articles from subject-oriented databases and citation indexes to give correct answers to the questions they receive at monastery libraries. For example, players must consult 3 books on reserve and compare their contents to answer the game's question about Black Death mortality rates (Figure 3). The game awards a golden scroll to the team for a correct answer to this monastery library question.

image showing a question from the Eastern Oracular Library of St. Jerome

Figure 3. Question from the Eastern Oracular Library of St. Jerome

Special Game Features

Additional spaces vary game play and add research-related information. For example, landing on the Fox Hunt space sends game pieces to the Hospital, which requires players to complete a task at campus libraries to earn their release. An example of a Hospital task is:

  • Go to any campus library. Ask the reference librarian to pretend that she or he has just given you a comprehensive tour of the library's resources and services. Ask the librarian to summarize by telling you the three most important things that you should remember about the library.

Teams have to collect a quota of 18 golden scrolls at the game's monastery library spaces. The game's scoring algorithm penalizes teams for incorrect answers. Correct answers give teams the opportunity to purchase exclusive licenses to monastery libraries and add 3 citations to their team bibliography. Teams use their bibliographies during challenges to take over an opponent's exclusive licenses. Teams check top scores on the game's leader board. Although game assets such as gold, exclusive licenses, and golden scrolls are important, a team's accuracy answering questions receives the heaviest weights in the game's scoring algorithm. A YouTube movie demonstrates game play ("Navigating," 2007). The research team's final report describes and illustrates the game's full functionality and explains its complex scoring algorithm (Markey et al., 2008).

Game Play

This section tells how the research team recruited undergraduate students to play the game, monitored game play, and collected data during the game.

Recruiting Students to Play the Defense of Hidgeon

The research team recruited students from SI 110, "Introduction to Information Studies," to play the game. The course is taught by this article's eighth author, and admits 75 undergraduate students at all levels and from a wide range of majors.

Frost mentioned game play to students only in passing at the beginning of the semester and one week before this article's first author visited the class. Our inclination was to downplay the game, preferring to gauge student enthusiasm on the game itself, not on a special buildup. On October 30, 2007, Frost introduced Markey to SI 110 students. Her remarks about the game were brief because she did not want to predispose students to thinking about the game in a particular way; instead, she wanted students to develop their own ideas about what the game was teaching them. Her introduction to the game included a demonstration of game play, a summary of monetary prizes to first ($400 per team), second ($200 per team), and third place ($100 per team) winners, encouragement to sign up on teams to play the game, and notice that she would ask them questions about their game-play experiences after the game ended. Game play began on November 3, and ended on November 29, 2007.

Monitoring Game Play Activity

Of the 75 students enrolled in SI 110, 29 students signed up on 8 teams of 2 to 4 students. After game play began, the project team studied logs of game play activity. Only one team played the game over the first weekend, answering 12 of 14 questions correctly and acquiring 8 of the game board's 17 exclusive licenses to monastery libraries. To encourage game play, SI 110's instructor issued an incentive telling students they would receive a half-letter grade increase if they answered 40% or more questions correctly in the course of collecting all 18 scrolls. In response, an additional 20 students signed up on 5 new teams to play the game. Overall, 49 (65%) students signed up on 13 teams to play the game. By the following Monday morning (November 12), 6 teams had signed onto the game and attempted to answer at least one question.

Collecting Data about Game Play

While SI 110 students played the game, the project team logged selected transaction data about game play such as questions attempted and answered correctly by type, licenses purchased by type, and time elapsed since the start of the game. When the game ended, we transferred logged data to an Excel workbook for data analysis. Additionally, project team members attended SI 110's three regularly scheduled weekly discussion groups on November 27 and 28, 2007 and conducted focused group interviews. We asked students whether their teams played together or individually, what they learned from playing the game, why some signed-up teams failed to play the game, why some students failed to sign up to play at all, to suggest improvements to the game, and about using games to learn about library research and academic topics generally.

Results of Game Play

The Storygame Project team considered "successful teams" to be the 6 teams that met the criteria for the instructor's grade increase, that is, answering the 18-question quota with 40% accuracy rate, and "unsuccessful teams" to be the 7 teams that failed to meet the criteria.

Game Play Patterns

An analysis of daily logs revealed these 5 game-play patterns:

  1. Instant starters: Teams that began game play immediately after the game's start and, in the absence of competition from opposing teams, met the quota of 18 scrolls and purchased all monastery library licenses within a week of the game's start. Example: InfoHunters team only.
  2. Dropouts: Teams that signed up for game play but dropped out, some failing to earn any scrolls, and others earning one or two scrolls. Examples: Hail, Best, and Blue teams.
  3. Testing the waters: Teams with low levels of game play activity that eventually became dropouts (#B above), played in spurts (#D below), or at the last minute (#E below). Examples: Conquerors, Wolverines, and Warriors teams.
  4. Pre-Thanksgiving dashers: Teams that played the game in spurts before Thanksgiving break. Some of these teams dropped out entirely, and others continued game play fulfilling game-play objectives connected with receiving the incentive only. Examples: Warriors and Victors teams.
  5. Last-minute rushers: Teams that rushed to complete game play before the game ended on November 29 so they could meet the criteria for the instructor's incentive. Examples: Authorities and Valiant teams.

Patterns that characterized the game play of unsuccessful teams and successful teams were B, C, and D, and A, D, and E, respectively. Two of the game's 7 unsuccessful teams failed to sign onto the game; 4 of the 5 remaining unsuccessful teams answered a total of 28 questions with a 21% accuracy rate. (The estimated probability of guessing right answers to questions was .30.) Except for the Warriors, unsuccessful teams appeared to be testing the waters, that is, trying to determine whether they should invest time and effort in game playing. The Warriors attempted 28 questions, answering half of them correctly. Most likely, the Warriors had every intention of meeting the instructor's incentive but a combination of competing priorities and technical problems that suspended game play during Thanksgiving break prevented them from doing so.

Giving Correct Answers to the Game's Questions

To find correct answers to monastery library questions, teams encountered these six types of resources: (1) web, (2) encyclopedias, (3) books, (4) edited works, (5) journal-article databases, and (6) citation databases, and searched different information retrieval systems – web search engines, library catalogs, subject-oriented databases, and citation indexes. The estimated probability of guessing right answers to questions was 0.30.

Successful teams answered 50.9% of monastery library questions correctly. Percentages of correct answers were highest for web, online encyclopedias, and database questions at 67%, 62%, and 62%, respectively. These percentages were a little over 2 times what would be expected by chance. Percentages of correct answers were considerably lower for books (43%), edited works (39%), and citation databases (42%).

Doing the Research to Answer Questions Correctly

Players did not have to leave their personal computers to do the research to answer questions for the web, online encyclopedias, and databases. Their higher accuracy rates for these questions may be due to the convenience of doing research online at their computers. Asked to identify the game's easiest questions, students immediately replied "web questions." In fact, students agreed that any question that kept them online at their computers was an easy question.

  • "Web questions. All you had to do was copy the term if they say Google and usually it was transparent in the first link and that was it."
  • "Going to a web site."

Asked about the game's most difficult questions, students chose books and edited works, that is, questions that required them to go to the U-M Library:

  • "I was doing other work, and I just didn't feel like going to the library."
  • "It's just having to get up and go somewhere it's like ... I didn't have enough time [to go to the library]."

Prior to game play, the research team put books and edited works on reserve. After game play, reserves staff confirmed that only 5 of the 51 items we placed on reserve were borrowed, and even these items were checked out a total 6 times. Asked about consulting books and edited works on reserve, students confirmed this in interviews answering in a chorus of "noes" to our direct questions. The evidence is overwhelming – game players guessed at most answers that required them to go to the library.

Game players told us that answering citation database questions was difficult. Their accuracy rates for citation database questions were as low as for books and edited works questions despite the online availability of citation databases. One focus group interviewee's explanation cites difficulties with this database's interface:

  • "I had more trouble with the citing questions ... which was the ISI Web of Science or something. It would come up with a person's name and there would be 19 of the one topic and then there would be 1 under the same topic name and, like, I would always choose all of the ones that were under the same topic and I would get the answer wrong and I think that would be why. I just felt like maybe the answers were like weird like they were not clear."

Citation searching may be new to undergraduate students. Accustomed to search engines in which they scan lists of web sites and journal articles on the same or similar subjects as their entered words and phrases, students scan lists of journal articles, books, and book chapters that cite the author names they entered. Differences in abbreviations, page numbers, and page ranges result in multiple listings requiring searchers to exert effort scrutinizing intermediary results, a perplexing task for some students as described above. The bottom line is that despite the online nature of citation databases, they are tools that deviate from traditional database searching and require searchers to exert patience, attention, and effort to learn how to use them effectively.

A Cavalier Attitude about Physical Library Collections

During our discussion of visiting the library, a few students assumed a cavalier attitude about whether the items in the library's physical collections should figure into their research. For them, research meant searching the web or online journal-article databases for recently published works and downloading them to their computer's desktop.

  • "I think realistically the way we do research now as college students applies much more to finding the article on a website through databases and much less through actual books. For example, I used some books for a paper last year and they were checked out in 2000. Like it had been 7 years since the book had been checked out and that's how we're kind of like shifting and I think books may be less relevant towards our needs in terms of what we actually need to know. We're kind of losing the art of finding the book and using the library in that sense but that's just the way it seems it's going."
  • "[Why bother with books when you have] the Internet ... It is kind of ... an easy out. You get all these libraries all over. There's like JSTOR, and you can go through a document, and you can search through that document using 'find,' and you can look for all the relevant words and really quickly skim through hundreds of documents in the time it would take to get to the library."

The Hospital: A Real Show-Stopper

Our intentions with regard to Hospital tasks were good. Tasks introduced players to campus libraries where they could learn how librarians and specific library collections could help them now and in the future. Unfortunately, Hospital tasks were disruptive and ruined the overall flow of the game. Here is what students said about the disruptive nature of the Hospital:

  • "People on [my] team were all pretty friendly so we just kind of sat in a room one day and tried to play but we ran into issues of all of us sitting in the same room trying to play. Like we all had one computer on the game and the rest were doing research ... like when you got put in the Hospital and we kept getting in the Hospital and it's really like a big pain to go get yourself out of the Hospital. We tried to play ... and we all ended up in the same room but two turns in we ended up in the Hospital again, and it just ruined that whole session."
  • "The Hospital that was a big pain in the you-know-where if you're on it and you already have good momentum and all of a sudden you have to stop and go and track down a librarian who like I was once there for 15 minutes so they could try to figure out where the code was. But I mean the idea of having to do that extra thing is a nice idea maybe just incorporate it so [it works]."

A redesign of the Defense of Hidgeon would put an upper limit on the performance of the Hospital task (maybe 3 times) and comparable tasks that deviate from online game administration.

Team versus Individual Game Play

The research team encouraged students to play the Defense of Hidgeon in teams rather than individually. Students followed our advice but the Hospital task ruined the online ebb and flow of the game.

The suggestions students made about how to improve the game included capabilities for giving one player control while team members observed his or her actions and allowing the controlling player to pass control to fellow team members. Such capabilities were not possible within our time and budget constraints but, given more time and financial resources, they would be a top priority in an enhancement of the game at hand.

Learning the General-to-Specific Model

When students were asked what they learned from game play, their comments focused on the "how-to" connected with library research such as learning the names of databases, becoming familiar with a particular database's content, choosing databases using the library portal's Search Tools capability, retrieving useful results, and searching particular databases. In this comment, a student gives us advice about how to order monastery libraries on the game board:

  • "I wonder if you could order the [monastery libraries] like how good their resources were. Web resources might not be as reliable as the last ones were I think databases or encyclopedias. I'm not sure ... Encyclopedias seem like more reliable than websites ... The information is accurate, and it has relevance, it has more relevance because it's more ... more has to be true than Wikipedia, like you can just go to Wikipedia, and it might not be true. Kind of like Monopoly – the first [properties] are always really cheap and then at the end it's really expensive."

In fact, ordering the game's monastery libraries was a very deliberate task of the Storygame Project team. The order was in keeping with the GenSpec Model. The least reliable sources – libraries specializing in web resources – were the first ones players encountered on the board. Next were libraries where retrievals were broad-based overviews of topics in the form of encyclopedias and books. At the end of the board were libraries with technical, in-depth, and scholarly treatments of topics. That game players did not come to a realization about the order of monastery libraries was troublesome to us researchers. It calls into question the very design of the game. It also demonstrates the need for instructors to be forthright about adding information literacy games to the course curriculum, telling students learning objectives before game play begins, encouraging students to talk about what they are learning during game play, and debriefing students at the end of the game.

Premises for the Development of Information Literacy Games

Based on an analysis of game play and evaluation data, the research team generated premises for the development of information literacy games. Three premises are featured here in a discussion that tells how the project team would improve the Defense of Hidgeon or design a new game to be in sync with each premise.

Premise #1: Game Play That Counts toward Students' Grades in the Course

SI 110 students were slow to sign up on teams and play the game because they did not see a direct connection between the game's Black Death theme and the course's information science content. Although students who played the game concluded that they could apply their newly acquired online searching skills and knowledge to the research for the course's two required papers, and, possibly, research for papers assigned in other courses, students wanted to know up front exactly how they would benefit from game play. Only after the instructor issued an incentive that affected students' final grades did they sign up and play the game. To be in sync with premise #1, instructors would have to tell students what they can expect to learn from game play and grade them on game play.

Premise #2: Game Play That Gives Players Mastery Over One Key Concept at a Time

The Defense of Hidgeon was ambitious and functionally rich. It gave students hands-on experience using several different types of information retrieval systems, exposed them to a wide variety of information sources, and put them in situations in which they made decisions about credibility, audience level, and discipline. No wonder they failed to realize the General-to-Specific Model – too many other things were happening at the same time! The project team yielded to the temptation to pack more and more into the game because we wanted to teach students as much as possible while we had their attention on game play. Based on our experience, we recommend that future information literacy games give players mastery over one key concept, task, or procedure.

If we were to redesign the Defense of Hidgeon, we would streamline the game, eliminating game spaces or rethinking them so that they reinforced the General-to-Specific Model. We would also rethink the challenge. It could be simplified, requiring teams to post their 1 best citation in response to a scenario. Alternatively, the challenge could be eliminated entirely and replaced by a "Synthesis" game space at the end of the board that would ask players questions to reveal the General-to-Specific Model to them.

Premise #3: A Payoff for Leaving the Computer Behind

Students must have concrete evidence that leaving their computer to do research will have a payoff in terms of improving their research or affecting their grades. Putting an upper limit on the requirement for teams to perform Hospital tasks would remedy the ill-will that game players developed toward the Defense of Hidgeon's Hospital feature. To learn how books and edited works compare to other resources, students would have to experience them first-hand in the context of an ongoing information-seeking episode to determine what these resources have to offer. This means that students have to go to the library. Perhaps enhancing Hidgeon's interface to allow passing control between players would make the task of visiting libraries less burdensome for them. However, convincing game players that items in a library's physical collections are still relevant now that so many other items are online will not be an easy task.

A Totally New Game Concept

After generating these premises, the research team tried to think beyond possible modifications to Hidgeon, and toward a totally new game concept that could meet student players where they were. Here are some initial ideas generated as responses to the above premises.

Let us consider a new information literacy game that is a pervasive and unobtrusive presence beside the online tools students use to research, write and document a writing assignment. The new game would be one game made out of a collection of narrowly focused mini-games, sensing active players and challenging them to play various mini-games based on student contributions to a shared bibliography on an instructor-assigned topic in an online citation manager. For example, a credibility mini-game would extract a citation from a shared bibliography, present it to one or two active players, and give them a limited amount of time to rate the citation according to its credibility. The game would calculate the closeness of their respective ratings and award points accordingly. Feedback would be immediate, that is, the game would report immediately to players how closely they matched the ratings of their peers so that players could make adjustments in the future to effect better ratings. Occasionally, the game could ask players questions about why they should be concerned with credibility and provide feedback to players who give incorrect answers to such questions. Not only would students benefit from learning about source credibility assessment by doing it during game play, they would leave behind a trail of credibility ratings that they could use to decide whether to consult a source from the shared bibliography to complete a course assignment.

If the design of this new game involved competitive bibliography-building (i.e., who can build the biggest, most credible bibliography in X time), it is conceivable that students would exhaust the web and online databases searching for citations and online texts on particular topics and eventually give in to searching the online library catalog that retrieves items in physical collections. Mini-games could be programmed to award bonuses to players who contribute citations that come from the online library catalog and even bigger bonuses to players who add summaries to these citations. It may be too ambitious to expect game play to reveal the benefits of physical collections. Instead, instructors may have to argue the case for physical collections in discussions following game play.

Conclusion

An evaluation of how undergraduates played a web-based board game and responded to calls for its improvement revealed the potential that games have for teaching students information literacy skills and concepts. Students tolerated game-play requirements that temporarily suspended online game play, but repeated interruptions ruined the flow of the game and their attempts to play as a team. Game play must be targeted with specific, limited objectives. Comprehensive games may fail to hit the mark, confusing students and making them question the value of the exercise. Finally, students want their efforts to be recognized in final grading. For most of the students who played the Defense of Hidgeon, the opportunity to improve their grades was much more important than receiving a monetary award. In fact, they adopted game-play strategies that were more in sync with meeting the instructor's extra-credit incentive than with winning the game.

This article concludes with three premises that tell how the project team would improve the Defense of Hidgeon or design a new game to be in sync with each premise. Additionally, instructors should debrief students before, during, and after the game to reveal, clarify, and reinforce game objectives, key concepts, and other important ideas that are difficult for students to grasp but essential for their understanding of what the game is teaching them.

Acknowledgments

The authors thank the Gladys Krieble Delmas Foundation and Trustee David H. Stam who provided the support that enabled us to develop the Defense of Hidgeon game, sponsor and evaluate game play, and give monetary awards to game winners.

References

Karl V. Fast, and D. Grant Campbell, 2004. 'I still like Google': University student perceptions of searching OPACs and the web, Proceedings of the ASIS annual meeting 2004, Information Today, Medford, NJ, pp. 138-146.

James Paul Gee, 2003. What video games have to teach us about learning and literacy. New York: Palgrave Macmillan.

Allison J. Head, 2007. Beyond Google: How do students conduct academic research? First Monday, volume 12, number 8 (August) at <http://firstmonday.org/issues/issue12_8/head/index.html>, accessed 30 July 2008.

Steven Johnson, 2006. Everything bad is good for you: How today's culture is actually making us smarter. New York: Riverhead Books.

Thomas Kirk, 1974. Problems in library instruction in four-year colleges, in Educating the library user, John Lubans, Jr., ed., pp. 83-103. New York: R. R. Bowker.

Karen Markey et al. 2008. Engaging undergraduates in research through a storytelling and gaming strategy: Final report to the Delmas Foundation. Ann Arbor, Mich.: School of Information, University of Michigan at <http://hdl.handle.net/2027.42/58630>, accessed on 30 July 2008.

Navigating the Defense of Hidgeon: The Plague Years, 2007 at <http://www.youtube.com/watch?v=u76tW-ne-yY>, accessed on 30 July 2008.

Kurt Squire, and Henry Jenkins, 2003. Harnessing the power of games in education. Insight, volume 3: 5-30, at <http://www.iaete.org/insight/articles.cfm?&id=26>, accessed 30 July 2008.

(Email address for the point of contact for this article was corrected on November 28, 2009.)

Copyright © 2008 Karen Markey, Fritz Swanson, Andrea Jenkins, Brian J. Jennings, Beth St. Jean, Victor Rosenberg, Xingxing Yao, and Robert L. Frost
spacer
spacer

Top | Contents
Search | Author Index | Title Index | Back Issues
Previous Article | Next Article
Home | E-mail the Editor

spacer
spacer

D-Lib Magazine Access Terms and Conditions

doi:10.1045/september2008-markey