Search   |   Back Issues   |   Author Index   |   Title Index   |   Contents

Articles

spacer

D-Lib Magazine
January 2006

Volume 12 Number 1

ISSN 1082-9873

The Digital Library for Earth System Education Provides Individualized Reports for Teachers on the Effectiveness of Educational Resources in Their Own Classrooms

 

Kim A. Kastens
Lamont-Doherty Earth Observatory &
Department of Earth & Environmental Sciences
Columbia University
<kastens@ldeo.columbia.edu>

Neil Holzman
Lamont-Doherty Earth Observatory
Columbia University
<nholzman@ldeo.columbia.edu>

Red Line

spacer

Abstract

We have developed and tested a system in which teachers and their students who have used an educational resource in the Digital Library for Earth System Education (DLESE) both submit on-line reviews of that resource using DLESE's Community Review System. We aggregate the students' reviews and generate an individualized report for the instructor on how his or her own students view the resource. The report for science teachers is formatted to show how well, on several dimensions, the resource worked. The report for science education professors is formatted to highlight how well students reflect on their own learning processes and identify what makes an educational resource effective.

Introduction

The Digital Library for Earth System Education [1] is designed to provide: (a) ready access to high-quality educational materials about the Earth and environment, (b) information, tools and services to maximize the usefulness of the materials provided; and (c) a community center that fosters interaction, collaboration and sharing among educators, learners and resource creators [2, 3, 4].

DLESE's Community Review System (CRS) [5, 6, 7] is a mechanism to gather, aggregate and disseminate information about DLESE resources based on the experience of people who have used the resources for teaching and learning. CRS's on-line questionnaires [8] probe the resource's scientific accuracy, ease of use, robustness, quality of documentation, ability to motivate students, overall pedagogical effectiveness, and effectiveness with specific audiences [9]. In addition, we solicit teacher-to-teacher or learner-to-learner tips about how to make best use of the resource. There are two tracks through the CRS review process: one for teachers and one for learners. The tracks address similar broad topics, but the teacher track includes some specific questions that are not asked of students; for example how readily can the resource be adapted for different audiences.

CRS disseminates information by several means.We email individual community and specialist reviews to the resource creator. We web-disseminate Teaching Tips, a graphic summary of ratings on the community reviews, a tabulation of the community's recommendations about specific audiences, and an Editor's summary. We also provide annotation metadata, which library builders use to generate links to CRS-provided Comments, Teaching Tips and review components [10].

The audiences for CRS's information have historically included potential users of the resource deciding whether to try it, current users of the resource seeking tips on how to use it better, the resource creator for guidance in improving the resource, and the CRS Editorial Review Board, who combine community feedback with specialist reviews to decide which resources to showcase in the DLESE Reviewed Collection. This article reports on a new service aimed at an additional audience: instructors seeking insights about how their own students evaluate a DLESE resource.

Sources and Audiences for CRS Information
Sources Audiences
  • Teachers who have used a DLESE
    resource to teach
  • Learners who have used a DLESE
    resource to learn
  • Potential users of the resource, to decide if it is well
    suited for their use
  • Current users, for tips on how to use the resource
    effectively
  • CRS Editors, as input into the Reviewed Collection
    selection process
  • (NEW) Instructors, for insights about their own
    students' evaluation of the resource

This project represents one tendril of a growing effort among digital library builders to provide services that are individualized for specific users, including individualized portals [11], digital reference desk [12], and library-maintained repositories for individuals' data and documents [13]. This project represents one approach to the puzzle of how to increase participant involvement in growth of educational DLs [14] by aligning library services with the incentive structure motivating teachers and students.

Instructor's Individualized Reports

Goals for the Instructor's Individualized Reports

For teachers of Earth or Environmental Science, our goal was to provide insights about how well a specific DLESE resource was working in their own classrooms based on feedback from their own students.

For teachers of science education, our goal was to reveal how insightfully their students could reflect upon and articulate their own learning processes, and how wisely they could select digital educational resources.

For students, we hoped that the process of articulating whether an educational resource helped them learn would increase their metacognitive awareness of their own learning processes, and thus help them learn better [15].

For the broader DLESE community, we wished to increase the flux of user reviews into the CRS, and thus build up the information content of the publicly disseminated CRS annotation collection.

For resource creators, we hoped to catalyze dialog between resource creators and teachers/users around the insights emerging from the instructor's report.

Finally, for education researchers, we wondered whether there might be consistent, systematic differences in how teachers and students in the same classroom perceive the same educational activity.

How the Individualized Reports Are Constructed

Teachers assign their students to fill out CRS learner-track reviews for resources that they have used for learning. Students use the standard review form, including a "group-identifier" code provided by their teacher. CRS then aggregates all the reviews into an Instructor's Individualized Report. We produce reports in two formats: one for science teachers and one for science education instructors.

For science teachers (Figure 1), the report is designed to reveal how well an individual resource worked. For each rubric, we generate a bar graph showing how the student group rated that resource on that parameter. Below the bar graph, we compile students' and teacher's text comments. To encourage candor, students' names are not associated with individual comments or quantitative rankings; however, the instructor is provided with a list of the students who provided a review of that resource so they can receive credit for completing the activity.

Image of a page from and Instructor's Individualized Report for a science teacher

Figure 1: One page of an Instructor's Individualized Report for a Science instructor. The data are aggregated across students within each rubric to provide a visual summary of where the students, as a group, thought the strengths and weaknesses of the resource were. The instructor's review is shown on the same graph to highlight similarities and differences between the teacher's and students' evaluation of the resource.

For instructors of science education courses (Figure 2), the report is designed to reveal how insightful the students were about their own learning process and how well they were able to identify and articulate the strengths and weaknesses of digital education resources. In this case, we aggregate the data by student rather than by resource, and we don't assume that the whole class has reviewed the same resource.

Image of a page from and Instructor's Individualized Report for a science education teacher

Figure 2: One page of an Instructor's Individualized Report for a Science Education instructor. In this format, the review material is aggregated by student and the emphasis is on the qualitative comments rather than the quantitative ratings. This format is designed to reveal whether pre-service teachers have reflected insightfully on their own learning processes, and how discerning they are in their evaluation of digital educational resources.

The reports are generated on-the-fly from a Web-based system that uses dynamic HTML code for graphics and formatting. Data are drawn from a database system which stores reviews indexed by course, instructor, resource, reviewer, and date. Reports are produced as PDF files to facilitate dissemination by email.

Pilot Project: Methods

College Pilot

In the fall of 2004, we recruited two college professors for an initial trial of the Instructor's Individualized Report service: one teaching an introductory course in a Geoscience department and one teaching an Earth Science course for undergraduates in a Science Education department.

The Geoscience professor assigned DLESE resources both as in-class laboratory activities and as readings in a Virtual Textbook. Students were offered extra credit for completing reviews. To facilitate the review process, we created a version of the Virtual Textbook Table of Contents [16] that had links directly into the CRS, with the group identifier and resource ID pre-embedded in the link URL.

The Science Education professor assigned students to find resources of interest in the DLESE Discovery System. They entered the CRS via the "Review this Resource" link in the Discovery System's resource description.

We kept in email contact with the professors throughout the college pilot. One report, covering the entire semester, was prepared for each professor at the end of the pilot. We debriefed the Geoscience professor informally by phone and the Science Education professor in a face-to-face meeting.

K-12 Pilot

In Spring 2005, we expanded the project to include five K-12 teachers and their students from public schools in Maryland, New Jersey (2), Connecticut and Illinois. Each of the teachers selected one or more DLESE resources for use in teaching their eighth or ninth grade Earth Science class(es); in most cases, these were resources they had used successfully in previous years. We facilitated entry to the review process by creating a pilot project web area [17] with links to view and review each teacher's resources.

We have a total of 15 teacher-resource pairings or "enactments." For each enactment, the teacher submitted an educator-track review, and the students submitted learner-track reviews. In some cases, the student reviews were created by small groups of students working together at one computer. On average, 17.2 student reviews were submitted for each enactment, with a range from 6 to 57. In cases where a teacher taught more than one section of the same class, they used the same group identifier, and we pooled the sections for analysis. Rather than one large report at the end of the semester, K-12 teachers received separate reports for each resource.

After asking permission of the teachers, we also forwarded reports to the resource creators. Students', teacher's and school names were deleted from the version of the report sent to resource creators. The K-12 teachers were debriefed in semi-structured telephone interviews, one at a time. We also interviewed three resource creators.

Pilot Project: Observations

Report for Science Education Class

We have only one instantiation of an Instructor's Report for a science education class. However, we can say that the education-format report makes patently obvious which students have reflected deeply on their learning process as users of the digital resource, and which students have not. The format also reveals students who think substantively about one aspect of resources (e.g., "bugs and technical difficulties") while having little to say about other aspects (e.g., "pedagogical effectiveness.")

The instructor reported that the education students liked doing CRS reviews. They valued having the concrete product of the formal review, which was echoed back to them by email, as documentation of their completion of this evaluative activity. They felt proud that their professional opinion was being sought in an authentic context, for use by an audience beyond their professor.

The best reviews from the science education students are among the most substantive that the CRS has received. These reviewers are from a generation that takes web resources seriously, they are in a pre-professional program attuned to the effectiveness of educational methods, and the format of the assignment apparently motivated them to invest appropriate time and effort. Science education classes seem to be fertile ground for recruiting insightful reviewers.

Science Educators' Reports

In the college Geoscience course, the combination of assigning a large number of DLESE resources and having the reviews as an extra-credit activity meant that the distribution of reviews was spotty. Also, the professor noted that the students who gravitated to the extra credit activity were either at the high end or the low end of the student continuum. In the future, we will advise interested instructors to chose one or a few resources and structure the reviewing activity as an assignment rather than as an extra-credit opportunity.

Providing single-resource reports as each resource is completed was more satisfactory than providing one large report at the end of the semester. The instructors can digest the information more readily, they can potentially modify their teaching practice while they still have the same students in the classroom, and the single-resource reports are suitable for dissemination to resource creators.

Eighth and ninth grade students are capable of creating interesting individual reviews that display some insight about what makes a good learning resource and some reflection on how they and other students learn. For example, one student wrote that: "I liked how I had to click on all the time periods or pictures in order to continue on. This makes you have to read the information. This is good for kids who don't like to read from the textbook." Two teachers commented on the significance of building up the importance of the review process in the minds of the students in order to get useful written comments. Even those participants who initially had questioned the value of an individual 8th grader's review felt that meaningful trends and preferences could emerge when dozens of young learners' reviews were aggregated.

In the evaluation interviews, teachers indicated that they found the reports helpful in understanding the interplay between the resource and their students. One teacher commented that the student reviews pointed out things he hadn't been aware of in class; another said the report provided answers to questions the instructor would not have asked in class; a third said he was surprised to find that students were having difficulties. Some teachers reported benefits for students as well as for themselves, e.g., doing reviews "...got the kids to think about how they're learning...a great aspect of having the kids do reviews."

Resource creators also found value in the reports. The resource creators seemed most interested in the side-by-side comparison between teacher's and students' evaluations, and in the written comments. The written comments rather than the quantitative ratings seemed most likely to influence future revisions of the resource. All resource creators interviewed said they would be willing to engage in a dialog with teachers who have used the resource with their students.

Do Teachers and Students Agree?

We made some basic comparisons of the ratings from teachers and learners in the K-12 data set, posing the question: "Do teachers and learners observing the same resource in the same class perceive a similar quality of educational experience?" There are eight rubrics that appear on both the educator-track and the learner-track questionnaires, addressing bugs and technical difficulties, factual errors, technical documentation, scientific documentation, site organization and layout, generation of interest and attention on the part of the student, generation of curiosity on the part of the student, and evidence of student learning.

To get an overall measure of students' and teachers' satisfaction with each resource used, we first averaged each student's ratings across rubrics to create a mean rating for that student, and then averaged across students within an enactment to calculate an overall student mean rating for that enactment. Likewise, we averaged teacher ratings across rubrics within each enactment to calculate an overall teacher mean rating. Answers of "not enough information to judge" and "not applicable" were not included. Figure 3 shows a comparison of the overall student mean rating and the overall teacher mean rating for each enactment. These comparisons need to be viewed with caution, since the rubric wording differed on the choices seen by the teachers and students. However, taken at face value these data would suggest that in an intriguing number of cases, teachers have an overall better impression of an educational resource than do the students in the same classroom.

Scatterplot showing a comparison between teacher mean scores and overall student mean scores

Figure 3: Scatterplot compares overall teacher mean scores and overall student mean scores. Each data point represents one class-resource combination (i.e., one "enactment.") If the data point falls above the 1:1 diagonal line, the teacher gave the resource higher scores than did students in the same classroom. Asterisks mark enactments where the teacher and student overall mean scores differed significantly according to a t-Test. This comparison needs to be viewed with caution, because the wording necessarily differs on the student and teacher rubrics. But taken at face value, these data suggest that teachers may often have a better overall impression of resources than do students in the same classroom.

To investigate whether teachers and students differed in what they thought were strong and weak aspects of the group of resources, we averaged within each rubric across enactments to create a student mean rating for each rubric and a teacher mean rating for each rubric. Figure 4 shows these comparisons. Taken as a group, neither teachers nor students found much to criticize about the resources' documentation, ease of use or technical robustness. The most striking differences are in the "motivational or inspirational for learners" categories. Once again, these analyses need to be viewed with caution because the numbers are small and the wording of the rubrics necessarily differs on the teacher and learner questionnaires. Nonetheless, a red flag should go up when under "Generation of Interest and Attention" a teacher review states "(4) Interest was widespread and sustained throughout use of this resources" while a majority of students in that teachers' class selected "(2) It was OK" or "(1) I was bored by this resource."

Bar chart showing rubrics for both teacher and learner surveys

Figure 4: For all of the rubrics that appear on both teacher and learner surveys, we averaged across all the students who filled out that rubric and all the teachers who filled out that rubric. Bar chart compares the mean teacher and student ratings for each rubric. The most dramatic difference is on the two "Motivational or inspirational for learners" rubrics; the teachers ranked the resources significantly higher on these rubrics than did the learners.

Conclusion

The Instructors' Individualized Reports from DLESE's Community Review System are leveraging the digital character of the library to generate insights for instructors about the nature and quality of the interaction between resources in the library and a particular group of resource users: the students in that instructor's own class. Results from the pilot project provide evidence that teachers find insights that may help them modify their instructional practice, resource creators find insights that may help them revise their resource, and some students are drawn to reflect metacognitively on their own learning processes.

Acknowledgements

We note, with admiration, the technical prowess of Robert Arko, John Weatherley, and Katy Ginger, who built the infrastructure for gathering, aggregating, disseminating, and displaying CRS information. We thank the following teachers in our college and K-12 pilots for their time and insights: Dr. Russanne Low, DLESE Program Center, Boulder, CO (formerly University of Minnesota, Minneapolis, MN); Dr. Christopher DiLeonardo, Foothill College, Los Altos Hills, CA; Mr. Tim McCollum, Charleston Middle School, Charleston, IL; Mrs. Christine Girtain, Toms River South High School, Toms River, NJ; Ms. Margaret A. Holzer, Chatham High School, Chatham, NJ; Mr. Keith McKain, Colonel Richardson High School, Federalsburg, MD; and Mr. Jeffrey Thomas, Fairfield Warde High School, Fairfield, CT. Susan Buhr and Tim Weston of the DLESE Evaluation Core Services group helped us plan and interpret the evaluative interviews.

This work was supported by the National Science Foundation through grants DUE00-85827, DUE02-26292, and EAR03-05092. This is Lamont-Doherty Earth Observatory contribution number 6865.

Notes and References

[1] DLESE home page, <http://www.dlese.org>.

[2] Marlino, M., T. Sumner, D. Fulker, C. Manduca, D. Mogk (2001). The Digital Library for Earth System Education: Building Community, Building the Library. Communications of the ACM, v.44 n.5, p.80-81, May 2001. On-line at <http://doi.acm.org/10.1145/374308.374356>.

[3] Manduca, C. and D. Mogk (2000). The Digital Library for Earth System Education (DLESE): A Community Plan. Final Report to the National Science Foundation (NSF), Grant 99-06648. June 2000. On-line at <http://www.dlese.org/documents/plans/CommPlanFinal_secure.pdf>.

[4] Wright, M., M. Marlino, T. Sumner (2002). Meta-Design of a Community Digital Library. D-Lib Magazine, May 2002, 8(5). On-line at <doi:10.1045/may2002-wright>.

[5] DLESE's Community Review System, <http://crs.dlese.org>.

[6] Kastens, K. A., and John C. Butler (2001). How to identify the "best" resources for the reviewed collection of the Digital Library for Earth System Education, Computers and the Geosciences, v. 27(3), 375-378. On-line at: <http://www.ldeo.columbia.edu/DLESE/collections/CGms.html>.

[7] Kastens, K. (2005). "The DLESE Community Review System: Gathering, Aggregating, and Disseminating User Feedback about the Effectiveness of Web-based Educational Resources." Journal of Geoscience Education, v. 53, pp.37-43. On-line at <http://www.nagt.org/files/nagt/jge/abstracts/Kastens_v53n1.pdf>.

[8] The CRS on-line questionnaires are viewable at <http://crs.dlese.org/examples/rubric_teacher.html> and <http://crs.dlese.org/examples/rubric_learner.html>.

[9] DLESE Reviewed Collection (DRC) Best Practices, <http://www.dlese.org/Metadata/collections/drc-best-practices.htm>.

[10] Arko, R., K.Ginger, K.A. Kastens, and J. Weatherley (2006), in preparation for D-Lib Magazine.

[11] Gibbons, S. (2003). "Building upon the MyLibrary concept to better meet the information needs of college students." D-Lib Magazine, 9(3). On-line at <doi:10.1045/march2003-gibbons>.

[12] Lankes, R. D. (2003). "Current state of digital reference in primary and secondary education." D-Lib Magazine, 9(2). On-line at <doi:10.1045/february2003-lankes>.

[13] Foster, N. F. and S. Gibbons (2005). "Understanding faculty to provide content recruitment for institutional repositories." D-Lib Magazine, 11(1). On-line at <doi:10.1045/january2005-foster>.

[14] Giersch, S., E. A. Klotz, F. McMartin, B. Muramatsu, K. A. Renninger, W. Shumar and S. A. Weimar (2004). "If you build it, will they come? Participant involvement in digital libraries." D-Lib Magazine, 10(7/8). On-line at <doi:10.1045/july2004-giersch>.

[15] Donovan, M. S. and J. D. Bransford, Eds. (2005). How Students Learn: Science in the Classroom. Washington, DC, National Research Council, Division of Behavioral and Social Sciences and Education.

[16] The Introductory GeoscienceVirtual Textbook. View at <http://crs.dlese.org/testbed/Textbook/>.

[17] K12 Pilot Project. On-line at <http://crs.dlese.org/K12pilot/>.

Copyright © 2006 Kim A. Kastens and Neil Holzman
spacer
spacer

Top | Contents
Search | Author Index | Title Index | Back Issues
Previous Article | In Brief
Home | E-mail the Editor

spacer
spacer

D-Lib Magazine Access Terms and Conditions

doi:10.1045/january2006-kastens