Pilot Study

To gather information on the effectiveness of using Multi-User Virtual Environments as assessment tools, the SAVE Science project conducted a pilot study using its first completed assessment module, “Sheep Trouble.”

SAVE Science researchers hypothesized that student interactions in the “Sheep Trouble” assessment module would reveal their level of understanding about adaptation of organisms to a given environment and provide detailed information on misconceptions that remained after completion of related classroom-based curriculum.

In this module students are sent back in time to solve a problem on a medieval sheep farm in a virtual world called Scientopolis. Students are asked to gather information around the farm and solve the problem of why newly imported sheep are not thriving. A local judge has decided that these sheep must be destroyed so their “bad magic” does not spread, unless each student can formulate a convincing hypothesis with supporting evidence.

The module is designed to assess understanding of the concept of adaptation of organisms to a local environment. This module asks students to investigate possible causes for illness among newly imported sheep. In completing the assessment task, students can interact with two computer-based characters (a farmer and his brother) and with sick new sheep and healthy indigenous sheep. Students can measure the sheep’s legs, bodies, etc., and can access information on weight, age and gender. They can also observe embedded data sources such as signs and posters, and tacit clues such as terrain and vegetation.

Students collect data, conduct analyses and take notes using a personal “Sci-Tools” electronic tool kit.

Nineteen students participated in the pilot study. Following the pilot, we examined the set of data that was automatically collected about student actions in the assessment module. First, we found that all students gathered data before reaching conclusions. They asked questions of the two non-player characters, Farmer Brown and Bill Jennings Brian (multiple times in some cases), and they observed sheep characteristics. Further, when students answered the question at the end, they used their observations to support their assertions. For example, when asked if they knew why the sheep were dying, several students said: “grass is dead in a lot of areas” or “not the type of land they [new sheep] are used to – mostly flat where they came from, while hills here.”

Some students showed a misunderstanding of the concept of data as evidence by answering with an inference as opposed to an observation, e.g. “[they] need different grass” or “[they are] not eating enough.”

We included two questions from the district assessment test in an embedded interview given by Farmer Brown at the end of the assessment module to see if giving a context for those questions improved students’ ability to answer them, compared to the success rate for the district tests. We found that on average those who took the time to look at multiple sheep — sometimes the same ones more than once — were more likely to answer these questions correctly. This indicates to us that contextualizing the questions does improve students’ ability to answer.

In terms of engagement, we found that 11 of the students went back into the virtual world and measured more sheep or talked with a farmer again after completing the embedded interview questions, supporting our observations in the classroom of student interest.

Students’ Interactivity in Sheep Trouble — What We Know Now

Problem Solving

ALL students;

  • gather data using the Sci-Tools;
  • “talk” with both virtual farmers;
  • observe multiple sheep (Range: 4-14, Mean: 9.05); and
  • turn data collected into information to answer the problem.