OU blog

Personal Blogs

H809: Activity 12.4 & 12.5: Second Life research in education

Visible to anyone in the world

Activity 12.4 & 12.5: Second Life research in education

 

I looked at a few journal papers and also looked at some more general articles on accessibility in Second Life as this is an area that impacts on my work and in which I am especially interested. I have included two journal articles in my blog and also two magazine reports.

The research that I examined all seemed to involve practitioner-researchers using SL as a tool within their educational settings and the analysis was based on the assessments that the learners completed together with questionnaires/interviews and practitioner perceptions. Little account was taken of the fact that the learners were being assessed and that researchers were closely involved with the learners and the effects that these could have on the results. All the research I looked at was qualitative and studied constructivist, problem-based learning.

 

Good, J., Howland, K. and Thackray, L. (2008) 'Problem-based learning spanning real and virtual words: a case study in Second Life', Research in Learning Technology, 16: 3, 163 - 172.

Project teams were paired with clients from Sussex Learning Network partner institutions to design learning experiences that corresponded to real curriculum needs within the vocational learning arena. The clients were asked to identify issues which were difficult, dangerous or impossible to teach in real life, to hold an initial meeting with the student team to outline their problem area, without offering a solution, and to provide additional input if required during the project.

Assessment by portfolio:
Production of a machinima (a short film shot within SL) which showed the highlights of their learning experience
A group document describing the project overall
An individual document grounding the learning experience in the relevant literature, reflecting on the overall experience, providing a critique of their learning experience and engaging in a broader discussion of the value of IVWs for learning.

The project was supported by eight sessions of two hours each, where students were introduced to a range of learning theories, initial orientation and building classes in SL and mentoring by a staff team experienced in interactive technologies, learning theory and SL.

Schiller, S.Z. (2009) 'Practicing Learner-Centered Teaching: Pedagogical Design and Assessment of a Second Life Project', Journal of Information Systems Education, 20, 3, pp. 369-381

The Second Life project was implemented in an MBA-IS course in which thirty-two students were randomly assigned to eight teams. Each team managed an avatar and completed a series of business-related activities.

Teacher facilitates with guided activities
Snapshots
Chat transcripts
Reflection essays
Group presentation in class
Post-activity survey

As this was part of an official course, the assessments were valuable to the students but also gave feedback to practitioner-researchers. Results may be biased due to students perceiving requirements for positive feedback in order to pass the course.

 

Springer, R. (2009) 'Speech in a Virtual World, Part II', Speech Technology Magazine, 14, 7, p. 42

Programs have been designed specifically to integrate assistive technologies with SL so disabled users can participate. Two of these are TextSL and Max, the Virtual Guide Dog.

TextSL, a free download, harnesses the JAWS engine from Freedom Scientific to enable visually impaired users to access SL using the screen reader. TextSL supports commands for moving one's avatar, interacting with other avatars, and getting information about one's environment, such as the objects or avatars that are in the vicinity. It will also read the text in the chat window. The program, which was created by Eelke Folmer, an assistant professor of computer science and engineering at the University of Nevada-Reno, is compatible with the JAWS screen reader and runs on Windows, Mac OS, and Linux.

Max, the Virtual Guide Dog, was created as a proof-of-concept to show that SL could be made accessible to people with all types of disabilities. Max attaches to one's avatar, and its radar moves the user and interacts with objects. Max can tell a user what she can reach out and touch, printing the information into the chat window. Max can also help a user find a person or place and transport the user to a desired location. If a device or object has a .WAV file associated with it, then Max can play the audio file as well.

 

Springer, R. (2009) 'Speech in a Virtual World', Speech Technology Magazine, July/August.

http://www.speechtechmag.com/Articles/Column/Voice-Value/Speech-in-a-Virtual-World-55189.aspx

An estimated 50 million to nearly 200 million people use virtual worlds like Second Life (the range can be attributed to an overlap of users among sites and a differentiation between registered and active users). We don't have hard numbers regarding how many users have disabilities, but statistics on video gaming offer insight. As many as 20 percent of the estimated 300 million to 400 million video gamers are disabled, a 2008 survey by PopCap revealed. Considering that roughly 15 percent of the U.S. population is disabled, people with disabilities are overrepresented in the gaming market. Those surveyed reported more significant benefits from playing video games than their nondisabled counterparts.

 

Permalink Add your comment
Share post

This blog might contain posts that are only visible to logged-in users, or where only logged-in users can comment. If you have an account on the system, please log in for full access.

Total visits to this blog: 462785