Using Embedded Agents to Support Learning

By: Alise Brann, Tracy Gray, Kristin Ruedel, and PowerUp WHAT WORKS


Animated agents and tutors can help provide your students with disabilities with targeted just-in-time supports for learning.

Agents and tutors are the life-like characters in multimedia software and online applications that pop up on the screen to explain rules, provide hints, or prompt the user to use the program's features. They can be human or nonhuman, animated or static. You may have encountered agents in the form of the "Microsoft Paperclip" or when shopping online. Some businesses, such as Ikea, have developed agents with some degree of artificial intelligence to help shoppers find the information and answers they are looking for without having to call customer service.

Using in Your Classroom

As technologies improve, more software programs and websites are using agents to support users with tasks such as navigation, solving problems, finding resources, or contacting support.

Students can make use of these agents in a variety of ways. Multimedia agents have been found to:

  • Increase interest and lessen content difficulty
  • Serve as an effective mentor or tutor
  • Simulate peer tutoring

Multimedia environments with well-designed agents can provide just-in-time prompts that support students' learning of content. Most students enjoy agents and find their advice valuable.

Although not all software tools make use of agents or helpers, many tools are adding this feature. The use of multimedia agents can be a great way of providing multiple options for representation, engagement, and action.

Digital agents or tutors can provide students with supplemental instruction and guided practice on any number of academic skills. For example, many reading programs now use agents to help students build skills and fluency or to prompt the application of comprehension strategies. Similarly, agents are embedded in simulation software that teach mathematics and science.

Multimedia environments that use agents can support students' understanding of content area concepts and the relationships among ideas and concepts in a discipline.

Choosing Programs with Animated Agents

When evaluating multimedia programs for your classroom, it is important to determine the type of interaction that the agent has with users as well as the quality of interaction it solicits from users.

  • Agents ask questions that promote higher-order thinking (why, what if, and how questions).
  • Agents do not limit or define student thinking about a topic — beware of agents that provide a student with shallow definitions of key concepts and look instead for tools that represent multiple perspectives and encourage deep thinking and reflection.
  • Agents that are personalized, referring to the student by name or as "you." Example: "Now, I'm going to help you get started …".

What the Research Says

A series of side-by-side comparison studies with youth and young adults have shown that students' learning and interaction are enhanced when they work with an agent that is programmed to demonstrate emotions and act in some unpredictable ways and to speak in a personalized tone (using "you" and "me/we") instead of static graphic agents or voice narration only (Moreno, 2005). These findings are reviewed in Moreno (2005); following are some examples.

Atkinson (2002) evaluated student interactions with Peedy, a parrot with a personality, in a multimedia program designed to assist students with algebra word problems. Undergraduate students working with Peedy reported less difficulty and had higher posttest scores than students in a control condition working with the same narration but without the Peedy agent. Moreover, students who worked with a talking version of Peedy benefited on posttests more than students who worked with a Peedy that presented written explanations in a thought bubble. Students in a study by Moundridou and Virvou (2002) also reported experiencing less difficulty and greater enjoyment with a multimedia program featuring an agent that helped students solve algebraic equations rather than the program without the agent.

In another study, middle school students worked with multiple versions of a multimedia agent called "Herman the Bug" within a multimedia program about botany (Lester, Stone, & Stelling, 1999). Changes in test scores demonstrated that all students learned the material, but students who worked with a speaking Herman reported higher levels of interest and engagement than their counterparts who worked with less interactive versions of Herman.

Researchers at the University of Memphis are designing agents that may increase reading comprehension by prompting students to self-explain their learning — asking themselves why, what if, and how questions, and engaging in an interactive dialogue that reinforces reading strategies. AutoTutor and iSTART are two Web-based prototypes that incorporate such agents. Both have been found effective at increasing comprehension of science content text for youth and young adults (Graesser., McNamara, & Van Lehn, 2005; Graesser et al., 2004). AutoTutor uses a human-like head to provide explanations, while iSTART uses a collection of three-dimensional agents, each performing a different function in the training module. Interactive dialogues are incorporated directly into the program through the agent or developed through peer interaction within student pairs. Peer dialogue around a multimedia learning experience has been shown elsewhere to improve learning for young adults (Craig, Driscoll, & Gholson, 2004).

An animated agent is an integral part of the commercial program Thinking Reader® (Tom Snyder Productions, Scholastic). The program embeds strategy instruction into award-winning novels for intermediate and middle school students and is based on research conducted with struggling adolescent readers (Dalton, Pisha, Eagleton, Coyne, & Deysher, 2001). The books are digitized and embedded with multiple supports, including human voice narration, text-to-speech, a multimedia glossary, background knowledge links, strategy instruction, and a work log. Agents prompt the students to "stop and think" (apply reading strategies), and provide corrective feedback on students’ performance. The use of these books has been shown to significantly improve the reading comprehension of struggling readers compared with traditional reciprocal teaching instruction (Dalton et al., 2001).

Bosseler and Massaro (2003) developed a multimedia training environment called the Language Wizard/Player that includes an agent, Baldi, who serves as a speech-language tutor. This agent provides specific feedback on students' vocabulary and speech production. Baldi's skin can be made transparent to show the articulatory movements in the mouth and throat. Young children with autism who worked with Baldi increased their vocabulary and generalized their new words to natural settings.



Click the "References" link above to hide these references.

Atkinson, R. K. (2002). Optimizing learning from examples using pedagogical agents. Journal of Educational Psychology, 94(2), 416–427.

Bosseler, A., & Massaro, D. (2003). Development and evaluation of a computer-animated tutor for vocabulary and language learning in children with autism. Journal of Autism and Developmental Disorders, 33(6), 653–672.

Craig, S. D., Driscoll, D. M., & Gholson, B. (2004). Constructing knowledge from dialog in an intelligent tutoring system: Interactive learning, vicarious learning, and pedagogical agents. Journal of Educational Multimedia and Hypermedia, 13(2), 163–183.

Dalton, B. Pisha, B., Eagleton, M., Coyne, P., & Deysher, S. (2001). Engaging the text: Reciprocal teaching and questioning strategies in a scaffolded learning environment. Final report to the U.S. Department of Education. Peabody, MA: CAST.

Graesser, A. C., McNamara, D.S., & Van Lehn, K. (2005). Scaffolding deep comprehension strategies through Point&Query, AutoTutor, and iSTART. Educational Psychologist, 40(4), 225–234.

Graesser, A. C., Lu, S., Jackson, G. T., Mitchell, H., Ventura, M., Olney, A., & Louwerse, M. M. (2004). AutoTutor: A tutor with dialogue in natural language. Behavioral Research Methods, Instruments, and Computers, 36, 180–192.

Lester, J. C., Stone, B., & Stelling, G. (1999). Lifelike pedagogical agents for mixed-initiative problem solving in constructivist learning environments. User Modeling and User-Adapted Interaction, 9, 1–44.

Moreno, R. (2005). Multimedia learning with animated pedagogical agents. In R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning (pp. 507–523). New York: Cambridge University Press.

Moundridou, M., & Virvou, M. (2002). Evaluating the personal effect of an interface agent in a tutoring system. Journal of Computer Assisted Learning, 18, 253–261.

Alise Brann, Tracy Gray, Kristin Ruedel, and PowerUp WHAT WORKS (2009)