Experiment to outfit classroom with sensors

Electrical engineering Professor Mani Srivastava’s
seven-year-old daughter Megha provided the initial inspiration for
a research project that may provide groundbreaking results in the
fields of education and computer science.

Srivastava’s purchase of a wireless educational toy that
allows parents to survey their child’s interactions through a
PC spurred him to imagine the larger implications.

Along with his team of faculty assembled from the departments of
electrical engineering and computer science and the Graduate School
of Education and Information Studies, Srivastava plans to outfit an
entire first grade classroom ““ from inanimate objects like
wooden building blocks and tabletops to the students themselves
““ with tiny electronic sensors.

“We want to use these devices in a classroom setting to
see what we can infer from student’s interactions and how
they are associated with academic performance,” Srivastava
said.

The sensors are part of a new generation of devices that create
sensor networks to sample physical environments and collect
data.

The lessons this experiment may provide ““ including
potential insight on teaching techniques, the speech of children,
and the application of software and hardware in novel environments
““ have been deemed important enough for the National Science
Foundation to provide $1.8 million in funding grants.

Students will wear caps with sensors called
“iBadges” pinned to them, Srivastava said. These badges
will track the location of the child and the physical orientation
of the child’s head, as well as capture their speech with
small microphones.

Objects, such as puzzle pieces or board games, will be wired
with sensors and used on task tables with magnetic systems under
them to track location and usage. This will enable researchers to
study the processes a student uses to complete tasks set by
instructors.

In addition, a series of microphones and cameras will be placed
at various locations around the classroom to further monitor
students’ activities. Srivastava said sound clips gathered
from the microphones would enable researchers to study the speech
of children ““ particularly those who are bilingual.

“With the microphones we can tell, for instance, when the
students will switch from using English to Spanish or vice
versa,” he said.

All data collected by sensors, cameras and microphones is routed
through a central computer system utilizing software called Sylph,
designed by computer science professor Richard Muntz.

“This isn’t the traditional kind of data ““ it
is both multimedia and sensor data which is not very
precise,” Muntz said. “Capturing it and being able to
process it is a complex problem.”

Muntz said the program is designed to collect queried data from
sensors, store data and query archived data once it has been
stored. Most importantly, he said the program includes data-mining
capabilities, which implies distinguishing patterns among collected
data.

“Data mining has been a growing field in the last
decade,” he said. “Data collections are too
overwhelming for humans to study so we are now using programs to
help in the assessment.”

Researchers from the UCLA Center for the Study of Evaluation of
the GSE&IS, which assess the quality of education and
standardized testing in the United States, have also been working
with Srivastava to determine how the classroom application of
sensor technology will affect student learning.

“It’s like developing a new thermometer to measure
kids interaction,” said Gregory Chung, a senior researcher
for the CSE.

Chung added that sensors would allow teachers to pay attention
to the problems of individual students through the assessment of
their performance in small group interaction scenarios.

“The problem for teachers is that they cannot usually pay
attention to each student across all groups,” he said.
“The feedback will allow teachers to better instruct their
students.”

Full deployment of the sensors in a classroom will begin next
spring, Srivastava said. As of now, only preliminary testing with
groups of four or five subjects has occurred.

Srivastava said the project itself has further implications for
the future of computers in the life of humans.

“This will be an example of how humans will use computers
to create smart environments,” he said. “The use of
sensors in this manner will allow people to talk and interact with
the physical world.”

Leave a comment

Your email address will not be published. Required fields are marked *