Bingen Cortazar takes a leaf, slides it into a small plastic box and holds it in front of his face. He looks at it a moment, then presses a button on the side of the red eyeglass frame-turned-computer.
With a quick photograph, the graduate student in electrical engineering determined the chlorophyll level and health of the leaf he was studying, without harming the plant.
Researchers in the lab of Aydogan Ozcan, Chancellor’s Professor of electrical engineering and bioengineering, in UCLA’s electrical engineering department developed an app for diagnosing plant health using Google Glass, a wearable voice-activated computer mounted on an eyeglass frame. The app noninvasively measures chlorophyll concentrations by taking photos of a sample leaf, leaving the plant intact.
The project is specifically aimed at rapidly assessing plant health in the field using mobile connectivity and inexpensive tools. It allows farmers and researchers in the field to diagnose a plant’s health quickly, from any location and without damaging the plant. According to the team’s paper published early this month, previous techniques involved destroying part of the plant and transporting it to a lab, a process that could take hours.
The Google Glass device involves a small computer, microphone, camera and screen mounted to an eyeglass frame. But Google Glass is not currently available for consumer purchase, and the lab has no intention of distributing their software or manufacturing the apparatus.
Ozcan leads a team focused on developing compact, mobile and cost-effective diagnostic tools, from microscopes mounted on mobile phones to the most recent Glass app.
“Glass is charming because it is hands-free, voice-activated and doesn’t obscure the user’s vision,” Ozcan said. “This is promising for emergency response applications, as well as those involving research in the field.”
The system involves a hand-held imaging apparatus and a custom app for Google’s Glass platform to provide accurate measurements of chlorophyll concentrations in leaf samples. Chlorophyll is the primary light-absorbing pigment plants use during photosynthesis, and reduced chlorophyll concentrations may be indicative of disease or environmental toxins, according to the lab’s paper published earlier this month by the Royal Society of Chemistry.
“Measuring chlorophyll levels is already widely used in botanical research,” said Steve Feng, an engineer employed by the lab. “Our system is innovative in that it provides a faster and more cost-effective alternative to lab-conducted diagnostics by using inexpensive materials and mobile technology.”
Feng said he thinks the cost of wearables will only remain a hindrance to development in the short term because the technology is becoming less expensive.
Developing for Google Glass had its challenges.
Developers had to account for every possible change in lighting, from the time of day to whether a researcher is indoors or outdoors, said Hatice Koydemir, a postdoctoral fellow who works with the Ozcan lab.
The researchers addressed this problem by developing a hand-held leaf holder, a small piece of plastic that holds the sample in place while a picture is taken using Google Glass. The device uses LED lights to illuminate the sample from behind with wavelengths of light that chlorophyll can easily absorb.
The entire apparatus can be manufactured for less than $30. As of now, they have only been made in small batches on the lab’s industrial-grade 3-D printer.
Cortazar said he thinks the project will help farmers quickly identify health problems with plants or soil.
“The agricultural field will benefit significantly from tools such as this. Wearable technology will allow farmers not only to communicate, but to do diagnostics,” Cortazar said.
In the future, the UCLA team does not plan to continue developing or to market the equipment. Instead, they hope to focus on other biomedical applications for Google Glass, Feng said.