A CSI-like analysis of everyday objects could be coming to smartphones by the end of 2017. Fraunhofer Institute, a research company based in Germany, announced Thursday HawkSpex Mobile, an app that can conduct a spectral analysis without any accessories.
Traditional spectral analysis cameras use prisms and specialized sensors to read how an object reflects different wavelengths of light. Since different elements reflect light differently, that information gives the camera details on just what that object is made up of — like whether or not an apple has been sprayed with pesticides.
Since smartphone cameras don’t have that prism, apps that can analyze objects require accessories, add-on cameras that are expensive and need to be carried around with the smartphone. Instead, the research group reversed the idea. Rather than using a prism to detect the different wavelengths, the smartphone’s screen emits a particular wavelength at a time, while the camera reads whether or not that wavelength is reflected.
If there is only red light, the camera object can only reflect red light, and whether or not that light is reflected gives the camera clues as to what that object is made up of. When the screen repeats that process at different wavelengths, the camera can analyze the object’s content without needing that built-in prism.
HawkSpex is currently only a laboratory model, but Fraunhofer says that by developing different applications based on the technology, a consumer version could be heading out before the end of the year. The group has to teach the app using reference scans what the reflected light means and program it for specific purposes. For example, to teach the app whether produce has been sprayed with pesticides, the group has to show the program what an apple without pesticides looks like first.
While the app could certainly come in handy in a number of applications, the program would require a different app or mode for each type of scan because it requires those reference scans. That means users would need to tell the app whether they are scanning an apple or a head of lettuce.
The company says the technology has so many different applications that it will launch a sort of Wikipedia-like platform where users can suggest what reference scans the company should use next to release a version of the app for a more specific purpose. While many types of scans require a reference some will not — like comparing two different items. To see if a car has been in an accident, for example, the company says users could scan the paint from one section to see if it matched another section, without needing a pre-programmed reference since the app is only comparing two different scans.
Fraunhofer says the app could be used for more than just checking the accuracy of an organic label — commercially, the app could be used for quality control or to allow farmers to see if their crops need fertilizer.
- Here’s what the Snapdragon 8 Gen 1 platform means for future smartphone cameras
- Epic v. Apple case shows just how much of the App Store’s money comes from games
- How to use iOS 14’s Magnifier app
- Underrated smartphone apps you didn’t know you needed
- Worried about how FaceApp is using your photos? Here’s how to delete your data