Skip to main content

This camera that sees in real time could mean safer driverless cars and drones

ntu celex real time camera image 1 asst prof chen shoushun 729
Nanyang Technological University Singapore
Driverless cars, drones, and other unmanned vehicles can only react to potential hazards if they “see” them fast enough. A team from Nanyang Technological University in Singapore recently developed a camera called Celex with enough of a speed boost to see in real time.

Conventional video cameras record as many as a few hundred images per second, strung together to create a video. While cameras are getting faster, current options are limited based on how quickly the computer can make sense of all that data and process that many large files. Essentially, the camera sees the information, but the computer inside cannot process it fast enough.

The research group from NTU developed a camera that records those changes in light in nanoseconds, instead of traditional frames, allowing the system to adjust to light changes faster than a typical camera. Instead of taking a large number of photos per second to create a video feed, Celex instead doesn’t concentrate on an entire image, but only reads the changes between the previous view at each pixel. Since the camera is only processing changes instead of an entire image, the speed of the camera is dramatically increased.

The camera also uses a built-in computer to analyze what is in the foreground, or what is close to the camera, and what is in the background. This optical flow computation helps the system determine what is part of the moving scenery and what is actually moving on its own toward a potential collision path.

The research team, led by assistant professor Chen Shoushun, says the camera system is also better than traditional options for night driving, as well as driving in bad weather, because of the onboard circuit that processes all the data. “Our new camera can be a great safety tool for autonomous vehicles, since it can see very far ahead like optical cameras but without the time lag needed to analyze and process the video feed,” Shoushun said. “With its continuous tracking feature and instant analysis of a scene, it complements existing optical and laser cameras and can help self-driving vehicles and drones avoid unexpected collisions that usually happen within seconds.”

Of course, since the camera focuses only on changes to keep file sizes small, the technology isn’t something that will eventually wind up in consumer cameras. But the enhanced speed could help increase safety in applications where the camera serves as a pair of eyes and not as an artistic tool, like in driverless cars and drones.

Work on Celex started in 2009 and the group launched a startup based on the technology. According to the researchers, the system, now in its final prototype stage, could hit the market before the end of 2017.

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
MIT’s shadow-watching tech could let autonomous cars see around corners
mit shadow look around corners sensing 0

Whether it’s cyclists about to lurch into the wrong lane or a pedestrian getting ready to cross the street, self-driving cars need to be hyperaware of what is going on around them at all times. But one thing they can’t do is to see around corners. Or can they? In a paper presented at this week’s International Conference on Intelligent Robots and Systems (IROS), Massachusetts Institute of Technology researchers have shown off technology which could allow autonomous vehicles or other kinds of robots to do exactly that -- by looking for changes in shadows on the ground to reveal if a moving object is headed their way.

"ShadowCam operates by detecting small differences in shadows and using this information to detect possible static and dynamic objects that are otherwise out of your line of sight," Alexander Amini and Igor Gilitschenski, two MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) researchers who worked on the project, told Digital Trends via email. "First of all, we need to focus on the same region of interest as we move, which we achieve by integrating a visual motion estimation technique into ShadowCam. Based on this stabilized image of the region of interest, we [then] use color amplification combined with a dynamic threshold on the intensity changes."

Read more
Uber Eats’ drone delivery service could see Big Macs hit speeds of 70 mph
uber eats redesigned app lets you track your order with cute graphics

Uber is following in the footsteps of Amazon, Google, and others, with the development of its own delivery drone.

The flying machine will be used by Uber Eats, a service that lets hungry folks order meals using an app on their smartphone.

Read more
Orbi crams 360-degree cameras inside glasses, drones — even football helmets
orbi 360 glasses drone helmet ces 2019 web white  4



Read more