MIT develops wearable tech for the visually impaired with a super chip and a 3D camera

Folks at MIT have come up with a super chip that is much smaller and consumes a thousand times less power than computer chips, but is capable of performing equally complex computations. Coupled with a 3D camera and a Braille interface, a device powered by the chip could replace the guide dog for visually impaired people, and help them easily navigate their surroundings. Researchers say it will be the size of a binoculars case and needs to be worn around the neck. The chip processes information from the 3D camera, and transcribes the distance to the obstacles in the line of sight. Actual images of the device have not yet been released.

3d-wearable-blind-aid-device-mit

This device trumps all previous initiatives because they were too bulky. The primary goal of the researchers was to to make it compact, hence the low-power micro chip. The team says this can be further shrunk. They developed an algorithm that allows them to translate the feed from the 3D camera into a 3D representation called point cloud. It identifies flat walking surfaces and calculates the distance and path with no obstacles.

The algorithm maps the surfaces by taking random points and comparing them with neighboring points. If they lie on the same plane, the area is treated as the same surface. Accessing the chip’s memory is the most power-consuming operation. In order to minimize it, the team of researchers have configured the algorithm to always start at the upper-left corner of the point cloud, scans along the top row analyzing each point and comparing it only with the point on the left. When done with the top row, it moves to the points on the second row, but compares them only with the ones on the left and directly above.

Another functionality of the chip is that it compares each frame shot with the preceding frame. If very less number of points vary, the device determines the wearer is at rest. During this time, the 3D camera is instructed to capture on a low frame rate, thereby saving power.

The research was jointly funded by the Andrea Bocelli Foundation, founded by blind singer Andrea Bocelli, and Texas Instruments, which provided the 3D camera. The chip was manufactured by Taiwan Semiconductor Manufacturing Company. The team behind this research includes Dongsuk Jeon, Anantha Chandrakasan, Professor Daniella Rus and Nathan Ickes of Apple.

Source: A virtual “guide dog” for navigation | MIT News | Massachusetts Institute of Technology

Replies

You are reading an archived discussion.

Related Posts

HTC recently launched their brand new 4G LTE enabled Desire 626 dual-sim smartphone, the newest member of its flagship Desire series in India. HTC decided to launch the dual sim...
Project Abstract / Summary : Our project primarily focuses on obstacle detection. It uses regenerative type principle which can generate energy by the gravitational force available in the vehicle suspension...
American smartphone manufacturer InFocus has launched “the perfect selfie phone”, the InFocus Bingo 21 in India at an amazing price tag of Rs. 5499. The very reasonably priced smartphone comes...
In web authoring how can i make a picture move from top to bottom?
Friends i wan to share with u. I am going to create a new application. I want to give you some information to you. please asked me question about application...