跳轉到主要內容
International Top 20

AISIG

Devices that enable the visually impaired to safely cross pedestrian crossings by using AI-based image recognition to determine the color of the signal and communicate it to the user.

  • It is attached to the grip of a white cane as shown in the figure.

What it does

AI-based image recognition solves the problem of visually impaired people not being able to cross pedestrian crossings without acoustic signals. Image recognition determines the color of the signal and transmits it to the user via vibration.


Your inspiration

It all started when a visually impaired friend of mine was having a hard time crossing the street. Acoustic signals are not widely used because they are difficult to install. So, I thought that if individuals could understand the traffic signal information on a device they could carry with them, they would be able to cross at any pedestrian crossing.


How it works

The camera on the AISIG reads the image on the front of the device. The AI, which is machine-learned from 300 traffic light images, then determines whether the signal is red or green. The accuracy is further enhanced by using Open AI to perform image processing, such as determining color components, overlaid on top of each other. The results of the AI judgment are transmitted to the user by vibration motors. If the signal is red, the vibration is long, and if it is blue, the vibration is short and continuous. This difference in vibration pattern allows even a blind person to understand the signal status.


Design process

Based on the concept of "using a device carried by an individual to determine the status of a traffic light," we began with machine learning of AI. We actually went to the site, took pictures of the traffic lights, and updated the dataset based on the images. Once a certain degree of accuracy was achieved, we began to consider the shape of the device and how to use it. Initially, we were developing a ring-shaped device, but since it was difficult to point the camera in the direction of the traffic light and it would get in the way when not in use, so we changed it to a device worn on a white cane. In addition, to avoid having the device make special movements when in use, we made it possible to use an acceleration sensor to determine the device's orientation. This allows image recognition to start automatically when the white cane is perpendicular to the ground at a pedestrian crossing, and to stop when the user starts walking.


How it is different

A competing product is a smartphone app that can recognize signal images, but this requires the user to hold the smartphone in the hand, which is dangerous because the user cannot put his or her hand on the device in the event of a fall. AISIG is attached to a white cane, so it can be used with the same movements as usual.


Future plans

In future development, the most important thing is to improve recognition accuracy. At the present stage, the recognition accuracy is 80%, but we are working on development aiming for 99% or more. We also believe that we can support safer walking if we can use AI to recognize not only traffic signals but also cars and obstacles.


Awards

GUGEN 2022 Grand Prize


主要內容結束。回到頂部。

Select your location