What it does
My app helps low-vision users and people with Parkinson’s do daily tasks like setting alarms or navigating using voice commands and audio feedback. It also has an SOS feature to alert emergency contacts, offering safety and independence.
Your inspiration
The idea came to me one day while I was crossing the road and saw three visually impaired girls walking in a train-like line, holding on to each other for support. The one in front had some vision, and the others were completely dependent on her. It made me realize how scary and risky daily navigation can be for them. Later, I saw a girl using her phone with voice feedback on it was slow and frustrating to watch, but she was so patient. That moment really stayed with me. It made me want to design something better, something that truly supports them with dignity and ease.
How it works
Eyedentify is a voice-based app designed for blind and low-vision users. Instead of relying on touchscreens, users speak to the app to check time, set alarms, read text, or navigate. The app replies using audio like a personal assistant in your phone. For example: “What's the time?” - It speaks the time. “Set alarm for 6 AM” - It sets it. “Read this page” - It reads printed text aloud. The app also supports partial vision users with bold colors, large buttons, and a simple layout. One key feature is the SOS system. If the user says “Help me” or taps a button, the app sends an alert to a saved contact, shares live location via GPS, and notifies nearby volunteers signed up to assist. These volunteers receive a quick alert and can help in real situations like crossing roads or getting home safely. All features use built-in tools like mic, GPS, and speech tech no extra devices needed. It's simple, caring, and made for real-life needs.
Design process
This project started with real-life observation I saw visually impaired girls walking in a chain and another user struggling with voice-over on her phone. It made me question how technology could better support their independence. I began with research, including visits to a blind center and studying conditions like glaucoma and tunnel vision. My first prototype was UI-heavy, but I soon realized that wasn’t practical for blind users. So, I shifted to a voice-first approach, using large buttons and bold contrast only where needed for low-vision users. I built wireframes in Figma, focused on core tasks: checking time, setting alarms, reading, and navigation. After testing and feedback, I simplified the design, reduced screen clutter, and added an SOS feature that alerts emergency contacts and nearby volunteers using voice commands. This process helped me design with empathy not just for screens, but for real, everyday challenges.
How it is different
Eyedentify stands out because it wasn’t just designed to look good it’s built from real empathy and observation. Most apps for blind users focus on one task or need complex interfaces or devices. I wanted something more natural. After seeing how visually impaired users struggle with basic tasks and slow voice-over systems, I designed Eyedentify to be fully voice-first. It feels like a conversation letting users check time, set alarms, read text, and navigate hands-free. Another thing that sets it apart is the SOS and volunteer network. Most emergency tools stop at sending a message to one contact. Eyedentify also alerts nearby volunteers who’ve signed up to help, turning the app into a bridge between people who need support and people who want to offer it. I didn’t just want to design something cool. I wanted to design something kind something that understands, supports, and empowers users to feel more confident in their everyday lives.
Future plans
As a first-year design student, my next step is to improve Eyedentify based on feedback from real users, especially those who are visually impaired. I hope to build a working prototype and collaborate with developers to bring it to life. I also want to explore opportunities to test it in real environments, learn more about inclusive tech, and maybe one day launch it as a real app. For now, I aim to keep learning, refining my skills, and designing with empathy at the core
Share this page on