EyeCheck is an endeavour to provide low-cost vision care to millions of people currently underserved by care in their region.
A little over a year ago we had the great pleasure of working with four classmates to take on a professor’s side project. Our third year engineering design professor challenged our class to come up with a mobile phone-based solution to assess myopia (near-sightedness) in the third world. Perhaps naively, we decided to take it on. After four months we did not have much. Our software was not accurate and our hardware did not work. However, after four months, Daxal and I were not discouraged and we had learned a great deal.
In our research we learned that for hundreds of millions of people, clear vision means going to an eye camp, where highly-trained volunteers can assess vision. Most of these people live in the developing world. Some do not attempt the journey to eye camps because they cannot afford to or the journey is too arduous. Others return without even being assessed. Sadly, prescriptions are the only obstacle to clear vision for these people, as the cost of creating glasses has become so affordable. Existing attempts to serve these people have not made their mark. Our competitors have excellent solutions, but not for the third world. They are expensive, bulky or require too much user input.
Eye camps need a way to maximize their resources and increase the number of people they see. EyeCheck is a solution for these people. While remaining affordable, our solution is fast and easy, reducing the time taken for an eye exam in half. Further, portability and ease of use enable mobile and remote distribution of glasses improving access even further. This all leads to far more people safely participating in jobsites, classrooms and communities throughout the developing world.
Our co-op placements during our engineering degree taught us a great deal and gave us many interesting experiences. None of them brought us the joy or challenge that EyeCheck did. For the first time in our lives, we felt like engineers. We learned the science and we applied it. My Co-founder Daxal and I see a very simple problem: close to half a billion people simply waiting for a prescription. The two of us cannot imagine our lives without the ability to see clearly and we feel everyone should have access to clear vision to take on the challenges they face daily.
We solve the problem of providing prescriptions using a smartphone app, standalone camera and server-side image processing. The app scans a long queue of people, pointing out those with healthy vision, and those who need urgent attention. For those with vision problems, our standalone camera is used to provide a prescription. The two devices work by shining different kinds light into the eyes and analyzing the reflections coming back. What we offer is an automatic, portable, easy solution to eye prescriptions.
* Working with users and experts in the field to come up with sound needs assessments for people in the developing world
* Working with the School of Optometry at University of Waterloo, MBET, engineering advisors, and library staff to develop a thorough understanding of the science and technology needed to solve this problem.
* Understanding our competitors and how they have decided to position themselves in the field, while understanding why they chose their design direction and why they are not being used by the developing world.
The purpose was to create a system that shone light, captured an image, and detected the error present. If the person was nearsighted there would be a crescent at the top of the pupil image, or for farsightedness there would be a crescent at the bottom of the pupil image. If they had perfect eyes there would be no crescent visible.
* Iteration 1: The first iteration used a Nexus 4 smartphone with film from the floppy disk attached to the Nexus 4 camera. The goal was to test the floppy disk film as an infrared filter and use the existing flash and camera of Nexus 4 to capture an image of the eye good enough for triaging a patient. The iteration failed because floppy film was not a good infrared filter and an image for triaging a patient was not found.
* Iteration 2: Next 8 infrared LED were tested with a SLR camera to capture an image of the eye good enough for triaging a patient. However, that failed as well because the distance between LED source and SLR camera was too high. Also, LED output was so low that pupil patterns could not be found in images taken.
* Iteration 3: Next, a circuit of 36 infrared LEDs arranged in a triangular pattern was built on a custom circuit board. The significant change came from research papers and suggestions from our advisors. The SLR camera was then replaced with a small Raspberry Pi unit (micro controller) that had a camera built into it. As a result, we had used an off the counter product (Pi) to take pictures and an infrared LED array to shine infrared light. The iteration was tested on an artificial eye and the pupil patterns were visible in the images taken. However, the prototype did not meet the safety standards set out for devices emitting infrared light.
* Iteration 4: The iteration will use low power infrared LEDs to ensure that the infrared output meets safety standards. Iteration 4 will also have a 3D printed encasing for the Raspberry Pi and infrared LED board.
Entered: Velocity Fund Finalists Spring 2014
Entered: Inaugural members of Velocity Foundry
Awarded: Recipients of the Engineers of the Future Trust