Closing the gaps in public transportation for visually impaired
We utilize artificial intelligence and smartphone technologies to help close the gaps in public transportation for blind people.
GPS-based navigation is known to have the “last 30-feet problem” due to its limited accuracy. The small gap can cause blind people to miss the train or bus entirely. We are developing a smartphone app that can help blind people in Indianapolis to reach bus stops precisely.
Describe who will use your solution (1,000 characters)
Although GPS-based navigation apps can aid visually impaired users when taking public transportation, they lack the pinpoint accuracy needed to get users exactly where they need to be. This “last 30-feet problem” can cause people to miss the train or bus, simply because they cannot get close enough to the stop.
We propose a solution to precisely locate public transit stops using the latest artificial intelligence technology. We have released a free vision assistance app called SuperVision Search, which can help users to search for bus route numbers and street names with their phone cameras. In 2019, we obtained funding from a donor to broaden its functionality to enable it to search for bus stop signs, thus addressing the last 30-feet gap. The function has already been realized for Boston and Los Angeles. It will not require any changes to the existing infrastructure to be rolled out in Indianapolis.
For more on apps, visit: https://www.masseyeandear.org/makeagift/supervision.
Describe your solution's stage of development
Pilot - you have implemented your solution in a real-world scenario
Ready to Scale - you have completed and expanded your pilot and are seeing adoption of your solution by your intended user
Insights from previous testing (500 characters)
We have demonstrated that the SuperVision app effectively helps with searching for small targets such as street names and bus route numbers, results that have been published in a peer-reviewed journal. Additionally, a pilot test for Boston and Los Angeles showed that the bus stop sign recognition module can detect signs from a 25-30 foot distance in real-time. These findings indicate that our search framework is robust when testing outdoors in challenging urban environments.
Tell us about your team or organization (500 characters)
Our team is from a vision research lab at Massachusetts Eye and Ear, a teaching hospital of Harvard Medical School. Members include experts on vision rehabilitation, mobility, mobile app development, and computer vision. We have a successful track record of developing technologies for people with visual impairments. We have released three free vision assistance apps with approximately 793,000 downloads to date.
Team leader profile: https://researchers.masseyeandear.org/details/327
Size of your team or organization
Rough Budget (500 characters)
The requested budget will be used to pay for labor costs associated with neural network training for the app (image collection, image labeling service, cloud service for network training), and field testing of the app to ensure successful build out for the city of Indianapolis.
Describe how you would pilot your idea (1000 characters)
The pilot includes two components. The first is a field testing study, explained in the following section about success measurement. In the second part of the pilot, we will release the app to the public and invite targeted users (most especially those with moderate to severe sight issues that can be identified through the DMV and other government entities) in Indianapolis through advertisement and in partnership with the City of Indianapolis, Ford Mobility and the City One Challenge Keynote sponsors: AT&T, Dell Technologies and Microsoft to test and use it in their daily travel.
Describe how you would measure the success of your pilot (1000 characters)
For the field study, we will work in collaboration with a local Orientation & Mobility expert to evaluate the utility of the app in a group of 12 blind users in Indianapolis. Participants will use the app and their habitual methods (including long cane and GPS-based navigation apps) to locate bus stops in unfamiliar areas. Each participant will need to find 12 bus stops with the app and another 12 bus stops without it in a randomized and balanced order. Their performance will be quantified by the time it takes to walk to the bus stops from 30 feet away as well as the number of times they successfully find (physically be aside) the bus stops.
During daily use testing, the app will collect data including usage frequency, GPS location, and search attempts per location, among others. After using the app for a month, the users will provide qualitative feedback via a predesigned questionnaire, which will be analyzed along with data collected from the app.
Sustainability Plan (500 characters)
As a non-profit organization, we will release the SuperVision Search app to the public for free. There is no business sustainability issue for us. We are maintaining three free vision assistance apps with the help of philanthropic support.
We can transfer the know-how to organizations interested in further developing assistive technologies for people with visual impairment, or partner with other organizations to help make public transit more accessible for vision impaired individuals.