Feb. 11th, 2021

kestrell: (Default)
from this week's Top Tech Tidbits newsletter

Date: February 12th 2021
Time: 9AM PT, 10AM MT, 11AM CT, 12PM ET, 5PM GMT
Description:
Accessible Pharmacy Services for the Blind and Be My Eyes are hosting a free webinar about talking glucose meters for blind and low vision diabetics. They will be joined by the team from Prodigy Diabetes Care and they will be discussing their products and the role that Accessible Pharmacy and Be My Eyes can play to help support patients. All attendees can get a free talking glucose meter to try. Register here
https://www.bemyeyes.com/diabetes-care-webinar
kestrell: (Default)
from CoolBlindTech
https://coolblindtech.com/ai-project-to-support-blind-and-partially-sighted-people/?bblinkid=248390077&bbemailid=28877723&bbejrid=1856789113#content

AI-project to support blind and partially sighted people
FEBRUARY 8, 2021 9:21 AM

Heriot-Watt and the Royal National Institute of Blind People (RNIB) have teamed up to support blind and partially sighted people in the UK Using AI technology called Alana.

What is Alana?
Alana is artificial intelligence (AI) software that can understand and respond to users in a human-like, conversational way, carry out human-like conversation, and can be used as a new tool for people with sight loss.

How does Alana work?
The tech delivers conversation based on context, device, and location, learning who the user is and remembering previous conversations. It then adapts to provide a personal experience for each person.

How will Alana be integrated with the RNIB?
Alana will initially be used to enhance the existing support offered by RNIB. Through its Sight Loss Advice Service, the charity currently offers support over the phone, in eye clinics and digitally.

It also provides information on eye conditions, legal rights, education, technology, and employment alongside emotional well-being services and signposting to services and resources offered by local societies.

AI has the potential to transform the way blind and partially sighted people access information. For example, the spin-out is developing a tool which will identify objects and find further information about one’s physical environment, automating the BeMyEyes App, which connects those who have sight loss with fully sighted volunteers.

The Heriot-Watt spin-out has already seen previous success with its innovative AI technology. In March, the team saw a huge jump in demand as the national lockdown came into effect.

Alana’s ‘touch-free’ interface allowed many users unable to converse with others as they would normally due to the coronavirus to remain connected.
The plan for the project is to support more than two million people with sight loss in the UK.

Source
Conversational AI Software Solutions
https://alanaai.com/
kestrell: (Default)
Kes: Because building in accessibility isn't a flaw, it's a feature.

App from VEMI Lab group will help people with visual impairments, seniors enjoy ride-sharing with self-driving cars 
https://umaine.edu/news/blog/2021/01/29/app-from-vemi-lab-group-will-help-people-with-visual-impairments-seniors-enjoy-ride-sharing-with-self-driving-cars/

A research group led by the Virtual Environments and Multimodal Interaction Laboratory (VEMI Lab) at the University of Maine is developing a smartphone app that provides the navigational assistance needed for people with disabilities and seniors to enjoy ride-sharing and ride-hailing, collectively termed mobility-as-a-service, with the latest in automotive technology. The app, known as the Autonomous Vehicle Assistant (AVA), can also be used for standard vehicles operated by human drivers and enjoyed by everyone.

AVA will help users request, find and enter a vehicle using a multisensory interface that provides guidance through audio and haptic feedback and high-contrast visual cues. The Autonomous Vehicle Research Group (AVRG), a cross institutional collective led by VEMI lab with researchers from Northeastern University and Colby College, will leverage GPS technology, real-time computer vision via the smartphone camera and artificial intelligence to support the functions offered through the app.

....Users will create a profile in AVA that reflects their needs and existing methods of navigation. The app will use the information from their profiles to find a suitable vehicle for transport, then determine whether one is available.

When the vehicle arrives, AVA will guide the user to it using the camera and augmented reality (AR), which provides an overlay of the environment using the smartphone by superimposing high-contrast lines over the image to highlight the path and verbal guidance, such as compass directions, street names, addresses and nearby landmarks. The app also will pinpoint environmental hazards, such as low-contrast curbs, by emphasizing them with contrasting lines and vibrating when users approach them. It will then help users find the door handle to enter the vehicle awaiting them.

“This is the first project of its kind in the country, and in combination with our other work in this area, we are addressing an end-to-end solution for AVs (autonomous vehicles) that will improve their accessibility for all,” says Giudice, chief research scientist at VEMI Lab and lead on the AVA project.
“Most work in this area only deals with sighted passengers, yet the under-represented driving populations we are supporting stand to benefit most from this technology and are one of the fastest growing demographics in the country.”

AVRG studies how autonomous vehicles can meet various accessibility needs.
VEMI lab
https://umaine.edu/news/blog/2019/08/23/umaine-research-project-on-improving-trust-in-autonomous-vehicles-using-human-vehicle-collaboration/
itself has explored tactics for improving consumer trust in this emerging technology.
AVA advances both groups’ endeavors by not only providing another means for people with visual impairments and other disabilities and seniors to access self-driving vehicles, but also increases their trust in them. The project also builds on a seed grant-funded, joint effort between UMaine and Northeastern University to improve accessibility, safety and situational awareness within the self-driving vehicle. Researchers from both universities aim to develop a new model of human-AI vehicle interaction to ensure people with visual impairments and seniors understand what the autonomous vehicle is doing and that it can sense, interpret and communicate with the passenger.
The app will offer modules that train users how to order and locate rides, particularly through mock pickup scenarios. Offering hands-on learning provides users confidence in themselves and the technology, according to researchers. It also gathers data AVRG can use during its iterative, ongoing development for AVA and its integration into autonomous vehicles.

“We are very excited about this opportunity to create accessible technology which will help the transition to fully autonomous vehicles for all. The freedom and independence of all travelers is imperative as we move forward,” says VEMI lab director Richard Corey.

VEMI Lab, co-founded by Corey and Giudice in 2008, explores different solutions for solving unmet challenges with technology. Prime areas of research and development pertain to self-driving vehicles, the design of bio-inspired tools to improve human-machine interaction and functionality, and new technology to improve environmental awareness, spatial learning and navigational wayfinding.

February 2024

S M T W T F S
    123
456789 10
11121314151617
18192021222324
2526272829  

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 2nd, 2026 01:16 pm
Powered by Dreamwidth Studios