kestrell: (Default)
According to this article, the i stands for "internet" because, back in 1998 when the iMac was released, a computer which had the built-in ability to connect to the Internet was pretty radical, as was spelling Internet with a lower case i.
https://www.howtogeek.com/784151/what-does-the-i-in-iphone-stand-for/
kestrell: (Default)
Kes: I admit it, I didn't listen to this before I posted the link because I was busy doing homework, but now that I have listened to it...Didn't I write this thesis over thirteen years ago?? Like, word for word and example for example in many places. It's as if I am so far ahead, I feel behind.

Seeing AI is possibly the most useful app for blind and low vision users, and the developer is a really dynamic speaker who has lots of great stories about users who have found their own uses for the app.

Seeing AI is a talking camera app for people who are blind/low vision. It describes the text, people, and things around you. Come hear about our latest developments, leveraging AI+AR to provide an immersive audio AR experience.

Speaker: Saqib Shaikh
At Microsoft, Saqib Shaikh leads teams of engineers to blend emerging technologies with natural user experiences to empower people with disabilities to achieve more - and thus to create a more inclusive worl
Youtube video
https://www.youtube.com/watch?v=KfmC1PAe8o8
kestrell: (Default)
Kes: This is one of the most useful apps for visually impaired people--I use it for everything from reading package labels to identifying my beer in the drinks cabinet. The developer is a great speaker, and he has lots of fascinating stories about how visually impaired people have found different uses for the app than he ever imagined when he initially developed it.

Seeing AI: Describing the world for people who are blind/low vision

Date: Tuesday, February 15, 2022

Description: Seeing AI is a talking camera app for people who are blind/low vision. It describes the text, people, and things around you.
Come hear about our latest developments, leveraging AI+AR to provide an immersive audio AR experience.

Speaker: Saqib Shaikh

At Microsoft, Saqib Shaikh leads teams of engineers to blend emerging technologies with natural user experiences to empower people with disabilities to achieve more - and thus to create a more inclusive world for all.
The Seeing AI project enables someone who is visually impaired to hold up their phone, and hear more about the text, people, and objects in their surroundings. It has won multiple awards, and been called "life changing" by users. Shaikh has demonstrated his work to the UK Prime Minister, and to the House of Lords. The video of the original prototype (http://youtu.be/R2mC-NUAmMk) has been viewed over three million times.ZS04NjZkLWNlZjg5M2RiNzNhNg

Sign up and find more info at
https://www.meetup.com/hololens-mr/events/282678622/

Hosted by: The Microsoft HoloLens and Mixed Reality Meetup
kestrell: (Default)
This is a great little article on the history of optical chracter recognition (OCR) for the blind, including how it took forty years for it to be anything close to affordable, which many contend it still isn't for many visually impaired people
https://accessibility-insights.com/2021/11/06/live-text-new-in-ios-15-is-amazing-but-it-took-us-45-years-of-technical-advancements-to-get-there/
kestrell: (Default)
Kes: Purrhaps this explains why 90% of all images get identified as cats.

From MIT Technology Review

April 1, 2021
The 10 most cited AI data sets are riddled with label errors, according to
a new study out of MIT,
https://arxiv.org/pdf/2103.14749.pdf
and it’s distorting our understanding of the field’s progress.

Data sets are the backbone of AI research, but some are more critical than others. There are a core set of them that researchers use to evaluate machine-learning models as a way to track how AI capabilities are advancing over time. One of the best-known is the canonical image-recognition data set ImageNet, which kicked off the modern AI revolution. There’s also MNIST, which compiles images of handwritten numbers between 0 and 9. Other data sets test models trained to recognize audio, text, and hand drawings.

In recent years, studies have found that these data sets can contain serious flaws. ImageNet, for example, contains
racist and sexist labels
https://excavating.ai/
as well as photos of people’s faces obtained without consent.
The latest study now looks at another problem: many of the labels are just flat-out wrong. A mushroom is labeled a spoon, a frog is labeled a cat, and a high note from Ariana Grande is labeled a whistle. The ImageNet test set has an estimated label error rate of 5.8%. Meanwhile, the test set for QuickDraw, a compilation of hand drawings, has an estimated error rate of 10.1%.
How was it measured? Each of the 10 data sets used for evaluating models has a corresponding data set used for training them. The researchers, MIT graduate students Curtis G. Northcutt and Anish Athalye and alum Jonas Mueller, used the training data sets to develop a machine-learning model and then used it to predict the labels in the testing data. If the model disagreed with the original label, the data point was flagged up for manual review. Five human reviewers on Amazon Mechanical Turk were asked to vote on which label—the model’s or the original—they thought was correct. If the majority of the human reviewers agreed with the model, the original label was tallied as an error and then corrected.

Does this matter? Yes. The researchers looked at 34 models whose performance had previously been measured against the ImageNet test set. Then they remeasured each model against the roughly 1,500 examples where the data labels were found to be wrong. They found that the models that didn’t perform so well on the original incorrect labels were some of the best performers after the labels were corrected. In particular, the simpler models seemed to fare better on the corrected data than the more complicated models that are used by tech giants like Google for image recognition and assumed to be the best in the field. In other words, we may have an inflated sense of how great these complicated models are because of flawed testing data.

Now what? Northcutt encourages the AI field to create cleaner data sets for evaluating models and tracking the field’s progress. He also recommends that
researchers improve their data hygiene when working with their own data. Otherwise, he says, “if you have a noisy data set and a bunch of models you’re
trying out, and you’re going to deploy them in the real world,” you could end up selecting the wrong model. To this end, he open-sourced

the code
https://github.com/cgnorthcutt/cleanlab
he used in his study for correcting label errors, which he says is already in use at a few major tech companies.
kestrell: (Default)
Kes: I was telling a housemate about this app and pondering the possible applications such as: you could post secret signs around a building just for your friends who had the app and could access your tags or, if you had a large old house like our, you could do a themed party, such as an Alice in Wonderland party, and have the tags give different quotes/themes for different rooms.

– The new smart digital signage for everyone! Or, as we say, QR Codes on Steroids! Free App on iOS and Android
by Blind Abilities Team
http://blindabilities.com/?p=6584#genesis-content

Show Summary:
Jeff and Pete are in the studio again, this time to chat with Javier Pita, founder and CEO of NaviLens Corp., and their remarkable product, NaviLens.
NaviLens is a new and enhanced kind of QR code, but unlike existing QR codes that can only be read from a short distance, the NaviLens tag can be detected by your smart phone camera up to 60 feet away with a 160 degree wide-angle range. This means that you will now be able to detect NaviLens tags that are almost at your 9 o’clock or 3 o’clock direction, thus being simple to find if you are blind or visually impaired. For example, you will now be able to find an indoor room if you are walking down a hallway, or another destination as you are approaching a bus stop or store front. And with NaviLens, you don’t even have to aim or point your phone as precisely as before. Jeff and Pete sometimes refer to them as "QR Codes on Steroids.”
In addition, there are home-use NaviLens tags that are free and you can print them out on your own personal printer and use them at home, and customize them for whatever purpose you want: labeling your record or CD collection, items in your fridge or pantry, clothing, medications, or anything. Then you can find things as easy as opening up the NaviLens app on your phone. Again, totally free.
NaviLens is also available with pre-labeled tag kits for schools and other packages for businesses. It would be great to see this product adopted by more schools and businesses around the country. So get the word out!
The company is also rolling out a new and more dynamic feature called NaviLens 360 Vision which gives you detailed step by step, foot by foot guidance to your destination with all kinds of really cool navigational assistance, such as audible tones to guide you left or right or straight to your destination. While this new feature will be available later this month, Javier gives us a sneak preview on today’s podcast, so be sure to give it a listen, begin using the tags yourself, and think about reaching out to a local business, school or governmental agency to implement the use of NaviLens in their location - it will benefit them as well as you!
You can find out much more on their web site,, NaviLens.com
You can also follow them on Twitter [profile] navilens
kestrell: (Default)
This morning I had a Zoom meeting and found myself saying first "Hi, Sara," and then "Hi, Paul," but it seems that Siri now also answers to the name Sara (or perhaps Sara is her evil twin?) and believes that "Hi, Paul" means that she should go ahead and phone Paul, and meanwhile Alexa mutters that she doesn't know how to respond to that name.

I'm contemplating renaming the aerye the Singularity.
kestrell: (Default)
Over the past week, I've been playing with the Microsoft Soundscape app
https://www.microsoft.com/en-us/research/product/soundscape/#banner
which provides navigational information of the real world for visually impaired people, including using binaural audio, creating the effect of 3D sound (spatial sound).

I used this app on my first gen iPhone SE along with a pair of Bose Frames bluetooth audio sunglasses, Rondo version
https://www.thurrott.com/microsoft/230909/microsoft-soundscape-now-supports-bose-frames-to-better-help-the-blind

As a blind technology user, I'm used to hearing a lot of promises and inflated marketing about tech that supposedly assists visually impaired users navigate the real world, so I was really skeptical about both the app and the bluetooth sunglasses, but...
THEY ARE TOTALLY AWESOME!!

I was surprised at how much info was provided by the audio interface, and that it provided so much control for the user over what kind of info was spoken (for example, there are filters for public transportation, stores, food and restaurants, and things to do). It not only indicates information about your current location and destination, but also provides info regarding which direction you are facing, cross-streets and intersections, and landmarks you are passing as you move, along with spatial audio sounding in either your left or right ear in order to indicate which side the landmark is on and how close it is.

It also provides many other features which I haven't learned yet, but one of the features is the ability to add audio beacons to locations, such as your home, starting location, or destination. Users can also add personalized audio tags to locations, and lots of other features which I am still learning how to use.

There's also a fantastic feature called Street Preview, which is described as "providing an innovative tool for virtually exploring the world! With it, you can select any location in the world to preview the area at street level in order to familiarize and build a mental map of the space." While I have in the past thought of how useful it would be to have a feature like this, the first time I used it I went to the Cafe du Monde in New Orleans and walked around Jackson Square, listening to the app tell me about the streets I was on and what was around me. I couldn't get the music or the smells of all that great New Orleans food and chicory coffee but, if I could, this app would really be perfect.

As a XR device, the Bose bluetooth sunglasses definitely fill the same functionality as the headsets for sighted people, providing a sense of being both immersive and intuitive with natural movement and actions. The sound is delivered as beam toward the ear, so the audio quality is fantastic, plus it actually sounds as if it is inside your head, so the sense of immersion is really vivid. You can easily turn them on by pressing a tiny button on the right arm of the frame, and they are easily turned off by removing them and flipping them upside down for two seconds.

The form factor was perfect for me: I typically wear sunglasses anyway, and the shape of the Rondo style sunglasses I was using is identical to the classic Ray Ban Wayfarers that I favor, while weighing only slightly more. I'm also one of those people who has trouble finding earbuds which stay in their ears, so the Bose sunglasses are a great alternative. I also have a hearing impairment in my left ear, but the sound quality was so good that I had no problem hearing the audio cues in that ear.
Note: I am a short person with a small face, and the Rondo style fit me *perfectly*. I'm not sure if they would be comfortable on a larger person, and all the other Bose Frames styles are much larger, and way too large for my face.

The MS Soundscape app is constantly being improved, and has regular updates
https://iphone.apkpure.com/microsoft-soundscape/com.microsoft.soundscape

What's missing:
1. Microsoft seems to be showing some bias toward making Soundscape available on Android: granted, the iPhone has a huge following amongst visually impaired people, but keeping Soundscape platform-specific seems like a jerk move on MS's part.
2. The app is free, but it does require the user to share usage info with Microsoft, which could be a privacy concern for some users.
3. like everyone else, visually impaired or sighted, I wish there was a way for these to work inside large buildings, like campuses, hospitals, or office buildings.
4. If Soundscape could also deliver the sounds and smells of New Orleans through the app, I would be in heaven.
kestrell: (Default)
1. Microsoft Soundscape
Microsoft Soundscape is a research project that explores the use of innovative audio-based technology to enable people, particularly those with blindness or low vision, to build a richer awareness of their surroundings, thus becoming more confident and empowered to get around. Unlike step-by-step navigation apps, Soundscape uses 3D audio cues to enrich ambient awareness and provide a new way to relate to the environment. It allows you to build a mental map and make personal route choices while being more comfortable within unfamiliar spaces.
https://www.microsoft.com/en-us/research/product/soundscape/
Traveling to Essential Services using Microsoft Soundscape
https://www.microsoft.com/en-us/research/group/enable/articles/travelling-to-essential-services-using-microsoft-soundscape/
Annotating and sharing markers on Microsoft Soundscape
https://www.microsoft.com/en-us/research/group/enable/articles/annotating-and-sharing-markers-with-family-and-friends-through-microsoft-soundscape/#main
Getting FreshAir with Microsoft Soundscape
https://www.microsoft.com/en-us/research/group/enable/articles/getting-fresh-air-with-microsoft-soundscape/

2. Seeing AI app by Microsoft
Multi-purpose app which assists visually impaired to scan and recognize docs, recognize currency, identify objects in a picture, and more
https://www.microsoft.com/en-us/ai/seeing-ai
Hadley School for the Blind instructional video on using Seeing AI
https://www.youtube.com/watch?v=xRFGU2os7Og

3. The BARD mobile app from National Library Service has recently been updated
More information on the Eyes on Success podcast
https://eyesonsuccess.net/
BARD mobile User Guide
https://nlsbard.loc.gov/apidocs/BARDMobile.userguide.iOS.1.2.html

4. Webinar: A Beginner's Guide to Google Docs and Calender with Jaws
by the Maryland State Library for the Blind Technology Group
Saturday, 13 June 2020 at 14:00 GMT
You can find schedule info along with recordings of past webinars here
https://www.marylandlibraries.org/Pages/Technology%20User%20Group.aspx

5. Pinning and unpinning apps on the Start menu using a screen reader
https://www.youtube.com/watch?v=7OBdN3qWLz8&list=PLeXWOOlSQiVTecgn86zE0CO-yxzTXue1W
More helpful videos for blind screen reader users by Catch These Words
https://www.youtube.com/channel/UCrx961Vuddmymk1rhlKyA8g
kestrell: (Default)
Yesterday my iPhone developed a new glitch in which the volume goes really tiny. Originally, this was only when I asked it something, such as What song is this? Then I asked Alexx to see if he could fix it, and it began doing it whenever it felt like it.
Today I woke up and my iPhone had developed a new voice. It's male and British and reads messages as soon as they arrive, and I've never heard this voice before.

I think my iPhone has decided that now is the perfect moment to take over. I don't know if it's trying to take over my brain, my body, Melville Keep, or the entire world, but don't say I didn't warn you.
kestrell: (Default)
This article
https://www.howtogeek.com/662339/how-to-join-a-zoom-meeting/

lists multiple ways to join a Zoom meeting.

However, I've found that using my PC involves tweaking way too many settings, so instead have whoever sets up the meeting send me a text with the OneTap link in it.
Troubleshooting I have had to figure out so far:
1. Sometimes if the person sending the text includes other text, aside from the link, in the text, it makes it difficult for me to click on the link.
2. Sometimes when I get into the meeting, Voiceover says that I am muted and it doesn't matter how many times I use the unmute command of *6, I just get repeatedly muted, and at some point my onscreen keyboard gets dimmed and I can't get it undimmed to unmute.
I haven't found a way to tweak the keyboard dimming setting, so every few minutes I flip my phone up and then back down to the horizontal position, which refreshes the keyboard.
There has to be a better way to accomplish this but, since I am using Voiceover, it changes many of the default commands, so it's difficult to find troubleshooting fixes that also work with Voiceover.
Many users who are more confident using an iPhone turn Voiceover off while using other apps, but I'm not at that level yet.

Finally, let me once more plug the national Braille Press nbp.org for providing a pile of technology guides for blind and low-vision users. Anna Dresner's books, in particular, are both detailed and clearly written, and you will definitely get your money's worth from any of her guides.
Note that, although it is called the "Braille" Press, you can get these guides in all sort of accessible formats, including Daisy, and that you can buy these books already loaded ona thumb drive so that you just plug them into whatever device you use to read your books.
kestrell: (Default)
I saw these two screen magnifiers on a list of stocking stuffers for movie fans, but thought it would be a fun option for low vision folks, plus it's less expensive than the disability tax options.

1. Luckies of London Retro Screen Magnifier
2x magnification plus enlarges your smartphone screen to 8 inches
https://www.amazon.com/Phone-Screen-Magnifier-8-Inch-Magnification/dp/B01KV3BFD6/?tag=whtgh-20
2. A 12-inch screen option
https://www.amazon.com/Screen-Magnifier-Smartphone-Compatible-Smartphones/dp/B07VC383LJ/?tag=whtgh-20&th=1
kestrell: (Default)
I think I may be ready to try out some of the blind-accessible navigational apps. However, I would rather not walk aorund with a cane in one hand and my iPhone in the other, and have to worry about dropping my iPhone.

I've heard of iPhone sleeves: can anyone recommend one of these? I have an iPhone 6SE, which is one of the smaller ones, so a sleeve should work pretty easily.
kestrell: (Default)
If you are blind or visually impaired and haven't checked out the technology guide published by the National Braille Press, you should definitely do so
http://www.nbp.org/ic/nbp/publications/apple.html

I'm constantly overwhelmed by all the new tech that is becoming accessible to blind people and, after having my iPhone for over two months, I still don't access more than a handful of its features. Plus, all those new apps that offer assistance with travel navigation? Clueless. But I just found a bunch of guides to talk me through learning how to use these features.

If you are a visually impaired smartphone user, you have probably spent as much time as I have trying to find documentation for accessibility features, with frustratingly little profit, but these guide make it quick and easy to learn what you want to learn.

There are guides for both iPhone and Android users, and there are guides on particular subjects such as navigation and GPS apps, ebook apps, and writing on your smartphone.
kestrell: (Default)
Also, there is a Voice Dream Reader app for Android. Interesting that this line of apps was not originally designed to be accessible, it was just such a good design that visually impaired users could use it.
https://www.afb.org/aw/20/5/16444
kestrell: (Default)
My new iPhone arrived on Tuesday morning and I have to admit, I am truly impressed by the level of accessibility offered in both iOS and Apple as a company. The least accessible part of the whole process of purchasing, receiving, and getting the iPhone running and registered was finding the link to purchase the phone I wanted from the Apple Website.

Full disclosure: I had sighted help with that part and with a couple otherparts of the process, such as getting Siri started and registering the phone. However, Apple provides a service which is particularly useful for users with disabilities: as soon as Apple is alerted that your phone has been delivered--and the user will also get an alert for this--an email gets sent to the user's mailbox with a link to schedule time with a live Apple tech support person who can assist the user through the starting and registering process, including turning on the phone's accessibility features and transferring information from a previous phone to the new phone.

Once the phone is charged and on and Siri has been turned on, the user can tell Siri to turn on Voiceover, the iOS screen reader and, with the use of both Siri and Voiceover, it's relatively easy to learn a number of the most basic tasks from the first day.
Read more... )

February 2024

S M T W T F S
    123
456789 10
11121314151617
18192021222324
2526272829  

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Dec. 24th, 2025 08:28 am
Powered by Dreamwidth Studios