A visual impairment does not stop a woman from applying makeup, but it could prevent her from seeing clearly, how the makeup looks. Now there’s an app for that and it fits neatly into her smartphone’s accessible app collection with many other accessibility apps she uses daily. The inclusive beauty technology is developed by one of the biggest players in the world of cosmetics, the Estee Lauder Companies.
The ELC Voice-Enabled Makeup Assistant was awarded Fast Company’s Innovation by Design Awards, winning in the Beauty and Fashion Category and as a finalist in the Artificial Intelligence category. And we’ll give it a prize for inclusion. The Virtual Makeup Assistant works with all makeup brands, it is not restricted to people with vision loss, or women for that matter, everyone is welcome!
It’s good to have the objectivity of AI give you an individual scan to review makeup on face, eyes and lips, then say if blending or touch up is needed. And it’s utterly affirming to hear you look “Fabulous!” The VMA uses the phone’s camera for face scan, directing the user to position face properly. The app assesses the makeup uniformity, boundaries and coverage. The assistant makes specific touch up recommendations, if applicable, then it reassesses.
Woman checking makeup application with the Estee Lauder virtual Makeup Assistant. Image credit: Vogue Business.
The assistant is completely voice-enabled and talks you through the process each time. As the app opens the VMA says “Welcome back, to begin checking your makeup, say “start.” Next question is, “Should I check foundation, eye shadow or lipstick first?” When there is a pause, the assistant will turn off the camera and say to tap the “Continue” button at the bottom of screen when ready to resume. With each scan the voice provides guidance, for example: “Look to the right for 3 seconds, then to the left for 3 seconds, or tilt your head up slightly. Feedback includes specific description and location of area that needs attention. Positive feedback tells you the makeup looks “Beautiful!,” and you’re good to go. It sure beats walking out the door and having someone inform you there’s a smudge on your face.
The app works automatically with or without a screen reader. Change the voice speed or submit feedback about your VMA experience by saying, “Open Settings” and selecting Voice Speed or Feedback.
The VMA is compatible with iPhone 8 and iOS 14.1 or later, currently available in the UK and US. Coming soon to other markets and to Google Play.
Technology overwhelms most of us to some degree. A visual impairment and an aging brain can most certainly compound the problem. The endless stream of messages, notifications, alerts, sounds and haptics disrupt focus and elevate angst. But the upside, so far, still outweighs the downside, and life without technology presents many other difficulties. There are ways to meet it in the middle while maintaining a sense of control.
The truth is, technology can help us compensate for vision loss in many ways. Take charge, prioritize the apps and programs used on a daily basis. It’s impossible to know everything because it’s not always intuitive and it is always changing, sometimes for the better, sometimes not. It’s all about adjusting. And don’t forget, learning new things is good for the brain.
Giving up technology is never the best option, consider these steps to keep tech in check.
Accessibility Support by Phone.
This is the #1 tip, don’t hesitate to use it. Call for expert assistance to personalize vision settings and help resolve other issues, such as issues with changed interface and navigation. Here is OE’s coveted list:
Accessibility support specialist helps customer with vision settings.
Embrace Small Changes.
Inertia is the enemy of automation, there is no freezing it in place. This is why, allowing small, incremental updates will avert an eventual and painful total shutdown, resulting in changes that are never small. It’s not worth the risk: Don’t Avoid Software Updates.
Learn One Thing At A Time.
Practice is key to learning, the more we do the better we get. It’s important to be very selective about the tech that touches every day. Learning new skills is, in fact, essential exercise for the brain. Here’s how to approach the process of Learning At Any Age With Vision Loss.
Reduce Digital Clutter.
We tend to carry a lot of unnecessary data that will never be accessed again. It’s like keeping a storage locker filled with garbage. Yes, this too can become overwhelming. Cleaning it up is actually cathartic and it may also help everything run smoother. Clean Up Your Cluttered Phone.
Minimize Chronic Distractions.
Digital messages incessantly hijack our attention and break concentration. The very creators of this problem, Apple iOS and Android, now offer ways to take back control. Read this and Reclaim Your Digital Focus.
By now we expected smart glasses to be all the rage. They’d be on lots of faces, in the same way smartphones are in (just about) every hand. Actually, they were predicted to replace the smartphone, providing the wearer with voice access to information and apps. And, of course, they’d be accessible to visually impaired or blind users, because smartphones are. It would be wonderfully inclusive and normalizing, glasses that made information accessible, whether you could see the screen or not.
Apple’s Rumored Glasses Are Unveiled
For years the rumors abound, with stories of Apple’s smart glasses in development. They would somehow replace iPhone, and they would be accessible, of course. Years of gossip on the subject, should have taught us not to believe everything we read. Nonetheless, the rumors continue. According to Bloomberg, and published on Apple News, “Apple’s long-anticipated mixed-reality headset is an ambitious attempt to create a 3D version of the iPhone’s operating system, with eye- and hand-tracking systems that could set the technology apart from rival products.” Now said to be launching in early 2023 at a price of $3,000, they also report, Apple decided to offload the battery pack, roughly the size of 2 iPhones, to rest in the users pocket, at the end of a cable, which sounds rather antiquated.
The Apple vision Pro was officially unveiled in June, but it will not be unboxed until early 2024. It doesn’t look like the smart glasses of our dreams, but hey, who are we to naysay? Apple’s track record in Accessibility is as good as it gets, no one else has come close, so let’s wait and see. In Forbes, Gus Alexiou asks, Could Apple’s Vision Pro Significantly Augment Sight for people with vision loss? He notes this has been “a long held aspiration within the field of assistive technology,” that has failed to materialize. “This could well be about to change” when Apple’s Vision Pro launches, “though,” he says, “the price point of the $3,499. Mixed reality spacial computer doesn’t exactly scream out accessibility.” But actually, the assistive low vision head mounted devices he mentions have been sold at a similar price point. So it’s hard to complain about the price, hopefully this one will be worth it.
Google’s Early Glass
It was the anticipated, but short-lived promise of Google Glass. Launched in 2013, with a $1,500 price tag and the intention to create a ubiquitous computer whose wearers would communicate with the internet via natural language. An excellent idea, but failing to reach critical mass. Google discontinued its public product in 2015, re-introducing Glass for enterprise in 2017. Was the technology not quite ready, or was it the customer who was not in sync?
Person wearing Apple Vision Pro (image credit: apple.com)
Wearable Low Vision Devices
If anything has taught us to manage expectations, it is the classification of Wearable Low Vision Devices, also referred to as Electronic Glasses or Smart Glasses. These are head-mounted devices that enhance vision, predominantly through video magnification for people with central vision loss, or field expansion for those with a narrowed visual field, while others offer non-visual assistance. Low vision devices have also been in development for more than a decade with improvements in technology and price.
Visual assistance comes mainly in the form of Trekkie-looking headsets that are slowly coming down in size and weight. Some devices are not designed for mobility and all should be carefully evaluated for specific applications that include reading, watching TV, movies, theater, cooking, crafts, card and board games. IrisVision, Eyedaptic and eSight may serve the need at prices ranging from $2,000 to $6,000.
Non-Visual Low Vision Options
People with uncorrectable vision loss want to see better; no question about that. However, when artificial vision from clunky headsets does not do the trick, there are non-visual options. The OrCam MyEye, at $4,500, reads text, identifies colors, products, and faces. OrCam is not technically smart glasses; it’s a talking camera that clips quietly to the arm of any eyewear. Aira’s visual interpreting assistance can also be access through Envision Glasses for $3,000 plus the monthly service fee, now starting at $50 for 30 minutes.
No smart glasses are needed to magnify images, read, convert text-to-speech, recognize objects and faces, read barcodes and handwriting on an iPhone or most Android phones. Some people consider it less convenient as it requires holding a phone and tapping the screen.
First published Jan 31, 2023 and updated Aug 25, 2023
While researching Amazon’s Accessibility, we discovered just how desirable it can be to have Alexa read to us. Amazon’s well known, much loved, voice assistant will read Kindle e-books with her voice or play Audible audiobooks, all you have to do is ask.
This is surely one of the easiest options available for reading books. It’s a natural for people with vision loss, and a great convenience for anyone who wants to continue reading while they do other things. Ironically, Alexa’s skills do not even come under the heading of ‘accessibility’, yet it is the functionality we’ve been waiting for.
Amazon Fire 7 tablet unboxed
We learned more about this fantastic feature in an email from our friend Steven. He wrote, “I purchased the brand new Fire 7 tablet.” Fire tablets vary in screen size, storage capacity and price starting at $60. “It is hands-free using Alexa vocal commands to open a book, pause the book, fast forward, go to a different chapter, etc. All of which I saw demonstrated at the (Amazon Books) store. Alexa is built in to the Fire tablet, no additional devices are required.” He appreciated his in-store experience. “The sales person was great, totally setting up and downloading my tablet and it was fully functional when I left the store, except for connecting it to wifi when I got home.”
Steven’s in-store approach can no longer be duplicated, since Amazon has closed all 24 of it’s physical book stores. Another good option is to phone Amazon’s Accessibility Customer Service at 888-283-1678. They can help you make the purchase online and walk through the setup which requires entering a wifi password and Amazon account.
The very same voice commands that control the reading on the tablet, do the same on an Amazon Echo speaker. Alexa can read Kindle books that are authorized for Text to Speech or Screen Reader Enabled. It seems the majority of books are eligible, just be sure to check before you buy a Kindle book.
Alexa will also read your Kindle and Audible books on an iOS device or Android, although it’s not quite as hands-free or as agreeable as it is on an Amazon device. In this case you would open the Alexa app and tap the button to ask.
Amazon Alexa speaker next to smart phone with Amazon apps
How to Ask Alexa
Kindle Alexa commands:
– Read my Kindle book “To Kill A Mockingbird”
– Play
– Pause
– Resume
– Stop
– Skip Back / Skip Ahead
It is Alexa’s very own familiar voice that reads the Kindle books, and she’s a very good reader. Audible books are read by an array of professional readers including authors, actors and celebrities.
There is something magical about dictation. Spoken words are rapidly turned to text. The instantaneous nature is pretty amazing, but the best thing is dictation takes the place of typing.
Whether you have a visual impairment, or not, typing on a tiny touchscreen keyboard is a tedious task. That is why more and more people are becoming dictators.
In Apple iOS devices, Dictation is available whenever there is a keyboard on screen. The Dictation button is the microphone icon at the lower right corner or left of the Space Bar, depending on device. If the microphone key is not there, first go to Settings and click General, next click Keyboard, now go to Enable Dictation and turn on.
Here are the steps for dictating with iPhone or iPad.
Tap on the Dictation / microphone button below the keyboard and be prepared to start speaking following the single ding tone.
Finish speaking and tap again, you will hear a second single ding and your spoken words will appear in the text field.
(With Apple’s VoiceOver screen reader use a two-finger double tap to activate Dictation, and a second two-finger double tap to stop, VoiceOver then reads the text aloud.)
iPhone screenshot shows microphone button on search bar and below keyboard.
Speak clearly for best results. Noisy environments will create conflict. Dictate one sentence at a time for accuracy. Correct errors on imperfect transcripts or delete all and try, try again.
To include punctuation, just say so. Finish a sentence with a “period” or a “question mark.” Follow a salutation with a “comma“ or a “colon.” Also say, “new line,” new paragraph,” “all caps,” “apostrophe,” “hyphen,” or “exclamation mark.”
Practicing can actually be fun, so go ahead and do it. You’ll be a powerful dictator in no time!
Crossing the street is a risky business when you can’t see the signal on the other side. People with vision loss develop strategies as the traffic lights become less and less visible to the eyes. Depending on the intersection, and traffic patterns, stepping off the curb can be a real leap of faith. Accessible Pedestrian Signals have been around for decades, however they are expensive to install and a challenge to maintain. In cities large and small, no matter how many audible signals exist, visually impaired people will inevitably encounter crosswalks that are not accessible. Then what? If you cannot see the pedestrian signal it is unlikely that you can turn around and hunt for another signal that is accessible. This is where the OKO app comes in.
How OKO Works
OKO is a smart camera app, that detects and reads crosswalk signals instantly, providing information in audio, visual and haptic feedback. When the camera is pointed to opposite side of street the app detects status of pedestrian signal and immediately begins to inform the user with 3 types of feedback. Play the sounds in the app to familiarize yourself before heading to the street.
DON’T WALK – audible tone and haptic vibration is a slow beat and Red screen means STOP
WALK – audible tone and haptic vibration is a fast beat and Green screen means GO
COUNTDOWN – audible tone and haptic vibration is a timed beat and yellow screen means WAIT
The app is similar to Seeing AI in it’s simplicity and quick response. Before you take it for a spin you’ll have to agree to the terms of use, a reminder that OKO is not a replacement for mobility devices and training. It is currently available for iOS in the United States and Belgium, with more countries in testing mode. The app developer, AEYES, is also working on a feature that will read the signs on city transit buses.
Launched in late November 2022, Chat GPT amassed a record setting 100 million monthly users by January 2023. TikTok took 9 months to reach the same milestone, and for Instagram it took 2 1/2 years. Created by Open AI, an American artificial intelligence (AI) development lab, a nonprofit at the start, it’s mission to “promote and develop friendly AI.” Now a for profit company, Open AI received $13 billion investment from Microsoft, igniting a fierce competition, with Google, to establish a newly dominant AI internet search engine.
Sounds like just the thing we’ve been waiting for, a voice assistant with a PHd. But it is not clear the new chatbot technology will enhance visual accessibility or elevate the intelligence of the voice assistants we’ve come to depend on, like Siri, Google and Alexa. Chat GPT goes beyond answering basic questions on topics relating to trivia, history and pop culture, setting reminders and managing smart home devices. Chat GPT is able to conduct natural sounding conversations with users, while providing more in-depth responses. Chat GPT also has a memory and ability to draw from it for future reference. Think of it as a research assistant, geared to work more than play, and predicted to change the way we work in ways good and bad.
Plenty has been reported about the nefarious side of chatbots too. They can be mean and nasty, tell lies and spread hate. The same behavioral issues surfaced in earlier chatbots, causing them to be taken offline, but no indication the current bots will be fired for insubordination. In separate segments on 60 Minutes, neither Google’s Sundar Pichai nor Microsoft’s Brad Smith, seemed confident in the ability to control their tech creatures. In March, more than 1,000 tech industry leaders signed an open letter calling for a 6-month moratorium on further development to minimize the dangers to humanity. Several days later leaders from the Association for the Advancement of Artificial Intelligence released it’s own letter warning of the risks. This month the “godfather of AI” severed ties with Google to warn of the perils ahead, and a few days later the White House summoned the chief executives from leading AI companies, Google, Microsoft, Open AI and Anthropic, warning of the potentially dramatic threats AI can pose to safety and security, infringe civil rights and privacy and erode public trust and democracy. Have they lost control of “friendly AI?”
In terms of chatbots for visually impaired and blind users, Microsoft Accessibility is partnering on the development of many new applications. Be My Eyes is testing a Visual Volunteer meant to provide a real world audio description capability. We’ll see much more to come on this.
The accessibility of technology has come a long way, which is not to say it’s all smooth sailing. Issues arise and surprise, creating frustration and inconvenience, or even abruptly ending access as we’ve come to know it. Some play it safe, to avoid sorrow, by ignoring software updates, but that too will come back and bite you. So when it happens do not suffer in silence, take action, make some noise! Before you do anything, remember to close all apps and shut down the device, then reboot, because sometimes that works like magic.
If the accessibility issue is on an Apple device, well that might be lucky because they are well equipped to deal with it. First, report the problem to Apple Accessibility Support by phone at: 887-204-3930. An accessibility specialist will be able to help you fix it or find a workaround. They will also report the trouble to software engineers and tell you if others have registered the same complaint. For unresolved issues, it never hurts to also send feedback by email to: [email protected]. The more they hear from users, the higher it moves in priority. When all goes well, the glitch is fixed in the next software update.
These steps can and should be exercised with issues on software from Microsoft, Google, Amazon and others. Consult OE’s inside guide to Accessibility Support Phone Lines for more.
1. Troubleshoot by turning the feature off, then on again in Settings and, or close all apps, reboot device to see if issue resolves.
2. Contact the accessibility support team by phone.
3. Report the issue with your device name and software version.
Volunteerism is alive and well as demonstrated every day in the Be My Eyes app. The concept was conceived in 2012 by furniture craftsman Hans Jørgen Wiberg, who began losing his vision twenty-five years prior. It was his idea to provide access to sighted assistance for people with low vision or blindness. The app name leaves no doubt about its purpose. I recall being incredibly impressed in the early days of BME, when there were 150,000 volunteers, today there are over 6.4 million helping more than 480,000 people with vision loss. The fact that so many are interested in lending their eyes to a total stranger, is a testament to humanity.
Built on the kind assistance of humans, the BME app is now testing a “Virtual Volunteer” powered by ChatGPT-4. If you want to be part of the beta testing process, there is a registration page on the app, but no guarantee you’ll get in, there is a waiting list. In a post on Mashable, one of the participating testers, Lucy Edwards, is reported to have used the conversational AI tool as tour guide, food blog, reader of restaurant menus and fashion catalogs, language translator and personal trainer. It will be very interesting to see how this develops, live human kindness vs. Ai chatbot.
For now the all-live volunteers are able to offer their service, at times that are convenient to them. BME creates an opportunity to give back in a sort of micro-lending kind of way, in small increments of time, free of rigid scheduling commitments. The visually impaired user is able to call for help whenever it is needed, without feeling they are imposing. The volunteers are logged in because they are ready and willing to help someone, possibly you.
The app, available for iOS and Android, is designed with a fittingly friendly user interface.The two main options on the uncomplicated homepage are “Call a Volunteer” -or- “Get Trained Support”.Expert company representatives are available in the categories of: Assistive Technology, Beauty & Grooming, Blindness Organizations, Careers, Civic Engagement, Food & Beverage, Home & Cleaning, Personal Health or Technical Support. Participating companies include: Google, LinkedIn, Microsoft, Spotify, Pantene, Hadley, Lighthouse San Francisco, Accessible Pharmacy, Rite Aid and more.
The service is active in 150 countries and available in 185 languages.When you call a volunteer, BME sends out the request to the nearest available volunteers by location and language.There is no limit to the number of calls or time spent, however it is best to say, at the start of a call, if you expect the call to be lengthy.The app provides a rating system to register feedback about your experience, good or bad.
Here are 100 Ways to Use BME. I have used the app for assistance reading a thermostat, setting the oven temperature, reading hand-written notes and product directions.All these encounters with BME volunteers were pleasant, constructive and successful.There is a “Community” tab at the bottom of every page worth exploring for inspiring stories from users and volunteers.
And, by the way, all Be My Eyes Services are free.
Blue and white Be My Eyes logo with white text on black background.
Amazon’s $3.9 billion acquisition of One Medical was completed February 23, 2023 and within days the email invitations were sent to Amazon customers. Join the new monster healthcare venture, and get what they’re calling, “Frustration-free primary care,” at an introductory rate of $144 for a year, reduced from $199. This is, they say, “primary care for your body and mind.” Membership benefits are touted as 24/7 virtual care via messaging or video, online appointment booking, on demand video chat, in app prescription requests and renewals. Apparently they also accept most health insurance and can “help” with common illnesses, chronic diseases and mental health concerns. Is this your “doctor’s office re-imagined?”
Amazon One Medical app
Amazon’s Echo speaker and voice assistant Alexa are a legendary duo. Introduced in 2014, the pair now boasts 100,000 skills, a mind-boggling number. In the year 2020 alone, there were 53.6 million Echo devices sold. Lately it appears Alexa is positioned to take on healthcare, which makes a lot of sense.
The same reason Alexa’s popularity has accelerated in home settings is why it can integrate and enhance communication in healthcare settings. Liron Torres, global leader of Alexa Smart Properties, told Fierce Healthcare, “We believe that ambient computing can dramatically change and improve the way our customers use and interact with technology.” She went on to say that voice technology is “natural, intuitive, and accessible.” The idea is to simplify the way hospitals and assisted living facilities integrate and manage Alexa-enabled devices to elevate care.
Echo Alexa Speaker
Cedars-Sinai and Boston Children’s, among other hospitals and senior living communities, including Atria and Eskaton, are the first to adopt Amazon’s smart service, which enables patients to easily connect with family and communicate with care team members, by way of voice interaction with Alexa. They can also ask Alexa to play a game, turn on music, get the news, or turn on the TV. “Voice is intuitive for patients, regardless of age or tech-savviness,” said Peachy Hain, executive director of Medical and Surgical Services at Cedars-Sinai.
While it’s all just in the early stages, you can expect to encounter more and more skills being implemented that make healthcare easier to deal with. Some apps are proprietary to healthcare organizations, and others are just for the asking. Alexa has skills that help diagnose illness, manage medications, manage diabetes, manage blood pressure, get first aid advice from the Mayo Clinic, get healthy living tips from the Cleveland Clinic, and even watch over aging loved ones with Alexa Together.
In other healthcare ventures Amazon bought PillPack in 2018 and launched Amazon Pharmacy in 2020. They also partnered with JP Morgan Chase and Berkshire Hathaway to look at lowering healthcare costs and teamed with the National Institutes of Health to develop technology that connects biomedical researchers worldwide. No doubt, there’s more to come.
Post first published Jun 30, 2022 and updated Mar 9, 2023.
There have been many changes to Aira services, and we must admit, it’s all a bit hard to follow. The subscription prices have increased significantly for new subscribers in 2023, but it’s not clearly published, so best to call Aira to clarify. There is also apparently a new app, called Aira Explore, which is currently available on Google Play, but not yet on the App Store. The old app (now called Legacy) will not be updating and users will need to install the new version at some point. Call Aira customer service: 800-835-1934.
Living with vision loss has a way of compelling us to become better problem solvers. We learn to develop compensatory strategies for getting things done on a daily basis. And we all know there are moments when the technology, the magnifiers and the light, just won’t suffice, what we could really use is another pair of eyes. This is when you would consider calling upon a family member, a colleague, or a friend, but you’d prefer not to disturb anyone.
The visual interpreting service, Aira (pronounced I-ra) is a possible solution to that problem, and many more. The name is acronym for Ai Remote Assistance the service connects people with live, specially trained agents for help reading, navigating, identifying or describing. The connection icon a video call made on a smartphone app, currently available in the USA, Canada, Australia and New Zealand. You’re not bothering them, they want you to call, it’s what they do. They offer another pair of eyes, exactly when you need them.
Home screen of Aira app.
The App
Access to Aira’s free services requires an app. Create an account and sign in as a Guest. The app will tell you what Aira Access Locations are in your vicinity. In Access Locations, which include Starbucks, Target, Bank of America, Walgreens, AT&T, 50+ airports and transit systems, like all of Boston’s MBTA railroad, subway and bus stops. Recently Aira announced the entire state of Connecticut the service is paid for by the location and you enjoy guest access for free. If you already have an Aira account, you’re good to go, just make sure you have the latest version of the app.
Monthly Subscription
Upgrade to a monthly plan with a call to customer service. You may find great value in a plan that enables you to take a guided stroll through a museum. Rediscover your neighborhood restaurants and shops, or get help navigating an airport that is not yet a free access location. The possibilities are endless.
You may find great value in a plan that enables you to take a guided stroll through a museum, rediscover your neighborhood restaurants and shops, or get help navigating an airport that is not yet a free access location. The possibilities are endless.
Tell Aira Where You Want Guest Access
To sponsor more free service and expand inclusivity, Aira needs to build their Guest Access Network. The most significant impact comes from businesses with many locations. If you know of a an organization with interest in making Aira’s services available to their customers, send that information to: [email protected].
As navigation apps go, for people with visual impairments, Soundscape was one of the few worth discussion. It enhanced awareness of our surroundings, like walking with a friend who describes the environment. So it is disappointing to share the news that Microsoft has discontinued its development. No longer available on the App Store or on Google Play, users with the app currently downloaded to a device will have it until the end of June 2023, at which time it will stop functioning.
The announcement from Microsoft says, in part, “The Soundscape code is now available as open-source software on GitHub at https://github.com/microsoft/soundscape, so that anyone can continue to build on, and find new ways to leverage, this novel feature set for the growing navigation opportunities in today’s world. As Microsoft Research continues to expand into new accessibility innovation areas, we hope the open-source software release of the Soundscape code supports the community in further developing confidence and utility of spatial audio navigation experiences.”
We, too, hope the code helps in the development of new applications and that Microsoft Research will continue to expand into new accessibility innovation.
Microsoft Soundscape 2018: A Review Of What It Can Do
Microsoft is clearly on a mission to advance accessibility for people with vision loss, and that is a very good thing. The past six months have seen the launch of two significant new apps to iPhone users, both intent to give us the information we are unable to discern visually.
The first app, Seeing AI, has very quickly become a crowd favorite. It’s ability to instantly read short text (signs, addresses, labels, and packaging) is reason enough to use it. It also reads documents, product barcodes, currency, color, handwriting and more.
The newest app in this collection is ‘Soundscape,’ a navigation tool, described as a “map delivered in 3D sound.”Not to be confused with the GPS app you’ve been waiting for, there is no turn by turn directions, no specific guidance from point A to point B. And, as we experience with all GPS, there is the issue of accuracy. Putting that aside, this app still has much to offer.
Soundscape provides an enhanced awareness of what’s around you. It’s kind of like walking along with a friend who is pointing out stores, restaurants, structures, and intersections. The information comes in 3D stereo sound, information about what is on your left comes to you from the left, what’s on the right comes from the right, and what’s in front comes from the center. The audio is impressive.
With progressing central vision loss, I have not been privy to the specifics of my surroundings for a while. There is something wondrous about knowing what’s around me in any given spot.You might think you would get used to the not knowing, but the curiosity never really subsides. Soundscape is a bit of a thirst quencher in this way.
Like all technology, you need to work with this and find out what it can do. At first, I must admit to being frustrated by the free-floating information about what’s “around” without any clues for getting to it. Then I started to appreciate the narrative as I learned about places near and new to me. To get this type of information before, I would ask someone I’d be walking with to “Please tell me everything that’s on this street.” The Soundscape app basically does that, just not exactly.You must be aware and accepting, that there is a margin for error.
The greatest benefit I derived from this app is the telling of street names, numbers, and intersections. Before Soundscape, my best method for figuring out what street I was on was to ask Siri, “Where am I?” That approach also not always accurate. Using ‘My Location’ lets me know what street I’m on and what intersection is coming up. I love that!
Microsoft’s Soundscape app main screen.
The Soundscape app has a cleanly designed interface with the following options:
Menu
Select, or search and save, reference points.
Manage Call Outs by selecting the information you want to be called out automatically like Places and Landmarks, Intersections, Destination Distance, Bluetooth Beacons Indoors and Location Updates. Here you will also find Help and Tutorials and Settings.
Set a Beacon
Select a specific location and audio beacon will indicate when you are facing the direction of your selected location, it will not set a path or take you there.
Call Outs On/Off
My Location
Gives you current location, direction facing, nearby roads and points of interest.
Around Me
Tells you about one thing in each direction: ahead, to right, behind you, and to the left.
Ahead of Me
Helps you discover what’s coming up ahead.
Holding the phone flat in your hand with the top facing the direction you are heading will enable it to function like a compass.
Soundscape and Seeing AI apps eat up battery power, so it is well advised to always carry a backup.
It’s very safe to say this is just the beginning for Soundscape and Seeing AI. Microsoft will surely continue to develop and improve these technologies. Try this in your city.You’ll help the progress by sending feedback to [email protected].
And don’t forget, you can always call the Microsoft Disability Answer Desk for help at 800-936-5900.
Please note that this article was not paid for, affiliated with, or endorsed by any third-party companies. The views and opinions expressed in this article are solely those of the author’s.
This post was originally published March 16th, 2018 and updated on February 20, 2023