Podcast: Play in new window | Download
Subscribe: RSS
Maria Johnson is a mother of two, group fitness instructor, radio show host, podcaster and blogger at girlgoneblind.com.
A blog and podcast dedicated to the exploration of an enabled life with blindness.
Podcast: Play in new window | Download
Subscribe: RSS
Maria Johnson is a mother of two, group fitness instructor, radio show host, podcaster and blogger at girlgoneblind.com.
Posted in Blog Posts
Twitter has become a very popular social media service among the blind and visually impaired. This is because of how easy and accessible it is to share short bits of text. Unfortunately, images that are shared on Twitter can be a barrier to the blind. By taking a couple of easy steps first, it can make it easier for blind and
visually impaired users to fully have access to what you are tweeting.
Since May 2016, Twitter has allowed you to describe any image that you
include with a tweet. In order to do this you must activate this feature.
First you will need to enter the user menu by selecting your profile icon and select “Settings and Privacy”. Next, under “General”, select
“Accessability”. Once in the “Accessability” section, scroll down
until you find a toggle labeled “Compose Image Descriptions”. This
needs to be turned on.
That’s all you need to do! Once you back out and compose a Tweet,
you’ll be presented with a new box that allows you to compose an image
description for any image that you post.
This enables a blind Twitter user to use a screen reader software (like
Voice Over for iPhone or JAWS for PC) to find out what your shared image is all
about.
Features like image descriptions are nice and much needed in social media. I am quite grateful that Twitter has incorporated this along with so many other accessibility options into its service.
But why hide the option so deep within the accessibility settings? Most people might not even take the time to find it, let alone turn it on.
Unfortunately, it seems as though it was just an afterthought added in by developers. An accessibility feature like this is so very important to someone who
is blind or visually impaired. A feature that is so simple and yet so important should be turned on by default. It’s a small gesture that would go a long way in making a big difference to those who are not always able to be included in the family memory, political joke or trending meme.
Posted in Podcasts
Podcast: Play in new window | Download
Subscribe: RSS
Happy New Year!
We begin the 20th episode of the Life After Blindness podcast with an all new because of my blindness story.
Since we’ve been away, there’s been many developments in blindness related news. First,Microsoft’s Seeing AI app has added four new channels. These channels include: currency, colors, handwriting and light detection.
AIRA has announced partnerships with two US airports to allow users to navigate these airports for free without using any minutes. AIRA also announced a partnership with the ridesharing app Lyft. This puts lift right into the AIRA app, enabling your agent to help you in contacting and accessing your ride.
En-Vision America will be releasing a new app this year for the iPhone called Script Talk. Already available on android devices, this app allows users to scan codes on prescription bottles that will then read all the pertinent information about the prescription.
Next, Tim speaks with Mel Scott from blindalive.com about the recent release of the Eyes Free Fitness app for the iPhone. This app enables you to download and play a number of workouts offered by Blind Alive. The app also gives you direct access to their blog and podcast.
Tim the continues his conversation with Randy Rusnak. This time they are talking all about the Amazon Echo.
The podcast closes with a recording sent in by listener Rachel from the UK.