The Monarch could be the next big thing in Braille

For many people around the world, braille is their primary language for reading books and articles, and digital braille readers are an important part of that. The newest and fanciest yet is the Monarch, a multipurpose device that uses the startup Dot’s tactile display technology.

The Monarch is a collaboration between HumanWare and the American Printing House for the Blind. APH is an advocacy, education, and development organization focused on the needs of visually impaired people, and this won’t be their first braille device — but it is definitely the most capable by far.

Called the Dynamic Tactile Device until it received its regal moniker at the CSUN Assistive Technology Conference happening this week in Anaheim. I’ve been awaiting this device for a few months, having learned about it from APH’s Greg Stilson when I interviewed him for Sight Tech Global.

The device began development as a way to adapt the new braille pin (i.e. the raised dots that make up its letters) mechanism created by Dot, a startup I covered last year. Refreshable braille displays have existed for many years, but they’ve been plagued by high costs, low durability, and slow refresh rates. Dot’s new mechanism allowed for closely-placed, individually replaceable, easily and quickly raisable pins at a reasonable cost.

APH partnered with HumanWare to adopt this new tech into a large-scale braille reader and writer code-named the Dynamic Tactile Device, and now known as Monarch.

These days one of the biggest holdups in the braille reading community is length and complexity of the publishing process. A new book, particularly a long textbook, may need weeks or months after being published for sighted readers before it is available in braille — if it is made available at all. And of course once it is printed, it is many times the size or the original, because braille has a lower information density than ordinary type.

A woman holds a Monarch braille reader next to a stack of binders making up an “Algebra 1” textbook.

“To accomplish the digital delivery of textbook files, we have partnered with over 30 international organizations, and the DAISY Consortium, to create a new electronic braille standard, called the eBRF,” explained an APH representative in an email. “This will provide additional functionality to Monarch users including the ability to jump page to page (with page numbers matching the print book pages numbers), and the ability for tactile graphics directly into the book file, allowing the text and graphics to display seamlessly on the page.”

The graphic capability is a serious leap forward. A lot of previous braille readers were only one or two lines, so the Monarch having 10 lines of 32 cells each allows for reading the device more like a person would a printed (or rather embossed) braille page. And because the grid of pins is continuous, it can also — as Dot’s reference device showed — display simple graphics.

Of course the fidelity is limited, but it’s huge to be able to pull up a visual on demand of a graph, animal, or especially in early learning, a letter or number shape.

Now, you may look at the Monarch and think, “wow, that thing is big!” And it is pretty big — but tools for people with vision impairments must be used and navigated without the benefit of sight, and in this case also by people of many ages, capabilities, and needs. If you think of it more like a rugged laptop than an e-reader, the size makes a lot more sense.

There are a few other devices out there with continuous pin grids (a reader pointed out the Graphiti), but it’s as much about the formats and software as it is about the hardware, so let’s hope everyone gets brought in on this big step forward in accessibility.

The Monarch could be the next big thing in Braille by Devin Coldewey originally published on TechCrunch

Wheel the World grabs $6M to offer guaranteed accessibility, price match for hotel rooms

Wheel the World wants to open the world up to people with disabilities and has made it its mission since 2018 to provide travel accommodations and experiences that fit their needs, both in the United States and worldwide.

Today, the company announced two new booking tool features, including guaranteed accessibility for hotel rooms booked from its website and a price match offer.

These new features follow the closing of $6 million in what it is calling “pre-Series A” funding. Kayak Ventures led the investment and was joined by Detroit Venture Partners, REI Co-op Path Ahead Ventures, former Booking.com CEO Gillian Tans, Dadneo, CLIN Fund, Amarena and WeBoost.

We caught up with founders Alvaro Silberstein and Camilo Navarro again after covering their $2 million seed round in 2021. At the time, Wheel the World had offered travel packages to 50 destinations with experiences specially created for people with disabilities, seniors and their families — think activities like sky-diving, kayaking and snorkeling, but crafted to enable people to do them without limits.

Wheel the World now offers more than 200 destinations, including some new group tours that create itineraries for groups of eight to 10 with guaranteed accessibility, allowing travelers with disabilities to travel together and explore places like Costa Rica, Morocco and Alaska. The company also now price matches hotel rooms from other booking platforms, including Booking.com, Expedia and Travelocity.

“We have identified those details of accessibility and the rooms that have those details,” Silberstein said. “Through our operations, we can guarantee accessibility that other platforms cannot. If you find something that we didn’t promise, we will give you your money back. As powerful as it sounds, we allowed that a month ago, and it’s working really well.”

The concept has caught on: In the past year, Wheel the World experienced a four-fold increase in the number of travelers booking through its platform over 2021. In addition, 40% of customers booked trips through the company more than twice.

The new capital is part of a SAFE note that will be part of the company’s future Series A round, Navarro told TechCrunch. It enables the company to pursue new partnerships with destination management organizations to identify new accessible travel offerings with a goal of reaching 12,000 travelers booking through its platform by December 2024, CEO Silberstein said in an interview.

Seventy-nine percent of the company’s users come from the U.S., and the founders said over the next two years, they will focus on this audience with additional product development around customer experience in terms of booking multiple trips, places to stay and group tours. In addition, Wheel the World will continue to grow its community of travelers and offer more opportunities for them to get to know each other and travel together.

“We want to transform our product into a community-based product and help them learn how they can contribute,” Silberstein said. “We also want them to have more interaction through a referral program so users can make recommendations to other users and we can start receiving accessibility validations. Those are things that we haven’t done yet.”

Wheel the World grabs $6M to offer guaranteed accessibility, price match for hotel rooms by Christine Hall originally published on TechCrunch

Senator Markey calls on Elon Musk to reinstate Twitter’s accessibility team

After several rounds of layoffs, Twitter’s staff is down from about 7,500 employees to less than 2,000 — and one of the numerous cuts across the company eliminated the platform’s entire accessibility team last year.

In an open letter to Elon Musk, Senator Ed Markey (D-MA) called on the new Twitter owner to bring the accessibility team back.

“Not surprisingly, since you shut down Twitter’s Accessibility Team, disabled users have reported increased difficulty and frustration using Twitter,” Markey wrote.

Like any social platform, Twitter has had its foibles when it comes to accessibility — in 2020, Twitter didn’t even have an accessibility team and only established one after public outcry when the company rolled out voice tweets without captions. But in the few years Twitter did have an accessibility team, the company rolled out features for alt text on images, automatic captioning on videos, and captions for Spaces live audio rooms and voice tweets. Many disabled users found community on Twitter, because its built-in accessibility features made it easier to use than other social platforms.

With no accessibility team at Twitter anymore, it’s not as though the platform’s features have simply remained dormant. Captions on Twitter Spaces have disappeared altogether, making the feature unusable for any user who is Deaf or hard of hearing. To disabled users, this sends the message that accessibility is no longer part of the conversation at Twitter HQ (which, by the way, the company has stopped paying rent for).

Twitter’s accessibility was dealt another blow when the platform cut off access to its API. Now, third-party developers will have to pay yet-to-be-determined monthly fees to build on a platform that previously welcomed them for free. This means that long-beloved apps like Ironfactory’s Twitterrific, which gave users expanded accessibility features, are no longer available.

“I received more than a few emails from Blind users who were upset and outraged because they would most likely have to stop using Twitter without accessible third-party clients like Twitterrific,” Ironfactory co-founder Gedeon Maheux told Forbes.

Markey’s letter poses a number of questions to Musk, to which Markey requested responses by March 17. Markey asks about why Musk eliminated the accessibility team and if he will reinstate it, Twitter’s compliance with ADA and FCC accessibility regulations, why Twitter removed captioning from Spaces, and if the platform will commit to creating user-friendly experiences for all kinds of content.

“All of these changes under your leadership signal a disregard for the needs of disabled people,” Markey wrote to Musk.

Senator Markey calls on Elon Musk to reinstate Twitter’s accessibility team by Amanda Silberling originally published on TechCrunch

MrBeast’s blindness video puts systemic ableism on display

Recently, megastar creator MrBeast posted a video to his YouTube in which he spotlights numerous blind and visually impaired people who have undergone a surgical procedure that “cures” their blindness. As of this writing, the video has been viewed more than 76 million times, and the responses have been visceral in both praise and contempt. For his part, MrBeast has taken to Twitter to publicly bemoan the fact that so many are so angry at him for putting on what amounts to a publicity stunt under the guise of selfless charity.

The truth is straightforward: The video was more ableist than altruistic.

Before delving into the many layers of why the video is troublesome, it’s important to issue a caveat. However problematic MrBeast’s premise in producing the video, the people who participated — the patients and their doctors — should not be vilified. They made the decision to go through with the surgery of their own volition. The reasoning behind making that choice goes far beyond the scope of this article.

In the broadest lens, the biggest problem with wanting to “cure” blindness is that it reinforces a moral superiority of sorts by those without disabilities over those who are disabled. Although not confronted nearly as often as racism and sexism, systemic ableism is pervasive through all parts of society. The fact of the matter is that the majority of abled people view disability as a failure of the human condition; as such, people with disabilities should be mourned and pitied. More pointedly, as MrBeast stated in his video’s thumbnail, disabilities should be eradicated — cured.

On one level, disability being viewed as a failure of the human condition is technically correct. That’s why disabilities are what they are: The body doesn’t work as designed in some way(s). If disabilities were computer software, engineers would be tasked with finding and fixing the bugs.

Yet the human body isn’t some soulless, inanimate machine that requires perfection in order to work properly or have value. I’ve been subject to a barrage of harassment on Twitter since tweeting my thoughts on MrBeast’s video. In between calls for me to imbibe a bottle of bleach, most of them have been hurling retorts at me that question why I wouldn’t want to “fix” or “cure” what prevents people from living what ostensibly is a richer, fuller life because blindness would be gone. A blind person, they said, could suddenly see the stars, a rainbow, a child’s smile or whatever other romantic notion one could conjure.

Elizabeth Barrett Browning would be proud of the way I count the ways in which this myopic perspective lacks perspective.

For one thing, the doctors shown in the video aren’t miracle workers. There is no all-encompassing cure for blindness. If the people who participated in this surgery have had their lives changed for the better by regaining their sight, more power to them.

That said, we know nothing of their visual acuities before the operation, nor do we know what the long-term prognosis is for their vision. That MrBeast proclaims to “cure” blindness is essentially baseless.

At a fundamental level, MrBeast’s video is inspiration porn, meant to portray abled people as the selfless heroes waging war against the diabolical villain known as disability. And it’s ultimately not meant for the disabled person. It’s for abled people to feel good about themselves and about disabled people striving to become more like them — more normal. For the disability community, inspiration porn often is met with such derision because the message isn’t about us as human beings; it’s about a group that’s “less than” the masses. This is where structural ableism again rears its ugly head.

Think about it: If you fell and broke your hand or your wrist, that would indeed be bad. You’d be disabled for some period of time. But the expectation during your recovery time would be that you’re still human, still yourself to reasonably do everything you could do prior. You may find certain things inaccessible for a while and need some forms of assistive technology, but you would expect to be treated with dignity and you wouldn’t expect someone to miraculously reset your broken bone. Yet this is what MrBeast (and his millions of minions) are peddling with this video. They don’t recognize the humanity of blind people; they only recognize the abhorrence of not being able to see.

In other words, abled people have a tendency to think disability defines us.

In many meaningful ways, yes, our disabilities do define us to a large degree. After all, no one can escape their own bodies. But what about our traits as individuals? Our families, our work, our relationships and much more? Surely people are aware of things like the Paralympics and wheelchair basketball leagues, for instance. The point is, disabled people are no different in our personal makeup than anyone else. We shouldn’t be pitied and we certainly don’t require uplifting in ways like MrBeast suggests.

I have multiple disabilities due to premature birth, but most people know me as a partner, a brother, a cousin and a friend who loves sports, likes to cook and listen to rap music, and a distinguished journalist. Everyone in my orbit is well aware of my disabilities, but they do not judge me solely based on them. They know the real me — they know my disabilities aren’t the totality of my being.

My lived experience is unique because I have so much to draw from: I have visual disabilities, physical motor disabilities and speech disabilities, and my parents were both fully deaf. Growing up the oldest of two children, I served as the unofficial in-house interpreter for my parents. As a CODA, I straddled the line between the deaf and hearing worlds. I know firsthand how deaf people look at their culture and their ways of life with immense pride. If someone “cured” deafness, what would happen to the people? Deaf culture is real. The culture would fade away because there’d be no reason for sign language to exist and the experiences derived from it.

I had a mentor my senior year of high school who asked me the day we met in my counselor’s office if I would go back and change things in my life so that I wouldn’t have disabilities. I told him rather unequivocally that I wouldn’t. He was taken aback by my answer, but I explained my rationale was simple: It would change who I am.

Almost a quarter-century later, my feelings are unchanged. Granted, I have my moments. I curse the fact I can’t get in a car and go anywhere I want, anytime I want. Likewise, I often lament the fact that my limited range of motion caused by cerebral palsy prevents me from literally moving as freely as I need or want to sometimes.

All told, however, my disabilities have enabled me to thrive in many respects. The relationships I’ve made, the knowledge I’ve acquired, the journalism career I’ve had for close to a decade — all of this would not have been possible in an alternate universe where I wasn’t a lifelong disabled person. To me, that’s the ultimate silver lining.

I don’t presume to be an oracle when it comes to accessibility and assistive technologies. I know a lot, but I don’t know it all. Similarly, I don’t presume to speak for all blind people or the disability community at large. Blindness in particular is a spectrum, and I proclaim to know only where my eyesight sits on that line. I also know this: A cure is not the answer to “helping” blind people, let alone anyone else with a disability.

Disabled people don’t need pity. We don’t need to be uplifted. We don’t need cures from ourselves. What we desperately do need is some recognition of our basic humanity. We need abled people to start seeing us as the people we are instead of the sorrowful, burdened outcasts society likes to portray us as.

MrBeast (and his defenders) easily fall into the trap of perpetuating that deeply entrenched ableist mindset; as I wrote earlier, ableism is just as pervasive as racism and sexism. Simply put, we need allies — people who see us as real people.

Finding a cure for cancer or a cure for AIDS is one thing. Disabilities need no cure. What truly needs curing is society’s proclivity to view the disability community as little more than real-life characters from a Tod Browning film. Disabled people are not freaks. Disability isn’t a bad word. You can learn a lot from us.

MrBeast’s blindness video puts systemic ableism on display by David Riggs originally published on TechCrunch

Sony aims to make PlayStation more accessible with Project Leonardo controller

Sony has been embracing accessibility options in its games for a few years now, but one place it has lagged behind perennial rival Microsoft is in accessible hardware. They aim to change that with Project Leonardo, a new gaming controller aimed to be customizable to the needs of any person.

The device was described only generally on stage, but it appears to be a hub with swappable parts and plates that let users connect various other items, such as breath tubes, pedals, and switches of all kinds to activate different buttons.

Each UFO-shaped Project Leonardo device can handle an analog joystick plus 8 buttons, and they can be paired with each other or with a traditional controller to complement or offer alternatives to any function. Sony worked with accessibility experts to make sure it was useful to a wide range of people.

It’s similar to how Microsoft’s Xbox Adaptive Controller works — some stuff is built in, some you provide yourself. Everyone’s accessibility needs are a little different, and so it’s important to support the solutions people already have. Besides, that stuff is expensive!

We’re waiting to find out more about this project, when it will be available for people to use, the technical details, and the design process behind it. We’ll update this post with more info when it’s available.

Sony aims to make PlayStation more accessible with Project Leonardo controller by Devin Coldewey originally published on TechCrunch

Amazon’s Echo Show adds more accessibility features, including ‘Gestures’ and text-to-speech

Amazon today is introducing a small handful of new features for its digital assistant Alexa that aim to make the device more accessible. The company is launching two new ways to interact with Alexa without speaking including support for Gestures on Echo Show devices that will users to interact with the device by raising their hand — something that can also come in handy for anyone using Echo while cooking who want to quickly dismiss a timer without having to speak. In addition, Amazon is rolling out text-to-speech options and a way to turn on all closed captioning features at once across devices.

The new features are the latest to arrive in a push to make Alexa a more accessible tool, and follow the fall launch of a “Tap to Alexa” option for Fire tablets that allow users to interact with the voice assistant without speaking.

With Gestures, Amazon says users will be able to hold up their hand — palm facing the camera — to dismiss timers on the Echo Show 8 (2nd Gen.) or 10 (3rd Gen) devices. Beyond enabling nonverbal customers to use the device, Amazon also envisions a common scenario where users in the kitchen are cooking while listening to music and don’t want to have to scream over their tunes to be heard by Alexa or touch the screen with messy hands. The gesture could give them an easier way to interact with Alexa, in that case.

Gestures are not enabled by default — you’ll have to visit Settings, then Device Options to access the option. (Presumably, by calling it “Gestures” and not “Gesture,” Amazon has other plans in store for this feature down the road.)

To work, Gestures uses on-device processing to detect the presence of a raised hand during an active timer, Amazon said. Users will not have to enroll in other visual identification features like Visual ID, the Echo Show’s facial recognition system, to use it.

The company is also launching text-to-speech functionality to the new Tap To Alexa feature, which today provides customers with a dashboard of Alexa commands on the Echo’s screen which they can tap to launch. With text-to-speech, customers will now be able to type out phrases on an on-screen keyboard to have them spoken aloud by their Echo Show. These commands can also be saved as shortcut tiles and customized with their own icon and colors.

The feature aims to help customers with speech disabilities, or who are nonverbal or nonspeaking who can use text-to-speech to communicate with others in their home, for example by typing out “I’m hungry.”

Image Credits: Amazon

The third new addition is called Consolidated Captions, and allows customers to turn on Call CaptioningClosed Captioning, and Alexa Captions at once across all their supported Echo Show devices. This enables customers to turn on captions for things like Alexa calls and captions for Alexa’s responses, which helps those who deaf, hard of hearing, or who are using Alexa in loud or noisy environments, Amazon says.

This feature is enabled by tapping Settings, then Accessibility, and selecting “Captions.”

Image Credits: Amazon

The new features come at a time when Amazon is trying to determine how to proceed with Alexa, whose division at the company saw significant layoffs and, per an Insider report, is said to be on pace to lose Amazon around $10 billion this year as opportunities to monetize the platform, like voice apps known as Skills, have failed to gain traction with consumers. Alexa owners also tend to only use the device for basic tasks, like playing music, operating smart home devices, using timers and alarms, and getting weather information, among other things.

More recently, Amazon has been positioning its Echo Show devices as more of a family hub or alternative to the kitchen TV. Its wall-mounted Echo Show 15, for example, offers widgets for things like to-do lists and shopping lists and just rolled out Fire TV streaming.

Amazon says the new Echo Show features are rolling out now.

Amazon’s Echo Show adds more accessibility features, including ‘Gestures’ and text-to-speech by Sarah Perez originally published on TechCrunch

Explore accessibility via Amazon Alexa at Sight Tech Global 2022

A little-appreciated fact about Amazon’s Alexa is where the voice service’s public debut took place before it became widely available to customers. The venue was CSUN, the longstanding conference dedicated to assistive technology for people with disabilities, notably vision loss. No one had more to gain from a machine that could provide quick replies to spoken questions, and no group could provide more incisive feedback. 

At the upcoming Sight Tech Global conference on December 7 & 8 (virtual and free — register here), two of Amazon’s foremost accessibility leaders, Peter Korn, Director of Accessibility, Devices & Services, and Dr. Joshua Miele, Principal Accessibility Researcher, will discuss how Amazon continues to dig deeper into the accessibility and fairness surrounding the remarkable Alexa voice service, which is used by millions of customers around the world, billions of times each week. 

As Korn and Miele will point out, the advantages Alexa confers to blind people, for example, does not necessarily work in the same way for people who have speech disabilities; at the same time, Alexa’s capabilities long ago escaped the bounds of speech-based interaction. Today, 30% of Alexa interactions in the home are not prompted by users’ voice commands but by Alexa’s fascinating side hustles like Hunches and Routines

And in a nod to the reality that not everyone speaks in a way that Alexa can understand today, Amazon recently joined a consortium of technology companies, including Apple, Google, Meta and Microsoft, to launch the Speech Accessibility Project with the University of Illinois Urbana-Champaign (UIUC), which is using AI and new voice datasets to make speech recognition systems, like Alexa, and other voice services better able to understand diverse speech patterns. 

For people who work in assistive technologies, it was no surprise that Alexa’s first public debut was at CSUN. Many remarkable technologies have started with the blind. It was technology legend Ray Kurzweil who in 1976 took a huge step forward in optical character recognition (OCR), the ancestor of today’s computer vision, when he unveiled the $50,000 Kurzweil Reading Machine at a press conference hosted by the National Federation of the Blind. OCR spawned countless businesses outside of accessibility, as well as many powerful and virtually free tools used by the blind today, including many we discuss at Sight Tech Global. 

The Alexa that Amazon demoed nearly eight years ago in front of the CSUN audience has done just the same, growing into a service with a vast number of features, like Show and Tell and Notify When Nearby, that help an increasing number of people, in part because of the Amazon team’s focus on an inclusive approach that aims to leave no one behind while also making Alexa better and more helpful for everyone. 

Join Sight Tech Global for this session and many more, which you can see on the complete agenda. Now in its third year, Sight Tech Global brings together the world’s top technologists in AI and other advanced technologies to address assistive technology for the people who are blind. Register today

We’re grateful to sponsors iSenpai, Google, Amazon, LinkedIn, HumanWare, Microsoft, Ford, Fable, APH and Waymo. If you would like to sponsor the event, please contact us. All sponsorship revenues go to the nonprofit Vista Center for the Blind and Visually Impaired, which has been serving the Silicon Valley community for 75 years.

Explore accessibility via Amazon Alexa at Sight Tech Global 2022 by David Riggs originally published on TechCrunch

Google’s Reading Mode app helps visually impaired people read long-form content

Along with its Android update for December, Google has launched a new app called Reading Mode today. It helps people with visual imparities and dyslexia read the content on the screen — especially articles.

The newly released app works on any device running Android 9.0 or above. Once you install it on your phone, you will have to turn on the toggle for the app under the Accessibility settings. This allows the app to have a floating button on the screen all the time, so you can turn any app or webpage into a more accessible version.

You will have to enable shortcut for the app from Settings to use the app Image Credits: Google

Reading Mode turns the content on the current screen into a simpler format. It features controls for adjusting contrast, font type, line space, and size. What’s more, you can ask it to read out the content on the screen and control the playback speed. The app also allows you to quickly change the reading voice as well. Users can also turn on a toggle to have the app highlight current text being read by the voiceover feature.

Image Credits: Google

Google already offers a number of accessibility tools including a screen reader with TalkBack and a built-in Braille keyboard. As the name suggests, Reading Mode have been specifically designed to read or listen to long text, such as online articles. It isn’t meant to read everything that is on the screen like buttons and their purposes

Google’s Reading Mode app helps visually impaired people read long-form content by Ivan Mehta originally published on TechCrunch

Android’s December update features includes accessible reading mode and sharable car keys

Google is rolling out its Android feature updates for December across phones, watches, and Google TV. These updates include an accessible reader mode, a new YouTube Search widget, shareable digital car keys, new action tiles for Wear OS, along with some holiday special features.

Here’s a roundup of everything Google is rolling out.

Phone

  • The Android December update brings an accessible reader mode that helps folks with dyslexia or visual imparity to consumer content better. The mode lets users control contrast, font type, and size for better visibility. Plus, it has a text-to-speech function with speed control so they can listen to the articles online. You have to install the Reading Mode app on your phone and follow the instruction to turn on the shortcut.

Image Credits: Google

  • Google is also introducing a new YouTube home screen widget that has easy access to the search bar, Home, Shorts, Subscriptions, and Library.

  • Beginning next week, users will be able to cast a title to a compatible TV directly from the Google TV app. This allows you to play something while looking for other stuff to watch, switch to another app to check an update, and use the phone as a remote as well. The company first talked about this feature back in May at its Google IO developer conference.
  • Google Photos is adding new designs for collages, made by Australian husband-and-wife visual duo DABSMYLA and renowned watercolor artist Yao Cheng Design, for the holiday season. You can select photos in the Google Photos app, add them to a collage, and browse through these new styles to create a final frame.
  • Gboard’s Emoji Kitech also adding support for the blue heart 💙,  snowman ⛄, and snowflake ❄ for emoji mashups during the holiday season.

Car

  • Google first started rolling out support for digital car keys to unlock your car last year. With this new update, it will now allow you to share access to your car with friends and family on Pixel and iPhones. The company said you can use the digital wallet app to view and change access to the digital car key. It added that this feature will soon be available on some other devices running Android 12 and up.

Watch

  • Wear OS update includes the introduction of new Tiles — widget-like screens to access information quickly — including favorite contacts and time of sunrise and sunset. It already has Tiles for Google Maps to let you access directions to saved places like work and home, and Google Keep to start a new note or a list. The company also has an API for third-party developers to take advantage of the Tiles format on Wear OS.

Google Tiles

Google Tiles

  • Google has updated the Keep app on Wear OS to make notes and lists have a better format to be read on a watch with support for custom backgrounds, photos, and drawings. Plus, you can view labels and collaborators to a list.
  • The search giant is adding support for the Adidas running app to Google Assistant. The update, rolling out over the next week, will allow you to use the Assistant to start more than 30 exercises through a voice command on your watch. You can say “Hey Google, start a run with Adidas Running” to start tracking your run with the app.

For enabling new features in individual apps like Google TV and Gboard, you will need to update those apps to their latest version.

Android’s December update features includes accessible reading mode and sharable car keys by Ivan Mehta originally published on TechCrunch

Tatum is building a robot arm to help people with deafblindness communicate

Precise numbers on deafblindness are difficult to calculate. For that reason, figures tend to be all over the place. For the sake of writing an intro to this story, we’re going to cite this study from the World Federation of the DeafBlind that puts the number of severe cases at 0.2% globally and 0.8% of the U.S.

Whatever the actual figure, it’s safe to say that people living with a combination of hearing and sight loss is a profoundly underserved community. They form the foundation of the work being done by the small robotics firm, Tatum (Tactile ASL Translational User Mechanism). I met with the team at MassRobotics during a trip to Boston last week.

The company’s 3D-printed robotic hand sat in the middle of the conference room table as we spoke about Tatum’s origins. The whole thing started life in summer 2020 as part of founder Samantha Johnson’s master’s thesis for Northeastern University. The 3D-printed prototype can spell out words with American Sign Language, offering people with deafblindness a window to the outside world.

From the user’s end, it operates similarly to tactile fingerspelling. They place the hand over the back of the robot, feeling its movements to read as its spells. When no one is around who can sign, there can be a tremendous sense of isolation for people with deafblindness, as they’re neither able to watch or listen to the news and are otherwise cut off from remote communication. In this age of teleconferencing, it’s easy to lose track of precisely how difficult that loss of connection can be.

Image Credits: Tatum Robotics

“Over the past two years, we began developing initial prototypes and conducted preliminary validations with DB users,” the company notes on its site. “During this time, the COVID pandemic forced social distancing, causing increased isolation and lack of access to important news updates due to intensified shortage of crucial interpreting services. Due to the overwhelming encouragement from DB individuals, advocates, and paraprofessionals, in 2021, Tatum Robotics was founded to develop an assistive technology to aid the DB community.”

Tatum continues to iterate on its project, through testing with the deafblind community. The goal to build something akin to an Alexa for people with the condition, using the hand to read a book or get plugged into the news in a way that might have otherwise been completely inaccessible.

In addition to working with organizations like the Perkins School for the Blind, Tatum is simultaneously working on a pair of hardware projects. Per the company:

The team is currently working on two projects. The first is a low-cost robotic anthropomorphic hand that will fingerspell tactile sign language. We hope to validate this device in real-time settings with DB individuals soon to confirm the design changes and evaluate ease-of use. Simultaneously, progress is ongoing to develop a safe, compliant robotic arm so that the system can sign more complex words and phrases. The systems will work together to create a humanoid device that can sign tactile sign languages.

Image Credits: Tatum Robotics

Linguistics: In an effort to sign accurately and repeatably, the team is looking to logically parse through tactile American Sign Language (ASL), Pidgin Signed English (PSE) and Signed Exact English (SEE). Although research has been conducted in this field, we aim to be the first to develop an algorithm to understand the complexities and fluidity of t-ASL without the need for user confirmation of translations or pre-programmed responses.

Support has been growing among organizations for the deafblind. It’s a community that has long been underserved by these sorts of hardware projects. There are currently an estimated 150 million people with the condition globally. It’s not exactly the sort of total addressable market that gets return-focused investors excited — but for those living with the condition, this manner of technology could be life changing.

Tatum is building a robot arm to help people with deafblindness communicate by Brian Heater originally published on TechCrunch