Comment on this story
Comment
Over decades, input devices in the video game industry have evolved from simple joysticks to sophisticated controllers that emit haptic feedback. But with Enabled Play, a new piece of assistive tech created by self-taught developer Alex Dunn, users are embracing a different kind of input: facial expressions.
While companies like Microsoft have sought to expand accessibility through adaptive controllers and accessories, Dunn’s new device takes those efforts even further, translating users’ head movements, facial expressions, real-time speech and other nontraditional input methods into mouse clicks, key strokes and thumbstick movements. The device has users raising eyebrows — quite literally.
“Enabled Play is a device that learns to work with you — not a device you have to learn to work with,” Dunn, who lives in Boston, said via Zoom.
Dunn, 26, created Enabled Play so that everyone — including his younger brother with a disability — can interface with technology in a natural and intuitive way. At the beginning of the pandemic, the only thing he and his New Hampshire-based brother could do together, while approximately 70 miles apart, was game.
“And that’s when I started to see firsthand some of the challenges that he had and the limitations that games had for people with really any type of disability,” he added.
At 17, Dunn dropped out of Worcester Polytechnic Institute to become a full-time software engineer. He began researching and developing Enabled Play two and a half years ago, which initially proved challenging, as most speech-recognition programs lagged in response time.
“I built some prototypes with voice commands, and then I started talking to people who were deaf and had a range of disabilities, and I found that voice commands didn’t cut it,” Dunn said.
That’s when he started thinking outside the box.
Having already built Suave Keys, a voice-powered program for gamers with disabilities, Dunn created Snap Keys — an extension that turns a user’s Snapchat lens into a controller when playing games like Call of Duty, “Fall Guys,” and “Dark Souls.” In 2020, he won two awards for his work at Snap Inc.’s Snap Kit Developer Challenge, a competition among third-party app creators to innovate Snapchat’s developer tool kit.
With Enabled Play, Dunn takes accessibility to the next level. With a wider variety of inputs, users can connect the assistive device — equipped with a robust CPU and 8 GB of RAM — to a computer, game console or other device to play games in whatever way works best for them.
Dunn also spent time making sure Enabled Play was accessible to people who are deaf, as well as people who want to use nonverbal audio input, like “ooh” or “aah,” to perform an action. Enabled Play’s vowel sound detection model is based on “The Vocal Joystick,” which engineers and linguistics experts at the University of Washington developed in 2006.
“Essentially, it looks to predict the word you are going to say based on what is in the profile, rather than trying to assume it could be any word in the dictionary,” Dunn said. “This helps cut through machine learning bias by learning more about how the individual speaks and applies it to their desired commands.”
Dunn’s AI-enabled controller takes into account a person’s natural tendencies. If a gamer wants to set up a jump command every time they open their mouth, Enabled Play would identify that person’s individual resting mouth position and set that as the baseline.
In January, Enabled Play officially launched in six countries — its user base extending from the U.S. to the U.K., Ghana and Austria. For Dunn, one of his primary goals was to fill a gap in accessibility and pricing compared to other assistive gaming devices.
“There are things like the Xbox Adaptive Controller. There are things like the HORI Flex [for Nintendo Switch]. There are things like Tobii, which does eye-tracking and stuff like that. But it still seemed like it wasn’t enough,” he said.
Compared to some devices that are only compatible with one gaming system or computer at a time, Dunn’s AI-enabled controller — priced at $249.99 — supports a combination of inputs and outputs. Speech therapists say that compared to augmentative and alternative communication (AAC) devices, which are medically essential for some with disabilities, Dunn’s device offers simplicity.
“This is just the start,” said Julia Franklin, a speech language pathologist at Community School of Davidson in Davidson, N.C. Franklin introduced students to Enabled Play this summer and feels it’s a better alternative to other AAC devices on the market that are often “expensive, bulky and limited” in usability. Many sophisticated AAC systems can range from $6,000 to $11,500 for high-tech devices, with low-end eye-trackers running in the thousands. A person may also download AAC apps on their mobile devices, which range from $49.99 to $299.99 for the app alone.
“For many people who have physical and cognitive differences, they often exhaust themselves to learn a complex AAC system that has limits,” she said. “The Enabled Play device allows individuals to leverage their strengths and movements that are already present.”
Internet users have applauded Dunn for his work, noting that asking for accessibility should not equate to asking for an “easy mode” — a misconception often cited by critics of making games more accessible.
“This is how you make gaming accessible,” one Reddit user wrote about Enabled Play. “Not by dumbing it down, but by creating mechanical solutions that allow users to have the same experience and accomplish the same feats as [people without disabilities].” Another user who said they regularly worked with young patients with cerebral palsy speculated that Enabled Play “would quite literally change their lives.”
But the device isn’t limited to the gaming sphere. It’s also being used in schools to make computer labs more accessible. With the rise in remote work and online learning environments brought on by the pandemic, Jaipreet Virdi, a historian, author and professor at the University of Delaware, said the device may serve as a model for “inclusive participation” in schools.
“If disabled students can learn and keep up with the expected educational rate through these [assistive] technologies, then they can thus graduate with more opportunities than their disabled ancestors ever had,” Virdi said.
In some therapy programs in the U.S., specialists use Enabled Play to track facial expressions and gamify treatment sessions. Alissa McFall, a speech language pathologist and orofacial myologist in Sacramento, said it can be used to analyze how a patient’s muscles work so that health professionals can then use that feedback to develop customized treatment plans.
“The biggest value we’ve seen so far using the Enabled Play device is that it can be programmed to read natural communication movements and connect each sound or facial expression to a function that is meaningful to an individual,” McFall said.
Since its launch in January, Enabled Play has partnered with a number of organizations in the gaming and assistive tech sphere, including Special Effect, Makers Making Change and — more recently — Microsoft with its Designed for Xbox accessibility partners program. Next Dunn hopes to soon roll out “virtual devices,” which would allow other developers to add Enabled Play’s inputs to their apps. With these additions, a person could use facial expressions and voice commands in Microsoft Word and Adobe Photoshop without buying a separate device.
As developers look for ways to make tech more accessible, Dunn hopes to help drive that change, encouraging others to think far beyond the typical keyboard and mouse inputs.
“It’s a very personal mission of mine to solve these problems,” he said. “That’s the difference that I’m after, which is to build devices that change the human-computer interaction paradigm to one that’s just more inclusive.”
Amanda Florian is a journalist based between the U.S. and Shanghai. She reports on tech, culture and China’s new media scene.
Source by www.washingtonpost.com