freaksiop.blogg.se

Android closed captioning
Android closed captioning











android closed captioning

ANDROID CLOSED CAPTIONING ANDROID

Live Caption will be launched on the Google Pixel this fall, and the wider Android Q ecosystem some time in the future. Indeed, with Live Caption, it seems like you could read a podcast much like an e-book. “You can imagine with a podcast, there isn’t anything on the screen you’re looking at so you just want more captions,” says Bleuel. You can also increase the box’s size, the font size, and the color for legibility. Together, all of these features mean it’s easy to turn on and off, and a mere tap or two away from a user at any time.Īs for the text window itself, you can drag and drop it anywhere on the screen at any time, basically making the optimal interface for yourself. “And any time it detects an audio stream on your phone, a video on a social network–or someone sent a voice message, or a video in Google Photos, it will pull up a caption box and start captioning that in real time,” says Bleuel. Once it’s on, it’s just on until you turn it back off. You activate it, not inside accessibility settings or some deep menu, but a menu that appears when you tap the volume buttons of your phone.

android closed captioning

“That’s how you get to the point to make something universally useful and accessible.”įrom that feedback, the design morphed into a relatively simple dark gray box with white text. “Even though I don’t consider this to be an accessibility feature, I’d rather start by building it for the people who need and want it the most,” says Bleuel. The team shared the idea with designers who were deaf and hard of hearing, and they were remarkably receptive to it. It’s a floating button that you could activate in the settings and tap when you needed to translate audio to text. Where could these captions float without getting in the way?Īt first, the team mocked up something akin to Chat Head, a late UI from Facebook that is used in some Android functions. On mobile phones, every app interface is a little bit different. There’s only one constant video stream that takes up your whole screen–so sticking it near the bottom generally works. On television, where it began in the 1970s, closed captioning is pretty straightforward. While the Pixel currently has features like call screening, which uses AI on the phone to detect and transcribe what someone on hold is saying, to caption everything on the device requires the Android team to recode some fundamental bits of Android’s audio architecture.īeyond that, there were big questions of what closed captioning on a phone would even look like. Of course, there were reasons why Google couldn’t easily caption every piece of content inside Android. It would also be handy for anyone who was using their phone somewhere without sound. Captioning would be wonderful for the deaf community. Somewhere early in the process, the Lab landed on a big idea born from the process: “We were thinking, if YouTube could caption every video, why couldn’t we do that for every piece of content on your phone?” says Nicole Bleuel, team lead on the project with the Creative Lab. What Wong describes is almost a textbook definition of inclusive design, or bringing in people who are considered edge users of a product to spearhead design and development. What’s tough in your life? How do we solve that?’ It’s designing with, not designing for.” It’s more like, ‘You have a different take on the world, a different experience. “You start with one person, don’t even try to solve their problem, but get with them, design with them,” Wong explains. The lab has since dubbed this wider initiative Start with One. We brought her in, said let’s talk about the community, and workshopped things,” says Robert Wong, VP of Google Creative Lab. The project was born out of Google’s Creative Lab, which invited KR Liu, an advocate for the deaf and hard of hearing, to the office. With Live Caption, a world of otherwise inaccessible content will be made available to the deaf and hard-of-hearing community. Similarly, podcasts are rarely transcribed, and personal videos that friends share via text never feature closed captioning. As a result, many videos aren’t captioned at all. Much like we’ve seen with Google’s music identification service (which identifies 70,000 songs) and Night Sight photography (which can basically see in the dark), the technology uses shrunk-down machine-learning algorithms to run right on your device.Įven though most every service allows creators to manually caption their videos, it can be laborious to do so. No data leaves your phone, and it can even work in Airplane Mode. Live Caption, it’s called, doesn’t use the cloud.













Android closed captioning