Opening Up our World Through Closed Captioning 

 READ THE ENTIRE MARCH 2023 EDITION OF INSIDE EQUAL ACCESS
This article originally appeared in the March 2023 edition of Inside Equal Access. 

Image
James Trombka
By: James Trombka

One of the many changes to come from the pandemic was the shift of live events to remote. From school and education, to work meetings and concerts, the world quicky learned to adapt to a virtual environment. During this time, many of those events included subtitles, or open captions, to accommodate viewers who were hard of hearing or deaf.While many events have moved back to an in-person setting, the use of closed captioning has remained. A number of recent studies have shown that more and more viewers are opting in to using closed captioning, regardless of if they are hard of hearing, especially Gen Z viewers. A study reported on by the BBC states that 80% of viewers ages 18 to 24 use subtitles most of the time while only about 25-35% of viewers who are 45 or older tend to use them. Moreso, many older viewers report that they find subtitles and captions to be distracting from their viewing experience. As the study showed, “young people are almost 4 times more likely than older viewers to watch TV shows with subtitles, despite having fewer hearing problems.” Because of this upward shift in the use of captioning technology, especially in the younger generations, captions are being integrated into many more viewing mediums and will most likely continue to do so in the years ahead.  

A recent video on YouTube by the video news site Vox had a title proclaiming, “Why we all need subtitles now”. The video accounted for the changes in the filming and production of the movie industry. Many of these changes have made films in general harder for audiences to hear and understand. With innovations in the quality and number of microphones, actors are less likely to enunciate which creates “mumble acting”. It is then up to sound mixers and editors’ jobs to clarify the audio the best they can. Even then, movie audio is edited and tailored to be run through the sound systems of Dolby Atmos theatres which run up to 128 audio channels. For movies to be converted to devices that have only 2 audio channels, such as most televisions, computers, and phones, the audio must be condensed down, making it garbled and muddier. Much of this talk was sparked with the pandemic release of the latest Christopher Nolan movie, Tenet, as many moviegoers complained that it was difficult to understand the dialogue regardless of if an individual was hard of hearing.  

Because of the upward shift in the use of captioning technology, especially in the younger generations, captions are being integrated into many more viewing mediums and will most likely continue to do so in the years ahead.

There are two main categories of captioning: open and closed. Open captioning is text that is “baked” into a video and cannot be removed by a viewer. Seeing a foreign film at a movie theater with captions translating dialogue into English is an example of open captioning. Closed captioning on the other hand can be turned on or off by the viewer. With modern technology, closed captioning has become readily available through most streaming and online digital platforms. While there has been a push in many states to require movie theaters to have open captioning for all of their films, especially after the release of the movie Tenet, there has also been pushback by viewers who say the captions are too distracting and take away from the movie.  

Traditionally, a moviegoer who requests closed captioning technology in a standard movie showing can pick up a closed captioning device at the box office called a CaptiView. About the size of a slightly elongated smart phone, the small box device is attached to a handle and base which slides easily into the armrest cupholder of a moviegoer’s seat.  The closed captions on the device are then synced automatically to the movie being played. May users of these devices, however, state that it can be difficult to divert their attention back and forth between the CaptiView’s and theater’s screens and will many times miss out on important pieces of dialogue or visuals due to this diverted attention.   

Similar issues can be seen with live theater performances as well. Instead of using a device like the CaptiView, many live theater venues have special closed-captioned performances and will set up monitors on either side of the stage displaying the dialogue taking place on stage. While this solution does make theater-going more accessible, audience members still run into similar issues of having to divert their attention between the action on stage and the dialogue displayed on monitors. Seat placement can also be an issue depending on the size of the text being displayed and the quality of the audience member’s eyesight.  

A few years ago, to address these frequently reported issues Jonathan Suffolk, the Technical Director at the National Theatre in London presented a solution with Smart Caption Glasses. Looking like eyewear from the future, these glasses can be worn with or without glasses and display captions directly on their lenses. Just like with a CaptiView, the captions are synced to the dialogue of the performance, however, they are layered onto of what a viewer is seeing. Because of the captions displayed on the lenses, an audience member can look around which allows them to watch the show and read the captions at the same time, ensuring they don’t miss key pieces of the action onstage or diagloue. change the size, color, position, and brightness contrast of the captions being displayed. Similar to a CaptiView, audience members requesting captioning can pick the device up at the box office and use it at the show for free. Since the National Theatre’s first use of the device, Suffolk reports that the number of accessible performances for those who are deaf or hard of hearing have gone from 5% (for the traditional closed captioning model) to 80% of all their performances. Devices like the Smart Caption Glasses have begun to spread not only to other live theater venues, but many other movie theaters as well throughout the world.  

Closed captioning of live events has also spread to the workplace as well, including the Senate in Washington D.C. John Fetterman, the state senator for Pennsylvania, suffered a near fatal stroke last May. One of the lingering effects of this stroke is a neurological condition that impairs his hearing. Due to his auditory processing issues, a monitor that rises and lowers and a custom desk at the center dais of the Senate chamber were installed in order for live audio-to-text transcription to be displayed for Senator Fetterman to use. He also uses a tablet when he sits in on committee meetings or when having a conversation in order to utilize this speech to text technology as well. In the history of the Senate, there has only been one deaf senator, Samuel McEnergy of Louisiana who served from 1897 to 1910, so the transcription technology being used and installed is the first of its kind in the Senate. Other senators and colleagues of Fetterman “view the changes being made to accommodate him as modernizations for the Senate, a workplace like any other”. (NY Times Article).  

Closed captions and captioning technology continue to be used more widely in all areas of life with new innovations coming out every year. From the entertainment industry and movies watched, to the live theatre productions, the workplace, and even the Senate floor, captioning technology is being integrated making our world more accessible.