Subtitles in virtual reality: Guidelines for the integration of subtitles in 360º content

Authors

DOI:

https://doi.org/10.17533/udea.ikala.v25n03a03

Keywords:

subtitles, virtual reality, immersive media, 360º content, hearing impairment, subtitling strategies

Abstract

Immersive content has become a popular medium for storytelling. This type of content is typically accessed via a head-mounted visual display within which the viewer is located at the center of the action with the freedom to look around and explore the scene. The criteria for subtitle position for immersive media still need to be defined. Guiding mechanisms are necessary for circumstances in which the speakers are not visible and viewers, lacking an audio cue, require visual information to guide them through the virtual scene. The aim of this reception study is to compare different subtitling strategies: always-visible position to fixed-position and arrows to radar. To do this, feedback on preferences, immersion (using the ipq questionnaire) and head movements was gathered from 40 participants (20 hearing and 20 hard of hearing). Results show that always-visible subtitles with arrows are the preferred option. Always-visible and arrows achieved higher scores in the ipq questionnaire than fixed-position and radar. Head-movement patterns show that participants move more freely when the subtitles are always-visible than when they are in a fixed position, meaning that with always-visible subtitles the experience is more realistic, because the viewers do not feel constrained by the implementation of subtitles.

|Abstract
= 1271 veces | PDF
= 625 veces|

Downloads

Download data is not yet available.

Author Biographies

Belén Agulló, Universitat Autònoma de Barcelona

Ph. D. in Translation and Intercultural Studies, Universitat Autònoma de Barcelona, Spain. eLearning Director and Lead Media Researcher, Nimdzi Insights, Spain.

Anna Matamala, Universitat Autònoma de Barcelona

Ph. D. in Applied Linguistics, Universitat Pompeu Fabra, Spain. Associate professor, Universitat Autònoma de Barcelona, Spain

References

Agulló, B. (in press). Technology for subtitling: a 360-degree turn. Hermeneus, 22.

Agulló, B., Matamala, A., & Orero, P. (2018). From disabilities to capabilities: testing subtitles in immersive environments with end users. HIKMA, 17, 195-220. https://doi.org/10.21071/hikma.v17i0.11167

Agulló, B., & Matamala, A. (2019). Subtitling for the deaf and hard-of-hearing in immersive environments: results from a focus group. The Journal of Specialised Translation, 32, 217-235. Retrieved from http://www.jostrans.org/issue32/art_agullo.pdf

Agulló, B., Montagud, M., & Fraile, I. (2019). Making interaction with virtual reality accessible: rendering and guiding methods for subtitles. Artificial Intelligence for Engineering Design, Analysis and Manufacturing, 33(4), 416-428. https://doi.org/10.1017/S0890060419000362

APANAH (n.d.). Retrieved from https://www.apanah.com/

Arnáiz-Uzquiza, V. (2012). Subtitling for the deaf and the hard-of-hearing: Some parameters and their evaluation [Doctoral thesis, Universitat Autònoma de Barcelona, Barcelona, Spain]. https://www.tdx.cat/handle/10803/117528

Asociación Española de Normalización y Certificación (AENOR). (2003). Norma UNE 153010: Subtitulado para personas sordas y personas con discapacidad auditiva. Subtitulado a través del teletexto. https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma/?Tipo = N&c = N0029761

Bartoll, E., & Martínez-Tejerina, A. (2010). The positioning of subtitles for the deaf and hard of hearing. In A. Matamala, & P. Orero (Eds.), Listening to subtitles: Subtitles for the deaf and hard of hearing (pp. 69-86). Peter Lang.

BBC (n.d.). Reel World. 360° Videos from the BBC. Retrieved from https://www.bbc.com/reel/playlist/reel-world-360-videos-from-the-bbc

Brown, A., Turner, J., Patterson, J., Schmitz, A., Armstrong, M., & Glancy, M. (2017). Subtitles in 360-degree video. Proceedings of the 2017 ACM International Conference on Interactive Experiences for TV and Online Video, Hilversum, Netherlands, 3-8. https://doi.org/10.1145/3084289.3089915

Brown, A., Turner, J., Patterson, J., Schmitz, A., Armstrong, M., & Glancy, M. (2018). Exploring subtitle behaviour for 360° video (White Paper WHP 330) https://www.bbc.co.uk/rd/publications/whitepaper330

Catapult Digital. (2018). Growing VR/AR companies in the UK: A business and legal handbook. https://www.pwc.co.uk/intelligent-digital/vr/growing-vr-ar-companies-in-the-uk.pdf

Cayrol, A. (Producer), & Zandrowicz, P, (Director). (2016). I, Philip [Video file] Retrieved from https://www.arte.tv/sites/webproductions/en/i-philip/

D’Ydewalle, G., Pollet, J., & van Rensbergen, J. (1987). Reading a message when the same message is available auditorily in another language: the case of subtitling. In J. K. O’Regan, & A. Lévy-Schoen (Eds.), Eye movements: From physiology to cognition. (pp. 313–321). Elsevier Science Publishers. https://doi.org/10.1016/B978-0-444-70113-8.50047-3

Díaz-Cintas, J. (2013). The technology turn in subtitling. In M. Thelen, & B. Lewandowska-Tomaszczyk (Eds.), Translation and meaning: Part 9 (pp. 119-132). Zuyd University of Applied Sciences.

Díaz-Cintas, J. (2014). Technological strides in subtitling. In Chan Sin-wai (Ed.), The Routledge encyclopedia of translation technology, (pp. 632–643). Routledge.

Díaz-Cintas, J., Orero, P., & Remael, A. (Eds.), (2007). Media for all: Subtitling for the deaf, audio description, and sign language. Rodopi. https://doi.org/10.1163/9789401209564

European Broadcasting Union (EBU). (2017). Virtual reality: How are public broadcasters using it? https://www.ebu.ch/publications/virtual-reality-how-are-public-broadcasters-using-it

Graham (2015). Jaunt And RYOT Announce VR Documentary Series Holy Land. Retrieved from https://www.vrfocus.com/2015/12/jaunt-and-ryot-announce-vr-documentary-series-holy-land/

igroup presence questionnaire (n.d.). Igroup Presence questionnaire overview [html]. Retrieved from http://www.igroup.org/pq/ipq/index.php

Immersive Accesibility website (n.d.). Presentation [html]. Retrieved from http://www.imac-project.eu/

Kruger, J. L. (2012). Making meaning in AVT: eye tracking and viewer construction of narrative. Perspectives: Studies in Translation Theory and Practice, 20(1), 67-86. https://doi.org/10.1080/0907676X.2011.632688

Kurzhals, K., Cetinkaya, E., Hu, Y., Wang, W., & Weiskopf, D. (2017). Close to the action: Eye-Tracking evaluation of speaker-following subtitles. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, USA (pp. 6559-6568). https://doi.org/10.1145/3025453.3025772

Lessiter, J., Freeman, J., Keogh, E., & Davidoff, J. (2001). A cross-media presence questionnaire: The ITC-Sense of Presence Inventory. Presence: Teleoperators, and Virtual Environments 10(3), 282-297. https://doi.org/10.1162/105474601300343612

MacQuarrie, A., & Steed, A. (2017). Cinematic virtual reality: Evaluating the effect of display type on the viewing experience for panoramic video. Proceedings of 2017 IEEE Virtual Reality (VR), USA (pp. 45-54). https://doi.org/10.1109/VR.2017.7892230

Mangiron, C. (2013). Subtitling in game localisation: a descriptive study. Perspectives: Studies in Translation Theory and Practice, 21(1), 42-56. https://doi.org/10.1080/0907676X.2012.722653

Matamala, A., & Orero, P. (Eds.) (2010). Listening to subtitles: Subtitles for the deaf and hard of hearing. Peter Lang.

Monotype (2017). The virtual frontier [blog entry]. Retrieved from https://medium.com/@monotype/the-virtual-frontier-8f05bf20e92d

NYT VR (n.d.). Virtual reality [blog entry]. Retrieved from http://www.nytimes.com/marketing/nytvr/

NYT VR Player (n.d.). Virtual reality player [blog entry]. Retrieved from https://www.nytimes.com/video/360-video

Perego, E., Del Missier, F., Porta, M., & Mosconi. M. (2010). The cognitive effectiveness of subtitle processing. Media Psychology, 13(3), 243-272. https://doi.org/10.1080/15213269.2010.502873

Perkins Coie LLP (2018). 2018 augmented and virtual reality survey report: Insights into the future of AR/VR. https://www.perkinscoie.com/images/content/1/8/v2/187785/2018-VR-AR-Survey-Digital.pdf

Romero-Fresco, P. (2009). More haste less speed: edited vs. Verbatim respeaking. Vigo International Journal of Applied Linguistics, (6), 109-133. https://dialnet.unirioja.es/servlet/articulo?codigo = 3207248

Romero-Fresco, P. (Ed.). (2015). The reception of subtitles for the deaf and hard of hearing in Europe. Peter Lang. https://doi.org/10.3726/978-3-0351-0888-0

Romero-Fresco, P. (2019). Accessible filmmaking: Integrating translation and accessibility into the filmmaking process. Routledge. https://doi.org/10.4324/9780429053771

Rothe, S., Höllerer, T., & Hussmann, H. (2018). CVR-analyzer: a tool for analyzing cinematic virtual reality viewing patterns. Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia, Cairo, Egypt, 127-137. https://doi.org/10.1145/3267782.3274688

Rothe, S., Tran, K., & Hussmann, H. (2018). Dynamic subtitles in cinematic virtual reality. ACM TVX 2018, Seoul, Republic of Korea, 209–214. https://doi.org/10.1145/3210825.3213556

Schubert, T. (2003). The sense of presence in virtual environments: A three-component scale measuring spatial presence, involvement, and realness. Zeitschrift für Medienpsychologie, 15(2), 69-71. https://doi.org/10.1026//1617-6383.15.2.69

Sidenmark, L., Kiefer, N., & Gellersen, H. (2019). Subtitles in interactive virtual reality: Using gaze to address depth conflicts. Proceedings of Workshop on Emerging Novel Input Devices and Interaction Techniques. Osaka, Japan. https://eprints.lancs.ac.uk/id/eprint/132411/

Slater, M., & Usoh, M. (1993). Representations systems, perceptual position, and presence in immersive virtual environments. Presence 2(3), 221-233. https://doi.org/10.1162/pres.1993.2.3.221

Szarkowska, A., Krejtz, I., Kłyszejko, Z., & Wieczorek, A. (2011). Verbatim, standard, or edited? Reading patterns of different captioning styles among deaf, hard of hearing, and hearing viewers. American Annals of the Deaf, 156(4), 363-378. https://doi.org/10.1353/aad.2011.0039

Szarkowska, A., Krejtz, I., Pilipczuk, Dutka, L., & Kruger, J. (2016). The effects of text editing and subtitle presentation rate on the comprehension and reading patterns of interlingual and intralingual subtitles among deaf, hard of hearing and hearing viewers. Across Languages and Cultures, 17(2), 183-204. https://doi.org/10.1556/084.2016.17.2.3

The New York Times & Within (Producers), & Solomon, B. C. & Ismail, I. (2015). The Displaced [Video file]. Retrieved from: https://www.nytimes.com/video/magazine/100000005005806/the-displaced.html

Ulanoff, L. (2019, January 17). Why Gen Z loves closed captioning [Web log post]. https://onezero.medium.com/why-gen-z-loves-closed-captioning-ec4e44b8d02f

Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence Teleoperators & Virtual Environments, 7(3), 225–240. https://doi.org/10.1162/105474698565686

Downloads

Published

2020-09-12

How to Cite

Agulló, B., & Matamala, A. (2020). Subtitles in virtual reality: Guidelines for the integration of subtitles in 360º content. Íkala, Revista De Lenguaje Y Cultura, 25(3), 643–661. https://doi.org/10.17533/udea.ikala.v25n03a03

Issue

Section

Empirical Studies

Categories