Introduction
In late September 2023, Meta unveiled its second generation of smart glasses in collaboration with Ray-Ban [1]. These smart glasses come with several improvements, including enhanced audio and cameras, over and a lighter design. The glasses are equipped with an ultra-wide 12 megapixel camera and immersive audio recording capabilities, allowing users to capture moments with a high level of detail and depth (Fig. 1) [1, 2]. These smart glasses are part of Meta’s efforts to develop AR and VR technologies. In addition, the glasses are equipped with AI-powered assistants like Meta AI [1].
Ray-Ban Meta smart glasses also represent a promising development in assistive technology for individuals with visual impairments and have the potential to significantly enhance their quality of life. The field of assistive technology has been advancing rapidly in recent years, particularly due to significant advances in artificial intelligence [3] and augmented reality [4]. Envision is currently one of the leading smart glasses developers, and their technology allows visual information to be articulated into speech for individuals with vision impairments. A recent update included GPT integration, allowing users to ask the glasses specific questions, like to summarize text, or only reading vegan items from a menu. GPT-4 [5]. Future updates will further increase the usefulness of this integration [6].
The Envision smart glasses are built on the Google Glass Enterprise Edition 2 (now discontinued), and the high price of the Google smart glasses likely posed as a barrier of the adoption to this helpful technology in vision impaired individuals. Lowering the cost of assistive technologies is essential, as previous research in the UK found a staggeringly low employment rate of 26% for blind and partially sighted working age individuals [7].
By Meta attempting to make smart glasses a mainstream technology, the cost of smart glasses will continue to decrease in the coming years. The incorporated advanced camera technology can provide real-time image processing, while the built in AI can recognize objects and convert this visual information into speech [1]. An update planned within the next year is expected to allow users to ask Meta AI questions about what they are looking at. Users can potentially interact with these assistants to receive auditory information about their environment, read text aloud, recognize faces, or get directions, which can be invaluable for individuals with visual impairments (Fig. 2). Future incorporation of GPS navigation accompanied with audio cues facilitates self-navigating for individuals with visual impairments in new environments. Previous research in the U.K. showed that nearly 40% of blind and partially sighted individuals are not currently able to complete all of the journey that they need or wish to make [7]. Better accessibility through the usage of smart glasses can lead to greater independence for individuals with vision impairments.
Meta hopes to incorporate augmented reality in future versions of smart glasses, and describes the current stage as a stepping stone to true augmented reality. Users with vision impairments would benefit highly from true augmented reality glasses, with potential features like magnification, contrast enhancement, and color correction, enhancing their ability to see and navigate their surroundings more effectively. Meta’s future augmented reality work will be compared to the Apple Vision Pro, which is also looking to make mixed reality devices mainstream [8, 9]. Further research will also be required to minimize the variability between various different VR/AR devices prior to clinical use [10]. We look forward to continued advances in augmented reality with AI integration, and believe this technology can revolutionize how individuals with vision impairments interact with the world.
References
-
Introducing the New Ray-Ban | Meta Smart Glasses. Meta. 2023. https://about.fb.com/news/2023/09/new-ray-ban-meta-smart-glasses/.
-
Iqbal MZ, Campbell AG. Adopting smart glasses responsibly: potential benefits, ethical, and privacy concerns with Ray-Ban stories. AI Ethics. 2023;3:325–327.
Google Scholar
-
Waisberg E, Ong J, Paladugu P, Kamran SA, Zaman N, Lee AG, et al. Challenges of artificial intelligence in space medicine. Space Sci Technol. 2022;2022:1–7.
Google Scholar
-
Masalkhi M, Waisberg E, Ong J, Zaman N, Sarker P, Lee AG, et al. Apple vision pro for ophthalmology and medicine. Ann Biomed Eng. 2023;51:2643–2646.
Google Scholar
-
Waisberg E, Ong J, Masalkhi M, Kamran SA, Zaman N, Sarker P, et al. GPT-4: a new era of artificial intelligence in medicine. Ir J Med Sci. 2023. https://doi.org/10.1007/s11845-023-03377-8.
-
Paladugu PS, Ong J, Nelson N, Kamran SA, Waisberg E, Zaman N, et al. Generative adversarial networks in medicine: important considerations for this emerging innovation in artificial intelligence. Ann Biomed Eng. 2023;51:2130–2142.
Google Scholar
-
Slade, J, Edwards, R. My Voice 2015: the views and experiences of blind and partially sighted people in the UK. accessed 10 Oct. 2023.
-
Waisberg E, Ong J, Masalkhi M, Zaman N, Sarker P, Lee AG, et al. Apple Vision Pro and why extended reality will revolutionize the future of medicine. Ir J Med Sci. 2023. https://doi.org/10.1007/s11845-023-03437-z.
-
Waisberg E, Ong J, Masalkhi M, Zaman N, Sarker P, Lee AG, et al. The future of ophthalmology and vision science with the Apple Vision Pro. Eye. 2023. https://doi.org/10.1038/s41433-023-02688-5.
-
Sarker P, Zaman N, Ong J, Paladugu P, Aldred M, Waisberg E, et al. Test–retest reliability of virtual reality devices in quantifying for relative afferent pupillary defect. Trans Vis Sci Technol. 2023;12:2.
Google Scholar
Author information
Authors and Affiliations
Contributions
EW—Writing. JO—Writing. MM—Writing, figure development. NZ—Review, intellectual support. PS—Review, intellectual support. AGL—Review, intellectual support. AT—Review, intellectual support.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Reprints and Permissions
About this article
Cite this article
Waisberg, E., Ong, J., Masalkhi, M. et al. Meta smart glasses—large language models and the future for assistive glasses for individuals with vision impairments.
Eye (2023). https://doi.org/10.1038/s41433-023-02842-z
-
Received: 17 October 2023
-
Revised: 01 November 2023
-
Accepted: 10 November 2023
-
Published: 04 December 2023
-
DOI: https://doi.org/10.1038/s41433-023-02842-z