A driving simulator as a tool for benchmarking optical lenses

A driving simulator as a tool for benchmarking optical lenses

Pixabay:ID 12019

Conception of the Aalen Mobility Perception & Exploration Lab (AMPEL)

The Aalen Mobility Perception & Exploration Lab (AMPEL) is part of the Competence Center “Vision Research” in the Innovation Centre (Inno-Z) at the University of Applied Sciences in Aalen, Germany. A virtual research environment was developed there, which allows for night driving experiments under highly standardized conditions. This also allows for benchmarking optical lenses.
By Judith Ungewiss, Michael Wörner, Ulrich Schiefer

About one third of all road traffic accidents occurs at night. This seems to correspond to the percentage of the dark period on a superficial view. However, taking into account that only approximately 25% of the yearly mileage is driven at nighttime, the risk of a traffic accident with a fatal outcome is increased by about 50 % at night in comparison to the same distance traveled during the day (data collected for the Federal Republic of Germany) [1][2]. Nighttime driving ability is therefore of specific importance in road traffic.
With regard to the effort involved in setting up a driving simulator, the question as to whether this is worthwhile in order to determine the ability for night driving arises. In this respect – even though we feel that the value of each human life is “inestimable” and thus cannot be quantified in monetary terms – it should be noted that the statistical value of a (traffic) accident death in the Federal Republic of Germany is estimated at an average of about 4.5 million Euros [3]. In comparison, the costs for driving simulator experiments, which in the best case contribute to saving human lives in the future, seem manageable and legitimate. Furthermore, such a simulator has the important advantage to display scenarios in a highly immersive way under standardized conditions (cited as in [4]).

Setup and interior of the Aalen Mobility Perception & Exploration Lab (AMPEL)

The Aalen Mobility Perception & Exploration Lab (AMPEL), operated by the Competence Center “Vision Research” in the Innovation Centre (Inno-Z) on the campus of the Aalen University of Applied Sciences, was launched in 2015 and consists of two large laboratories with associated measurement and workstations.

The nighttime driving simulator situated in the AMPEL lab contains a completely retrofitted Audi A4 (Audi AG, Ingolstadt, Germany) with a steering and pedal unit (SensoDrive GmbH, Wessling, Germany), and two fully digital displays (instrument panel and navigation monitor). Two high-performance projectors (Zeiss Velvet 1600, Zeiss AG, Jena, Germany), which are commonly used in a planetarium environment, project the driving route on a cylindrical 180° screen with a radius of 3.20 m. A monitor (KD 65 XF 9005 BAEP, Sony, Tokyo, Japan) behind the trunk of the car provides an almost realistic scenario through the rear-view mirror. A vehicle can be brought into the laboratory via a sliding window and, if necessary, be exchanged for another vehicle or another test arrangement (see Fig. 1 and Fig. 2). Driving scenarios are imported using the SILAB simulator software developed by the WIVW (Würzburg Institute for Traffic Sciences, Veitshöchheim, Germany). It is possible to visualize traffic routes using existing GPS data.

@image: Fig.1 hier

@image: Fig.2 hier

Speed-related driving noise can be generated by a loudspeaker (BeoPlay Beolit 15, Bang & Olufsen, Struer, Denmark). This noise acts as an acoustic feedback and therefore supports the patients with regard to velocity control while driving.

Calibration procedures are regularly executed prior to the start of a study (Spectroradiometer CAS 140 VIS/UV, Instrument Systems GmbH, Munich, Germany and Minolta Luminance Meter LS160, Konica Minolta Holdings K.K., Tokyo, Japan).

@subhead: Simulator vs. on-road experiments
Why are experiments not directly implemented in the form of (real) on-road driving, but instead a simulator is set up with a great deal of effort with the aim of creating a driving environment as close to reality as possible?
The main reason for this is the complexity and thus usually inadequate standardization of the on-road experiments: It is almost impossible to realize identical conditions with respect to the road surface, weather, lighting, wind conditions etc. for several experimental runs (which may additionally also be executed on different days). The personnel expenditure for on road experiments is also considerable: On the routes used, the street lighting may have to be switched off, and the routes must be locked by roadblocks or supervisory staff so that passers-by do not endanger themselves or the experiments. Additionally, for insurance reasons, such experiments are only permitted in a vehicle with a dual brake system set under the constant supervision of a driving instructor. Experiment supervisors and operators (responsible for the technical implementation of the experiments) are required.
On the other hand, only one test supervisor and one operator are required to carry out the corresponding experiments in a simulator. A driving simulator allows for highly standardized examinations without compromising the safety of test persons [4].

@subhead: Technical and experimental opportunities
The driving simulator in the AMPEL laboratory realizes the determination of visual acuity and contrast sensitivity also while driving. Individual local thresholds are assessed by eight-position Landolt Cs. The stimuli can be presented at various locations under different distances within the setup:
• at the projection screen (right road side, 4.66 m)
• at the monitor just behind the trunk via the center of the rear-view mirror (3.40 m)
• at the center of the navigation monitor (0.75 m)
• at the center of the instrument panel (distance 0.72 m) (see Figure 3).

@image: Fig. 3 hier

The appearance of the visual signal can be announced to the test participant with an audio signal. The opening of the appearing Landolt C has to be indicated (verbally) by the test person and is recorded by the test supervisor or – as an alternative – by a miniaturized microphone. In this way it is possible not only to check the correctness of the respective response but also to record its latency. Beyond Landolt Cs, obstacles (e.g. pedestrians or animals, each with various contrast levels) can be presented.

The simulator is equipped with an eye tracking system (SmartEye Pro, sampling rate 120 Hz, gaze accuracy of 0.5° under ideal conditions, SmartEye AB, Gothenburg, Sweden), which enables the contact-free recording and evaluation of head and eye movements. For this purpose, the driver is observed via (a minimum of) three infrared cameras while driving. The driving scenario is recorded by an additional scene camera. With the help of this set up, head and eye movements as well as fixations of the driver can be assigned and annotated to certain objects or regions of interest.

A realistic display of glare, for example from headlights of oncoming vehicles, is an essential factor for the representation of a close-to-reality driving environment. Simply projecting virtual headlights on the screen of the simulation environment does not achieve the necessary luminance values. Instead, a patented mobile glare device was developed for the AMPEL laboratory: With the help of cable robots, wireless LED arrays controlled via WiFi are moved in both, horizontal and vertical directions. The power supply is provided by small lithium-polymer accumulators, as used in model aircraft constructions [4].

In order to validate such a simulator, it has to be determined whether a virtual scenario actually measures what it is supposed to measure. Such validation is achieved by comparing the simulator results with those of a real on-road driving test under comparable conditions.
At the AMPEL lab, the on-road parcours of the “Burren” campus at the Aalen University, next to the driving simulator, is transferred into its virtual environment by using the Software package SILAB (Würzburg Center for Traffic Sciences, WIVW, Veitshöchheim, Germany) [4].

@subhead: Benchmarking optical lenses
Optical lenses are usually evaluated by ray tracing methods or questionnaires: Ray tracing is well suited to characterize the image quality at the retina level whereby the image processing in the subsequent visual pathway is not taken into account. Questionnaires are subjective tools with an inherent lack of standardization.

A new, patented approach was set up at the AMPEL laboratory: A psychophysical test records the individual’s visual performance or impairment in a location-specific manner within the highly standardized environment of a driving simulator. LED arrays that are either static or moving via cable robots serve as glare sources [5]. Static and dynamic optotypes, moving along so-called vectors with a constant angular velocity, are presented for this purpose. The vector origins can be placed within the center of a glare source (current setup: Visual angle: 0.3°, luminance level 60 kcd/m2, 6.2° left of and 1.1° below the fixation mark, corresponding to the left headlight of an oncoming car [GOLF VII, Volkswagen AG, Wolfsburg, Germany]). By this way the individual, location-related extent of the visual impairment due to halo or starburst can be assessed (see Figure 3). Local threshold variability and individual response time are assessed by repeated presentations and vector placements within unaffected visual field areas on meridians 0°, 45°, 90°, 135°, 180°, 225°, 270°, and 315° (in random order).
The extent of the displacement of the isopters during the glare condition, compared to the initial condition without glare represents the magnitude of impairment due to glare / halo size (see Fig. 4) [6]. Patient responses can be recorded in order to extract reaction times either from these recordings or from the keypad inputs.

The setup is being used for characterizing the visual effects of media opacities (cataract) in motorists (ContrastVal study, ClinicalTrials.gov Identifier: NCT03169855) for benchmarking the impact of any kind of optical corrections, such as intraocular lenses (JJ-EYHANCE study, ClinicalTrials.gov Identifier: NCT04059289), spectacle lenses, contact lenses, or other (surgical) refractive procedures.

@image: Fig.4 hier

@subhead: Conclusions
In the AMPEL lab, a virtual test environment was created which allows for night driving experiments under highly standardized conditions. By means of visual acuity or contrast vision testing under static or dynamic conditions and at various locations, benchmarking optical lenses can be achieved. In addition, static or dynamic blinding of the test subjects with simultaneous time-resolved visual function testing is possible. In comparison to real on-road experiments, simulator experiments are safe, well-standardized and require a comparatively low effort. A validation is possible if required.

[1] Kraftfahrtbundesamt. Erneut mehr Gesamtkilometer bei geringerer Jahresfahrleistung je Fahrzeug. https://www.kba.de/DE/Statistik/Kraftverkehr/VerkehrKilometer/2017_vk_kurzbericht_pdf.pdf?__blob=publicationFile&v=14. Updated April 18, 2020. Accessed April 18, 2020
[2] Statistisches Bundesamt (Destatis). Verkehrsunfälle Zeitreihen 2017. 2018
[3] Spengler H. Kompensatorische Lohndifferenziale und der Wert eines statistischen Lebens in Deutschland. Zeitschrift Arbeitsmarktforschung 3:269–305. 2004
[4] Ungewiß J, Schiefer U, Wörner M. Untersuchung der Nachtfahrtauglichkeit im Simulator – Vorstellung des Aalen Mobility Perception & Exploration Lab (AMPEL). Der Augenspiegel 06/2019:38-41. 2019
[5] Eichinger P, Sauter T, Schiefer U, Schmitt U, Ungewiss J, Hirche M, Schuster M. Fahrsimulator und Verfahren zur Durchführung einer Fahrsimulation. German Patent and Trademark Office: Patent 10 2017 126 741, IPC G09B 9/02
[6] Schiefer U, Ungewiss J, Wörner M. Ortsbezogene Quantifizierung von Halo- und Streulichtbeeinträchtigung. German Patent and Trademark Office: Patent 10 2019 121 602 A1, WO 2020/089284 A1

The authors would like to thank Prof. Dr. Peter Eichinger, Prof. Dr. Ulrich Schmitt and their whole ContrastVal study team (Mechatronics, Aalen University, Germany) for the design and implementation of the glare sources and Prof. Dr. Jürgen Nolting and Prof. Dr. Günter Dittmar (AWFE Steinbeis Transfer Centre, Aalen, Germany) for their comprehensive support in light measurement tasks.

Disclosure of financial and proprietary interests:
Ulrich Schiefer is consultant of the Haag-Streit AG, Köniz, Switzerland.
Michael Wörner is Managing Director of Blickshift GmbH, Stuttgart, Germany.


Corresponding author:
Judith Ungewiss – judith.ungewiss@hs-aalen.de
Judith Ungewiss holds a M.Sc. in Ophthalmic Optics & Psychophysics. She works as scientific assistant in the Competence Center “Vision Research” in the study course ophthalmic optics & audiology at Aalen University.
@image: Judith_Ungewiss

Dr.-Ing. Michael Wörner is a software engineer who works as scientific assistant in the Competence Center “Vision Research” in the study course ophthalmic optics & audiology at Aalen University. In addition, he is Managing Director of Blickshift GmbH in Stuttgart, Germany.
@image: Michael_Wörner

Prof. Dr. med. Ulrich Schiefer is head of the Competence Center “Vision Research” (study course Ophthalmic Optics at Aalen University), which addresses the visual system and its possible dysfunctions, especially regarding the development and validation of examination and therapy procedures. After his medical studies and service in the Dept. of Ophthalmology at Military Hospital Ulm, he joined the University Eye Hospital Tübingen in 1989 where he has filled various positions in a full-time appointment until 2012 and in a part-time scientific appointment up to now. Ulrich Schiefer holds several patents on perimetric examination technique and other mainly sensory physiological examination methods.

EssilorLuxottica cooperates with Facebook for smart glasses

EssilorLuxottica cooperates with Facebook for smart glasses

EssilorLuxottica and Facebook announced a multi-year collaboration to develop the next generation of smart glasses. Facebook founder and CEO Mark Zuckerberg shared the news during Facebook Connect, an annual conference held virtually from California.

The partnership aims to combine Facebook apps and technologies, Luxottica’s market power and iconic brands, and Essilor’s advanced lens technology to “help people stay better connected with their friends and families.” The first product will be the world’s most popular eyewear brand, Ray-Ban, and should be launched in 2021.

“We’re passionate about researching devices that help people better connect with those who are closest to them. Wearables have the potential to do this. With EssilorLuxottica we have an equally ambitious partner who brings their expertise and their first-class brand catalog to the first truly fashionable smart glasses, ”said Andrew Bosworth, Vice President of Facebook Reality Labs.

“We are particularly proud of our collaboration with Facebook, which projects a cult brand like Ray-Ban into an increasingly digital and social future. By combining a brand that is loved and worn by millions of consumers around the world with technology that has brought the world closer together, we can redefine expectations for wearables. We are paving the way for a new generation of products designed to change the way we see the world, ”said Rocco Basilico, Chief Wearables Officer, Luxottica.

Product name, specifications, software capabilities, pricing, and other details will be announced prior to launch in 2021.

Picture: Pixabay: LoboStudioHamburg

The Vision Council launches diversity, equity and inclusion task force

The Vision Council launches diversity, equity and inclusion task force

The Vision Council announced the launch of a Diversity, Equity and Inclusion Task Force. The task force launch is part of the organization’s larger brand reimagination focused on community-building and industry growth and follows The Vision Council’s recent unveiling of a redesigned logo and new website designed to better serve and accommodate the needs of members and the vision care community at large. Comprised of 13 vision care community leaders, the task force’s mission is to amplify and embrace everyone’s voice within all segments of the vision industry in order to push forward a more equitable agenda for the eyewear and eyecare industry and to build a more diverse and inclusive community.

“In 2019, The Vision Council conducted one of the first studies of diversity in leadership for the optical industry. The results were intended to benchmark the optical leadership landscape and illuminate the opportunities companies have to promote diversity, cultivate talent and improve their performance,” said Ashley Mills, CEO of The Vision Council. “We shared these results at The Vision Council’s annual Executive Summit in January 2020, during which we also announced a five-year plan reflecting a renewed focus on community-building and promoting industry growth. The survey was an important starting point in gathering information and data that would allow us to foster a more inclusive vision care community, while the creation of a Diversity, Equity and Inclusion Task Force demonstrates our commitment to following through on this goal. The task force will be charged with developing dynamic programming and community initiatives and also identifying milestones and goals for the industry in order to provide tangible, actionable pathways for our members to access.”

“I am honored to lead this effort to create more opportunities for courageous and impactful engagements within the communities we serve,” said Tarrence Lackran, Director of Programming and Partnerships at The Vision Council. “Launching the task force now, on the heels of the debut of The Vision Council’s new brand identity, is a testament to the concrete steps The Vision Council is taking to advocate for all members of the vision care industry and promote industry growth in an inclusive way. Given the challenges the community is grappling with, fostering an industry culture that is welcoming and inclusive is of the utmost importance.”

Task force members include:
• Dr. Derrick Artis – Management Consultant at Artis Consulting
• Lanard Atkins – Opticianry Program Director at Georgia Piedmont Technical College
• Dr. Diana Canto-Sims – Co-Founder at Buena Vista Optical and Designer at La Vida Eyewear
• Gai Gherardi – Co-Founder at l.a.Eyeworks
• Rebecca Giefer – CEO, Americas at MODO Eyewear
• Dr. Millicent Knight – SVP, Customer Development Group at Essilor
• Blake Kuwahara – President/CEO, Focus Group West and Creative Director, Blake Kuwahara Eyewear
• Dr. Howard Purcell – President and CEO at New England College of Optometry
• Dr. Danielle Richardson – Independent Optometrist at Optometrix
• Tiffany Smith – Regional Director of Doctor Recruitment at National Vision
• Phernell Walker – Director of Optometric Relations at VSP Ventures
• Kyly Zak Rabin – Co-Founder at Zak.
• Christine Yeh – Executive Editor at 20/20 Magazine