The last frontier for progressive design personalization

The last frontier for progressive design personalization

Tran

Gaze dynamics diagnosis.

The desire to design the ideal progressive addition lens for each user has been a constant over the years.
The tools that have emerged along the years to offer customization demonstrate this. One new tool is a lens calculation technology that can take into account the gaze dynamics diagnosis of each person´s unique way of looking at the world.
By Pau Artús

 

 

Since the first progressive addition lenses were industrially produced in 1959, a variety of companies have offered similar technical solutions. All the designs around that time were of a similar type giving birth to what today is known as the hard design philosophy, i.e. more abrupt transitions between the areas with and without unwanted lateral astigmatism. The first soft designs appeared in the 1980s, with a more gradual transition between those areas, but in exchange they suffered from some invasions of astigmatism into the usually clear areas. This led to trials and studies to try to understand people´s preference for one philosophy or another. First recommendations were based on the wearer´s experience with progressive lenses.
Later began the customization race where the position of wear of the frame would also be taken into account. Three additional parameters were required for this purpose: back vertex distance, pantoscopic tilt, and wrap angle. This had a significant contribution to the user’s visual quality.
In the on-going search for personalization, the relationship between head and eye movements were also studied. For this purpose, a coefficient relating these two parameters was established and users were classified according to whether they moved their head more or less than their eyes.
But in spite of the improvement in visual quality, progressive lenses did not provide completely comfortable vision for all of the users.

Then lens manufacturers began to show interest in understanding the effect of the lifestyle of presbyopic patients and in assigning designs based on their daily activities. By means of a questionnaire, the practitioner collected qualitative information about the patient´s activities and lifestyle to offer a solution that was adapted to the specific needs. However, although this method is still widely used nowadays it relies on subjective information that depends on the
patient’s perception and the circumstances they may be living at that particular moment. For example, when asked “how often you play sports,” or even “how much time you spend using electronic devices,” the same person can give different answers at different times of the week, or even answers that are strongly conditioned by his or her wishes rather than by reality.
Furthermore, knowing the time spent on different types of activity does not give very relevant information about how lenses are used, since two people may behave very differently when doing the same activity. In short, the various tools developed to customize progressive lenses have no doubt improved presbyopic users’ vision but are still far from achieving the ideal solution that perfectly adapts to each patient’s needs.

Gaze dynamics – definition
Gaze dynamics can be defined as the movement of your direction of gaze with time. It involves the whole human oculomotor system (head, neck [1], eyes) while individuals follow objects with their gaze that draw their attention or that are relevant to the activity they are performing. This combination of head, neck and eyes movements is unique to each person, as each individual has a different way of combining them [2].
For gaze dynamics to be considered as reliable, the user’s visual environment must be free of any visual impairment, meaning that no element in the system should prevent the individual from looking at objects naturally. So, we define the visual environment as the spatial area a person can reach by naturally performing a combination of eye and head movements.

Gaze dynamics: when it forms and how it changes
Development of perceptual [3] and oculomotor abilities of vision occurs as the necessary anatomical and physiological maturation of the visual structures is acquired. The visual system is immature at birth.
The motor response is limited due to the immaturity of the fovea, reduced visual acuity, and low contrast sensitivity. All of this limits fine control of  accommodation, tracking movements, and precision of monocular and binocular alignment. A baby will be able to demonstrate a range of eye movements that, while less sophisticated than those of adults, indicate normal visual development [4].
Similarly, as you become older, some muscles and joints become less and less flexible and the tracking movements gradually increase eye movement ratio. In other word, gaze dynamics changes along the life of a person.

Fig. 1: Tracking movements

Eye movements
Eye movements [5] depend on the oculomotor system and have two main functions: to locate the image of the objects in the visual field in the fovea and to keep the image in that position. Among the various types of eye movements [2], tracking movements (Fig. 1) are the most relevant in the study of gaze dynamics.
Tracking movements are oscillatory, rapid and have a clear fixation stimulus [6]. They are voluntary, i.e., they are produced in a coordinated manner by both eyes. Their function is to maintain the gaze against involuntary eye drifting. They also regulate fixation [7] in vergence movements [8].
These movements are brief and intermittent in babies, interrupted by saccadic movements that help restore fixation. Tracking movements are not smooth until 8-12 weeks. Horizontal tracking appears before vertical (as is the case for saccadic movements). The individual also responds sooner to fixations at constant speed and later to fixations at variable speed.

Fig. 2: Torsional VOR (Vestibulo-Ocular Reflex)

Head movements
These are necessary to observe the whole visual environment. Their function within gaze dynamics is to compensate for eye movements to ensure correct fixation on the fovea. Head movements are defined by the head-neck [1] combination. From a mechanical point of view, this is defined as a set of masses supported by a practically rigid system (the skeleton) connected by viscoelastic soft tissues.
When the head is tilted, the oculomotor system causes a torsion of the eyes intended to compensate for the loss of verticality that the retinal image would suffer under these conditions. The torsional Vestibulo-Ocular Reflex (VOR) is responsible for this process (Fig. 2). So, when the head is tilted to the right, there is a shift of the eyes to the left, while tilting to the left causes a shift of the eyes to the right.
In short, the visual system is responsible for compensating for the displacement generated in the image due to a head movement. In this way, gaze dynamics is the mechanism that ensures that the image will be as static as possible on the retina all the time.

Gaze dynamics and progressive lenses
When the concept of gaze dynamics was introduced above, we talked about the required absence of impairments in the visual environment to make sure a person can perform eye and head movements without limitations. However, things change when progressive lenses are used. With the use of progressive lenses, gaze dynamics encounters limitations. Progressive lenses have clear areas for all visual distances but as a side effect, they have other areas, typically on the sides of the corridor, with aberrations that the user perceives in the form of blurring. This is usually known as unwanted astigmatism. This means that the visual environment will be limited, and therefore the user’s gaze dynamics will be altered.

Fig. 3: Virtual Reality headset

Gaze dynamics measurement tool: Virtual Reality headsets
There are many commercial devices that allow the measurement of gaze dynamics. Eyetrackers are among the most common ones. These devices can trace the user gaze direction while he or she is exposed todifferent kinds of stimulus. Many studies use the measurement of gaze dynamics to understand preferences and priorities of users when being presented a lot of information simultaneously or when looking at different stimuli that keep their attention.
However, gaze dynamics applied to ophthalmic also need to track the users head movement to truly understand the real use of each area of the lens. In other words, recording only the gaze direction might not give you the full lens use information since the position of the head will ultimately determine through what area of the lens the gaze will go.
Recent technology advances in virtual reality headsets (Fig. 3) have brought this technology to a point that makes it specifically suitable to be used as a gaze dynamics measurement device. A virtual reality headset with software specially designed can be used to diagnose the gaze dynamics of a specific patient, i.e. by enabling controlled measurement of eye and head movements in a controlled environment free of external factors.
This immersion allows the user to live the experience in first person, as they can see as if they were really inside the virtual world, minimizing the sensation of user-device interface and establishing a direct relationship between them. Virtual reality headsets are composed of cameras and sensors that
integrate the subject in the virtual environment and precisely record user movements in it. They also ensure an optimal binocular visual field, which gives an unrestricted space by simulating natural gaze movement. Speakers are usually present to enhance a fully immersive sensory experience. In addition, given the three-dimensional nature of these systems, they can also allow gaze dynamics evaluation at many visual distances to register whether there are significant differences while the person looks at a stimulus that is at a near, intermediate or far distance.

Fig. 4: Example of a frequency map. Red color shows high frequency of use of that area while blue color shows low frequency of use.

Diagnosis: frequency maps
Gaze dynamics diagnosis for ophthalmic applications can be nicely represented using frequency maps (Fig. 4). These maps can be readily calculated by taking into account eye and head movement and they allow you to identify what areas of the lens will be used in a higher or lower frequency.
Many parameters can be extracted from each of these maps. Overall size can be considered important since it is related to the total area of the used lens. Vertical to horizontal size ratio can also be relevant since it will show preference for longitudinal eye or head movements over transversal.
And, obviously, the actual location of the maximum frequency areas should also be taken into account since it will show user preferences for certain parts of the lens. Finally, when different maps registered at different distances are combined, the quality of the gaze dynamics study for that person can be greatly increased. All the information relevant for a fully personalized lens design can be revealed.

 

Gaze dynamics provide truly personalized lenses
We know that user satisfaction with progressive lenses is subjective, i.e. it depends on each person’s perception. This means that, when trying two progressive lenses of the same category (e.g. premium) but different manufacturers, two users may feel more comfortable with one over the other.
The reason behind this is that optical features that are significantly different among two designs can have different perception and acceptance by different users, but this is also telling us that a preferred ideal solution may exist for each person. That is why one of the major challenges in the ophthalmic sector has been to understand and characterize users to try to identify which parameters could reveal their preferences. And accordingly, the major challenge in the lens designing field is to be able to provide a wide enough range of optical features in progressive lens designs that can fulfill user preferences.
But what happens nowadays is that the progressive lens market offers a finite set of solutions. This means that a user has a limited list of lens design options available at a given selling point, often strongly biased by the previously established business relationship between the optical shop and its supplier.
Therefore, an unprecedented lens calculation technology needs to be created that can smoothly vary optical features continuously and gradually, without jumps, to accommodate any registered gaze dynamics. Other industries, such as aeronautics, had a dream many years ago of continuously changing the geometry of physical objects applying what they call the morphing technique. Aeronautic engineers dreamed of modeling aircraft wing flaps in such a way that geometry changes could happen like those in a bird’s wings to optimize performance at any given moment.
The optical industry needs to incorporate a similarly disruptive technology into designs to create infinite solutions for infinite individuals that enable the generation of the optimal solution for each of them, incorporating his or her frequency map as one of the main inputs. The result could finally be a truly tailor-made progressive lens for a particular user, which can take into account the gaze dynamics diagnosis of someone’s unique way of looking at the world around them.

In summary
The combination of gaze dynamics measurements using Virtual Reality headsets with the appropriate design flexibility can become the last frontier for the progressive design personalization. Thanks to it, a presbyopic patient will be offered a progressive solution that perfectly matches his or her visual requirements. A careful study and interpretation of the gaze dynamics can allow an accurate and reliable diagnosis for each patient to create a solution for
them that will be the preferred one. And probably more important, the gaze dynamics measurement using a virtual reality device can take the lens shopping experience to an unprecedented exciting level where the lens selling step is no longer a disorienting moment of pain where customers need to cope with the idea of paying the highest cost of the spectacle for a round transparent piece of material that the ECP prescribes. Instead, the customer perceives now he has become an active part shaping the lens features through an immersive and unique experience.

Acknowledgements: Ivette Bruguera and Elisabeth Meliá

References:
[1] https://es.slideshare.net/Natt-N/movimientos-del-cuello-55378548
[2] Alberich, J. Ferre, A. Gómez, D. Percepción Visual. Universitat Oberta de Catalunya.
[3] Peter N.T. Wells, William R. Hendee. The Perception of Visual Information. Second Edition.
[4] Pacheco, M. Desarrollo Postnatal de la Función visual. Universistat Politècnica de Catalunya.
[5] Motilidad y Percepción Binocular: Tipos de Movimientos Oculares. Universitat Politècnica de Catalunya
[6] Antony, F. Coordination of Eye and Head Movements during Reading. Article in Investigative Ophthalmology & Visual Science · July 2003
[7] Federico, A. Federighi, P. Rosini, F. Rufa, A. Veneri, G. Evaluating gaze control on a multi-target sequency task: The distribution of fixations is evidence of exploration optimization. Computers in Biology and Medicine.
[8] Pons, A. Fundamentos de visión Binocular. Universidad de Alicante.
[9] Azadi, R. De Vries, J.P. Harwood, M.R. The saccadic size-latency phenomenon explored: Proximal target size is a determining factor in the saccade latency. Vision Research.

 

Pau Artús: The Chief Innovation Officer (CIO) at Horizons Optical studied Chemistry at Universitat de Barcelona and got his M.Sc. in Molecular Magnetism at Indiana University. His ophthalmic career started in the R&D department of Indo in 2002 where Artús has developed most of his professional career as a Material Chemist and Coatings Developer until he became the R&D manager in 2008. In parallel, he got his Ph.D. in Mechanical Properties of Plastic Materials for Ophthalmic Lenses (UPC in 2009) and a mastersdegree in Innovation Management (UPF in 2011). In 2017 the company Horizons Optical was created as a spin-off of Indo where Artús initially was the Operations Director and later became the CIO.

Demystifying the dip coating process

Demystifying the dip coating process

Hard coating of ophthalmic lenses

For most of the ophthalmic laboratories, among the many steps for lens manufacturing, the most critical one is certainly the hard coating process. This is probably linked to the fact that it is a chemistry related process and not a purely mechanical one. Most laboratory managers have a good knowledge of optics, mechanics and IT, but little or no knowledge of chemistry. Besides, machine manufacturers share the same knowledge features as their customers. Most of them believe that there is a cleaning/rinsing operation before the dip coating of the lenses, but this is not really the case. Some others may know
that it is not really the case, but since they also sell consumables, including lacquers, their interest is to maximise the consumption of the chemicals in their machines. By François Breton

Image by artographer34

Actually, there are two parts in a dip coating machine, the first part being the lens surface preparation, and the second part being the dip coating itself. The truth is that 90% of the problems observed when taking lenses out of the machine come from the first part. Why is it so? Mostly because the key ingredient of this process is the water and moreover the tap water! This process is not at all a cleaning process because lenses are clean already when they are placed in the machine. What we are explaining here is that the lens substrates are made out of organic polymers with a given porosity. These polymers need to be ‘activated’ in order to create a chemical bonding with the hard coating lacquer. This activation requires a strong alkaline species (Fig. 3) in order to withdraw protons from the substrate. In order to maximize the effect of either sodium or potassium hydroxide, it is necessary to use a surfactant for a good wetting of the lens surface, since lens materials are more hydrophobic than hydrophilic.
Ultrasonic waves help also a lot in that activation process, although for most of the machines, transducers are placed under the tanks which leads to waves going from bottom to top, therefore not hitting the lens surface directly. This is the reason why activation times may be quite significant and increasing with the lens refractive index which substrates are much less chemically reactive than standard 1.5 material. Once this activation is achieved to a significant level, comes the rinsing stage and this is where most of the commercial dip coating machines are doing it wrong …

A surfactant is like a soap

We can compare what is happening to the lenses to what is happening to our hands when washing them. In such a case, it is obvious that the soap goes away much faster with warm than with cold water and with hard than with soft water. Yet, as can be seen on those machines, this tap water rinse tanks are neither heated nor ultrasonically powered. As a consequence, surfactants are carried forward into the DI water tank which, even fitted with heat and ultrasonics, is not able to remove the surfactant.

What is then happening is that the remaining surfactant, carried forward by the lenses, is finally removed from the lacquer by the solvent. This leads to a lacquer contamination which will expire faster than it should. This is rather easy to understand, so it is pointing out the wrong design of many commercial dip coating machines. In some cases, it can even be worse, as there is sometimes an extra tank with an acidic soap, supposed to neutralize alkaline residues. This makes sense if this process would simply be a cleaning/rinsing one. But it is not!

Rinsing with an acidic soap is nonsense because it is actually reducing the activation level at the surface of the lenses. It is much better to replace the acidic soap with straight tap water if such a tank is available. Not only it will not decrease the surface activation, but it could also improve the rinsing process because that tank is generally fitted with heating and ultrasound.
Of course, since it is a static tank, this water will need to be changed from time to time, the frequency depending on the number of lenses that would go through it. In some cases, the city water may be too hard (presence of Ca2+ and/or Mg2+ cations); in such a case, it is advisable to soften it (meaning replacing alkaline-earth cations with Na+) in order to avoid limestone deposit at the surface of the lenses. Ideally, this should be monitored with a conductivity meter.

Measurement of the ionic strength

The conductivity is a way to measure the ionic strength, that is the power of removing surfactants from the lenses. An ideal situation is when the city water conductivity ranges between 300 and 400 μ S.cm-1 If it is lower, the ionic strength is insufficient and it could be needed to increase the temperature and/or the rinsing time. If it is higher, the resins used to maintain the low conductivity of the DI water may get saturated too fast.
It should be considered that the maximum conductivity for the DI water to ensure optimum results is 0.3μ S.cm-1. When the DI water conductivity becomes higher, dots start to appear at the lens surface, as the alkaline pH value of the lens favors the formation of carbonates, given the fact that there is always a significant amount of CO2 around the machine caused by breathing of the operators.

No water should be left at the surface

Once the lenses have been immersed into the DI water, it is important to check that there is no water left at their surface when coming out from the tank during a slow (less than 1m m/s) lift-out. This is a sign that the lens activation and rinsing has been well performed. There might be still some water on the lens holders and this is why a good IR drying is necessary in order to introduce any water in the lacquer.
However, when using a water-based primer, it is obviously no longer necessary to dry the lenses prior to entering the tank. The IR power for drying lenses and lens holders can be set to maximum power as there is a need to remove all remaining water for avoiding any lacquer contamination.
Nevertheless, some lens substrates may have lower heat resistance and may distort from the combined effect of the temperature and the pressure from the lens holders. In such a case, it is necessary to reduce IR power while increasing the drying time.
Drying times mostly depend on relative humidity and air flow within the machine, that is why it is impossible to set a general rule for that stage. Anyway, it is necessary to allow the lens and its holder to cool down prior to entering the lacquer tank. There should not be a temperature difference of more than 10°C with the lacquer temperature in order to avoid formation of micro bubbles at the lens surface. These micro bubbles are actually made of fast evaporating solvents such as methanol or ethanol that are trapped within the already close to drying hard coating surface. This phenomenon being likely to happen again if the IR drying station next to the lacquer tank delivers too much power too early. That is the reason why it is desirable to have either a programmable IR pre-curing station or two contiguous IR tanks, the first one delivering half power before going to full power in the second one.

The dip coating process

The most complex and critical stage in the dip coating process, is, of course, the dip coating itself! Nevertheless, it has been studied intensively by two Soviet scientists, Lev Landau and Veniamin Levich who described it back in 1942 with their famous equation allowing to calculate the coating thickness:

h = coating thickness, η = viscosity, γLV = liquid-vapour surface tension, ρ = density, g = gravity.

Fig. 1: Stages of the dip coating process: dipping of the substrate into the coating solution, wet layer formation by withdrawing the substrate and gelation of the layer by solvent evaporation.

 

As it can be expected, the coating thickness is mainly defined by the withdrawal speed, by the solid content and the viscosity of the liquid. The withdrawal speed must be chosen such that the shear rates keep the system in the Newtonian regime, in such a case the Landau-Levich equation becomes really accurate. As shown by James and Strawbridge for an acid catalyzed silicate sol, thicknesses obtained experimentally fit very well to calculated values.
The interesting part of dip coating processes is that by choosing an appropriate viscosity the coating thickness can be varied with high precision from 20 nm up to 50 μm while maintaining high optical quality. The schematics of a dip coating process are shown in figure 1.

 

Fig. 2: Gelation process during dip coating process, obtained by evaporation of the solvent and subsequent destabilization of the sol (after Brinker et al).

 

 

When reactive systems are chosen for coatings, as it is the case in sol-gel types of coating using alkoxides or pre-hydrolyzed systems – the socalled sols – the control of the atmosphere is indispensable. The atmosphere controls the evaporation of the solvent and the subsequent destabilization of the sols by solvent evaporation, leads to a gelation process and the formation of a transparent film due to the small particle size in the sols (nm range). This is  schematically shown in figure 2.

 

 

In general, sol particles are stabilized by surface charges, and the stabilization condition follows the Stern’s potential consideration. According to Strn’s theory the gelation process can be explained by the approaching of the charged particle to distances below the repulsion potential. Then the repulsion is changed to an attraction leading to a very fast gelation. This takes place at the gelation point as indicated in figure 2.
The resulting gel then has to be densified by thermal treatment, and the densification temperature is depending on the composition. But due to the fact that gel particles are extremely small, the system shows a large excess energy and, in most cases, a remarkably reduced densification temperature compared to bulk-systems is observed.
However, it has to be taken into consideration that alkaline diffusion in conventional glasses like soda lime glasses starts at several hundred degrees centigrade and, as shown by Bange, alkaline ions diffuse into the coated layer during densification. In most cases, this is of no disadvantage, since the adhesion of these layers becomes perfect, but influences on the refractive index have to be taken into consideration for the calculations for optical systems.
Of course, this thickness calculation mostly corresponds to the value measured at the center of the lens. Ideally, to get a homogeneous thickness on the lens, a speed gradient should be applied, but only a few machines allow it. Yet, this is the best way to overcome the famous tear problem on bi-focal lenses along the segment.
Some machine manufacturers who cannot offer the gradient speed feature either recommend extremely slow speed leading to insufficient coating thickness to guarantee the scratch resistance, or extremely high-speed leading to potential cracking of the coating layer during the curing stage.
The best solution being to run those lenses at the nominal speed specified by the lacquer supplier followed with the maximum acceleration that the robot can deliver as this is increasing the centrifugal effect to expel the tear out of the lens. Anyway, with the spreading of freeform lenses in the market, the bi-focal lenses are likely to become just a piece of ophthalmic history.

Back to what is creating the chemical bonding of the lacquer to the lens. It is of course linked to the chemical composition of the lacquer which must contain a certain amount of epoxy groups to ensure the maximum adhesion to the lens substrate as can be seen in figure 3. It is clear that with this covalent bonding, an extremely strong adhesion level can be reached, making it difficult to remove the coating layer.

 

Fig. 3: Reactions occurring in the machine during the pre-curing of the hard coating.

 

 

 

Last but not least…

It is of prime importance to control temperature and Relative Humidity (RH) within the machine. However, because of the lower flash point of methanol and its high evaporation rate, methanol-based lacquers require a tank temperature around 10°C. Yet, when RH is set at 40%, condensation occurs when temperature difference with air gets higher than 6°C. This, in practice, requires that operators should work in a room where temperature is lower than 16°C, or that (what is generally the case), condensation occurs, reducing the pot life of this family of lacquers that was originally developed for fast drying
and high throughput.

Conclusions

It is clear that the outcome of a dip coating process is purely driven by physical and chemical parameters. When knowing this, there is no surprise with the result obtained from it. Machine manufacturers must understand that, from now on, actual and potential customers will no longer accept equipment which do not comply with the minimum requirements needed to ensure the maximum quality for their lens production.

Author
Breton graduated with an engineering degree in Chemistry followed by a PhD in Organic Chemistry and a PostDoc in Polymer materials. He started his career in big US companies such as GE and Goodyear before entering the Optical sector back in 1999 with SDC Coatings. He successively worked with NGL and FISA before creating his own company named Chemoptics. This company later became HPMAT (High Performance MATerials) to reflect activities outside the Optical sector and a JV was created in Russia under the name of FBOptics but as can be expected, it is now ‘on hold’ until further notice!

 

Tinting – from gut feeling to process control

Tinting – from gut feeling to process control

A software-based approach to improve process control in a tinting lab

Automation and digitalization are omnipresent. Their benefits on the shop floor in terms of costs, quality and throughput are well acknowledged. Now there exists a promising approach to introduce a new level of control and predictability to the process of tinting plastic ophthalmic lenses. A software may soon supply the missing quantitative link between the in-process parameters and the outcome, i.e. the color of the tinted glass. By Peter Weber

 

For this approach, Dr. Klaus Dahmen has merged his years of experience in ophthalmic’s R&D with his background in surface physics to set up a new model of diffusion processes in ophthalmic lenses. In close cooperation with tec5, manufacturer of the well-established TFM spectrometer, he now presents a software, that directly communicates with the TFM. It yields a new intuitive representation of the pigments’ effect in the lens and the results of the tinting process.

Finally, it links spectra (= colors) to process parameters (= recipes) and vice versa. That opens completely new opportunities to make the output of the tinting labs more predictable. In parallel this will stimulate equipment suppliers to keep up with the emerging opportunities of automation and provide a new generation of tinting hardware. Recent developments on the shop floor are governed by automation and digitalization. While the second half of the last century saw the rise of information technology, nowadays we are still far from tapping its full potential. Algorithms, machine learning and artificial intelligence provide great opportunities to better control our production processes and subsequently improve quality, increase throughput and reduce costs. However, progress can only be achieved step by step. Implementation within existing processes needs the courage to invest in new ideas.

 

Beacon of automation – surfacing

Prime example in ophthalmic industries for a consistent implementation of new concepts is the lens surfacing process. Within the last ten years automation and in-process data acquisition were implemented on many shop floors. Other parts of the ophthalmic process chain lag behind. Implementation of the coaters into a fully automated process chain is an unsolved task. And, most remarkable, the whole tinting procedure of prescription lenses is still a fully manual procedure with only a minimum of automation, data acquisition and systematic process control.

 

Mostly manual – tinting

One must pay maximum tribute to the operators in the tinting labs. With years of experience, combined with a perfect gut feeling and a lot of patience they manage to create impressive outputs during tinting season year by year. But the other side of the coin: each tinting season is an ordeal, a test to the nerves of operators, process engineers and management. With only a minimum of process control, setbacks in terms of output are more the rule than the exception. Keeping up with delivery schedules puts an enormous amount of stress onto the operators. And whenever setbacks cannot be absorbed, unsatisfied customers are the result. Uncontrolled rework loops create an equally uncontrolled rise of effort and costs. And eventually the planning of manpower in a tinting lab comes close to squaring the circle. Training new operators may take months, and only operators with years of experience and frequent practice can manage the job to its full extent. This severely conflicts with the highly volatile demand for sunglasses over the year, which requires quick and flexible allocation of personnel. With a small portfolio of just a few standard colors, a tinting lab may be well manageable in the “classic” way. But if you wish to exploit the opportunities of the sunglasses’ market to its full extend, you need to quickly adapt to fashion and offer a wide range of colors to be in vogue.

 

Make tinting predictable

The potential for improvement in terms of stable, reliable, and predictable tinting processes was well known to Dr. Klaus Dahmen, when in mid-2020 he established his own business in the Rhine-Main area (Germany). Based on his years of R&D experience with a global player in ophthalmic industries, he committed himself to the vision of a software, that would establish a sound basis for well-controlled processes in any tinting lab. To accomplish that the gap between process result (i.e. the tinted glass) and process parameters (e.g. pigment concentration, temperature, time) had to be closed. After half a year of theoretical work and programming, and reality-checks with hard-ware expert Jürgen Krall (IPS), in January 2021 the software was ready to be calibrated on practical tests.

Visualization of the effect of tinted glasses on sunlight. Source: K. Dahmen

The approach is based on the characterization of glasses by their transmission spectrum. It embraces the TFM, the ophthalmic standard spectrometer of metrology specialist tec5. Since September Dahmen and tec5 are closely cooperating in aligning interfaces. Shortly the software will be available with direct access to the TFM and respective data.

 

Merging process and metrology

At tec5 Dahmen’s activities were received very positive. Steffen Piecha, head of Sales, was involved in the TFM’s development (with Rodenstock) from the very beginning. As tec5 is specialized on process analysis by means of optical methods, in his eyes Dahmen provides the crucial missing link to process control, while up to now the TFM was mainly used to ensure compliance with legal regulations or to do sample quality control.

Nikolaus Petzold, Managing Director of tec5, points out that the TFM is a very alive product, getting regular overhauls of the hardware and continuous updates of the software. And thus, the additional customer value of the TFM by means of the new software fits perfectly well into the product strategy. The benefit of the cooperation is mutual, as the TFM is one of the few spectrometers, which measure valid spectra independent of the lens’ refractive power.

 

About colors and wavelengths

Light is an electromagnetic wave and can be characterized by its wavelength. Our eye is, technically spoken, a sensor for light. And the color we see depends on the wavelength of the light. The detection limits of this bio-sensor, called eye, define the visible range of light, reaching from roughly 380 nm to 780 nm wavelength.

In theory each wavelength is related to a certain color. For example, light of a wavelength of 520 nm would be seen as green by the eye. In nature no single-wavelength light exists, all light is a mixture of wavelengths.
Most prominent example is the sunlight. It consists of all colors of the rainbow, each contributing with a different amount. Our eyes recognize this mixture as white light. If you measure the amount (intensity), by which each color (i.e. each wavelength) contributes, you determine the spectrum, in this case, of sunlight. If you manage to remove a particular range of colors out of the sunlight’s spectrum, the remaining light appears colored. This is what you actually do with sunglasses.

 

Tinting – a highly complex process

The software calculates color contributions for given tinting times and pigment concentrations to predict the outcome of the process. Source: K. Dahmen

The tinting process is generally based on tree pigments: red, yellow, and blue. The concept is to disperse a certain amount of each pigment in water, put the glasses into the water, and let the pigments diffuse into the surface of the glass. Higher concentration of pigment in the water means higher amount in the glass afterwards. Within the glass certain pigments then absorb certain parts of the sunlight, the amount of each pigment in the glass finally determining its overall color.

As simple as the basic concept may sound, as complex the implementation may become. The pigments are large molecules thus one needs to accelerate the diffusion process into the glass by increasing the temperature of the bath. One may also add chemicals to the water to enhance the diffusion process. Already now you see quite a bunch of process parameters, that finally influence the result, i.e. the color of the lens.
Next, different speeds of diffusion within the lens for different pigments need to be considered. And the fact that diffusion is bidirectional comes into play, meaning that depending on the conditions, pigments may also diffuse back out of the lens into the water. To add further complexity, each different refractiveindex of a lens has its own diffusion parameters. Even different suppliers or different batches may make a difference. For the final impression of the lenses’ color, the depth distribution of the pigments is not of much importance. Only the overall quantity counts for that. But to predict process parameters and the outcome of a certain process chain in terms of color, the depth distribution actually is very important, particularly when talking about rework steps or mixing of light colors on the lens in different baths.

With years of experience operators in a tinting lab get to know their tinting parameters by heart. But nevertheless, for the before-mentioned reasons an industrial process needs a certain level of systematic backup. Dahmen is convinced that his new software approach will not make the experienced operator obsolete but will support him. It will increase and stabilize the output in terms of quality and quantity. Positive side effect: this will improve the working conditions in the tinting lab by reducing stress and taking off unreasonable responsibilities from the shoulders of operators.

The software

As Dahmen illustrates, the basic scope of the new software is to link process parameters to the spectra of tinted lenses and vice versa. As tinting is based on a dynamic diffusion process, his simulations do not only predict the amount of the three different pigments in the lens. One step further, he integrated a model of the time-dependent diffusion process within the lens to heed the distribution in depth. The whole simulation finally consists of three sub-models: one sub-model describes the state of the bath and the respective diffusion process into the lens surface. The second sub-model describes the diffusion process of the pigments within the lens. And finally, the third sub-model describes the resulting absorption spectrum in dependence of the before-mentioned pigments’ distribution in the lens. One basic paradigm shift in Dahmen’s approach is to leave the established path of the 3-dimensional CIE- L*, a*, b* representation, as found in the DIN ISO Norm. As he states, for certain topics the color space is an important model, but it is by far too abstract, to deduce parameters for the shop floor. Instead, he chose a completely new way of describing the “coloring power” of the pigments in a glass in terms of a relative number in percent for each of the three colors. Baseline for these relative values is a set of reference-tinted lenses, where each of the three pigments is applied separately under given standard conditions. These relative values then completely characterize a lens in terms of color with respect to the used set of pigments. Core feature of the software is to calculate these three relative values (for red, yellow, and blue) from a given spectrum or vice versa. Then, with the tree relative values for a spectrum being known, the software can deduce the necessary process parameters, to reproduce exactly this spectrum.

 

Benefits

Intuitive visualization of measurement results for the operators: relative contribution of each pigment and its respective specifications for each of the two lenses. Deviations between the two lenses are depicted as well. Source: K. Dahmen

According to Dahmen the software will support and considerably accelerate development of new processes by automatically suggesting recipes for given colors (i.e. spectra). Additionally, the software will be able to reconstruct the necessary tinting process for existing lenses, e.g. for old reference samples. Furthermore, the software can be used for quality control: continuous sample measurements can be evaluated, and statistical envelopes can be generated in the spectrum graphs to represent a wavelength-dependent spread of the process. This leads to production cards which will indicate upcoming instabilities of the process. The software can even quantify the respective amounts of pigments within a measured sample and track them in the same way. In case of color deviations, the software will make suggestions on how to adjust the process to bring it back into specification (i.e. adjustment of pigment concentration, temperature, tinting time). Another prominent application could be the scenario of switching to another supplier of pigments. Based on sample measurements with the new set of pigments the software could calculate new recipes for the future set of pigments from the old recipes.

Or another application of the software: it is well known that the suppliers of pigments suffer from certain batch-to-batch variations. It may be possible to qualify a new batch by tinting standard samples and evaluating the results in comparison to earlier batches. It may even be possible to automatically create suggestions, how to compensate these variations in the process. The systematics of the software may also give the means to align different production sites and make sure that the same sunglasses from different sites actually look the same.
Last but not least, the software will provide support for teaching new operators and will assist experienced operators by comparison of target to actual parameters of the tinted lenses in an intuitive graphical representation.

 

Status and outlook

Just recently, as Dahmen reports, compatibility testing and validation with the TFM spectrometer was finished successfully in cooperation with tec5. The user can now take control of the TFM, carry out measurements directly out of the software and get full access to recent measurement results and data. This means, that as of now the software with the above-mentioned features is available and utilizable for any owner of a TFM spectrometer. At the bottom line this is a very promising approach to introduce a new level of process control in the tinting labs. Key innovation is the establishment of a link between measured outcomes of tinting, i.e. spectra of the lenses, and the initial process parameters, which actually determine this outcome. The software has high potential to reduce efforts in the development of new recipes, to reduce rework, and, on the whole, to stabilize the processes in terms of cost, quality and output. As Dahmen points out, it will not provide 100% automation of the tinting processes but give support to developers and operators by supplying them with measurable, quantified data, that were never available up to now. Of course, this new approach also opens new opportunities for the manufacturing equipment. Making process results measurable and feeding them back into control parameters will demand for a next generation of hardware with enhanced in-process data acquisition as well as improved means to control the process. ◆

 

Peter Weber

The author is professor at the University of Applied Sciences in Frankfurt and acts as journalist in the field of science and technology. In the past he managed production and engineering departments in semiconductor and ophthalmic industry. www.connects-by-knowledge.de

 

Interview Partners

1. Dr. Klaus Dahmen, entrepreneur: expert for tinting processes, surface physics and software implementation.
2. Nikolaus Petzold, Managing Director, tec5 AG: years of experience as head of sales in the sensor business with strong scientific background in optical metrology.
3. Steffen Piecha, Head of Sales, tec5 AG: 20 years of experience with tec5.

Sustainable manufacturing for lens generating

Filtration and waste management started as a means to save companies money by reusing resources through their process. As the world has turned to sustainability and improving our ecological footprint, filtration and waste management technology has evolved, offering equipment that not only helps the environment, but helps the business too. In the optical and lens generating industry, we see this happening at an increasingly rapid rate. By Jamal J. El-Hindi

Optical lens generating requires hefty amounts of water and coolant to maintain clean and precise machining. Like many industries, it only made sense to invest in ways to reuse water and coolant, reducing operation costs.

The Environmental Protection Agency of the United States defines sustainable manufacturing as “the creation of manufactured products through economically-sound processes that minimize negative environmental impacts while conserving energy and natural resources”. Filtration in industry has always been an element of sustainable manufacturing. Although environmental conservation has not always been a priority in manufacturing, filtration has been a means to conserve natural resources and save money.

 

Filtration — how to deal with waste?

Fig. 1: Bigger labs generate more waste.

Optical is by far not the only industry reusing coolants, oils, water, etc. Filtration among other industries has been a profound part of their process for a long time. So much, that large industries turn to specialized filtration companies to provide entire filtration systems.
In the optical industry, especially in the United States, we have only recently seen a rise in smaller labs expanding in size. This transformation brings all sorts of new obstacles to overcome, making it more practical to implement these same filtration practices from other industries.

Most smaller labs got along just fine without any special equipment for filtration or waste. As a lab gets bigger, however, it generates more waste, or plastic “swarf”. Swarf generated from lens grinding takes up a lot of volume, and as labs grow, we see costs such as garbage disposal, hazardous waste removal, coolant, storage, and water increase.Just the increased volume of swarf alone can cost labs a lot of money, not to mention how much coolant is being thrown away with that swarf. There is often more alloy and spent polish that needs to be disposed of as well. Alloy, being a hazardous material, has to be taken away by specialized hazardous waste companies that treat the waste-water. In some areas polish is also regulated, and must be treated as hazardous material.

These are not only concerning for the bottom line, but they also concern our environment. The waste from making lenses takes up a lot of landfill space. Plastics, such as polycarbonate and other thermoplastics used in lens manufacturing do not decompose for hundreds of years. They are also saturated in coolant if not properly dried, which leaches into the ground. Alloy water and polish compounds cause concern for the environment as well. These have hazardous metals that have contaminated entire sewage systems.
Optical is certainly not the first industry to encounter these hazards. The same issues have been present in several other industries for many years. Filtration specialists have been creating solutions for these issues, and only in recent history do we see some of that technology spreading to the optical industry.

One of the first obstacles to overcome is filtration itself. Filter technology has not only allowed companies to reuse their coolants, but has also provided clarity well enough to keep their equipment running longer. In the optical industry, cleaner coolant means changing the diamond tip tooling less often. The filter must also allow the waste swarf to discharge with as little coolant remaining in it as possible. In most industries you will see the filter with a long, steep discharge ramp. This is to allow sludge or swarf to drain for as long as possible before being discharged. In industries like copper production, we go as far as adding heaters and special reclaiming methods to dry and reuse as much of the waste as possible.

 

Labs start investing in briquetting technology

Fig. 2: Some labs start investing in briquetting technology.

Copper mills will often reclaim their waste and run it back into their production. Many industries have seen the need to either reclaim or condense their waste. Briquetters or compactors have played a large role in condensing and reclaiming.
The optical industry has recently begun implementing this technology. What started as a means to reduce the volume of swarf waste, quickly became a means to retrieve more coolant. We are starting to see labs investing in briquetting technology as a goal to reduce their ecological footprint and help the environment. This has proven to save manufacturers thousands in coolant costs per month, making it a perfect choice for sustainable manufacturing.
The problem with briquetters, however, is that most are not designed for optical swarf. The first few in the industry were saw dust briquetters, which proved just how difficult lens swarf is to deal with.
After several years of development, there are now some options for compacting swarf and retrieving more coolant than ever. With swarf volumes being reduced to a 20:1 ratio, and squeezing nearly all coolant out of the swarf, this may be one of the most environmentally friendly advances in the industry.

Waste-water treatment

Fig. 3: DAc aqua distill water/waste recycler.

Many industries have had to implement systems for treating their waste-water because of how much they produce. As optical labs grow, so does their hazardous waste production. Some labs simply outsource their waste-water treatment by having companies pick up their hazardous materials. Some labs pour those hazardous materials down the drain. As environmental regulations increase and government focuses their lens on optical manufacturing, optical labs face some tough decisions.

 

Waste-water filtration systems

Filtration companies have been producing waste-water filtration systems for many years. Most of these systems were designed for much larger industries that typically have more space and more waste-water. All technology adapted to the optical industry must be functional and small enough to fit in an already crowded optical lab. In rising to this challenge, there are now new technologies available.

Fig. 4: Waste water treatment

Waste-water evaporators and waste-water treatment systems seem to be the leading products available to labs currently [Fig. 3, 4]. Evaporators, although they take time and a lot of electricity, are great for isolating hazardous materials and removing them from the water, making hazardous waste a lot easier and less expensive to deal with. Another waste-water treatment system uses a method that large waste-water treatment plants have been using for decades.

They use an environmentally safe clay-based chemistry to latch on to the hazardous materials in the water, separating them entirely. The hazardous material can then be extracted by several different filtration methods [Fig. 5].

 

Fig. 6: Waste water filtering.

One of the prominent methods is using a centrifuge, which separates and condenses the waste material. This allows each lab to safely discharge the waste-water knowing it is free of hazardous materials. This technology is now conveniently provided as a turn-key system for labs to treat their own hazardous process water, with the benefit of a smaller footprint.

 

The industry is growing and so are the challenges

As the optical industry continues to grow, there will be more issues to face, especially those regarding our environment. This can be especially difficult when policy, practice, and procedures vary across lens manufacturers.
It is difficult to compete when regulations in every country are different, and enforcement varies. It is critical then, that sustainable manufacturing technology continues to improve so that it not only helps protect our environment, but can be justified in cost savings. As this industry grows, it is on filtration and waste management that the industry will depend on for sustainable manufacturing. Filtration, compacting, recycling and waste-water treatment technology will be necessary so they can follow regulations without consequence.

Fig. 6: 100% recycled content.

As we look to the future of sustainable manufacturing, many are looking beyond simply reducing swarf waste and retrieving coolant. The industry has been asking what we can do with this waste and how to prevent it from ending up in a landfill. Many have been trying to answer these questions, and in the United States there is one group that has an answer. A development that is dramatically impacting sustainability in the lens industry is a new technology that enables recovery and recycling of lens swarf into permanently sustainable products.From its Dallas headquarters, DEVCO Services is currently developing a North American network for recovery of densified swarf. The company’s CEO, Alex Rankin, commented “We are focused on delivering value for the entire lens industry through recovery and conversion of swarf into a raw material used in production of quality products with positive environmental impact.” Their success in developing and selling a product from compacted lens swarf has given hope to the entire industry. As optical labs allow other industries to play a part, we will see more options for a cost-effective means of helping our environment and practicing sustainable manufacturing. ◆

Author

Jamal is a sales and field engineer for Filtertech with a Masters of Business Administration. As the third generation of this family-ownedand-operated company, he started at an early age building filtration equipment. After finishing his college career, Jamal began working as a field operator, fixing and maintaining filtration equipment across the world. With extensive knowledge of the mechanics and technology behind filtration, he began applying his experience in the field to engineering and design. Today, Jamal works directly with companies across the world, finding filtration solutions to meet the unique needs of multiple industries.

 

Pushing the frontier of automation

Pushing the frontier of automation

Picture by kalhh on Pixabay

Robots and digitalization for stable processes

Robotics is no rocket science anymore. Affordable and reliable modules reveal new views on automation. Experienced equipment-suppliers profit from the opportunities. Based on existing process know-how robotics merges with Industry 4.0 and empowers highly efficient and stable production equipment of the next generation. Labor cost reduction is by far not the only benefit. Automation basically stabilizes processes. Combining it with digital tools and smart metrology, real-time quality data can be generated and processed in an unprecedented way. Feedback of this output into the process will raise the level of process stability and quality significantly. Stable processes and high quality-level are the bedrock of competitive production in high wage countries. In other words: robotics and digitalization are the tools to secure the future of high-quality production sites.
By Peter Weber

The complex process chain in ophthalmic lens production is subjected to innumerable influencing parameters along each single process step. Single deviations of these parameters may sum up to severely reduce the quality of the final product, the eyeglass. Thus, in theory, one gains control of the process by measuring and monitoring each parameter in a control chart and instantly reacting on significant deviations by adjustments.

 

Automation meets digitalization

Let us consider a very simple example. The control parameter “temperature” is measured by a sensor within a curing oven. This parameter is fed back into the temperature control electronics, which in turn changes the magnitude of the parameter “heating current”. By this the measured temperature is adjusted towards the target temperature. In this case, the process control is straight forward.
Now the next step is, to monitor the more-or-less fluctuating temperature and connect the actual oven temperature to each lens (or batch) in a database. This is the key to connect deviations in quality with fluctuations along the process chain.
The data monitoring is, for this example of a simple temperature-reading, just a technical question of connecting the sensor output to a digital database. Still, if your process depends on many such parameters, the systematic monitoring becomes a challenge.
On the other end of the complexity scale, consider the inspection step of the glasses after surfacing: In that case the measured data do not only consist of a “few” overall parameters. For each glass an enormous amount of data is created, each calling for elaborate algorithmic evaluation. On top of that each glass needs individual handling to be positioned precisely in the sophisticated metrology setup. In this environment, the combination of robotics and digitalization unfolds its full benefit.

 

Strong process knowledge is a must

We all know that the exact measurement of all parameters along a process chain can never be obtained. And subsequently, the exact control of all influencing factors is not possible. The great challenge in mastering processes is the determination of those parameters which influence process stability the most. Thus, no matter what sophisticated metrology or data-analysing tools you use, the groundwork is sound process engineering with a deep understanding of the interdependencies within and between process steps. This is one core message of Jürgen Krall (IMACO Products and Solutions), based on years of experience in automation: One can only harvest the benefits in the field of robotics and digitalisation with a strong understanding of the processes.

 

Robots enable real-time process control

Proper understanding of existing processes yields well-defined interfaces for the new approaches of robotics and digitalization. We all know how tedious it can be to control a process based on three-times-a-day handwritten temperature and conductivity tables.
For real-time process control it is inevitable to have all the relevant data available in digital format. Basically, the lab becomes an Internet of things (IoT). Every piece of equipment, every gauge, every sensor needs to be connected to a central database. And, as mentioned before, in many cases this data acquisition on large scale cannot be integrated into the existing process just “en passant”. Many key-measurements require extensive handling. This is the reason why new prospects of robotics and automation open completely new ways for the equipment suppliers.
In the last years, many equipment manufacturers extended their R&D activities into the fields of software and data management. Dr. Christian Laurent, head of development at A&R, reports that just recently his company rolled out a newly developed DataHub in the field. The core of the system is a central database which directly communicates with the lab’s metrology about product quality and constantly feeds these data into intelligent analytical algorithms. By means of automation and digitalization an enormous amount of data can be evaluated statistically in real time, yielding patterns and interdependencies, which in turn can be fed back to control the production process. Fluctuations in the process can be compensated before there is any violation of customer’s specifications.

 

Glimpse into the handling unit before brush cleaning. Source:J. Krall, IMACO Products and Solutions

Robots master complex handling steps

In the context of merging automation with metrology it is worthwhile, to take a look at the interface between the surfacing process and coating. Krall describes an amazing ensemble of robotics and image recognition. In the past an operator manually moved glasses from job trays onto the conveyor belt of the brush-cleaner. In that movement, by a brief glance, the operator checked the glasses for damages or unacceptable residues after pre-cleaning. In parallel the operator moved the job ticket from one tray to another.
The challenge is the complexity of the seemingly simple human work step. In that case, according to Krall, the solution is a linear system for the ticket transfer, a SCARA (=selective compliance assembly arm) robotic system for the glass handling, and a simple CCD camera system for the inspection. Krall points out that there are a lot of human motion sequences which, in terms of speed, no robot could reach.
The key point in fact is that a robot does a motion sequence always in the same way, independently of factors like personal fitness or the operator’s training. When considering process stability, this is of cause the decisive advantage. And this is particularly eminent for a visual inspection step. Evaluation algorithms for the CCD-data define quantitative and measurable criteria for allowed and not allowed residues or damages. A visual inspection by operators always leaves space for subjectiveness.
Thus, the combination of robotics and software-based camera inspection boosts the quality of the available process data. It is less probable, that unsuited glasses are processed onwards, and the quantitative data can be used for quality tracking and continuous improvement.

 

Central process-data management: The A&R DataHub collects and shares statistical product and process metrics. Source: C. Laurent, A&R

Robots help create mass data

Another vivid example for automation being the enabler for metrology is given by Laurent. He describes the possibilities of a control chain for the free-form surfacing process. A survey of each glasses’ power-map in automated inspection right after deblocking yields an error map for each single glass. This is a coloured map of local deviations with respect to the target surface. Based on such error maps, design deviation datasets are computed. Primarily this allows to individually tolerate each glass.
But the essential feature in connection with the DataHub is the statistical evaluation of all error maps of all glasses. As the expected (i.e. tolerated) deviation for each position on the glass is not a constant over the whole surface and also differs from glass to glass, it is quite sophisticated to calculate. But by analyzing each individual free-form against the actually measured error-map, smart algorithms can, by application of certain criteria, draw conclusions on the stability of the front-up free-form surfacing process.
By these analytical means it becomes possible, in real time, to trace systematic deviations back to certain process steps, certain machines, and certain root causes, and to instantly adjust the related process parameters in the surfacing process.
To gain even more process control and to improve lens quality, Laurent suggests, to measure the front surface of each semi-finished glass before it enters the free-form surfacing. According to his experience these surfaces tend to deviate significantly from the specifications. On the one hand the free-form parameters for each glass could be instantly and individually adjusted to yield better free-form results. On the other hand, knowledge of the exact initial state of the surface before free-form processing greatly increases the efficiency of the previously described process control mechanisms. Essential Factor for the creation of this enormous amount of data is a smart automation of the complex measurement workflow.

 

Real-Time Process Control Loop stabilizes lens quality and productivity: Violation of specifications (hor. red line) is anticipated, and corrective actions (after vert. red line) readjust the process proactively. Source: C. Laurent, A&R

 

Robots support Statistical Process Control (SPC)

Philippe Vaudeleau, CEO of FISA, manufacturer of cleaning and hard lacquer equipment, regards automation as part of his company’s DNA. By now FISA equipment only comes with fully automated processes. Apart from the reduction of labor costs, for Vaudeleau stabilization of processes is the greatest benefit for his customers from automation. Automation is closely linked to digitalization and process data acquisition.

Fully automated in-line cleaner with a combination of two robots and a linear handling system. Sorce: P. Vaudeleau, FISA

 

 

 

The fully automated FISA equipment is designed with a range of sensors to feed all the relevant process data into a central database in real-time. This opens up completely new opportunities for statistical process control (SPC). Process data are directly accessible for evaluation, yielding control charts for everybody in real time, allowing statistical evaluation, and to establish the direct link to the quality data of each respective glass.
Vaudeleau enthusiastically reports about the extra possibilities, which arose at customer’s sites, where RFID-tagged job trays were already implemented. But also for the classic paper production card, the link between each individual glass and the respective process data can be easily done via the classic barcode scanning.

 

From robots to cobots

As one of the biggest challenges for a company like FISA in the context of robotics and automation, Vaudeleau sees the need to stay ahead of technology. R&D has a strong position in the group, reporting directly to the CEO, with roughly 10% of the company’s turnover going back into the central development site at Milano.
FISA developers rely on a combined strategy of linear handling systems and 6-axis robots (or, more specifically: Cobots). The flexibility and adaptability of the robot is necessary to grab glasses from the tray, lying there in a more-or-less undefined position. The high-precision linear system on the other hand is the proper choice for a critical process like extracting the glasses out of the lacquer. Concerning robots, FISA recently made the step towards cobots.
Collaborative robots are equipped with advanced sensors and safety features that allow humans to work right next to it, while a classic robot represents a severe hazard to human personnel and needs to be kept separately. The rise of cobots, as Vaudeleau estimates, will have significant impact on the degree of automation in ophthalmic industries. For small loads like eyeglasses, cobots are sufficiently affordable and fast enough by now to play a major role.

A&R, FISA and IMACO incorporate decades of experience in the specialized market of eyeglass production. They represent the archetype of equipment suppliers in this segment. Based on their process knowledge, they all heavily rely on three paradigms: First, keep hardware simple to ensure feasibility on the shop-floor. Second, put the complexity into the software and the data acquisition. And third, embrace the new possibilities of robotics, automation, and digitalization to stay ahead in terms of quality and costs. Common expectation is that with cheap, easy-to-maintain, easy-to-program cobots, the remaining gaps of automation along the process chain of eyeglass production will soon be closed.

 

Interview Partners

[1] Dr. Christian Laurent, R&D Director Automation & Robotics: Manufacturer of equipment for accurate measurements, process control and automation in ophthalmic industry. www.ar.be
[2] Jürgen Krall, IMACO Products + Solutions GmbH: 20 years of experience as general manager in the field of industrial equipment and expert for automation. www.imaco-ps.de/
[3] Philippe Vaudeleau, CEO, FISA Group: Supplier of cleaning and hard-lacquer coating equipment for ophthalmic industry and other fields of manufacturing. www.fisa.com

 

Author

Peter Weber is professor at the University of Applied Sciences in Frankfurt and acts as journalist in the field of science and technology. In the past he managed production and engineering departments in semiconductor and ophthalmic industry. www.connects-by-knowledge.de

Prismatic corrections – a communication challenge between the industry and opticians

Prismatic corrections – a communication challenge between the industry and opticians

Picture by Michael Dziedzic on Unsplash

In my career of over 30 years as a lecturer in ophthalmic optics, I have been through all the stages from student to teacher to developer to expert authority – trusted with confidential knowledge about the industry. I would like to thank everyone for the trust they have shown in me, because this was the only way I was able to explain complex optical issues to opticians in a clear and understandable way. Here aberrometry with its wavefront measurement is a case in point.

By Fritz Paßmann

Knowledge means recognizing connections and developing links to topics that are already understood. The conversion of prismatic prescriptions into spectacle lenses that a wearer is comfortable with is something I care deeply about. Both from the point of view of the end customer, i.e. the improvement of his vision and possible alleviation of asthenopic complaints, as well as from the point of view of the optician, who needs support in exercising his skills and offering his – sometimes emotional – recommendations. This calls for broad backing by the industry. This article is aimed at bringing about a willingness to offer opticians more information, education and transparency.

This is a huge challenge: converting values ​​determined in the refraction room into spectacle lenses that are comfortable for the end user. It requires close cooperation between all the parties involved. The optician is the link between the lens manufacturer and the customer.

Let’s start at the beginning

When looking through a spectacle lens, a prismatic effect is created outside the optical center. Normally this does not pose a problem for the spectacle wearer, as the prismatic effects of different versions and base settings mutually diminish each other resulting in an “almost zero” overall prism; except where there is anisometropia, in particular in the principal meridian of the spectacle lens in the respective direction of view.

To calculate this, the optician uses the “Prentice law”: P = c * S`. Charles F. Prentice is considered as the father of optometry in the USA and in 1921 he recognized the connection between the distance of the path of the light beam in the spectacle lens from the optical center to the vertex power for calculating a prismatic effect.

If someone wants to achieve a prismatic effect by decentering the lens, they must not use the Prentice law as the formula does not take account of the rotational direction of the eye when looking through the lens. For more accuracy, the ‘Weinhold formula’ or ‘extended Prentice law’ (Fig. 1) which includes the BVD (back vertex distance) must be used.

‘extended Prentice law’

Technically the distance between the eye center of rotation and the lens vertex b` should be taken into account, as the eye rotates around its optical ocular center Z` in the opposite direction to the base setting of the prism. But how can you measure the distance to a virtual point? This raises three fundamental questions for the lens manufacturer:

  1. Is the BVD or b´ always included when ordering prismatic lenses? In which case a specifically developed order form including all the necessary data is recommended.
  2. What distance does the industry use to calculate b’? The ‘standard eye’ according to Gullstrand specifies 13.5
  3. Is there any conversion made between ‘Prentice’ and ‘Weinhold’?

This leads to a further issue: the difference between the measurement position and the position of use of the spectacle lens which needs to be communicated precisely, both for the clarity and understanding of the optician and for the technical details to be printed on the lens bag. Admittedly the differences in calculation will mostly have an effect ‘after the decimal point’, i.e. in the decimal range.

However in problem cases several factors often occur together, or the customer is particularly sensitive, in which case precise communication can minimize misunderstandings.

Prismatic effect through decentration and its limits

The application of the Prentice law is already taught during training and thus opticians are able to autonomously decenter the lenses and determine the reference point in order to achieve the desired prismatic effect. I was already proud to be able to do this in the early days during my apprenticeship. At a time when considerable subsidies were still paid by the health insurance for spectacle lenses (editor´s note: that was the case in Germany until 2004), ordering a spherical or toric single vision lens (multifocal lenses naturally did not apply) and achieving a prismatic effect with the help of a larger uncut lens diameter was a great art and did not involve changing the prescription or even cheating the health insurance.

The desired effect was achieved and afterwards it was not possible to see by the lens how this had been done. However, this approach does have its limits, which should be clearly communicated by the industry:

  1. The vertex power and uncut lens diameter are mutually dependent. That means that, where S` is rather low, the blank will not have the required diameter. My recommendation for maximum dioptric power is the number “4”. This means that above a vertex power of0 D and prismatic effect of 4 cm/m, the prism should be created industrially, i.e. by tilting the blank when blocking. This procedure is not known to all opticians, but it explains the limitations of what is feasible when the blank only has a limited edge thickness.
  2. For toric lenses the effect in the meridian must be taken into account. This means that where the principal meridian and the base setting of the prism to be generated are not parallel to each other but intersect obliquely, this approach tends to lead to approximate results.
  3. If the lens manufacturer is not aware of the prismatic effect to be achieved, the increased aberrations that inevitably occur will not be corrected or even minimized. At this point the occurrence of ‘astigmatism of oblique bundles’ in wedge-shaped lenses should just be mentioned. An optimized prismatic correction for the comfortable vision of the spectacle-lens wearer requires the aberrations to be corrected at the reference point.

Mandatory: BVD and PD

During training to become a refractionist (not a refraction assistant), a distinction is made in the correction of a latent squint (heterophoria & associated heterophoria – explaining the difference would require a chapter of its own) between the pupil-center centration (PMZ) and the formula-case [1] when using a trial frame [2]. When centering on the pupil center using the Viktorin method, the PD setting is retained during the entire measurement process.

On the one hand, this is a straightforward procedure, on the other hand, it contains inherent inaccuracies and may even reach its limits. As already mentioned, after inserting a prism the customer no longer looks through the optical center of the existing sphero-cylindrical correction. Two prismatic effects occur: that of the measuring prism and the unwanted prismatic power of the spherical and toric lenses, leading to a weakening or strengthening of the overall prismatic effect.

 

The refractionist does not know the overall prism value and has to rely on the expertise of the lens manufacturer to produce the exact prismatic effect. However, to do this, as already mentioned, the PD and BVD need to be known. A prismatic prescription cannot be followed precisely without specifying the PD and BVD.

Furthermore, the visual acuity/vision of the customer may be impaired during refraction due to the resulting aberrations, which may lead to the refractionist giving up on further prismatic correction, although the status of binocular vision of the customer could still be improved. In addition, the binocular field of vision becomes smaller.

When using the equation case for lens centering, the far distance centration (not the customer´s PD) is adjusted. Two similar rules of thumb are used which are by no means only marginally different: for every 3cm/m or every 4cm/m, the set distance in the trial frame is corrected in the opposite direction to the base setting of the inserted prism. This minimizes the prismatic effect of the main lens but by no means removes it completely.

Furthermore, new questions arise as to the practical procedure used: if the formula is only used from 3 cm/m or 4 cm/m, what procedure is to be followed below these values? In the case of higher vertex powers, a noticeable prismatic side effect quickly becomes apparent, even if the trial lenses used are of low prism. What is the right setting for 5 cm/m on the trial frame? The required accuracy of 1.67 mm or 1.25 mm cannot be achieved.

In practice, this results in a combination of the two variants: partly further adjustment, partly accepting the prismatic side effects of the main lens. I recommend making the size of the ‘readjustment’ dependent on b`. The larger the VD, the more likely is it necessary to make readjustments. This results in the next demand on the measurement protocol/order form for spectacle lenses with prismatic effect: the indication of the PD at the beginning and at the end of measurement as well as the BVD. As previously stated, progressive lenses must always be ordered with the prismatic effect.

Potential sources of communication errors between the optician and the lens manufacturer

The question sometimes arises for the optician: “If the customer’s PD is readjusted already during refraction, (I am deliberately using this imprecise wording here because it is so often used; which does not make it any more correct. The correct terminology would be to readjust the centration point distance in the trial frame) then do I no longer have to take it into account when readjusting?” The answer is: “It depends.”

The crux of the matter is that it depends on the optician’s workflow. A refraction is usually carried out after taking down the case history and assessing the requirements. The PD setting of the trial frame is rarely recorded. The optician takes the centering data from his centering system using the – hopefully correctly adjusted – trial frame.

However, at this point the prismatic correction is not in front of the eyes, so they take the ortho position. This data is then sent to the lens manufacturer. In this case, the centration correction must be carried out over the entire prismatic effect, not just over the ordered prism, regardless of whether readjustment has already occurred in the refraction room.
If the optician specifies the readjusted PD when ordering the lens, only the small difference to the newly calculated total prism needs to be changed. In my opinion, it is desirable to indicate this correction on the lens bags as the new “distance centration distance”.

Some lens manufacturers have recognized this potential source of error in communication and have for some time now been taking a new approach, at least for progressive lenses: relocating the stamp by moving the engraving. The lens manufacturer calculates the new position of the reference point exactly, i.e. to an accuracy of 1/10 mm, thereby offsetting the centration marking of the stamp which makes sense. The optician has one less thing to worry about and continues to work as usual. However this is not a general solution to the problem.

New questions arise: “Can this approach be used for all types of lenses in the portfolio?” Is the centration correction clearly shown in the price list? At what point in time was the change made?” This question may be of key significance when handling complaints. Since specialist opticians usually work with a number of lens suppliers: “Which lens manufacturers offer this, and which do not?”

Prism distribution

Personally, I prefer asymmetrical prism distribution. I find it satisfying when aesthetically pleasing lenses can be achieved by adjusting the thickness. It is down to the refractionist to decide whether a symmetrical or asymmetrical distribution of the prismatic effect is more appropriate. Only he knows the visual acuity and the tolerance of any prism combinations as well as the influence of the customer’s “dominant eye”.

Only the lens manufacturer can calculate the absolute effective thickness ratios for ground lenses with a sphero-cylindrical and/or progressive surface. From these two points of view, an interaction is recommended through which all possibilities can be envisaged.

Taking account of the refraction conditions in the trial frame

The lens manufacturer should have a further skill and needs to communicate this to the optician: namely modification of the overall prism in the finished lens. In the refraction trial frame, lenses are placed in different holders, whereby the prismatic trial lens with its base setting is usually inserted into the front lens holder for practical reasons. The lens is thicker and would otherwise scratch the subsequent lenses or would not even fit into the lens holder. However, in the end only one lens is manufactured and ground to fit into the customer’s spectacle frame, with the resulting effect corresponding to the sum of all the individual lenses.

This poses additional questions: Are the various distances between the trial lenses taken into account when calculating the spectacle lens? Is the forward prism wedge of the trial lens converted into the same effect on the spectacle lens in which the prism wedge protrudes backwards? How is the prismatic trial lens labeled? According to DIN, the basic deflection of the prism is defined in such a way that the principal beam hits the front surface at a right angle and deflection only takes place at the rear surface. Has the trial lens already been adjusted in this regard and, with the wedge protruding forward, does it have the same effect as is required in the position of use of the corrective lens for the main direction of view, despite the principal beam being refracted twice?

Measurement position versus position of use

In addition, there is the well-known problem of distinguishing between the measuring position and the position of use. The requirement for the measurement protocol/order form for spectacle lenses with prismatic effects should therefore include information on the respective slots used. In this context, the position of the edge of the ground-in spectacle lens is also important. Usually, only the optician has exact knowledge of the spectacle frame in question. Only the details of the lens shape are passed on to the lens manufacturer.

However, if the details of the entire frame – with its material, type of manufacture and the possibility of hiding the thickness of the lens rim – are passed on, new possibilities of producing aesthetically pleasing spectacle lenses by shifting the facet are opened up.

In addition, the position of the spectacles on the face is important. Are there any anatomical features such as strong eyebrows or deep-set eyes or protruding cheekbones to be considered? If you want the standard facet to deviate from parallel to the front surface, i.e. to be incorporated in the middle or even in the rear third of the edge thickness, this option must also be made clear on the order form.

Interplay between lens thickness, transparency and color fringe

I would also like to see more information about different lens materials and their Abbe number. This knowledge should then be used to advantage by the optician during the consultation. Thinner and more expensive does not always mean lighter and better!

Using easily understandable graphic charts, the interplay of lens thickness, transparency and color fringe of spectacle lenses can be used as a sales-support material to find the ideal combination, thus leading to the greatest benefit for the customer.

Taking account of the reference point

In the next point, too, I would like to plead for a commitment to sustained communication by the industry to the optician: “How can I (the optician) check lenses for prismatic effect?” This is only possible at the reference point. For this reason, single vision lenses should always be delivered with the reference point marked on them.

In the case of progressive lenses, the corrective prism effect cannot be measured solely at the prism reference point. At this point, the prescription prism and the calculated thickness reduction prism of the lens manufacturer have to be considered together. Often the optician is not aware of this combined effect. In addition with a height prism, the difference between the prismatic effects must be calculated separately from right to left.

One question after another

With regard to the previous point, the optician needs the expertise of physicists and physiologists from the industry. What does the spectacle wearer need to know about prismatic corrections? What tips and recommendations can the optician pass on to his customer? The comment “You’ll just have to get used to it!” is simply not good enough and certainly not professional.

How do increased distortions and dispersions affect vision? What can help to reduce these aberrations? How can it be that a customer reports seeing colored objects in different layers due to color stereopsis? Can anything be done about this? Can micropsia (objects appearing to be smaller) and macropsia (objects appearing to be larger than they are) be identified and avoided during eye testing for spectacle lenses?

Questions upon questions that can only be satisfactorily answered through a cooperative partnership where both sides bring in their expertise.

In conclusion, I hope that this article will create a better understanding of the vital need to collect all this data. A certain amount of transparency is desirable for calculating ideal lenses in order to demonstrate both the quality of the lenses and to stand out from the competition. This applies equally to lens manufacturers and opticians.

Footnotes:
[1] Formula case means that the pupil distance in the trial frame is corrected by 1 mm against the base for every 4 cm/m. An example: Initial situation; PD 32 mm, 4cm/m base-in, then the trial glasses are readjusted to 33 mm.
[2] A Trial Frame is an adjustable spectacle-like device containing cells used to hold multiple trial lenses during subjective refraction.