Whither Optical Design? Douglas C. Sinclair
Optics and Photonics News, June 2000 The twentieth century is likely to have witnessed both the birth and the death of traditional lens design. By traditional lens design I mean the process of balancing the aberrations of centered spherical systems to achieve maximum image quality. In traditional lens design, physical optics provides only a termination condition. When the Rayleigh diffraction limit is reached, the design is good enough. Until recently, the dimensions of typical optical systems were much greater than optical wavelengths. Specialists in geometrical optics studied refraction using aberrations and rays, and specialists in physical optics studied images using waves, with neither group having much in common with the other. Now, however, optical wavelengths are not considered as short as they used to be. At the same time, there is a lack of people competent in both geometrical and physical optics. We have the tools to tackle challenging and exciting problems of much greater scope and interest than ever before, but fewer and fewer people know how to use these tools. Nearly twenty years ago, Warren Smith wrote an article entitled “The Vanishing Lens Designer”; today, the dwindling number of expert lens designers has become a problem that is even more keenly felt. At the beginning of the century, the mathematics and physics relevant to traditional lens design were already pretty well established, and early designs like achromatic doublets and the Petzval lens had already been developed. But it was during the first half of the twentieth century that the theory and practice of lens design were really established, chiefly in Europe. By the time computers appeared in the 1950s, the basic design forms used today were pretty well developed. It is a tribute to the inventors that lenses like the Cooke triplet, the Petzval and the Double-Gauss are still widely used, mostly in the form of computer-optimized derivatives. There are some new forms, of course. Gradient-index and diffractive lenses, and the ubiquitous zoom lens, were known but not well developed until the second half of the century. It is interesting but perhaps not surprising that although computers have made possible the development of lenses of greater complexity, today most lenses still operate within more or less conventional guidelines. The modern lithographic lens is a sterling example of such a design. With nanometer distortion tolerances, incredible illumination and wavefront requirements, lithographic lenses are part of a recursive cycle, in that they are designed to produce chips fast enough to design the next generation of lenses. On the other hand, they tend to be variations of double-humped Gauss lenses, an established form. Two mid-century inventions, the laser and the computer, have expanded the scope of optical design so much that optical design as it is traditionally conceived will likely soon be relegated to a minority position. Today, the laser provides the work for the optical designer, while the computer provides the necessary support. The importance of the laser is not to be found in the intrinsic characteristics of the device itself, but rather in its impact on optics. Until fairly recently—with the exception of a few established military, visual and photographic applications—optics was essentially a service discipline for other sciences. As we enter the new century, optics is rapidly becoming a consumer technology. To meet with market success, consumer technology must be both “good” and “cheap.” Many lens designers are not used to working under such pressure: they’re comfortable with good—but not with cheap. In the future, designers will not be called upon to design the lenses that form the best images, but rather those that form satisfactory images and are cheapest to manufacture. This is an important distinction, one that under-scores the need for development of a whole new design methodology. If a lens for a consumer product is over designed, it will be too expensive to be competitive. Yet optimizing this aspect of production is outside the boundary of traditional optical design. Tolerancing, the most challenging and traditionally the most neglected aspect of optical design, is obviously becoming more important. In packaging, small size is increasingly in demand. The quest for smaller and smaller sizes has led to unusual geometries, and the consequent replacement of analytic design methods with numerical techniques, often tailored to the particular task at hand. In some applications, size restrictions may necessitate placing lenses in spaces where they can have dimensions of only a few wavelengths. In this case, geometrical design methods are inadequate, but current physical optics methods are too slow. These are the types of problems that optical designers can expect to confront in the future. Designers won’t be able to solve these problems without wide-ranging knowledge of optics as a discipline. In the future, it will be necessary to combine the lens designer’s meticulous methodology with the broader perspective of the optical engineer. Outside the optical design community, there is a tendency to consider lens design a problem that has been solved: there are those who believe that if you purchase an optical design software package and press the “global optimize” key, you can be done with it. The reality, of course, is different. The optical design problem presented to participants in the 1998 International Optical Design Conference, involved the optimization of a system characterized by chromatic aberration. Participants had the choice of using a few surfaces but lots of different glasses, or a few glasses and lots of surfaces. As expected, experienced lens designers came up with the top five solutions. Four of the designers used commercial optical design software, one an in-house proprietary program. Although three of the designers noted that they had used the global exploration features of their software as an aid, all the solutions reflected the detailed personal involvement of the designer. In this respect, the results were comparable to those of similar contests held over the course of the past twenty years. Two designs submitted as solutions to the lens design problem posed at the 1998 International Optical Design Meeting in Kona, HI. John Isenberg's winning design (top) achieved superb performance, but a typical solution from an unknown designer (bottom) lacked the essential apochromatic correction. In the drawings, higher index glasses are shown in warmer hues. More interesting are some of the losing designs among the few dozen that were submitted. It is apparent that many of the designers were unaware of some of the basic principles of apochromatic correction, or else that they tried to let the software do the thinking for them—and in this case, it didn’t work. Of course, apochromatic correction of lenses is not something that you expect the average engineer to be familiar with, but we’re talking here about a group of optical design specialists. These are the people who presumably will take over optical design when today’s top lens designers vanish. It seems reasonable to ask what happened here. The obvious response is that the software didn’t provide the answer: its standard built-in capabilities were insufficient and needed to be supplemented by designer-developed workarounds. As the scope of optical design widens, this type of situation will become increasingly common. To deal with it, we need to re-conceptualize optical design and ensure we are educating students in such a way as to provide them with the ability to meet the challenge. In particu-lar, it’s not a good idea to start by teaching geometrical optics and then progress to physical optics: better to start with waves and then introduce rays as a useful approximation when the dimensions of the elements are large compared to the wavelength. It’s true that learning is facilitated when students progress from simple to more complicated subjects, but the truth of the matter is that ray optics isn’t any simpler than wave optics. In fact, starting with rays implies the use of Snell’s law. Snell’s law is elegant, but when you try to use it to describe imaging systems you are led to series expansions that very quickly become unmanageable. At this point, with a limited degree of familiarity with paraxial optics, and believing that only lens designers with a penchant for algebra can understand the subject, the vast majority of students opt out. Optical design is difficult because it involves modeling a continuous non-linear system using discrete numbers, not because the algebra is hard. Computers can do the algebra. Starting with waves means beginning with Huygens’ principle and arriving at geometrical optics through Fermat’s principle, an approach that leads students to think about the physics of light rather than the algebra of ray tracing. This is good because it helps them understand how waves and rays are related, as well as how the systems needed to make sensible engineering decisions really work. If your job will one day entail circumventing the limitations of canned software, it’s important to understand the physics of what you’re doing. And it’s not likely that this requirement will change in the future. Jenkins’ and White’s folly First published in 1937 and still in print 63 years later, Fundamentals of Optics by Jenkins and White is probably the most widely used optics text ever published.1 In a field sown with thousands of what are known as “twig books,” Fundamentals of Optics has been at the root, or at least the trunk, of the tree. Everyone trained in optics is familiar with it. Yet the first two sentences start us off on the wrong track: Optics, the study of light, is conveniently divided into three fields, each of which requires a markedly different method of theoretical treatment. These are (a) geometrical optics, which is treated by the method of light rays, (b) physical optics, which is concerned with the nature of light and involves primarily the theory of waves, and (c) quantum optics, which deals with the interaction of light with the atomic entities of matter and which for an exact treatment requires the methods of quantum mechanics. What’s the problem? It’s not in the assertion that classical and quantum optics are based on different concepts. It’s in the assumption that the difference between geometrical and physical optics requires a “markedly different method of theoretical treatment.” Everyone agrees that the propagation of light is governed by Maxwell’s equations: it doesn’t matter whether you think in terms of wave surfaces or the normals to these surfaces. Although Jenkins’ and White’s “folly” has been detrimental to students trying to learn optics, the real problem is that the text seems to have been adopted as some sort of bible by suppliers of optical components. Real optical designers don’t use Jenkins and White. Lens design vs. optical engineering A few years ago, Rudolf Kingslake was named Engineer of the Year by the Rochester Engineering Society, not for anything he did during that particular year, but for a lifetime of professional accomplishment. When I congratulated him on an award justly deserved, he replied that he felt indeed honored to have been selected, and was somewhat surprised, because he had always thought of himself as a lens designer, not an optical engineer. Kingslake had a clear picture of the difference between the two. The lens designer was the one who balanced the aberrations; the optical engineer worked on layouts and negotiated with the marketing and mechanical departments to get enough room for the design to be implemented within the laws of physics. As we enter the new century, that distinction is disappearing. At the moment, it is leaving a void that is not being filled. Both lens designers and optical engineers need to confront this issue. The profession of lens design per se is likely to disappear. The reality today is that very few companies can afford to employ a full-time lens designer from the traditional mold. The problems that these companies face are not the aberration-balancing problems solved in traditional lens design—they’re much broader in scope. At the same time, the optimization of optical systems is not a problem that lends itself to casual solution, as the results of recent lens design contests demonstrate. There’s much demand for people who can work at a professional level in optical design, but who also have a broad engineering perspective. A few years ago, the Japanese trade ministry came up with the slogan “Electronics is the science of the twentieth century—Optics is the science of the twenty-first century.” Now, at the turn of the century, optics is looking pretty good, but we need to be confident that we’re working on the right agenda. We do indeed have the opportunity to make optics the science of the twenty-first century, but it will be primarily an engineering science, and we need to understand that. Including lens design as a part of optical engineering is not easy. Although changing the way that geometrical optics is taught may be of some help, it is not sufficient. More important is raising the level of optical design taught to optical engineers. There’s a great deal to be learned from the past, and it’s important to learn it. But there are wonderful opportunities in optical design, and surely there’s never been a better time for a young person to enter optics. |