Basically, up to GCSE level (school exams taken at age 16) one is taught about light as being a "ray" which always travels in straight lines and obeys
Fermat's principle, that light always takes the shortest path in terms of travel time. This is basically pretty close to the Newtonian view of light as understood in around the 18th century, and is sufficient to explain most of classical optics such as reflection and refraction up to calculating such things as the focal length of a lens.
At A-level (exams taken at age 18) one is introduced to the concept of diffraction, whereby the path light is deflected as it passes close to an object, most commonly illustrated by passing a beam of light through a narrow slit. This is impossible to explain in terms of light being a ray or a classical particle but is on the other hand a characteristic property of waves. Historically this led to the development of the wave theory of light and specifically
Huygen's principle whereby one considers the propagation of a light wave by considering a series of "wavelets" emanating from every point along a wave front and examining in which directions the wavelets will cancel one another, and in which directions they will reinforce. This is the approach taught at A-level, and allows one to correctly describe diffraction in addition to reproducing all the results of the GCSE approach, i.e. refraction and reflection.
At University level we covered most of this material again in my 1st year from a more formal, mathematical perspective, but we were also introduced to the paradoxes in the classical (wave theory) view of light which led to the development of quantum mechanics. The main issue is the conflicting classical view of light being either a stream of particles or a continuous wave, with sound experimental evidence for both (such as diffraction indicating light must be a wave, while the
photoelectric effect suggests light must travel in discrete packets of energy). Quantum mechanics resolves the issue by describing light as photons, which in some ways behave like waves, in some ways like classical particles, but also with some entirely new properties not predicted by classical physics. This leads on to the contemporary field of Quantum Optics, which I studied briefly in my final year and covers things like entanglement, quantum cryptography and quantum teleportation.
At postgraduate level the generally accepted model is Quantum Electrodynamics, which is basically quantum mechanics fused with special relativity. As far as I am aware this model accurately describes almost all properties of light so far discovered (though it must be combined into a larger theory if one wants to describe other properties of matter such as nuclear physics or gravity).
I suppose in a way the order in which different theories of light are taught loosely mirrors the historical development of the theories; one starts with a relatively simple model of light which explains the known properties of light, and the model is replaced with successively more sophisticated models to explain new properties of light as they are discovered. It used to annoy me, that almost every year we were told "Okay, everything you've learnt up to now about light is wrong, here's how it actually works", but I realise now it is designed to help you get used to the idea that you are not trying to comprehensively describe how light "works", since no one really knows; you are simply drawing some analogy to see what properties of light it predicts. I have found this is an approach we use very often in research, frequently switching between different particle physics models which explain a certain aspect of physics (and very often
only that aspect) for calculational simplicity. I guess the point is that if all you want to do is calculate lens focal lengths, you don't need to learn quantum mechanics.
Bookmarks