Consider a problem in classical electrodynamics, when a monochromatic beam experiences total internal refraction when traveling from a medium with $n>1$ to a medium with refractive index $1$ - see schematic below. Using Fresnel equations one gets the penetration depth $$d = \frac{1}{\sqrt{k_x^2+k_y^2-(\tfrac{2\pi}{\lambda})^2}},$$ where $k_x$ and $k_y$ are the perpendicular components of the wave vector, and $\lambda$ is the wavelength (in the vacuum).
At least in theory, it is possible to have an evanescent wave of an arbitrary penetration depth $d$. However, in such case one needs to use a plane wave, thus a wave of unbounded spatial size. For a beam with a finite variance $\langle x^2\rangle$ (and $k_y=0$ to reduce the problem to two dimensions) there seems to be a relation that $\langle d\rangle/\sqrt{\langle x^2\rangle}$ is lesser than a constant.
The main questions: is there any strict bound in the form of $$\text{a measure of penetration depth}\leq f(\text{transversal beam size},n)$$ (perhaps in the style of Heisenberg uncertainty principle, or using other moments of $x$, $y$ and $d$)?
No comments:
Post a Comment