How is Optical Density (OD) in the blocking band affected by changes in the Angle of Incidence (AOI)?

When dealing with thin-film optical bandpass filters, changing the Angle of Incidence (AOI) away from the angle it was designed for (typically 0°, or normal incidence) has a profound effect on the Optical Density (OD) in the blocking band.

The most prominent effect of increasing the AOI is that the entire spectral response of the filter shifts toward shorter wavelengths. This is commonly known as a "blue shift."

Here is a breakdown of how this AOI change specifically affects the Optical Density in the blocking band:

1. Wavelength-Specific Loss of Optical Density

Because the entire transmission and blocking profile shifts to shorter wavelengths as the AOI increases, the OD at a specific target wavelength can change drastically.

  • If you are relying on the filter to provide a high OD (e.g., OD6) to block a specific laser line at 1064nm, an increased AOI will shift the filter's transmission band toward the blue.
  • If the transmission band or a sideband ripple shifts over your target wavelength, the OD at that specific wavelength will plummet, allowing light to leak through.

2. Polarization Splitting (S and P Polarization)

At normal incidence (0°), light behaves identically regardless of its polarization state. However, as the AOI increases, the incident light splits into two orthogonal polarization states: s-polarization and p-polarization.

  • P-polarized light experiences a much larger blue shift than s-polarized light.
  • S-polarized light shifts less, but the edges of its transmission band often become narrower, and its blocking bands can behave differently.
  • Impact on OD: Because the two polarization states shift by different amounts, the sharp transition edges between the transmission band and the blocking band become "smeared" or broadened. If you are using unpolarized light, this splitting reduces the steepness of the filter edge, effectively lowering the OD in the transition zones right next to the passband.

3. Degradation of Maximum Optical Density

Thin-film interference filters are constructed using alternating layers of high and low refractive index materials, optimized for a specific angle to create perfect destructive interference in the blocking regions.

  • When the angle changes, the optical path length through these microscopic layers changes.
  • This phase mismatch reduces the efficiency of the destructive interference. As a result, the absolute maximum depth of the blocking band can degrade at higher angles. A filter that achieves OD 7 at 0° might drop to a lower overall blocking efficiency at 45° due to these phase errors.

The Shift Formula

For reference, the shift in the center wavelength (and the corresponding shift in the blocking bands) can be estimated using a standard plain text equation:
Shifted Wavelength = Design Wavelength * square root of [1 - (sin( AOI) / Effective Refractive Index)2]

 

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.