What does nm mean in infrared?

Nanometer in Infrared Context

Nanometer (nm) is a unit of length in the metric system, equivalent to one billionth of a meter (1 nm = 10-9 meters). In the context of infrared (IR) radiation, it is used to specify the wavelength of the light. Infrared radiation is a type of electromagnetic radiation, which also includes visible light, ultraviolet light, x-rays, and others.

IR wavelengths range from about 700 nm to 1 millimeter (mm), which places it just beyond the red end of the visible light spectrum (which runs from approximately 380 nm to 750 nm). The IR spectrum is commonly divided into three regions: near-infrared (NIR), mid-infrared (MIR), and far-infrared (FIR), with the near-infrared closest to visible light and far-infrared closest to the microwave range.

  • Near-Infrared (700 nm to 1400 nm): Primarily used for optical communications, sensing, and imaging.
  • Mid-Infrared (1400 nm to 3000 nm): Used in thermal imaging, heat sensing, and also in spectroscopy for chemical analysis.
  • Far-Infrared (3000 nm to 1 mm): Useful for thermal detectors, astronomy, and weather forecasting applications.

Infrared radiation, being outside the visible spectrum, is not visible to the human eye, but it can be detected as heat. Devices like infrared cameras, remote controls, and fiber-optic cables utilize infrared radiation. The precise measurement and characterization of IR wavelengths using the nanometer unit is crucial in designing optical systems and instruments that operate in this range of the electromagnetic spectrum.

Back to blog