Bragg's law - Definition


Discovered by W.H. and W.L. Bragg in 1915, Bragg's law interprets the diffraction of electromagnetic waves by a crystal. The process of diffraction signifies the reaction of waves in contact with an obstacle which is not totally transparent. Bragg's law determines a correlation between the distance separating the atoms of a crystal and the angles at which the X-rays transmitted through the crystal are principally diffracted. This empirical law is formulated by the equation 2d sin a = nλ. "d" corresponds to the distance between two atomic planes of the crystal. "n" indicates a pattern of diffraction. "λ" determines the wave-length of the incidental light. Finally, "a" defines the angle of incidence of the ray of light.
Published by Jeff. Latest update on October 29, 2013 at 07:16 AM by Jeff.
This document, titled "Bragg's law - Definition," is available under the Creative Commons license. Any copy, reuse, or modification of the content should be sufficiently credited to CCM Health (
Locus - Definition
Mendel's laws - Definition