Discovered by W.H. and W.L. Bragg in 1915, Bragg's law interprets the diffraction of electromagnetic waves by a crystal. The process of diffraction signifies the reaction of waves in contact with an obstacle which is not totally transparent. Bragg's law determines a correlation between the distance separating the atoms of a crystal and the angles at which the X-rays transmitted through the crystal are principally diffracted. This empirical law is formulated by the equation 2d sin a = nλ. "d" corresponds to the distance between two atomic planes of the crystal. "n" indicates a pattern of diffraction. "λ" determines the wave-length of the incidental light. Finally, "a" defines the angle of incidence of the ray of light.