Bragg's law was introduced by Sir W.H. Bragg, and his son W.L. Bragg. The law states that when a beam of x-rays of wavelength λ, enters a crystal, the maximum intensity of the reflected ray occurs when,
nλ = 2d sin θ
where θ is the complement of the angle of incidence, n is a whole number, and d is the distance between the layers of atoms.
Consider a single crystal with aligned planes of lattice points separated by a distance d. Monochromatic X-rays of light are incident upon the crystal at an angle θ at A₁ and A₂ respectively. The rays were reflected back with the same angle θ. Consider a point B on the ray which is right angle to incident ray and passes through A₁.
Lets take A₁BA₂,
By Pythagoras Theorem,
A₁A₂ = A₁B+A₂B,
When two rays are travel parallel to each other, the extra distance must be equal to the integral (n) multiple of wavelength (λ),
A₁A₂ = nλ
nλ = A₁B + A₂B
Since, It is a right angle triangle, sin θ will be equal to the ratio of opposite side to hypotenuse. and A₁B = A₂B
A₁B = d sin θ
A₂B = d sin θ