Most optical under-display fingerprint scanning systems use light emitted from the device's display to illuminate the fingertip of the user, which is reflected off of the fingerprint and back through tiny openings between the display pixels. A sensor beneath the display can then read the fingerprint and authenticate the user.
Due to the "low-light throughput and diffraction" caused by the display stack, the fingerprint image is liable to suffer from low contrast and low signal-to-noise ratio, making it harder to read the fingerprint and potentially increasing the time it takes to authenticate a user.
To overcome this problem, Apple proposes a system in which the off-axis angular light from the finger is captured via a series of "angle-dependent filtering options between the display and the sensor." This method can "improve the contrast of fingerprint impressions and maintain the compactness of the entire sensing system," according to Apple.