Introduction
Diffraction patterns are a frequent occurrence on mobile phone screens, displaying as colored or shimmering effects that can differ based on the lighting environment. This optical phenomenon is caused by the wave properties of light and the finite size of pixels, and can affect image quality, particularly on high-resolution displays, leading to visual disturbance or distraction.
The interaction of light with matter in computer graphics (CG) is described by standard geometric optics. However, it is difficult to simulate phenomena associated with the wave optics domain, such as iridescence or diffraction on the surface of a compact disc. To simulate the iridescent effect caused by the diffraction of light on a reflective surface, we will derive a geometric model of the BRDF that will allow the simulation of diffraction effects to be simulated in a bidirectional tracer that is mainly governed by geometric optics. This geometric model reproduces the diffraction effects based on measurements.
In the following sections we will describe the approach and the measurement procedure, as well as the Ocean BSDF node.
General Principles
Pixel diffraction: The optical phenomenon that occurs when light interacts with the edges of individual pixels on a display screen or imaging sensor
Wave Nature of Light: Pixel diffraction is a consequence of the wave nature of light. When light waves encounter an obstacle or aperture, such as the edges of pixels, they can bend or spread out, creating interference patterns.
Interference Patterns: As light passes through or around the edges of pixels, it interferes with itself, resulting in the formation of patterns. These patterns can manifest as lines, rings, or other geometric shapes, and they can produce various optical effects.
Appearance of Patterns: The patterns generated by pixel diffraction can appear as colored or shimmering effects on a display, particularly when viewing fine details, small text, or high-resolution images. These effects can be more pronounced on screens with smaller pixels or when the viewer is close to the display.
Pixel Size and Resolution: The degree of pixel diffraction is influenced by the size of the pixels and the resolution of the display or imaging sensor. Smaller pixels and higher resolutions can lead to more pronounced diffraction patterns.
Modelling Approach
Our approach is mainly based on the paper of Antoine Toisoul and Abhijeet Ghosh. – Practical Acquisition and Rendering of Diffraction Effects in Surface Reflectance. [ACM Trans. Graph. 36, 5, Article 166 (October 2017), https://doi.org/10.1145/3012001]. For an exhaustive description, please refer to the paper.
The model is based on a texture/look-up table used to derive a diffraction pattern. It exploits the Huygens-Fresnel principle associated with mathematical description of sum (see section 4.1 of the above cited paper for more details).

Figure 1 - Schema of the production of diffraction phenomenon on a screen by reflection on a surface. From A. Toisul and A. Ghosh.
We won’t go into all the calculation steps here. However, the formula for the contribution of diffraction to the
BRDF (with
and
as incident and outgoing direction) is as follows:
Here we find the function
, which corresponds to the Fresnel term (the intensity reflected by the pattern) and the geometric term
of Stam’s model [Diffraction Shaders, 1999].
The
function we are interested in is called the diffracted irradiance.
We are particularly interested in the formulation which shows here that the reflected radiance pattern
is captured at a master wavelength
and that a power of dispersion is introduced around this wavelength in a given
interval.
In summary, we have listed above the key elements that drive the diffraction BSDF :
: diffraction pattern at normal incidence (view from above).
: master wavelength at which this pattern was obtained.
: the dispersion power around this wavelength (which extrapolates spectral behavior).
: spectral interval of validity for this pattern.
: total reflected intensity spectrum.
Input Data
As explained above, there are two main inputs to simulating a diffraction pattern: the pattern and the reflective intensity.
Reflective intensity:
For reflected intensity, total reflectance is measured using an integrating sphere to recover all the intensity reflected by the diffractive surface.

Figure 2 - Diagram of an integrating sphere in "total reflectance" configuration
It is best to carry out this measurement with angular dependence if the device supports it. However, in Ocean it is still possible to extrapolate a single measurement at normal incidence using Schlicks’ law, based on the paper of Schlick, Christopher M.. “An Inexpensive BRDF Model for Physically‐based Rendering.” Computer Graphics Forum 13 (1994).
Diffraction pattern
All the different parameters listed previously are intended to allow several approaches to using a diffraction pattern. In fact, it is not possible to be exhaustive about the methods of acquiring the diffraction pattern. In another way, each method also imposes certain parameters depending on the acquisition method.
We will describe the impact of the different parameters to illustrate how to obtain a pattern.
Firstly, we estimated that there are three methods of obtaining a diffraction pattern, each with its advantages and disadvantages:
- Empirical generation: with a python script, we periodically position peaks (e.g. Gaussian) with different intensities. We save this distribution as .exr file.
- Fast & flexible
- Not predictive
- Simulated method: using numerical methods (FDTD, python, matlab, etc.), we calculate the diffraction pattern for a given wavelength. This distribution is saved in .exr format.
- Requires a diffraction solver
- Predictive
- Measuring method: by capturing the displays’s reflection, we obtain a projection of the diffraction pattern. This pattern is then placed in the correct frame of reference (angle around the central peak). The image is cleaned of noise (threshold, low-pass filter, etc.). Save as .exr.
- Lots of post-processing
- Predictive
The simulated methods (1 & 2) offer the freedom to define the maximum diffraction angle, the desired wavelengths, and a certain level of quality on the relative intensities between peaks.
The measured method (3) requires a certain amount of experimental equipment, particularly for spectral filters, and extensive post-processing to achieve a distribution with good quality (noise, intensity, relative range dynamics, etc.). The experimental setup will directly constrain the parameters: wavelength, pattern size, etc.
For the rest of this article, we’re going to follow method (1) so that we can share all the elements.
Diffraction Map in Ocean
Reflective intensity curve
Here, we have generated a reflection spectrum with three Gaussian peaks centered on different wavelengths (410nm, 555nm & 620nm), which reproduce the reflection of blue, green & red LEDs respectively. The maximum intensity has been arbitrarily raised to 1%. For angular behavior, we used the Schlicks interpolation available in Ocean.

Figure 3 - Left : Reflection spectrum for three different LEDs

Figure 4 - Right : Angular reflection curve at 550nm.
We therefore entered these reflection curves in an avspectrum node.
Diffraction pattern
For this part, we generated with Python a periodic pattern with a triangle raster. We have a central peak that is more spread out at maximum intensity. Below, a view of the map (right) and a cross-section at y=0 on the left.

Figure 5 - Left : Intensity map at y=0 . Right : Intensity map in false color (blue to red), redline is y=0.
As can be seen from the graph, the pattern was generated from -9° to +9°. In fact, this distribution is a hemispherical projection in plan view.

Figure 6 - This diagram shows that the map corresponds to the angle domain for the pattern. By varying θmax, we spread or narrow the pattern around the central peak perceived by an observer.
As we saw in the previous section, we generated a complete spectrum for the three LEDs (blue, green, red). We therefore assume that a single map is sufficient to represent the pattern and we have chosen a “central” wavelength, arbitrarily 550nm, that will be our master wavelength. This assumes that the spectral range to be considered represents the entire reflection spectrum: 380 to 780 nm.
This strong assumption implies that the pattern does not change whatever the wavelength, i.e. the type of LED. In practice, the positioning and shape of the LEDs will cause the pattern to vary for different wavelengths.
The last criterion corresponds to the power of dispersion. As a reminder, this will spread a peak depending on the wavelength. This criterion is necessary if the spectral dependence is incomplete (which is the case here). If the user can obtain a map by wavelength and wishes to follow this procedure, the power of dispersion is useless and should be set to zero.

Figure 7 - From left to right, the power of dispersion is increased.
As can be seen in the image above, the green peak hardly moves at all. This is because we have defined our master wavelenght at 550nm, which corresponds to the green peak. The spectrum is therefore spread around the green.

Figure 8 - Interface for the diffraction node in Ocean. The parameters appear when BRDF is selected. The reflection spectrum is in intlaw (Angle spectrum er), the map is in 'map' (External file).
We will find the parameters in the graphical interface of Ocean previously explained:
- θmax = 9° : maximal angle defined by the map generation’s Python Script
- dispersion_power = 1.0 : free parameter for peaks spectral spreading
- lambda_map = 550nm : wavelength corresponding to the diffraction pattern, defined arbitrarily here
- lambda_min = 380nm : minimal wavelength of pattern, defined here in visible range
- lambda_max = 780nm : maximal wavelength of pattern, defined here in visible range
Examples of pixel diffraction patterns and simulations generated with OceanTM
Explore various diffraction patterns such as:


The images below represent three simulations of reflective diffraction on a phone with a user defined diffraction pattern (exr image) with different dispersion power.
The following images show the same reflective diffraction configuration with a different diffraction pattern map:


Conclusion
In Ocean, we’ve introduced an advanced tool that accurately simulates diffraction patterns observed on phone screens. This tool permits users to customize patterns by specifying their intensity in different regions. What distinguishes this tool is its capacity for spectral diffraction, even without complete spectral data. This addition leverages Ocean’s technical capabilities, enabling users to delve into diffraction phenomena and study material interactions with precision.
Responses