Issue |
J. Eur. Opt. Society-Rapid Publ.
Volume 19, Number 1, 2023
EOSAM 2022
|
|
---|---|---|
Article Number | 25 | |
Number of page(s) | 6 | |
DOI | https://doi.org/10.1051/jeos/2023024 | |
Published online | 17 May 2023 |
Research Article
Two-wavelength digital holography through fog
1
Institute of Applied Optics, University of Stuttgart, Pfaffenwaldring 9, Stuttgart 70569, Germany
2
Institut für Lasertechnologien in der Medizin und Messtechnik, Helmholtzstrasse 12, Ulm 89081, Germany
* Corresponding author: alexander.groeger@ito.uni-stuttgart.de
Received:
15
February
2023
Accepted:
17
April
2023
Interferometric detection enables the acquisition of the amplitude and phase of the optical field. By making use of the synthetic wavelength as a computational construct arising from digital processing of two off-axis digital holograms, it is possible to identify the shape of an object obscured by fog and further increase the imaging range due to the increased sensitivity in coherent detection. Experiments have been conducted inside a 27 m long fog tube filled with ultrasonically generated fog. We show the improved capabilities of synthetic phase imaging through fog and compare this technique with conventional active laser illumination imaging.
Key words: Digital holography / Phase imaging / Imaging through scattering media / Two-wavelength holography / 3D shape measurement
© The Author(s), published by EDP Sciences, 2023
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1 Introduction
Digital holography is an interferometric detection technique, which provides the complex-valued wavefront of an optical field transmitted or reflected from the object of interest. Numerical computation on the recorded interferogram, yields the amplitude and phase information of the optical field. Since a strong reference beam can modulate a weak object signal, interferometric detection benefits from a heterodyne gain which increases the signal-to-noise ratio [1]. This enables holography at shot noise limit [2–4]. Soon after the invention of holography in the 1960s, its ability to image through scattering media has been studied [5, 6]. Especially in applications where only weak illumination powers are allowed or in scattering environments the increased sensitivity of holographic imaging by far exceeds the limits of conventional imaging [1, 7–10]. Today, the major strategy for imaging through scattering media, especially in large scale applications, is to isolate ballistic photons (i.e. photons travelling in a straight line through the scattering medium) by means of discriminating them against photons that are scattered and therefore experience a change in their physical properties (direction, polarization, coherence, etc.) [11–13]. Holography offers two main advantages here, on the one hand the signal amplification and on the other hand the coherence gating as an excellent filtering mechanism [6]. Furthermore, we can use the phase information provided by digital holography to measure the shape of an object obscured by fog. However, we need to synthetically enlarge the optical wavelength in order to measure the shape of meso- and macro scale objects. In addition, a surface roughness in the order of the optical wavelength produces large random changes of phase in the reflected wave. To circumvent such problems, a synthetic wavelength,(1)can be constructed using two digital holograms of the same object recorded with different wavelengths λ 1 and λ 2. The synthetic wavelength can potentially be orders of magnitudes larger than the optical wavelength. A similar mechanism can be observed in acoustics where it is referred to as beat (i.e. the interference pattern between two sounds of slightly different frequencies). Two-wavelength techniques were proposed for shape measurements of three dimensional object having rough surfaces. In [14], it is shown how the classical two-wavelength holography can be used for shape measurements. Digital holographic shape measurement was carried out by using different types of lasers, e.g. by tuning the wavelength of a ruby laser [15]. The combination of results obtained by recording digital holograms at different wavelengths allows an unambiguous and fine reconstruction of the object shape (see [16] for more details). The method can also be used when the objects have discontinuities (i.e. steps). An implementation of lensless multi-wavelength digital holography with auto-calibration of temporal phase shifts and artificial wavelength is presented in [17]. There are basically two different strategies to generate a synthetic wavelength hologram: the first one is to record two holograms with two sequential acquisitions [15, 18] and the second one is to record both holograms simultaneously in one frame [18–20]. Today, multi-wavelength digital holography is entering the realm of industrial applications for inline measurement [21]. One very important aspect of two-wavelength holography in combination with objects having a rough surface is the correlation of the speckle patterns of both wavelengths. How spectral correlations in the scattered wavefronts can be exploited to image an obscured object is shown in [22].
In this paper, we show that using the synthetic wavelength obtained by processing two off-axis digital holograms recorded with two different wavelengths allows to identify the shape of an object obscured by fog and further increases the imaging range due to the improved sensitivity of interferometric detection. Individual holograms are recorded sequentially using a tunable laser. In order to reduce speckle decorrelation, the optical setup should be steady for the duration of the measurement.
To demonstrate the increased sensitivity of the two-wavelength holographic imaging compared to conventional imaging, we perform comparative measurements inside a 27 m long tube having a diameter of 60 cm filled with ultrasonically generated fog.
Considering our experiments presented in [1], it has been shown that compared to conventional imaging (i.e. blocked reference beam in our setup) holographic imaging requires around 30 times less laser illumination power for the same imaging range through the fog. We will show that using two-wavelength holography enables the location of objects placed at different distances. Furthermore, by adjusting the wavelength difference, the technique allows the contouring (shape measurement) of the object. Notice that in this case a phase unwrapping algorithm should be applied to remove the 2π discontinuities inside the phase map. We also noticed that the contrast of the synthetic phase maps is improved compared to plain amplitude/intensity images. This allows an easier recognition of the object located inside the scattering media.
2 Experimental setup
We use the same experimental setup as described in our previous work [1]. The setup is illustrated in Figure 1.
Fig. 1 Off-axis digital holographic setup in image plane configuration. The object is located inside a tube with a length of 27 m filled with ultrasonically generated water mist. |
The object, which is located inside a fog tube, is illuminated by a tunable continuous wave laser (Toptica TA pro 780) with an adjustable optical power output of up to 4 W. The overall distance from camera to object is 30 m. The object is imaged onto the detector (eco655MVGE SVS-VISTEK in 8-bit mode with 2448 × 2050 px and a pixel size of 3.45 × 3.45 m2) via a lens (L2) with a focal length of 450 mm. The clear aperture of L2 is 50 mm. The small resulting field of view of about 0.5° strongly suppresses multiple scattered light, due to the small acceptance angle. To further reduce the detection of back-scattered light, the illumination beam is spatially separated from the imaging beam. The illumination cone generated by L1 is kept as small as possible, just big enough to properly illuminate the test objects. Overall, the amount of scattered light on the detector is small compared to the number of ballistic photons. Therefore, the fog in our experiment acts mainly as an absorber, essentially reducing the object signal strength. The light reflected from the object is superimposed with a reference beam to generate an interferogram on the detector. The coherence length of the laser used in the experiment is approx. 150 m. For stable operation, a polarization maintaining single mode fiber is used to guide the reference beam to the pupil plane of the imaging lens L2. The angle between object and reference beam is adjusted to achieve sufficiently high sampling of the interference fringes (see Fig. 3). This is referred to as an off-axis holographic setup in image plane configuration, that allows us to acquire holograms and conventional images directly one after the other by simply blocking the reference beam. In order to adjust the wavelength for each individual hologram required for synthetic phase imaging, we incorporate a stepper motor attached to the wavelength adjustment screw on the laser. The stepper motor is controlled by a host pc running a measurement script and is synchronized with the camera shutter. The time between both acquisitions is typically less than 0.5 s.
The scattering media inside the fog tube is ultrasonically generated water mist. The fog density is continuously measured with a separate 780n laser diode and a power meter (Thorlabs PM160) in an one-way arrangement. The signal intensity of the light beam traveling the distance d through scattering media is reduced to (see Lambert-Beer’s law [23]):(2)where I is the signal intensity and ε is the wavelength and medium dependent attenuation coefficient. As described in [1], one attenuation length (AL) is the distance over which the signal strength is reduced by a factor of e −1. It is defined as:(3)
The fog density in our experimental results is represented in attenuation lengths. For our measurements, we use life-size styrofoam heads as test objects (see Fig. 2).
Fig. 2 Two life-size styrofoam heads used as test objects. |
A hologram is given by the intensity of the interference between a wave coming from the object under investigation |O| and a reference beam |R|. According to [24], the resulting intensity can be expressed as,(4)
Figure 3a shows a hologram (i.e. interferogram) recorded with one wavelength by using the arrangement described in Figure 1.
Fig. 3 (a) Digitally recorded hologram; (b) absolute value of Fourier-spectrum of (a) calculated using FFT algorithm. |
The light reflected by the object hitting the detector is weak compared to the reference beam (|O| ≪ |R|), which is due to the presence of fog that scatters most of the photons of the illumination beam. Even at low object signal intensities, interference occurs and the modulated object spectrum becomes visible in the Fourier domain (see Fig. 3b).
3 Data processing for holographic image reconstruction
The data processing is done in different steps. After the recordings, the spectra of the holograms are calculated, filtered and inverse Fourier transformed. The obtained phase maps are subtracted from each other in order to obtain a new phase that is then filtered and unwrapped.
3.1 Calculating the spectrum of a hologram
At first, the 2D FFT of the digitally recorded hologram is calculated. The result will include the spectrum of object and reference signal and the modulated object signal (by the reference signal). The modulated signal results in two lobes symmetrically shifted away from the zero-frequency location. We refer to each one of these lobes as the modulated object spectrum. The size of these lobes is proportional to the imaging lens aperture.
3.2 Filtering the object signal in the frequency domain
Secondly, a circular digital binary mask is utilized to isolate one modulated object spectrum (i.e. lobe), which corresponds to a process of spatial filtering. The chosen mask diameter corresponds to the diameter of the modulated object spectrum in order to maximize object signal strength and spatial resolution. The filtered spectrum is shifted numerically to the zero-frequency point and an inverse 2D FFT is applied for obtaining a complex amplitude, containing amplitude and phase images of the object.
3.3 Calculating the phase map, filtering and unwrapping
Lastly, the reconstructed 2D phase maps from the two holograms, which correspond to the two different wavelengths employed, are subtracted in order to obtain the difference phase map. Fringes of equal phase represent the intersection of the longitudinal expansion of the test scene surface with equidistant planes spaced by . The obtained phase map usually contains speckle noise due to the application of coherent light sources. We reduce this noise by applying a sin/cosine average filter multiple times (i.e. 130) as discussed in [25] with a kernel size of 5 × 5. Furthermore, the phase map is wrapped into the interval (−π, π]. In order to solve the 2π ambiguity, a phase unwrapping algorithm will be applied [25].
3.4 Calculating the object shape from phase map
We denote ϕ 1(x, y) and ϕ 2(x, y) as the phase values obtained after processing the digital holograms recorded using the wavelengths λ 1 and λ 2 and their difference with,(5)where x and y denote pixel coordinates. From Δϕ and the synthetic wavelength Λ (see Eq. (1)) a height map h(x, y) (containing the shape) of the object can be calculated according to the relation:(6)
Equation (6) is valid when the sample is illuminated with a plane wave parallel to the observation direction. If this is not the case (e.g. illumination with an off axis spherical wave) a correction factor need to be included. In our experiment the sample was located at a long distance (27 m) from the recording system, thus the illumination has to be considered almost as a plane wave. The angle between illumination and observation was less than 1°. Thus, we may determine the shape with a good approximation using 6. Notice that the phase difference Δϕ(x, y) is wrapped (mod 2π) in the interval (−π, π] and needs to be further processed in order to obtain the unwrapped phase.
4 Experimental results
The presented technique enables the adjustment of the synthetic wavelength to the longitudinal size of the object or scenery consisting of several objects. To validate our holographic imaging setup, we conducted two different sets of measurements. In both experiments, the objects are located at the far end of the fog tube and thus approximately 30 m away from the recording system. At first we tested the shape measurement using only one of the life-size styrofoam heads in clear view (no fog). Afterwards, we performed shape measurements of a scenery consisting of both heads and at different fog densities.
4.1 Phase imaging of a single object in clear view
In order to reveal smaller details such as the ear, a smaller synthetic wavelength of approx. 15 cm is chosen (Δλ ≈ 4 pm). This is less than half the size of the head resulting in phase ambiguities. The measurement is shown in Figure 4.
Fig. 4 Imaging of one head in clear view: Photography (a); holographic reconstruction of one of the fundamental holograms (b); wrapped phase map of synthetic hologram (c); sin-cos filtered wrapped phase map (d); unwrapped phase map (e); false-color image with scale (f). |
The reconstructed amplitude in Figure 4b shows how the speckle pattern resulting from the coherent illumination causes a rather poor image quality. Nevertheless, the image quality can be improved using the synthetic phase instead. However, for the synthetic phase maps additional noise sources come into play. Especially the decorrelation in the speckle patterns due to spectral and mechanical changes between the hologram acquisitions introduces significant noise, as shown in Figure 4c. To reduce the speckle noise present in the phase map, we apply a sin-cos filter as proposed in [25]. In order to resolve the phase ambiguities, we additionally apply a phase unwrapping algorithm (PUMA). The resulting phase maps are presented in Figures 4e and 4f. The filtered phase map is free from high frequency speckle noise. However, the steep regions in proximity to the nose, where abrupt phase changes occur, is distorted. A closer look at the unwrapped phase map around the nose shows that the unwrapping algorithm has difficulties to produce reasonable results here. A direct comparison of Figures 4b and 4e demonstrates the benefit of synthetic phase measurement. The proposed technique is able to produce a smooth shape measurement containing even small details (e.g. ear). Now, if the requirement is to detect objects as parts of a whole scenery, the synthetic wavelength can be enlarged to the longitudinal size of the scene and the phase unwrapping can be omitted.
4.2 Phase imaging of a scenery of objects through fog
In ref. [1], we demonstrated the increased sensitivity in holographic imaging through fog. In comparison, conventional imaging with active illumination would require 30 times more laser illumination power for the same imaging range. In the following set of measurements, we demonstrate the capabilities of two-wavelength holography at weak object signals and compare the synthetic phase maps to conventional images. The idea of phase imaging multiple objects implies to be able to separate individual objects and quantify the longitudinal distance between them. Therefore, we adjusted the synthetic wavelength to approx. 60 cm (Δλ ≈ 1 pm), resulting in a clear separation of the objects in the synthetic phase map. Figure 5 shows the results of measuring the two heads through fog. The heads are longitudinally separated by approx. 20 cm. The fog gradually becomes more dense towards the bottom, resulting in a very low signal-to-noise ratio. Here, the significant effect of the sin-cos filter eliminating most of the speckle noise becomes clearly visible. A comparison between the reconstructed amplitude in Figure 5a and the filtered phase map in Figure 5c shows the advantage of synthetic phase imaging. The individual objects are clearly distinguishable from each other and from the background.
Fig. 5 Holographic measurements through fog at 4.0 AL: holographic reconstruction (a); synthetic phase map (b); sin–cos filtered synthetic phase map (c). |
In Figure 6, we show comparative measurements of the presented technique and conventional imaging at different levels of fog density. The contrast of the conventional images is adjusted to improve visibility. Again, the vertical gradient of the fog density becomes visible. Especially in the holographic images in Figure 6a only the upper part of both heads are recognizable. In the corresponding conventional image, virtually nothing except the forward scattered light of the fog density monitoring laser (bright spot on the right hand side) is visible. The scenery is correctly reproduced in the phase maps in Figures 6b and 6c. While in conventional imaging the objects become visible only in the third image (Fig. 6c), the filtered phase map already reveals more detailed structures like the ear. The phase values (shown in gray) of both heads in Figure 6a differ from the ones in Figures 6b and 6c. This is due to slight deviations in the wavelength differences between consecutive measurements. The wavelength difference in Figure 6a is smaller (than 1 pm), resulting in a larger synthetic wavelength.
Fig. 6 Filtered synthetic phase images, reconstructed amplitude and conventional images at different fog densities: dense fog (a); intermediate fog (b); light fog (c), the gray scale values in the phase maps ranging from −π to π correspond to half the synthetic wavelength, i.e. approx. 300 mm. |
5 Discussion and conclusion
In ref. [1], we already demonstrated that due to the coherent gain digital holography has improved sensitivity compared to conventional imaging. Thus, it allows to retrieve a shot noise limited image even when only few photons are reflected or transmitted by the sample. This makes it a suitable technique for applications where fog or smog reduce the visibility. In this paper we demonstrated that the detection of actively illuminated objects hidden in fog can be further improved by processing two digital holograms recorded at two slightly different wavelengths (the wavelength difference is typically few picometers). For our investigations, we used a tunable laser and the holograms were recorded sequentially but they could also be recorded in one frame by using two lasers emitting at different wavelengths. Two lasers emitting stable wavelengths simultaneously (without drift of the wavelength difference) separated by a few p were not available for our investigations, for this reason sequential recording was applied. Another way to obtain the two wavelengths for synchronous hologram recording would be to use an acousto-optic modulator as frequency shifter. The integration time of the camera in our experiment for the hologram recording has been around 500 μs. For the investigation of high dynamic events (e.g. application of the technique as imaging assistance for automotive driving in difficult visibility conditions), it would be suitable to use pulsed lasers emitting at slightly different wavelengths. The resulting phase images provide contours of each individual object as well as their longitudinal separation. Such data in combination with deep learning can potentially achieve new standards in object detection and situational awareness in challenging visibility conditions. In ongoing research projects we combine two-wavelength digital holography and deep learning.
Conflict of interest
The authors declare no conflict of interest.
Acknowledgments
The authors gratefully acknowledge financial support from the Baden-Württemberg Stiftung.
References
- Gröger A., Pedrini G., Claus D., Alekseenko I., Gloeckler F., Reichelt S. (2023) Advantages of holographic imaging through fog, Appl. Opt. 62, D68. [CrossRef] [Google Scholar]
- Tippie A.E., Fienup J.R. (2012) Weak-object image reconstructions with single-shot digital holography, weak-object image reconstructions with single-shot digital holography, in: Biomedical optics and 3-D imaging, Optical Society of America, Washington, D.C., p. DM4C.5. [CrossRef] [Google Scholar]
- Gross M., Atlan M. (2007) Digital holography with ultimate sensitivity, Opt. Lett. 32, 909. [NASA ADS] [CrossRef] [Google Scholar]
- Verpillat F., Joud F., Atlan M., Gross M. (2010) Digital holography at shot noise level, J. Display Technol. 6, 455. [NASA ADS] [CrossRef] [Google Scholar]
- Stetson K.A. (1967) Holographic fog penetration, J. Opt. Soc. Am. 57, 1060. [NASA ADS] [CrossRef] [Google Scholar]
- Lohmann A., Shuman C. (1973) Image holography through convective fog, Opt. Commun. 7, 93. [NASA ADS] [CrossRef] [Google Scholar]
- Dykes J., Nazer Z., Mosk A.P., Muskens O.L. (2020) Imaging through highly scattering environments using ballistic and quasi-ballistic light in a common-path sagnac interferometer, Opt. Express 28, 10386. [NASA ADS] [CrossRef] [Google Scholar]
- Locatelli M., Pugliese E., Paturzo M., Bianco V., Finizio A., Pelagotti A., Poggi P., Miccio L., Meucci R., Ferraro P. (2013) Imaging live humans through smoke and flames using far-infrared digital holography, Opt. Express 21, 5379. [NASA ADS] [CrossRef] [Google Scholar]
- Marron J.C., Kendrick R.L., Thurman S.T., Seldomridge N.L., Grow T.D., Embry C.W., Bratcher A.T. (2010) Extended-range digital holographic imaging, Extended-range digital holographic imaging, in: Turner M.D., Kamerman G.W. (eds.), Laser Radar Technology and Applications XV, Vol. 7684, International Society for Optics and Photonics (SPIE), Washington, D.C., pp. 493–498. [Google Scholar]
- Kanaev A.V., Watnik A.T., Gardner D.F., Metzler C., Judd K.P., Lebow P., Novak K.M., Lindle J.R. (2018) Imaging through extreme scattering in extended dynamic media, Opt. Lett. 43, 3088. [NASA ADS] [CrossRef] [Google Scholar]
- Dunsby C., French P. (2003) Techniques for depth-resolved imaging through turbid media including coherence-gated imaging, J. Phys. D – Appl. Phys. 36, R207. [CrossRef] [Google Scholar]
- Kijima D., Kushida T., Kitajima H., Tanaka K., Kubo H., Funatomi T., Mukaigawa Y. (2021) Time-of-flight imaging in fog using multiple time-gated exposures, Opt. Express 29, 6453. [NASA ADS] [CrossRef] [Google Scholar]
- Caimi F., Dalgleish F. (2010) Performance considerations for continuous-wave and pulsed laser line scan (lls) imaging systems, J. Europ. Opt. Soc. Rap. Public. 5, 10020s. [NASA ADS] [CrossRef] [Google Scholar]
- Friesem A.A., Levy U. (1976) Fringe formation in two-wavelength contour holography, Appl. Opt. 15, 3009. [NASA ADS] [CrossRef] [Google Scholar]
- Pedrini G., Fröning P., Tiziani H.J., Gusev M.E. (1999) Pulsed digital holography for high-speed contouring that uses a two-wavelength method, Appl. Opt. 38, 3460. [NASA ADS] [CrossRef] [Google Scholar]
- Wagner C., Osten W., Seebacher S. (2000) Direct shape measurement by digital wavefront reconstruction and multi-wavelength contouring, Opt. Eng. 39, 79. [NASA ADS] [CrossRef] [Google Scholar]
- Carl D., Fratz M., Pfeifer M., Giel D.M., Höfler H. (2009) Multiwavelength digital holography with autocalibration of phase shifts and artificial wavelengths, Appl. Opt. 48, H1. [NASA ADS] [CrossRef] [Google Scholar]
- Pedrini G., Alekseenko I., Jagannathan G., Kempenaars M., Vayakis G., Osten W. (2019) Feasibility study of digital holography for erosion measurements under extreme environmental conditions inside the international thermonuclear experimental reactor tokamak, Appl. Opt. 58, A147. [NASA ADS] [CrossRef] [Google Scholar]
- Kühn J., Colomb T., Montfort F., Charrière F., Emery Y., Cuche E., Marquet P., Depeursinge C. (2007) Real-time dual-wavelength digital holographic microscopy with a single hologram acquisition, Opt. Express 15, 7231. [CrossRef] [Google Scholar]
- Claus D., Alekseenko I., Grabherr M., Pedrini G., Hibst R. (2021) Snap-shot topography measurement via dual-vcsel and dual wavelength digital holographic interferometry, Light: Adv. Manuf. 2, 403. [Google Scholar]
- Fratz M., Seyler T., Bertz A., Carl D. (2021) Digital holography in production: an overview, Light: Adv. Manuf. 2, 283. [Google Scholar]
- Willomitzer F., Rangarajan P.V., Li F., Balaji M.M., Christensen M.P., Cossairt O. (2021) Fast non-line-of-sight imaging with high-resolution and wide field of view using synthetic wavelength holography, Nat. Commun. 12, 6647. [NASA ADS] [CrossRef] [Google Scholar]
- Beer A. (1852) Bestimmung der Absorption des rothen Lichts in farbigen Fluessigkeiten, Ann. Phys. 162, 78. [NASA ADS] [CrossRef] [Google Scholar]
- Gabor D. (1948) A new microscopic principle, Nature 161, 777. [CrossRef] [PubMed] [Google Scholar]
- Valadão G., Bioucas-Dias J.M. (2007) PUMA: Phase Unwrapping via MAx flows, in: Proceedings of Conference on Telecommunications – ConfTele, Peniche, Portugal, pp. 609–612. [Google Scholar]
All Figures
Fig. 1 Off-axis digital holographic setup in image plane configuration. The object is located inside a tube with a length of 27 m filled with ultrasonically generated water mist. |
|
In the text |
Fig. 2 Two life-size styrofoam heads used as test objects. |
|
In the text |
Fig. 3 (a) Digitally recorded hologram; (b) absolute value of Fourier-spectrum of (a) calculated using FFT algorithm. |
|
In the text |
Fig. 4 Imaging of one head in clear view: Photography (a); holographic reconstruction of one of the fundamental holograms (b); wrapped phase map of synthetic hologram (c); sin-cos filtered wrapped phase map (d); unwrapped phase map (e); false-color image with scale (f). |
|
In the text |
Fig. 5 Holographic measurements through fog at 4.0 AL: holographic reconstruction (a); synthetic phase map (b); sin–cos filtered synthetic phase map (c). |
|
In the text |
Fig. 6 Filtered synthetic phase images, reconstructed amplitude and conventional images at different fog densities: dense fog (a); intermediate fog (b); light fog (c), the gray scale values in the phase maps ranging from −π to π correspond to half the synthetic wavelength, i.e. approx. 300 mm. |
|
In the text |
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.