Issue 
J. Eur. Opt. SocietyRapid Publ.
Volume 19, Number 2, 2023



Article Number  44  
Number of page(s)  10  
DOI  https://doi.org/10.1051/jeos/2023042  
Published online  07 December 2023 
Research Article
Threedimensional temperature field measurement method based on light field colorimetric thermometry
School of Optoelectronic Engineering, Xidian University, Xi’an, Shaanxi 710071, China
^{*} Corresponding author: yuanying@xidian.edu.cn
Received:
4
October
2023
Accepted:
10
November
2023
This paper proposes a novel method for threedimensional (3D) temperature measurement using light field colorimetric thermometry, aiming to overcome the challenges associated with the intricate system structure and the limited availability of 3D information in traditional radiation temperature measurement methods. Firstly, the correlation between corresponding image points and the positions of 3D object in the light field image system is analyzed using the ray tracing method. The 3D position acquisition model and the light field colorimetric thermometry model are established, enabling simultaneous acquisition of the spatial coordinates and radiometric information of the 3D object. Then, the light field camera radiation calibration experiment was conducted, and the 3D temperature field will be obtained by employing colorimetric thermometry for each corresponding image point of the same object point. Finally, the experiment employed a light field camera for temperature measurement and reconstruction of candle flames. The accuracy of the temperature measurement is 3.31%, thus confirming the feasibility and effectiveness of the proposed method.
Key words: Temperature measurement / Optical field imaging / Colorimetric thermometry / 3D reconstruction / Radiometric calibration
© The Author(s), published by EDP Sciences, 2023
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1 Introduction
The radiometric thermometry technology, characterized by its fast response speed, high upper limit of temperature measurement, and wide dynamic range, holds great potential for various applications in the field of temperature measurement [1–3]. The traditional single camera system, while simple in structure and easy to implement, faces challenges in acquiring and reconstructing 3D spatial information of the radiation field. The camera array thermometry system [4] involves the arrangement of multiple cameras at various locations and angles around the temperature field, enabling simultaneous imaging to capture radiation images from multiple perspectives. This system offers the benefit of superior spatial resolution; however, it is characterized by high system complexity and challenging calibration.
Light field imaging technology [5–7] encompasses the acquisition and data processing of optical fields. By incorporating a microlens array (MLA) onto the optical path of a conventional camera, it becomes feasible to capture the complete optical radiation information in a single image. Through data processing techniques such as transformation and integration, the position, direction, radiation spectrum, and other pertinent information of the 3D object can be calculated. This innovative approach presents a novel concept for 3D temperature field measurement, with the potential to surpass the limitations of point and surface temperature measurements and achieve breakthroughs in 3D temperature measurement.
In recent years, numerous scholars have conducted research on methods for measuring and reconstructing 3D radiation temperature. Daniel et al. [8] proposed a spectral measurement method utilizing curve fitting techniques to accurately measure the temperature of a target within the range of 800–1200 K. Hertz and Faris [9] employed the principles of spectroscopy to simultaneously capture radiation projections of the transient fireball temperature field from multiple directions. They further integrated the MART algorithm to achieve a twodimensional reconstruction of the radiation intensity. Bheemul et al. [10] employed three CCD cameras to simultaneously capture target emission intensity projections in three directions, enabling quantitative reconstruction of gas radiation properties. Upton et al. [11] developed a 12direction hightemperature fireball radiation intensity acquisition system based on six CCD cameras, facilitating 3D reconstruction of the radiation field. Floyd et al. [12] utilized an experimental circular table with 5 CCDs positioned at 36° intervals, allowing for simultaneous projection of light field information in 2 directions onto one CCD. Consequently, projection information in 10 directions could be obtained simultaneously, achieving high spatial and temporal resolution 3D reconstruction of the turbulent impulsive temperature field. Ishino et al. [13] constructed an emission spectral tomography device with 40 projection acquisition directions, equipped with 40 imaging lenses, enabling 3D reconstruction of the propane temperature field. Furthermore, Ishino and Ohiwa [14] stacked the imaging lenses into 4 layers, resulting in a device with 158 imaging lenses, a number that set a world Guinness record, and further enhancing the reconstruction accuracy of the transient combustion field. Sun et al. [15] propose a novel geometric calibration method for focused light field cameras to accurately track the paths of flame radiance and reconstruct the 3D temperature distribution of a flame. Huang et al. [16] introduce a reconstruction technique for determining the 3D temperature distribution and radiative properties of participating media using the multispectral lightfield imaging technique. Yuan et al. [17] and Li et al. [18] simplify the reconstruction process by simulating the light field imaging of nonuniform temperature distributions using a previously developed multifocus plenoptic camera model. The aforementioned research achievements have laid a solid foundation for the study of light field colorimetric temperature measurement method in this paper.
This paper proposes a 3D temperature field measurement method based on light field colorimetric thermometry. The mathematical relationship between the spatial positions of 3D object and corresponding image points of elemental images array (EIA) in the integral imaging system was theoretically analyzed. Corresponding image points refers to multiple points on the focal plane that are generated by a point in the object space being captured by the MLA in an integral imaging system. These different corresponding image points capture the light field information of the same 3D objects from different angles, thus earning the name corresponding image points. Additionally, the 3D temperature distribution of the object was calculated by combining colorimetric temperature measurement with optical field imaging system parameters. The rest of the paper is organized as follows. The theoretical analysis of colorimetric temperature measurement with light field camera is derived in Section 2. Section 3 introduces the blackbody radiation calibration principle of the light field camera and gives the experimental calibration results. Experimental results are given in Sections 4 and 5 concludes this paper.
2 Principle of light field imaging colorimetric temperature measurement
2.1 Principles of 3D information acquisition in light field imaging
Light field imaging incorporates an MLA between the main lens and the image sensor, enabling the projection of light from various directions of the object onto the image sensor, thereby generating an EIA. This technique allows for the capture of singleview images of objects from multiple perspectives, thereby providing a ray recording and acquisition technology that offers enhanced dimensions and a more comprehensive information representation. This paper uses a focused light field camera to obtain passive 3D information and radiation information. As shown in Figure 1 is the schematic diagram of the 3D information acquisition of the focused light field camera. In contrast to conventional light field cameras, the MLA of a focused light field camera is positioned away from the image plane of the main lens. The image plane of the main lens, which is not captured by the image sensor, is referred to as the virtual image plane. This virtual image plane is the conjugate plane of the object plane with respect to the main lens, while the CCD plane of the image sensor is situated at the position of the virtual image plane in relation to the conjugate plane of the MLA. Therefore, the imaging process of a focused light field camera can be divided into two distinct and separate imaging processes. The first process involves the imaging of the object plane in relation to the main lens, while the second process involves the imaging of the virtual image plane in relation to the MLA. By employing this approach, focused light field cameras are able to effectively reduce directional sampling and enhance the resolution of spatial sampling.
Figure 1 Schematic diagram of 3D information acquisition technology of focused light field camera. 
In Figure 1, the x–y–z 3D coordinate system is established with the coordinate origin O as the center. The point A _{1} in 3D space is projected through the main lens and MLA to obtain an EIA. Among them, L _{1} represents the distance from the object point to the main lens, L _{2} represents the distance from the main lens to the virtual image plane, L _{3} represents the distance from the virtual object point A _{2} to the lens array, and L _{4} represents the distance from the element image array to the lens array. L _{1} and L _{2} are conjugate to the main lens, while L _{3} and L _{4} are conjugate to the MLA. The focal length of the main lens array is denoted as f _{main}, and the focal length of the MLA is denoted as f _{MLA}. D(m, n) represents the sub lens located at (m, n)th in the MLA. Image (m, n) represents the elemental image corresponding to D(m, n). For point A _{1}, it is projected from D(m, n) to Image (m, n) at point A _{3}(m, n), and the set of all A _{3}(m, n) in the element image array is defined as the image point of A _{1} with the same name.
This section employs the ray tracing method to thoroughly analyze the correlation between pixels sharing the same name. The coordinates of point A _{1} are denoted as A _{1}(X _{ A1}, Y _{ A1}, Z _{ A1}), while the coordinates of the virtual point A _{2} are represented as A _{2}(X _{ A2}, Y _{ A2}, Z _{ A2}). The optical center coordinates of the (m, n)th microlens, denoted as D(m, n), within the MLA are given by (X _{ D(m, n)}, Y _{ D(m, n)}, Z _{ D (m, n)}). Furthermore, the coordinates of the corresponding homonymous image points A _{3}(m, n) of object point A _{1} and virtual object point A _{2} in the element image corresponding to D(m, n) are expressed as (X _{ A3 (m, n)}, Y _{ A3 (m, n)}, Z _{ A3 (m, n)}). The microlens is considered to be a square lens, with adjacent microlenses spaced p apart in the MLA. Based on the imaging equation of the main lens, the relationship between A _{1}, O, and A _{2} can be described as follows:$$\{\begin{array}{c}\frac{{X}_{O}{X}_{{A}_{1}}}{{X}_{{A}_{2}}{X}_{O}}=\frac{{Z}_{O}{Z}_{{A}_{1}}}{{Z}_{{A}_{2}}{Z}_{O}}=\frac{{L}_{1}}{{L}_{2}}={\beta}_{1},\\ \frac{{Y}_{O}{Y}_{{A}_{1}}}{{Y}_{{A}_{2}}{Y}_{O}}=\frac{{Z}_{o}{Z}_{{A}_{1}}}{{Z}_{{A}_{2}}{Z}_{O}}=\frac{{L}_{1}}{{L}_{2}}={\beta}_{1},\\ \frac{1}{{Z}_{O}{Z}_{{A}_{1}}}+\frac{1}{{Z}_{{A}_{2}}{Z}_{O}}=\frac{1}{{f}_{\mathrm{main}}},\hspace{1em}\hspace{1em}\end{array}$$(1)where β _{1} = L _{1}/L _{2} represents the magnification of the main lens.
According to the lens imaging equation, the coordinate relationship between A _{2}, D(m, n), and A _{3}(m, n) is:$$\{\begin{array}{c}\frac{{X}_{D(m,n)}{X}_{{A}_{2}}}{{X}_{{A}_{3}(m,n)}{X}_{D(m,n)}}=\frac{{Z}_{D(m,n)}{Z}_{{A}_{2}}}{{Z}_{{A}_{3}(m,n)}{Z}_{D(m,n)}}=\frac{{L}_{3}}{{L}_{4}}={\beta}_{2},\\ \frac{{Y}_{D(m,n)}{Y}_{{A}_{2}}}{{Y}_{{A}_{3}(m,n)}{Y}_{D(m,n)}}=\frac{{Z}_{D(m,n)}{Z}_{{A}_{2}}}{{Z}_{{A}_{3}(m,n)}{Z}_{D(m,n)}}=\frac{{L}_{3}}{{L}_{4}}={\beta}_{2},\\ \frac{1}{{Z}_{{A}_{3}(m,n)}{Z}_{D(m,n)}}+\frac{1}{{Z}_{D(m,n)}{Z}_{{A}_{2}}}=\frac{1}{{f}_{\mathrm{MLA}}},\end{array}$$(2)where β _{2} = L _{3}/L _{4} represents the magnification of the MLA.
For periodically arranged MLA, the relationship between the optical center coordinates of D(m, n) and D(m + i, n + j) within the MLA can be expressed as follows:$$\{\begin{array}{c}{X}_{D(m+i,n+j)}={X}_{D(m,n)}+i\times p,\\ {Y}_{D(m+i,n+j)}={Y}_{D(m,n)}+j\times p,\\ {Z}_{D(m+i,n+j)}={Z}_{D(m,n)}.\end{array}$$(3)
The aforementioned equation represents the correlation between the optical center coordinates of any two microlenses within an MLA. By using equations (1)–(3), the coordinates of the corresponding pixel points A _{3}(m, n) and A _{3}(m + i, n + j) formed by two sub lenses D(m, n) and D(m + i, n + j) can be derived as follows:$$\{\begin{array}{c}{X}_{{A}_{3}(m,n)}=\frac{{X}_{D(m,n)}{X}_{{A}_{2}}}{{\beta}_{2}}+{X}_{D(m,n)},\hspace{1em}\hspace{1em}\\ {X}_{{A}_{3}(m+i,n+j)}=\frac{{X}_{D(m+i,n+j)}{X}_{{A}_{2}}}{{\beta}_{2}}+{X}_{D(m+i,n+j)}.\end{array}$$(4)
By simultaneously solving equations (3) and (4), the relationship between any two corresponding image points can be obtained:$$\{\begin{array}{c}{X}_{{A}_{3}(m,n)}={X}_{{A}_{3}(m+i,n+j)}\mathrm{ip}\times (1+1/{\beta}_{2}),\\ {Y}_{{A}_{3}(m,n)}={Y}_{{A}_{3}(m+i,n+j)}\mathrm{jp}\times (1+1/{\beta}_{2}).\end{array}$$(5)
When i = 1 and j = 1, the relationship between two corresponding image points obtained by adjacent lenses passing through the origin O can be derived as follows:$$\{\begin{array}{c}{X}_{{A}_{3}(m+1,n+1)}={X}_{{A}_{3}(m,n)}+p\times (1+1/{\beta}_{2}),\\ {Y}_{{A}_{3}(m+1,n+1)}={Y}_{{A}_{3}(m,n)}+p\times (1+1/{\beta}_{2}),\end{array}$$(6)where X _{ A3(m, n)} and X _{ A3(m+1, n+1)} represent the coordinates of the corresponding image points A _{3(m,n)} and A _{3(m+1,n+1)}, respectively. By simultaneously solving equations (2)–(6), the coordinates of the virtual object point A _{2} can be obtained.$$\{\begin{array}{c}{X}_{{A}_{2}}={X}_{D(m,n)}\frac{p({X}_{{A}_{3}(m,n)}{X}_{D(m,n)})}{{X}_{{A}_{3}(m+1,n+1)}{X}_{{A}_{3}(m,n)}p},\\ {Y}_{{A}_{2}}={Y}_{D(m,n)}\frac{p({Y}_{{A}_{3}(m,n)}{Y}_{D(m,n)})}{{Y}_{{A}_{3}(m+1,n+1)}{Y}_{{A}_{3}(m,n)}p},\\ {Z}_{{A}_{2}}={Z}_{D(m,n)}+\frac{{f}_{\mathrm{MLA}}({Z}_{{A}_{3}(m,n)}{Z}_{D(m,n)})}{{f}_{\mathrm{MLA}}+{Z}_{D(m,n)}{Z}_{{A}_{3}(m,n)}},\end{array}$$(7)where the coordinates (X _{ A3(m, n)}, Y _{ A3(m, n)}, Z _{ A3(m, n)}) of A _{3}(m, n) and the coordinates (X _{ A3(m+1, n+1)}, Y _{ A3(m+1, n+1)}, Z _{ A3(m+1, n+1)}) of A _{3}(m + 1, n + 1) can be obtained by extracting the corresponding elements from the image array. The parameters p and L _{4} can be directly measured. Given the specified focal point coordinates X _{ D(m, n)} and Y _{ D(m, n)} of D(m, n), the spatial coordinates (X _{ A1}, Y _{ A1}, Z _{ A1}) of object point A _{1} can be obtained by simultaneously solving equation (1).$$\{\begin{array}{c}{X}_{{A}_{1}}={X}_{O}+\frac{{f}_{\mathrm{main}}({X}_{{A}_{2}}{X}_{O})}{{f}_{\mathrm{main}}+{Z}_{O}{Z}_{{A}_{2}}},\\ {Y}_{{A}_{1}}={Y}_{O}+\frac{{f}_{\mathrm{main}}({Y}_{{A}_{2}}{Y}_{O})}{{f}_{\mathrm{main}}+{Z}_{O}{Z}_{{A}_{2}}},\\ {Z}_{{A}_{1}}={Z}_{O}+\frac{{f}_{\mathrm{main}}({Z}_{{A}_{2}}{Z}_{O})}{{f}_{\mathrm{main}}+{Z}_{O}{Z}_{{A}_{2}}}.\end{array}$$(8)
Based on the aforementioned analysis, it can be inferred that the acquisition of the spatial coordinates of object point A _{1} can be achieved by obtaining the focal length of the light field camera system, the spacing between microlenses in the MLA (p), and the distance from the image array to the MLA. This facilitates the implementation of 3D shape acquisition through light field imaging.
2.2 Principle of 3D temperature field colorimetric temperature measurement
In order to measure the 3D temperature distribution of the object’s surface, a colorimetric temperature measurement method is employed for each corresponding pixel on the light field image sensor. According to Planck’s law, the equation for the spectral radiance of nonblackbody radiation is as follows:$${L}_{\lambda}=\frac{{C}_{1}}{\pi {\lambda}^{5}}\epsilon (\lambda ,T){\left({e}^{{C}_{2}/\lambda T}1\right)}^{1},$$(9)where ε(λ,T) < 1 represents the emissivity of nonblackbody spectral radiation. L _{ λ } is the spectral radiance at wavelength λ. T is the blackbody temperature value. C _{1} = 3.7418 × 10^{−16} W ∙ m^{2} and C _{2} = 1.4388 × 10^{−2} m ∙ K are the first and second radiation constants, respectively. When the temperature of an object changes, the peak wavelength of the object’s spectral radiance also changes accordingly. The higher the temperature, the smaller the peak wavelength. When T is less than 3000 K, that is, C _{2}/λT significantly greater than 1, equation (9) can be simplified as:$${L}_{\lambda}=\frac{{C}_{1}}{\pi {\lambda}^{5}}\epsilon (\lambda ,T){e}^{{C}_{2}/\lambda T}.$$(10)
The peak spectral radiance increases with the temperature, and the peak spectral radiance of a certain temperature can only be detected within specific wavelength bands. The wavelength range of the CCD sensor is 380–780 nm, with red, green, and blue as the three primary colors. In computer graphics, the primary colors R, G, and B of a single pixel are represented by values ranging from 0 to 255. By combining different brightness values of the R, G, and B primary colors in specific proportions, any desired brightness value L can be achieved.$$L=l\left(R\right)+l\left(G\right)+l\left(B\right),$$(11)where l(R), l(G), l(B) represent the brightness values of the three primary colors. The range of binary grayscale values is [0,1]. Assuming the spectral response function of the visible light CCD sensor in the light field is denoted as Y(λ), the grayscale value of the light field image can be represented as H, which can be defined as follows:$$H=\frac{1}{4}\cdot \eta \mu t\cdot {\left[\frac{2a}{{f}^{\mathrm{\prime}}}\right]}^{2}\cdot \underset{380}{\overset{780}{\int}}\mathrm{}{K}_{T}\left(\lambda \right)L\left(\lambda ,T\right)Y\left(\lambda \right)\mathrm{d}\lambda ,$$(12)where η represents the conversion coefficient between the input current of the CCD and the grayscale value of the image; μ denotes photoelectric conversion coefficient of the CCD sensor; a is the conversion coefficient between the output voltage of the photosensitive unit and the image grayscale value; t represents the integration time of the camera; a is the aperture of the light field camera; f denotes focal length; K _{ T }(λ) represents the optical transmittance of the main lens of the light field camera; L(λ,T) represents the radiance value of the CCD image sensor. At a specific wavelength λ, K _{ T }(λ) undergoes minimal changes and is generally considered as a constant. Therefore, equation (12) can be rewritten as:$$\begin{array}{c}H=A\underset{380}{\overset{780}{\int}}\mathrm{}L\left(\lambda ,T\right)Y\left(\lambda \right)\mathrm{d}\lambda ,\\ A=\frac{1}{4}\cdot \eta \mu t\cdot {\left[\frac{2a}{{f}^{\mathrm{\prime}}}\right]}^{2}.\end{array}$$(13)According to the spectral response functions associated with the R, G, and B channels of the visible light CCD image sensor, the correlation between the grayscale values of the three primary color signals and the temperature of the measured object can be derived as follows:$$\{\begin{array}{c}R=A\underset{380}{\overset{780}{\int}}\mathrm{}L\left(\lambda ,T\right)r\left(\lambda \right)\mathrm{d}\lambda ,\\ G=A\underset{380}{\overset{780}{\int}}\mathrm{}L(\lambda ,T)g\left(\lambda \right)\mathrm{d}\lambda \\ B=A\underset{380}{\overset{780}{\int}}\mathrm{}L\left(\lambda ,T\right)b\left(\lambda \right)\mathrm{d}\lambda ,\end{array},$$(14)where r(λ), g(λ), b(λ) represent the spectral response functions of R, G, and B channels. The optical field colorimetric temperature measurement method involves comparing the spectral radiance values at two wavelengths. The expression for this method is as follows:$$\frac{{L}_{1}({\lambda}_{1},{T}_{1})}{{L}_{1}({\lambda}_{2},{T}_{1})}=\frac{L({\lambda}_{1},{T}_{C})}{L({\lambda}_{2},{T}_{C})},$$(15)where T _{1} represents the actual temperature of the nonblackbody thermal radiation material, and T _{ c } represents the temperature of the blackbody. By combining equation (10) with the expression for color temperature (15), the temperature of the object point corresponding to any pixel of light field image can be determined as follows:$$T=\frac{{C}_{2}\left(\frac{1}{{\lambda}_{2}}\frac{1}{{\lambda}_{1}}\right)}{\mathrm{ln}\frac{L({\lambda}_{1},T)}{L({\lambda}_{2},T)}\mathrm{ln}\frac{\epsilon ({\lambda}_{1},T)}{\epsilon ({\lambda}_{2},T)}5\mathrm{ln}\frac{{\lambda}_{2}}{{\lambda}_{1}}},$$(16)where L(λ _{1}, T) and L(λ _{2}, T) represent the monochromatic radiance at different wavelengths. For substances that can be considered equivalent to gray body, their spectral emissivity remains relatively constant within a narrow band range of the same spectrum. Therefore, it is reasonable to assume that the spectral emissivity expression for gray body is 0. The CCD colorimetric temperature measurement model does not require precise measurement of emissivity. Instead, it utilizes the emissivity ratio at two wavelengths to eliminate uncertainties arising from unknown variables, such as the measurement environment. Based on this assumption, the CCD colorimetric temperature measurement equation for a gray body is as follows:$$T=\frac{{C}_{2}\left(\frac{1}{{\lambda}_{2}}\frac{1}{{\lambda}_{1}}\right)}{\mathrm{ln}\frac{L({\lambda}_{1},T)}{L({\lambda}_{2},T)}5\mathrm{ln}\frac{{\lambda}_{2}}{{\lambda}_{1}}}.$$(17)
This method acquires 3D information of object points through light field imaging and utilizes the colorimetric temperature measurement method to obtain radiation field temperature information. By capturing both the 3D position information of the target object in a single shot and the radiation temperature information, it overcomes the limitations of existing technologies that necessitate multiple measurements from different perspectives to obtain both position and radiation information of the same target. This approach offers the advantages of 3D imaging and multispectral detection, while maintaining a simple system structure and convenient data sampling.
3 Spectral radiation calibration of light field camera
A blackbody can serve as a standard radiation source for calibrating the flame radiation temperature. By employing a light field camera to capture a light field image of a blackbody with a known temperature, the radiance values received by each pixel in the image and the temperature values of the blackbody can be related through equation (10) based on Planck’s law. Subsequently, the grayscale value of the image can be fitted with the corresponding radiance value at that temperature, thereby enabling the calibration of the image sensor of the light field camera for radiation measurements.
This paper utilizes a Raytrix R8 focused field camera for data acquisition. The Raytrix R8 image sensor’s R, G, and B channels have response wavelengths of 610 nm, 530 nm, and 460 nm, respectively. The experiment used the Lumasense M360 standard blackbody, which features a cavity made of graphite tube targets. The blackbody has an effective emissivity of 1.0 and can measure temperatures ranging from 5 °C to 1200 °C. Radiation calibration experiments are conducted under dark conditions without any illumination to eliminate the potential influence of stray light. As depicted in Figure 2, the light field camera is positioned directly in front of the central cavity of the blackbody at the distance of 370 mm.
Figure 2 Photos of blackbody calibration device. 
Due to the use of a pentagon diaphragm in the camera, the smallest image unit captured is also a pentagon. This leads to some gaps between the original light field images. To avoid collecting invalid grayscale data, only the grayscale values within the pentagon area are read, and the average grayscale value is taken based on the recorded data. When the exposure time set by the camera is deemed unreasonable, it may lead to overexposure of the captured image, thereby causing image distortion. To prevent overexposure of the captured raw light field image, an exposure time of 0.3 ms was employed. In the dark calibration laboratory at a room temperature of 22.5 °C, the blackbody calibration temperature ranged from 800 °C to 1000 °C, with images of the central cavity of the blackbody furnace being captured at intervals of 50 °C, as shown in Figure 3.
Figure 3 Original light field image of the center of the blackbody furnace. 
Figure 4 shows three channel images of R, G, and B in the center cavity of the blackbody. According to the colorimetric temperature measurement model, it is essential to extract multiple grayscale values of the R and G channels from each color image within the central calibration area of the blackbody. Subsequently, the average grayscale values of different temperature images under the two channels are calculated, as illustrated in Table 1.
Figure 4 Three channel images of R, G, and B in the center cavity of the blackbody. 
Grayscale values of the central cavity image of blackbody at different temperatures.
By utilizing the grayscale values presented in Table 1, it is possible to establish a correlation between image grayscale values and radiation intensity through the following fitting process:$$\begin{array}{c}{I}_{R}=134660+1.07\times 1{0}^{6}\times R+9.46\times 1{0}^{7}\times {R}^{2}4.61\times 1{0}^{7}\times {R}^{3},\\ {I}_{G}=18173+1.59\times 1{0}^{6}\times G6.05\times 1{0}^{6}\times {G}^{2}+8.27\times 1{0}^{6}\times {G}^{3},\\ {I}_{B}=17595+6.603\times 1{0}^{5}\times B1.69\times 1{0}^{6}\times {B}^{2}+1.91\times 1{0}^{6}\times {B}^{3},\end{array}$$(18)where R, G, and B represent the binary grayscale values corresponding to the three channel images, while I _{ R }, I _{ G } and I _{ B } represent the radiation intensity values corresponding to the three channels. The colorimetric temperature measurement equation for the optical field temperature measurement system can be expressed as follows:$$T=\frac{{C}_{2}\left(\frac{1}{{\lambda}_{g}}\frac{1}{{\lambda}_{r}}\right)}{\mathrm{ln}\frac{{I}_{R}}{{I}_{g}}5\mathrm{ln}\frac{{\lambda}_{g}}{{\lambda}_{r}}},$$(19)where λ _{ r } = 610 nm, λ _{ g } = 530 nm. The colorimetric temperature value of the light field temperature measurement system can be directly obtained based on the grayscale values of the R and G channel images.
4 Results and discussion
The light field imaging colorimetric temperature measurement system comprises three main components: the main lens, MLA and CCD sensors, and a computer image processing system. As depicted in Figure 5, the light field temperature measurement system captures the spectral radiation brightness information of candle flames, converts it into digital signals of flame images through the photoelectric and analogtodigital conversion processes of the light field CCD sensor, and subsequently transmits these signals to the computer image processing system using USB3.0 data technology, ultimately presenting the original flame light field image. This system employs a main lens with a short focal length of 35 mm and an aperture coefficient of 1.4. The Fnumber of the MLA is 2.8, and the single pixel size of the CCD is 2.24 μm.
Figure 5 Schematic diagram of candle flame temperature measurement experiment. 
The candle was positioned at the distance of 300 mm from the light field camera. To mitigate the impact of stray visible light interference, the experiment was conducted in a controlled laboratory environment with complete darkness conditions. The impact of environmental stray light and absorption effects on the results was found to be minimal. The light field colorimetric thermometry method can also be applied for longdistance measurements. However, it requires the selection of a telephoto lens. When conducting experimental measurements at long distances in outdoor settings, it is important to consider the influence of environmental stray light and atmospheric absorption effects in order to improve measurement accuracy.
Similar to the blackbody calibration experiment, a standardized exposure time of 0.3 ms was employed for the light field camera. Figure 6a displays the original flame light field image captured by the light field camera. The pentagonal aperture of the main lens results in a corresponding pentagonal micro unit image projected onto the CCD sensor of the camera through the MLA.
Figure 6 Light field imaging acquisition and reconstruction image: (a) original flame light field image, (b) R channel grayscale image, (c) G channel grayscale image, (d) B channel grayscale image and (e) flame refocused color image. 
The measurement and reconstruction of flame temperature field in this paper consists of the following three steps:

Step 1: Capture the original light field image, as depicted in Figure 6a. The original light field image encompasses the R, G, and B channels of the CCD sensor.

Step 2: Utilizing the 3D information creation method outlined in Section 2.1, generate monochromatic 3D reconstructed images for the R, G, and B channels of the light field images. These images are depicted in Figures 6b–6d. Subsequently, employ the RGB color space model to fuse the individual channel images of R, G, and B, resulting in the flame refocused color image displayed in Figure 6e.

Step 3: By utilizing the grayscale values of the R and G channels, in conjunction with the radiation calibration equation (18) and the colorimetric temperature measurement model, the temperature values of each pixel within the flame image can be derived. Figure 7a shows the plane distribution of the flame temperature field, and Figure 7b shows the 3D distribution of the flame temperature field. The flame temperature field underwent isothermal layering treatment, with the layering temperature aligning with the actual temperature distribution of the flame. This alignment indicates the rationality of reconstructing the temperature field distribution.
Figure 7 Reconstruction results of flame temperature field: (a) plane distribution diagram of flame temperature field and (b) 3D distribution diagram of flame temperature field.
To validate the accuracy of temperature measurements obtained from the system, a thermocouple thermometer with a temperature measurement accuracy of 0.2% was employed to measure the contact temperature of the identical flame. Figure 8a illustrates the configuration of the thermocouple flame temperature measurement device, which comprises two components: a thermocouple instrument and a thermocouple detection line. The detection line consists of a total of 8 sublines, enabling simultaneous temperature measurements at 8 different positions within the flame. The measurement point’s position is depicted in Figure 8b. Multiple measurements were conducted at each temperature measurement point, and the average value was considered as the actual temperature value for that specific point. Table 2 presents a comparison between the reconstructed flame temperature field and the measured data. The maximum value observed in the flame colorimetric temperature measurement method is 35 K. Furthermore, the maximum rate of the colorimetric temperature values measured by the system is 3.31%, thereby confirming the accuracy of the colorimetric temperature measurement model in comparison to the monochromatic temperature measurement model.
Figure 8 (a) Thermocouple contact temperature measurement and (b) schematic diagram of 8 temperature point locations. 
Comparison results between reconstructed flame temperature field and the thermocouple contact temperature measurement.
5 Conclusion
In this work, we present a novel method for measuring the 3D temperature field using light field colorimetric thermometry. This method enables the simultaneous acquisition of both 3D position information and radiation temperature data of the objects being measured. The mathematical relationship between the spatial positions of 3D object and corresponding image points of EIA in the integral imaging system is theoretically analyzed. The 3D temperature distribution of the object is calculated by combining colorimetric temperature measurement with optical field imaging system parameters. The effectiveness and feasibility of the proposed method have been validated through experimental verification. In response to the limitations identified in this study, future research efforts will primarily concentrate on addressing the following two areas: (1) the light field temperature measurement method proposed in this study has been investigated primarily for objects that exhibit characteristics similar to gray bodies. The subsequent phase of research will involve measuring and studying nongray objects, with the objective of establishing a temperature measurement model suitable for such bodies. (2) While the maximum error rate of the proposed light field temperature measurement method in this paper is 3.31%, representing an improvement over traditional monochromatic light field temperature measurement methods, there remains significant scope for further advancements to achieve highprecision 3D temperature measurement systems.
Conflict of interest
The authors declare no conflict of interest.
Acknowledgments
This research is supported by National Natural Science Foundation of China (62005204, 62075176 and 62005206) and the Fundamental Research Funds for the Central Universities (ZYTS23124, SY22033I, ZYTS23127, QTZX22004, JPJC2112).
References
 Berk A. (2008) Analytically derived conversion of spectral band radiance to brightness temperature, J. Quant. Spectrosc. Radiat. Transf. 109, 7, 1266–1276. [NASA ADS] [CrossRef] [Google Scholar]
 Raj V.C., Prabhua S.V. (2013) Measurement of surface temperature and emissivity of different materials by twocolour pyrometry, Rev. Sci. Instrum. 84, 1249031–12490310. [NASA ADS] [Google Scholar]
 Usamentiaga Rubén, Venegas P., Guerediaga J., et al. (2014) Infrared thermography for temperature measurement and nondestructive testing, Sensors 14, 7, 12305–12348. [CrossRef] [Google Scholar]
 Wilburn B., Joshi N., Vaish V., et al. (2005) High performance imaging using large camera arrays, ACM Trans. Graph. 24, 3, 765. [CrossRef] [Google Scholar]
 Ng R., Levoy M., Bredif M., et al. (2005) Light field photography with a handheld plenoptic camera, CSTR 2, 11, 1–11. [Google Scholar]
 Lumsdaine A., Georgiev T. (2009) The focused plenoptic camera, in: Proceedings of IEEE International Conference on Computational Photography (ICCP), IEEE, pp. 1–8. [Google Scholar]
 Manuel MartínezCorral, Bahram J. (2018) Fundamentals of 3D imaging and displays: a tutorial on integral imaging, lightfield, and plenoptic systems, Adv. Opt. Photonics 10, 3, 512–566. [CrossRef] [Google Scholar]
 Daniel K., Feng C., Gao S. (2016) Application of multispectral radiation thermometry in temperature measurement of thermal barrier coated surfaces, Measurement 92, 218–223. [NASA ADS] [CrossRef] [Google Scholar]
 Hertz H.M., Faris G.W. (1988) Emission tomography of flame radicals, Opt. Lett. 13, 5, 351–353. [NASA ADS] [CrossRef] [Google Scholar]
 Bheemul H.C., Lu G., Yan Y. (2002) Threedimensional visualization and quantitative characterization of gaseous flames, Meas. Sci. Technol. 13, 10, 1643–1650. [NASA ADS] [CrossRef] [Google Scholar]
 Upton T.D., Verhoeven D.D., Hudgins D.E. (2011) Highresolution computed tomography of a turbulent reacting flow. 50, 1, 125–134. [Google Scholar]
 Floyd J., Geipel P., Kempf A.M. (2011) Computed tomography of chemiluminescence (CTC): instantaneous 3D measurements and Phantom studies of a turbulent opposed jet flame, Combust. Flame 158, 2, 376–391. [CrossRef] [Google Scholar]
 Ishino Y., Inagawa O., Ohiwa N. (2005) Instantaneous volume imaging of fuel combustion rate distribution of a turbulent propaneair fuelrich premixed flame by threedimensional scanless computerized tomographic reconstruction method with a multilenscamera, in: Fourth International Symposium on Turbulence and Shear Flow Phenomina: TSFP4, Williamsburg, Virginia, USA, pp. 769–774. [CrossRef] [Google Scholar]
 Ishino Y., Ohiwa N. (2012) Threedimensional computerized tomographic reconstruction of instantaneous distribution of emission intensity in turbulent premixed flames, in: European Combustion Meeting, Lean Combustion Technology II. [Google Scholar]
 Sun J., Hossain M.M., Xu C.L., et al. (2017) A novel calibration method of focused light field camera for 3D reconstruction of flame temperature, Opt. Commun. 390, 7–15. [NASA ADS] [CrossRef] [Google Scholar]
 Huang X., Qi H., Niu C.Y., et al. (2017) Simultaneous reconstruction of 3D temperature distribution and radiative properties of participating media based on the multispectral lightfield imaging technique, Appl. Therm. Eng. 115, 1337–1347. [CrossRef] [Google Scholar]
 Yuan Y., Liu B., Li S., et al. (2016) Lightfieldcamera imaging simulation of participatory media using Monte Carlo method, Int. J. Heat Mass Transf. 102, 518–527. [CrossRef] [Google Scholar]
 Li T.J., Li S.N., Yuan Y., et al. (2018) Light field imaging analysis of flame radiative properties based on Monte Carlo method, Int. J. Heat Mass Transf. 119, 303–311. [CrossRef] [Google Scholar]
All Tables
Grayscale values of the central cavity image of blackbody at different temperatures.
Comparison results between reconstructed flame temperature field and the thermocouple contact temperature measurement.
All Figures
Figure 1 Schematic diagram of 3D information acquisition technology of focused light field camera. 

In the text 
Figure 2 Photos of blackbody calibration device. 

In the text 
Figure 3 Original light field image of the center of the blackbody furnace. 

In the text 
Figure 4 Three channel images of R, G, and B in the center cavity of the blackbody. 

In the text 
Figure 5 Schematic diagram of candle flame temperature measurement experiment. 

In the text 
Figure 6 Light field imaging acquisition and reconstruction image: (a) original flame light field image, (b) R channel grayscale image, (c) G channel grayscale image, (d) B channel grayscale image and (e) flame refocused color image. 

In the text 
Figure 7 Reconstruction results of flame temperature field: (a) plane distribution diagram of flame temperature field and (b) 3D distribution diagram of flame temperature field. 

In the text 
Figure 8 (a) Thermocouple contact temperature measurement and (b) schematic diagram of 8 temperature point locations. 

In the text 
Current usage metrics show cumulative count of Article Views (fulltext article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 4896 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.