CA3079552C - Method for restoring underground image on basis of ray reverse tracing technology - Google Patents
Method for restoring underground image on basis of ray reverse tracing technology Download PDFInfo
- Publication number
- CA3079552C CA3079552C CA3079552A CA3079552A CA3079552C CA 3079552 C CA3079552 C CA 3079552C CA 3079552 A CA3079552 A CA 3079552A CA 3079552 A CA3079552 A CA 3079552A CA 3079552 C CA3079552 C CA 3079552C
- Authority
- CA
- Canada
- Prior art keywords
- rays
- image
- light intensity
- intersection point
- refraction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 238000005516 engineering process Methods 0.000 title claims abstract description 16
- 238000002310 reflectometry Methods 0.000 claims description 16
- 239000013598 vector Substances 0.000 claims description 9
- 230000008030 elimination Effects 0.000 claims description 8
- 238000003379 elimination reaction Methods 0.000 claims description 8
- 238000005286 illumination Methods 0.000 claims description 7
- 239000007787 solid Substances 0.000 claims description 6
- 239000000463 material Substances 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 description 8
- 238000004364 calculation method Methods 0.000 description 3
- 239000003245 coal Substances 0.000 description 3
- 239000000428 dust Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000002238 attenuated effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Generation (AREA)
- Geophysics And Detection Of Objects (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
Abstract
A method for restoring an underground image based on a ray reverse tracing technology is disclosed. The method comprises: setting an underground camera as a light source emitting point, and emitting rays into an underground scene; recording all intersection points of all rays and underground objects, and calculating one intersection point closest to a view point in the intersection points; calculating a direction of rays newly generated after they are reflected and refracted by the objects in a position of the intersection point; respectively tracing the rays newly generated; recording rays irradiated onto a view plane after a strong light source emitted from the camera's position is reflected or refracted, and calculating the rays' light intensity; converting the light intensity into a pixel value; and eliminating the pixel value of strong light emitted from the camera to obtain the underground image.
Description
METHOD FOR RESTORING UNDERGROUND IMAGE ON
BASIS OF RAY REVERSE TRACING TECHNOLOGY
FIELD OF THE INVENTION
[0001] The present invention belongs to the field of underground image restoration, and particularly relates to a method for restoring an underground image on the basis of a ray reverse tracing technology.
DESCRIPTION OF RELATED ART
BASIS OF RAY REVERSE TRACING TECHNOLOGY
FIELD OF THE INVENTION
[0001] The present invention belongs to the field of underground image restoration, and particularly relates to a method for restoring an underground image on the basis of a ray reverse tracing technology.
DESCRIPTION OF RELATED ART
[0002] A ray tracing technology is a method for showing a three-dimensional (3D) image on a two-dimensional (2D) screen, is widely applied to games and computer graphics at present, and brings a more vivid effect to people. A light source is supposed as a point light source capable of randomly emitting tens of thousands of rays to surroundings, and those rays are reflected, refracted or absorbed (attenuated) or generate fluorescence after touching different objects. Ray tracing is a general technology from geometrical optics, and a ray passing path model is obtained by tracing rays generating interaction effects with an optical surface. However, tens of thousands of rays exist, and the rays after reflection, refraction, absorption and fluorescence generation are countless, so that the calculation amount of ray positive tracing is great.
Therefore, a ray reverse tracing method gradually comes into people's sight.
The calculation amount is greatly reduced if a camera lens is used as a light source emitting point and only the part of rays entering a view plane are calculated.
Therefore, a ray reverse tracing method gradually comes into people's sight.
The calculation amount is greatly reduced if a camera lens is used as a light source emitting point and only the part of rays entering a view plane are calculated.
[0003] Due to a fact that most explosion-proof cameras used underground at present are black and white cameras, special underground environment of a coal mine, all-weather artificial illumination, and influence by factors such as dust and dampness, an underground video has the characteristics of low image illuminance and nonuniform illumination distribution, and this special conditions cause low quality of the collected video and poor resolution of the video. When a strong light source such as a safety mine lamp occurs in a view filed of a mine camera, a collected image will have a dazzle light phenomenon, so that the quality of the video image is greatly reduced, and occurrence of safety accidents may be caused. Therefore, application of the ray reverse tracing technology to underground image restoration for image readability improvement is of great importance.
Date Recue/Date Received 2020-04-27 SUMMARY OF THE INVENTION
Technical Problem
Date Recue/Date Received 2020-04-27 SUMMARY OF THE INVENTION
Technical Problem
[0004] Aiming at the above problems, the present invention provides a method for restoring an underground image on the basis of a ray reverse tracing technology. By aiming at a phenomenon that under the conditions of low illuminance and much dust in a coal mine, a suddenly occurring strong light source may interfere an original video image, so that black and white level contrast of the monitoring image is too great, and information in the video image cannot be recognized, a ray reverse tracing method is used, and a pixel value of the strong light source in a view plane is eliminated, so that interference of the strong light source to the original video image is eliminated.
Technical Solution
Technical Solution
[0005] In order to achieve a goal of the present invention, the present invention adopts a technical solution that a method for restoring an underground image on the basis of a ray reverse tracing technology includes the following steps:
[0006] step 1: supposing an underground camera as a light source emitting point, i.e., a view point, and emitting rays into an underground scene;
[0007] step 2: recording all intersection points of all rays and underground objects, and calculating one intersection point closest to the view point in the intersection points;
[0008] step 3: according to illumination, object materials and a normal direction, calculating light intensity of reflection rays or refraction rays in the closest intersection point determined in the step 2;
[0009] step 4: calculating a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point;
[0010] step 5: tracing the rays newly generated in the step 4, and judging whether the third time reflection or refraction rays are emitted onto a view plane right in front of a safety mine lamp or not; if so, calculating the third time reflection light intensity and/or refraction light intensity; and otherwise, returning to the step 2 to redetermine the closest intersection point, and repeating the step 3 to the step 5;
[0011] step 6: converting the light intensity in the step 5 into a pixel value through a camera CCD photosensitive element, emitting rays obtained after the third time reflection and/or refraction of the rays emitted from the camera onto the view plane, Date Recue/Date Received 2020-04-27 and performing imaging on the view plane; and
[0012] step 7: eliminating the pixel value of strong light emitted from the camera in an image finally shown on the view plane to obtain an image after strong light source influence elimination.
[0013] In the step 3, the light intensity of the reflection rays or refraction rays in the closest intersection point determined in the step 2 is calculated according to the following method:
[0014] calculating the light intensity of the reflection rays in the position of the intersection point through a formula (1):
1, = 'a', + li(N = Odth(lciRd +Rs) (1)
1, = 'a', + li(N = Odth(lciRd +Rs) (1)
[0015] wherein L. represents the light intensity of the reflection rays; IX, represents an influence value of environment light in the position of the intersection point; L
represents the light intensity of incident light; Ka represents a specular reflectivity coefficient; Ks represents a diffuse reflectivity coefficient; Rd represents specular reflectivity; Rs represents diffuse reflectivity; and N, L and chTv respectively represent an object surface normal vector, a ray direction unit vector and a solid angle;
represents the light intensity of incident light; Ka represents a specular reflectivity coefficient; Ks represents a diffuse reflectivity coefficient; Rd represents specular reflectivity; Rs represents diffuse reflectivity; and N, L and chTv respectively represent an object surface normal vector, a ray direction unit vector and a solid angle;
[0016] or calculating the light intensity of the refraction rays in the position of the intersection point through a formula (2):
It = (cos 02/cos 01)(/i ¨ 1,) (2)
It = (cos 02/cos 01)(/i ¨ 1,) (2)
[0017] wherein it represents the light intensity of the refraction rays, and 0/ and 02 are an incidence angle and a refraction angle.
[0018] In the step 5, the rays newly generated in the step 4 are traced according to the following methods:
[0019] (1) if the rays do not intersect with any object, giving up the tracing; if the intersection point is on a nontransparent object, only calculating the light intensity of the reflection rays; if the intersection point is on a transparent object, calculating the light intensity of the reflection rays and the light intensity of the refraction rays, and tracing the rays obtained by reflecting or refracting the initial rays for three times; if the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, calculating the light intensity of the rays; and if not, giving up the tracing, and entering the step (2); and Date Recue/Date Received 2020-04-27
[0020] (2) if all reflection and refraction rays generated by the initial rays are not emitted onto the view plane right in front of the safety mine lamp, determining an intersection point second closest to the view point in the intersection points of the initial rays and the objects; repeating the step (1); if the second closest intersection point does not meet conditions, sequentially calculating the next closest intersection point until the intersection point found meets the conditions.
[0021] In the step 7, the pixel value of the strong light emitted from the camera is eliminated in the image finally shown on the view plane to obtain the image after the strong light source influence elimination according to the following methods:
[0022] besides light of the safety mine lamp simulated by light emitted from the camera underground, i.e., a light source A, other artificial lamp light, i.e., a light source B also exists, and meanwhile, the environment light, i.e., an artificial light source C also exists.
[0023] When the third time reflection rays and/or refraction rays are irradiated onto the view plane, the image on the view plane is shown as the following formula:
P (x, y) = R(x, y) = S(x, y) = L(x,y) (3)
P (x, y) = R(x, y) = S(x, y) = L(x,y) (3)
[0024] wherein P(x,y) represents the image finally shown on the view plane;
R(x,y) represents an image shown on the view plane when the camera does not emit light, i.e., the image shown on the view plane when the light source B and the light source C are overlapped; S(x,y) represents an image on the view plane when only the camera emits light; and L(x,y) represents an image of the environment light, i.e., the light source C, on the view plane.
R(x,y) represents an image shown on the view plane when the camera does not emit light, i.e., the image shown on the view plane when the light source B and the light source C are overlapped; S(x,y) represents an image on the view plane when only the camera emits light; and L(x,y) represents an image of the environment light, i.e., the light source C, on the view plane.
[0025] 1(x, y) = R (x , y) = S (x , y) (4) is set,
[0026] the logarithm is taken at both sides to obtain In P (x, y) = In I (x , y) + In L(x , y) (5),
[0027] and the environment light L(x,y) is shown as follows through P(x,y) and Gaussian kernel convolution of a Gaussian function G(x,y):
L (x, y) = P (x, y) * G (x, y) (6) _ (x2+y2)
L (x, y) = P (x, y) * G (x, y) (6) _ (x2+y2)
[0028] wherein G (x, y) -- Ae c2 '
[0029] C represents a Gaussian surrounding scale, and .1 is one scale, and enables f f G (x , y) dx dy = 1 to be always true. Through the formulas (4), (5) and (6), it can Date Recue/Date Received 2020-04-27 be obtained:
In R (x , y) = In y) ¨ In(P (x , y) * G(x, y)) ¨ in S(x , y)
In R (x , y) = In y) ¨ In(P (x , y) * G(x, y)) ¨ in S(x , y)
[0030] wherein S'(x, y) = e InR(x,y) is set,
[0031] and S'(x,y) is the image after the strong light source influence elimination.
Advantageous Effect
Advantageous Effect
[0032] Compared with the prior art, the technical solution of the present invention has the following beneficial technical effects:
[0033] the present invention changes a conventional thought on image processing by utilizing the ray reverse tracing. Conventional methods mostly use methods of linear conversion, gamma correction, histogram equalization, anti-sharpening mask, homomorphic filtering, tone mapping, dark channel algorithm and the like for the condition of sudden occurrence of a strong light source, and a processing effect is not obvious. The ray reverse tracing technology can effectively eliminate the interference of the strong light source, restore the original underground image, and ensure smooth proceeding of underground work and life safety of operators.
BRIEF DESCRIPTION OF THE DRAWINGS
BRIEF DESCRIPTION OF THE DRAWINGS
[0034] Fig. 1 is a schematic diagram of an opened solid included angle civT I
of a unit area towards a light source;
of a unit area towards a light source;
[0035] Fig. 2 is a schematic diagram of reflection and refraction receiving of ray reverse tracing of the present invention; and
[0036] Fig. 3 is a process of eliminating strong light source interference by ray reverse tracing of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The technical solutions of the present invention are further described below with reference to the accompanying drawings and embodiments.
DETAILED DESCRIPTION OF THE INVENTION
The technical solutions of the present invention are further described below with reference to the accompanying drawings and embodiments.
[0037] According to a method for restoring an underground image on the basis of a ray reverse tracing technology of the present invention, by aiming at a phenomenon that under the conditions of low illuminance, much dust and high dampness in a coal mine, a suddenly occurring strong light source may interfere an original video image, so that Date Recue/Date Received 2020-04-27 black and white level contrast of the monitoring image is too great, and information in the video image cannot be recognized, a ray reverse tracing method is used, and a pixel value of the strong light source in a view plane is eliminated, so that interference of the strong light source to the original video image is eliminated. As shown in Fig. 3, a process of eliminating strong light source interference by ray reverse tracing of the present invention concretely includes steps as follows.
[0038] Step 1: an underground camera is supposed as a light source emitting point, i.e., a view point, and rays are emitted into an underground scene. Intensity of the rays is equal to light intensity of rays emitted from a safety mine lamp.
[0039] Step 2: all intersection points of all rays and underground objects are recorded, and an intersection point closest to the view point in the intersection points is calculated.
[0040] Step 3: according to illumination, object materials and a normal direction, light intensity of reflection rays or refraction rays in the closest intersection point determined in the step 2 is calculated.
[0041] The light intensity of the reflection rays in the position of the intersection point is calculated through a formula (1):
= + I i(N = L)dc5(KdR(j + K,R5) (1).
= + I i(N = L)dc5(KdR(j + K,R5) (1).
[0042] Jr represents the light intensity of the reflection rays. I,K, represents an influence value of environment light in the position of the intersection point. /, represents the light intensity of incident light. Ka represents a specular reflectivity coefficient. IC, represents a diffuse reflectivity coefficient. &represents specular reflectivity. R., represents diffuse reflectivity. N, L and dVv respectively represent an object surface normal vector, a ray direction unit vector and a solid angle. As shown in Fig. 1, a horizontal axis direction represents an object surface; a longitudinal axis direction represents a normal vector direction of the object surface; and the solid angle is defined as an angle of a projection area of an underground object on a spherical surface to an observation point after the three-dimensional spherical surface is formed by using the camera as the observation point.
[0043] Or, the light intensity of the refraction rays in the position of the intersection point is calculated through a formula (2):
it = (cos 02/cos 01) (ii ¨ 1,) (2).
Date Recue/Date Received 2020-04-27
it = (cos 02/cos 01) (ii ¨ 1,) (2).
Date Recue/Date Received 2020-04-27
[0044] it represents the light intensity of the refraction rays, and 0/ and 02 are an incidence angle and a refraction angle.
[0045] A light and shade effect is only determined jointly by a first intersected object surface normal direction, a material, a view point and an illumination direction, and illumination intensity, and the second layer and deeper layer rays are not considered for ray projection, so that shade, reflection, refraction and fluorescence effects do not exist.
[0046] Step 4: a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point is calculated. The direction of the rays newly generated is jointly determined by an incidence light direction, an object surface normal direction and media.
[0047] Step 5: the rays newly generated in the step 4 is traced, and whether the third time reflection or refraction rays are emitted onto the view plane right in front of a safety mine lamp or not is judged; if so, the third time reflection light intensity and/or refraction light intensity is calculated; and otherwise, flow returns to the step 2 to redetermine the closest intersection point, and the step 3 to the step 5 are repeated.
[0048] After the rays are emitted from the camera, ray tracing is performed as follows:
the rays may intersect with transparent objects and nontransparent objects or may not intersect with any object in the scene after being emitted from the camera.
the rays may intersect with transparent objects and nontransparent objects or may not intersect with any object in the scene after being emitted from the camera.
[0049] (1) If the rays do not intersect with any object, tracing is given up.
If the intersection point is on the nontransparent object, only the light intensity of the reflection rays is calculated. If the intersection point is on the transparent object, the light intensity of the reflection rays and the light intensity of refraction rays are calculated, and the rays obtained by reflecting or refracting the initial rays for three times are traced. If the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, the light intensity of the rays is calculated. If not, the tracing is given up, and flow enters the step (2).
If the intersection point is on the nontransparent object, only the light intensity of the reflection rays is calculated. If the intersection point is on the transparent object, the light intensity of the reflection rays and the light intensity of refraction rays are calculated, and the rays obtained by reflecting or refracting the initial rays for three times are traced. If the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, the light intensity of the rays is calculated. If not, the tracing is given up, and flow enters the step (2).
[0050] (2) If all reflection and refraction rays generated by the initial rays are not emitted onto the view plane right in front of the safety mine lamp, an intersection point second closest to the view point in the intersection points of the initial rays and the objects. The step (1) is repeated. If the second closest intersection point does not meet conditions, the next closest intersection point is sequentially calculated until the Date Recue/Date Received 2020-04-27 intersection point found meets the conditions.
[0051] As shown in Fig. 2, an example for calculating the light intensity of the reflection rays and the light intensity of refraction are given concretely as follows.
[0052] It is supposed that in the underground scene, the camera is positioned in the position of the view point; light is emitted from the camera; and a transparent object 01 and a nontransparent object 02 exist. Firstly, an initial ray E is emitted from the view point and intersects with the 01 at P1, and a reflection ray R1 and a refraction ray T1 are generated. Light intensity of the R1 conforms to a formula = 1,,,1ai Ii(N1 = Li)dct7(If diRdi IciRs,), and since the Ri no longer intersect with other objects, tracing is ended. Light intensity of the Ti conforms to a formula = (COsevecos Doi - - The Ti intersects inside the 01 at Pz, and a reflection ray R2 and a refraction ray T2 are generated. Light intensity of the R2 conforms to a formula ir2 = 1a2 1<a2 In (N2 = 1.2)d (.4)2(K R + Ks2 Rs2 ), and light intensity of the T2 conforms to a formula It2 = (COS 04/COS 0)(1t1 4.2) - Recursion may be continuously performed to trace the R2 and the T2. For example, the T2 and 03 intersect at P3, and since the 03 is nontransparent, only a reflection ray R3 is generated. Light intensity of the R3 conforms to a formula fr3 = la31a3 42(N3 =
L3)dc(3(Kd3Rd3 K53 Rs3 . The R3 finally enters the view plane.
L3)dc(3(Kd3Rd3 K53 Rs3 . The R3 finally enters the view plane.
[0053] 8/ and 02 are an incidence angle and a reflection angle at the position Pi. 03 and 04 are an incidence angle and a reflection angle at the position P2. IctiKa, represents an influence value of the environment light at the position Pi. a2Ka2 represents an influence value of the environment light at the position Pz. a,Ka3 represents an influence value of the environment light at the position P3. I, represents light intensity of the ray E, i.e., the light intensity of incidence light of the initial ray.
lf,11, Kd2, and Kci, respectively represent specular reflectivity coefficients at the positions Pi. P2 and P3. Kci, K52, and Ks, respectively represent diffuse reflectivity coefficients at the positions Pi. P2 and P3. Rdi Rd2 and Rd3 respectively represent specular reflectivity at the positions Pi, P2 and P3. R51, Rs2, and Rs3 respectively represent diffuse reflectivity at the positions Pi,P2 and P3. N1, Nz, and N3 respectively represent normal vectors of the object surface at the positions Pi. P2 and P3. Li, L2 and L3 respectively represent unit vectors of ray directions of the initial ray E, the refraction Date Recue/Date Received 2020-04-27 ray T1 and the refraction ray T2. dodTh, and daT3 respectively represent solid angles generated at the positions P1, P2 and P3.
lf,11, Kd2, and Kci, respectively represent specular reflectivity coefficients at the positions Pi. P2 and P3. Kci, K52, and Ks, respectively represent diffuse reflectivity coefficients at the positions Pi. P2 and P3. Rdi Rd2 and Rd3 respectively represent specular reflectivity at the positions Pi, P2 and P3. R51, Rs2, and Rs3 respectively represent diffuse reflectivity at the positions Pi,P2 and P3. N1, Nz, and N3 respectively represent normal vectors of the object surface at the positions Pi. P2 and P3. Li, L2 and L3 respectively represent unit vectors of ray directions of the initial ray E, the refraction Date Recue/Date Received 2020-04-27 ray T1 and the refraction ray T2. dodTh, and daT3 respectively represent solid angles generated at the positions P1, P2 and P3.
[0054] Step 6: the light intensity in the step 5 is converted into a pixel value through a camera CCD photosensitive element. The rays obtained after the third time reflection and/or refraction of the rays emitted from the camera are emitted onto the view plane.
Imaging is performed on the view plane.
Imaging is performed on the view plane.
[0055] Step 7: the pixel value of strong light emitted from the camera is eliminated in an image finally shown on the view plane to obtain an image after strong light source influence elimination according to the methods as follows.
[0056] Besides light of the safety mine lamp simulated by light emitted from the camera underground, i.e., a light source A, other artificial lamp light, i.e., a light source B also exists, and meanwhile, environment light, i.e., an artificial light source C
also exists.
also exists.
[0057] When the third time reflection rays and/or refraction rays are irradiated onto the view plane, the image on the view plane may be shown as the following formula:
P (x, y) = R(x , y) = S (x,y) = L(x,y) (3)-
P (x, y) = R(x , y) = S (x,y) = L(x,y) (3)-
[0058] P(x,y) represents the image finally shown on the view plane. R(x,y) represents an image shown on the view plane when the camera does not emit light, i.e., the image shown on the view plane when the light source B and the light source C are overlapped.
S(x,y) represents an image on the view plane when only the camera emits light.
L(x,y) represents an image of the environment light, i.e., the light source C, on the view plane.
S(x,y) represents an image on the view plane when only the camera emits light.
L(x,y) represents an image of the environment light, i.e., the light source C, on the view plane.
[0059] (x, Y) = R(x,Y) = S(x,Y) (4) is set,
[0060] the logarithm is taken at both sides to obtain ln P (x, y) = ln I (x, y) + in L(x, y) (5),
[0061] and the environment light L(x,y) may be shown as follows through P(x,y) and Gaussian kernel convolution of a Gaussian function G(x,y):
L(x, y) = P (x y) * G (x, y) (6) _(x2+3,2)
L(x, y) = P (x y) * G (x, y) (6) _(x2+3,2)
[0062] wherein G (x, y") = Ae c2
[0063] C represents a Gaussian surrounding scale, and .1 is one scale, and enables J f G (x y) dx dy = 1 to be always true. Through the formulas (4), (5) and (6), it can Date Recue/Date Received 2020-04-27 be obtained:
in R(x , y) = In P (x,y) ¨ In(P (x , y) * G(x, y)) ¨ In S(x, y) elnR(x,y)
in R(x , y) = In P (x,y) ¨ In(P (x , y) * G(x, y)) ¨ In S(x, y) elnR(x,y)
[0064] wherein S'(x, y) = is set,
[0065] and S'(x,y) is the image after the strong light source influence elimination.
[0066] The present invention utilizes the ray reverse tracing technology.
Under the condition of greatly reducing the calculation amount of the ray tracing, the dazzle light phenomenon of the strong light source on the low-illuminance underground video image is effectively reduced, so that the effect of restoring the video image is achieved.
Date Recue/Date Received 2020-04-27
Under the condition of greatly reducing the calculation amount of the ray tracing, the dazzle light phenomenon of the strong light source on the low-illuminance underground video image is effectively reduced, so that the effect of restoring the video image is achieved.
Date Recue/Date Received 2020-04-27
Claims (4)
1. A method for restoring an underground image on the basis of a ray reverse tracing technology, comprising the following steps:
step 1: supposing an underground camera as a view point, and emitting rays into an underground scene;
step 2: recording all intersection points of all rays and underground objects, and calculating one intersection point closest to the view point in the intersection points;
step 3: according to illumination, object materials and a normal direction, calculating light intensity of reflection rays or refraction rays in the closest intersection point determined in the step 2;
step 4: calculating a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point;
step 5: tracing the rays newly generated in the step 4, and judging whether a third time reflection or refraction rays are emitted onto a view plane right in front of a safety mine lamp or not; if so, calculating a third time reflection light intensity or refraction light intensity; and otherwise, returning to the step 2 to redetermine the closest intersection point, and repeating the step 3 to the step 5;
step 6: converting the light intensity in the step 5 into a pixel value through a camera CCD
photosensitive element, emitting rays obtained after the third time reflection or refraction of the rays emitted from the camera onto the view plane, and performing imaging on the view plane; and step 7: eliminating the pixel value of strong light emitted from the camera in an image finally shown on the view plane to obtain an image after strong light source influence elimination.
step 1: supposing an underground camera as a view point, and emitting rays into an underground scene;
step 2: recording all intersection points of all rays and underground objects, and calculating one intersection point closest to the view point in the intersection points;
step 3: according to illumination, object materials and a normal direction, calculating light intensity of reflection rays or refraction rays in the closest intersection point determined in the step 2;
step 4: calculating a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point;
step 5: tracing the rays newly generated in the step 4, and judging whether a third time reflection or refraction rays are emitted onto a view plane right in front of a safety mine lamp or not; if so, calculating a third time reflection light intensity or refraction light intensity; and otherwise, returning to the step 2 to redetermine the closest intersection point, and repeating the step 3 to the step 5;
step 6: converting the light intensity in the step 5 into a pixel value through a camera CCD
photosensitive element, emitting rays obtained after the third time reflection or refraction of the rays emitted from the camera onto the view plane, and performing imaging on the view plane; and step 7: eliminating the pixel value of strong light emitted from the camera in an image finally shown on the view plane to obtain an image after strong light source influence elimination.
2. The method for restoring the underground image on the basis of the ray reverse tracing technology according to claim 1, wherein in the step 3, the light intensity of the reflection rays or refraction rays in the closest intersection point determined in the step 2 is calculated according to the following method:
calculating the light intensity of the reflection rays in the position of the intersection point through a formula (1):
I r = I.alpha.K.alpha. + l i(N .cndot. L)d~(K d R d + K s R s) (1) wherein I r represents the light intensity of the reflection rays;
I.alpha.K.alpha. represents an influence value of environment light in the position of the intersection point; I i represents the light intensity of incident light; K d represents a specular reflectivity coefficient; K s represents a diffuse reflectivity coefficient; R d represents specular reflectivity; R s represents diffuse reflectivity; and /V, L and d~
respectively represent an object surface normal vector, a ray direction unit vector and a solid angle;
or calculating the light intensity the refraction rays in the position of the intersection point through a formula (2):
I t = (cos .theta.2/cos .theta.1)(I i ¨ I r) (2) wherein I t represents the light intensity of the refraction rays, and .theta.1 and .theta.2 are an incidence angle and a refraction angle.
calculating the light intensity of the reflection rays in the position of the intersection point through a formula (1):
I r = I.alpha.K.alpha. + l i(N .cndot. L)d~(K d R d + K s R s) (1) wherein I r represents the light intensity of the reflection rays;
I.alpha.K.alpha. represents an influence value of environment light in the position of the intersection point; I i represents the light intensity of incident light; K d represents a specular reflectivity coefficient; K s represents a diffuse reflectivity coefficient; R d represents specular reflectivity; R s represents diffuse reflectivity; and /V, L and d~
respectively represent an object surface normal vector, a ray direction unit vector and a solid angle;
or calculating the light intensity the refraction rays in the position of the intersection point through a formula (2):
I t = (cos .theta.2/cos .theta.1)(I i ¨ I r) (2) wherein I t represents the light intensity of the refraction rays, and .theta.1 and .theta.2 are an incidence angle and a refraction angle.
3. The method for restoring the underground image on the basis of the ray reverse tracing technology according to claim 1 or 2, wherein in the step 5, the rays newly generated in the step 4 are traced according to the following methods:
(1) if the rays do not intersect with any object, giving up the tracing; if the intersection point is on a nontransparent object, only calculating the light intensity of the reflection rays; if the intersection point is on a transparent object, calculating the light intensity of the reflection rays and the light intensity of the refraction rays, and tracing the rays obtained by reflecting or refracting the initial rays for three times; if the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, calculating the light intensity of the rays; and if not, giving up the tracing, and entering the step (2); and (2) if all reflection and refraction rays generated by the initial rays are not emitted onto the view plane right in front of the safety mine lamp, determining an intersection point second closest to the view point in the intersection points of the initial rays and the objects;
repeating the step (1); if the second closest intersection point does not meet conditions, sequentially calculating the next closest intersection point until the intersection point found meets the conditions.
(1) if the rays do not intersect with any object, giving up the tracing; if the intersection point is on a nontransparent object, only calculating the light intensity of the reflection rays; if the intersection point is on a transparent object, calculating the light intensity of the reflection rays and the light intensity of the refraction rays, and tracing the rays obtained by reflecting or refracting the initial rays for three times; if the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, calculating the light intensity of the rays; and if not, giving up the tracing, and entering the step (2); and (2) if all reflection and refraction rays generated by the initial rays are not emitted onto the view plane right in front of the safety mine lamp, determining an intersection point second closest to the view point in the intersection points of the initial rays and the objects;
repeating the step (1); if the second closest intersection point does not meet conditions, sequentially calculating the next closest intersection point until the intersection point found meets the conditions.
4. The method for restoring the underground image on the basis of the ray reverse tracing technology according to claim 1 or 2, wherein in the step 7, the pixel value of the strong light emitted from the camera is eliminated in the image finally shown on the view plane to obtain the image after the strong light source influence elimination according to the following method:
when the third time reflection rays or refraction rays are irradiated onto the view plane, the image on the view plane is shown as the following formula:
P (x, y) = R(x, y) .cndot. S (x,y) .cndot. L(x, y) (3) wherein P(x,y) represents the image finally shown on the view plane; R(x,y) represents an image shown on the view plane when the camera does not emit light; S(x,y) represents an image on the view plane when only the camera emits light; and L(x,y) represents an image of the environment light on the view plane:
I(x, y) = R (x, y) .cndot. S(x, y) (4) is set, the logarithm is taken at both sides to obtain In P (x. Y) = ln 1(x. y) + In L(x, -Y) (5), and the environment light L(x,y) is shown as follows through P(x,y) and Gaussian kernel convolution of a Gaussian function G(x,y):
L(x, y) = P (x, y) * G (x, y) (6) wherein C represents a Gaussian surrounding scale, and .lambda. is one scale; through the formulas (4), (5) and (6), it can be obtained:
ln R(x, y) = In P (x, y) ¨ In(P (x, y) * G (x , y)) ¨ ln S(x, y) wherein S' (x, y) = e lnR(x,y) is set, and S'(x,y) is the image after the strong light source influence elimination.
when the third time reflection rays or refraction rays are irradiated onto the view plane, the image on the view plane is shown as the following formula:
P (x, y) = R(x, y) .cndot. S (x,y) .cndot. L(x, y) (3) wherein P(x,y) represents the image finally shown on the view plane; R(x,y) represents an image shown on the view plane when the camera does not emit light; S(x,y) represents an image on the view plane when only the camera emits light; and L(x,y) represents an image of the environment light on the view plane:
I(x, y) = R (x, y) .cndot. S(x, y) (4) is set, the logarithm is taken at both sides to obtain In P (x. Y) = ln 1(x. y) + In L(x, -Y) (5), and the environment light L(x,y) is shown as follows through P(x,y) and Gaussian kernel convolution of a Gaussian function G(x,y):
L(x, y) = P (x, y) * G (x, y) (6) wherein C represents a Gaussian surrounding scale, and .lambda. is one scale; through the formulas (4), (5) and (6), it can be obtained:
ln R(x, y) = In P (x, y) ¨ In(P (x, y) * G (x , y)) ¨ ln S(x, y) wherein S' (x, y) = e lnR(x,y) is set, and S'(x,y) is the image after the strong light source influence elimination.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2019100067663 | 2019-01-04 | ||
CN201910006766.3A CN109862209B (en) | 2019-01-04 | 2019-01-04 | Method for restoring underground image based on light ray inverse tracking technology |
PCT/CN2019/091631 WO2020140397A1 (en) | 2019-01-04 | 2019-06-18 | Method for restoring downhole image based on reverse ray tracing technology |
Publications (2)
Publication Number | Publication Date |
---|---|
CA3079552A1 CA3079552A1 (en) | 2020-07-04 |
CA3079552C true CA3079552C (en) | 2021-03-16 |
Family
ID=66893940
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA3079552A Active CA3079552C (en) | 2019-01-04 | 2019-06-18 | Method for restoring underground image on basis of ray reverse tracing technology |
Country Status (5)
Country | Link |
---|---|
CN (1) | CN109862209B (en) |
AU (1) | AU2019395238B2 (en) |
CA (1) | CA3079552C (en) |
RU (1) | RU2742814C9 (en) |
WO (1) | WO2020140397A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109862209B (en) * | 2019-01-04 | 2021-02-26 | 中国矿业大学 | Method for restoring underground image based on light ray inverse tracking technology |
CN114286375B (en) * | 2021-12-16 | 2023-08-18 | 北京邮电大学 | Mobile communication network interference positioning method |
CN114549339B (en) * | 2022-01-04 | 2024-08-02 | 中南大学 | Blast furnace burden surface image restoration method and system in severe environment |
CN116051450B (en) * | 2022-08-15 | 2023-11-24 | 荣耀终端有限公司 | Glare information acquisition method, device, chip, electronic equipment and medium |
CN116681814B (en) * | 2022-09-19 | 2024-05-24 | 荣耀终端有限公司 | Image rendering method and electronic equipment |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6005916A (en) * | 1992-10-14 | 1999-12-21 | Techniscan, Inc. | Apparatus and method for imaging with wavefields using inverse scattering techniques |
US7389041B2 (en) * | 2005-02-01 | 2008-06-17 | Eastman Kodak Company | Determining scene distance in digital camera images |
US7764230B2 (en) * | 2007-03-13 | 2010-07-27 | Alcatel-Lucent Usa Inc. | Methods for locating transmitters using backward ray tracing |
WO2009086836A1 (en) * | 2008-01-11 | 2009-07-16 | Danmarks Tekniske Universitet | A touch-sensitive device |
WO2011066275A2 (en) * | 2009-11-25 | 2011-06-03 | Massachusetts Institute Of Technology | Actively addressable aperture light field camera |
KR101395255B1 (en) * | 2010-09-09 | 2014-05-15 | 한국전자통신연구원 | Apparatus and method for analysing propagation of radio wave in radio wave system |
RU125335U1 (en) * | 2012-11-07 | 2013-02-27 | Общество с ограниченной ответственностью "Артек Венчурз" | DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS |
US9041914B2 (en) * | 2013-03-15 | 2015-05-26 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
KR101716928B1 (en) * | 2013-08-22 | 2017-03-15 | 주식회사 만도 | Image processing method for vehicle camera and image processing apparatus usnig the same |
JP2015132953A (en) * | 2014-01-10 | 2015-07-23 | キヤノン株式会社 | Image processor and method thereof |
US9311565B2 (en) * | 2014-06-16 | 2016-04-12 | Sony Corporation | 3D scanning with depth cameras using mesh sculpting |
WO2016002578A1 (en) * | 2014-07-04 | 2016-01-07 | ソニー株式会社 | Image processing device and method |
US9977644B2 (en) * | 2014-07-29 | 2018-05-22 | The University Of North Carolina At Chapel Hill | Methods, systems, and computer readable media for conducting interactive sound propagation and rendering for a plurality of sound sources in a virtual environment scene |
KR101592793B1 (en) * | 2014-12-10 | 2016-02-12 | 현대자동차주식회사 | Apparatus and Method for Correcting Image Distortion |
KR20160071774A (en) * | 2014-12-12 | 2016-06-22 | 삼성전자주식회사 | Apparatus, Method and recording medium for processing image |
AU2017250112B2 (en) * | 2016-04-12 | 2020-09-17 | Quidient, Llc | Quotidian scene reconstruction engine |
CN106231286B (en) * | 2016-07-11 | 2018-03-20 | 北京邮电大学 | A kind of three-dimensional image generating method and device |
CN109118531A (en) * | 2018-07-26 | 2019-01-01 | 深圳大学 | Three-dimensional rebuilding method, device, computer equipment and the storage medium of transparent substance |
CN109862209B (en) * | 2019-01-04 | 2021-02-26 | 中国矿业大学 | Method for restoring underground image based on light ray inverse tracking technology |
-
2019
- 2019-01-04 CN CN201910006766.3A patent/CN109862209B/en active Active
- 2019-06-18 CA CA3079552A patent/CA3079552C/en active Active
- 2019-06-18 AU AU2019395238A patent/AU2019395238B2/en active Active
- 2019-06-18 WO PCT/CN2019/091631 patent/WO2020140397A1/en active Application Filing
- 2019-06-18 RU RU2020115096A patent/RU2742814C9/en active
Also Published As
Publication number | Publication date |
---|---|
CN109862209A (en) | 2019-06-07 |
CA3079552A1 (en) | 2020-07-04 |
RU2742814C9 (en) | 2021-04-20 |
CN109862209B (en) | 2021-02-26 |
WO2020140397A1 (en) | 2020-07-09 |
RU2742814C1 (en) | 2021-02-11 |
AU2019395238A1 (en) | 2020-07-23 |
AU2019395238B2 (en) | 2021-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA3079552C (en) | Method for restoring underground image on basis of ray reverse tracing technology | |
Murez et al. | Photometric stereo in a scattering medium | |
CN106991663B (en) | A kind of underwater colour-image reinforcing method theoretical based on dark | |
US7692647B2 (en) | Real-time rendering of realistic rain | |
CN102982514A (en) | Underwater color image enhancement method integrating dark primary and white balance | |
CN112633181B (en) | Data processing method, system, device, equipment and medium | |
CN110135434A (en) | Underwater picture increased quality algorithm based on color line model | |
WO2023070958A1 (en) | Image restoration method based on physical scattering model | |
Lu et al. | Single underwater image descattering and color correction | |
Agrafiotis et al. | The effect of underwater imagery radiometry on 3D reconstruction and orthoimagery | |
Žuži et al. | Impact of dehazing on underwater marker detection for augmented reality | |
CN114581720A (en) | Water pollution detection method and system based on computer vision | |
Dai et al. | Nighttime smartphone reflective flare removal using optical center symmetry prior | |
Ghate et al. | New approach to underwater image dehazing using dark channel prior | |
Jin et al. | Reliable image dehazing by NeRF | |
Fan et al. | Robust photometric stereo in a scattering medium via low-rank matrix completion and recovery | |
CN107305316B (en) | Rear projection system | |
Boudhane et al. | Underwater image enhancement method using color channel regularization and histogram distribution for underwater vehicles AUVs and ROVs | |
Li et al. | Underwater image enhancement using inherent optical properties | |
Jian | Research on Underwater Image Enhancement Algorithm | |
Sun et al. | Single image fog removal algorithm based on an improved dark channel prior method | |
Ghate et al. | Recent trends and challenges in Image Enhancement Techniques for Underwater Photography | |
TWI753344B (en) | Hybrid depth estimation system | |
CN110009586B (en) | Underwater laser image restoration method and system | |
Jiji et al. | REOUN: restoration and enhancement of optical imaging underwater based on non-local prior |