AU2019395238B2 - Method for restoring underground image on basis of ray reverse tracing technology - Google Patents

Method for restoring underground image on basis of ray reverse tracing technology Download PDF

Info

Publication number
AU2019395238B2
AU2019395238B2 AU2019395238A AU2019395238A AU2019395238B2 AU 2019395238 B2 AU2019395238 B2 AU 2019395238B2 AU 2019395238 A AU2019395238 A AU 2019395238A AU 2019395238 A AU2019395238 A AU 2019395238A AU 2019395238 B2 AU2019395238 B2 AU 2019395238B2
Authority
AU
Australia
Prior art keywords
rays
image
camera
light intensity
underground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2019395238A
Other versions
AU2019395238A1 (en
Inventor
Xiaoyu Li
Bowen Liu
Peng Liu
Xuliang LU
Lei SI
Chao TAN
Zhongbin WANG
Honglin Wu
Yue Wu
Hongya Zhou
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Xuzhou Goldfluid Hydraulic Technology Development Co Ltd
Original Assignee
China University of Mining and Technology CUMT
Xuzhou Goldfluid Hydraulic Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT, Xuzhou Goldfluid Hydraulic Technology Development Co Ltd filed Critical China University of Mining and Technology CUMT
Publication of AU2019395238A1 publication Critical patent/AU2019395238A1/en
Application granted granted Critical
Publication of AU2019395238B2 publication Critical patent/AU2019395238B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Generation (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

METHOD FOR RESTORING UNDERGROUND IMAGE ON BASIS OF RAY REVERSE TRACING TECHNOLOGY Abstract The present invention discloses a method for restoring an underground image on the basis of a ray reverse tracing technology. The method includes the following steps: setting an underground camera as a light source emitting point, and emitting rays into an underground scene; recording all intersection points of all rays and underground objects, and calculating one intersection point closest to a view point in the intersection points; calculating a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point; respectively tracing the rays newly generated; recording rays irradiated onto a view plane after a strong light source emitted from the position of the camera is reflected or refracted for three times, and calculating light intensity of the rays; converting the light intensity into a pixel value by a camera CCD photosensitive element; and eliminating the pixel value of strong light emitted from the camera in an image finally shown on the view plane to obtain an image after strong light source influence elimination. The method provided by the present invention can effectively eliminate the interference of a strong light source, restore the underground image, and ensure the smooth proceeding of underground work and the life safety of operators.

Description

METHOD FOR RESTORING UNDERGROUND IMAGE ON BASIS OF RAY REVERSE TRACING TECHNOLOGY FIELD OF THE INVENTION
[0001] The present invention belongs to the field of underground image restoration, and particularly relates to a method for restoring an underground image on the basis of a ray reverse tracing technology.
DESCRIPTION OF RELATED ART
[0002] A ray tracing technology is a method for showing a three-dimensional (3D) image on a two-dimensional (2D) screen, is widely applied to games and computer graphics at present, and brings a more vivid effect to people. A light source is supposed as a point light source capable of randomly emitting tens of thousands of rays to surroundings, and those rays are reflected, refracted or absorbed (attenuated) or generate fluorescence after touching different objects. Ray tracing is a general technology from geometrical optics, and a ray passing path model is obtained by tracing rays generating interaction effects with an optical surface. However, tens of thousands of rays exist, and the rays after reflection, refraction, absorption and fluorescence generation are countless, so that the calculation amount of ray positive tracing is great. Therefore, a ray reverse tracing method gradually comes into people's sight. The calculation amount is greatly reduced if a camera lens is used as a light source emitting point and only the part of rays entering a view plane are calculated.
[0003] Due to a fact that most explosion-proof cameras used underground at present are black and white cameras, special underground environment of a coal mine, all-weather artificial illumination, and influence by factors such as dust and dampness, an underground video has the characteristics of low image illuminance and nonuniform illumination distribution, and this special conditions cause low quality of the collected video and poor resolution of the video. When a strong light source such as a safety mine lamp occurs in a view filed of a mine camera, a collected image will have a dazzle light phenomenon, so that the quality of the video image is greatly reduced, and occurrence of safety accidents may be caused. Therefore, application of the ray reverse tracing technology to underground image restoration for image readability improvement is of great importance. It is an object of the present invention to overcome one or more of the above disadvantages, at least to an extent.
SUMMARY OF THE INVENTION
[0004] An embodiment of the present invention aims to provide a method for restoring an underground image on the basis of a ray reverse tracing technology. By aiming at a phenomenon that under the conditions of low illuminance and much dust in a coal mine, a suddenly occurring strong light source may interfere an original video image, so that black and white level contrast of the monitoring image is too great, and information in the video image cannot be recognized, a ray reverse tracing method is used, and a pixel value of the strong light source in a view plane is eliminated, so that interference of the strong light source to the original video image is eliminated.
[0005] In order to achieve a goal of the present invention, an embodiment of the present invention adopts a technical solution that a method for restoring an underground image on the basis of a ray reverse tracing technology includes the following steps:
[0006] step 1: supposing an underground camera as a light source emitting point, i.e., a view point, and emitting rays into an underground scene;
[0007] step 2: recording all intersection points of all rays and underground objects, and calculating one intersection point closest to the view point in the intersection points;
[0008] step 3: according to illumination, object materials and a normal direction, calculating light intensity of reflection rays or refraction rays in the closest intersection point determined in the step 2;
[0009] step 4: calculating a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point;
[0010] step 5: tracing the rays newly generated in the step 4, and judging whether the third time reflection or refraction rays are emitted onto a view plane right in front of a safety mine lamp or not; if so, calculating the third time reflection light intensity and/or refraction light intensity; and otherwise, returning to the step 2 to redetermine the closest intersection point, and repeating the step 3 to the step 5;
[0011] step 6: converting the light intensity in the step 5 into a pixel value through a camera CCD photosensitive element, emitting rays obtained after the third time reflection and/or refraction of the rays emitted from the camera onto the view plane, and performing imaging on the view plane; and
[0012] step 7: eliminating the pixel value of strong light emitted from the camera in an image finally shown on the view plane to obtain an image after strong light source influence elimination.
[0013] In the step 3, the light intensity of the reflection rays or refraction rays in the closest intersection point determined in the step 2 is calculated according to the following method:
[0014] calculating the light intensity of the reflection rays in the position of the intersection point through a formula (1):
1, = laKa + Ii(N - L)dFi(KdRd +KSRS) (1
[0015] wherein ,rrepresents the light intensity of the reflection rays; aKarepresents an influence value of environment light in the position of the intersection point; Ii represents the light intensity of incident light; Kd represents a specular reflectivity coefficient; Ks represents a diffuse reflectivity coefficient; Rd represents specular reflectivity; Rs represents diffuse reflectivity; and N, L and dio respectively represent an object surface normal vector, a ray direction unit vector and a solid angle;
[0016] or calculating the light intensity of the refraction rays in the position of the intersection point through a formula (2):
It = (cos 02 /cos 01)(1i - Ir) (2)
[0017] wherein Itrepresents the light intensity of the refraction rays, and 0 and02are an incidence angle and a refraction angle.
[0018] In the step 5, the rays newly generated in the step 4 are traced according to the following methods:
[0019] (1) if the rays do not intersect with any object, giving up the tracing; if the intersection point is on a nontransparent object, only calculating the light intensity of the reflection rays; if the intersection point is on a transparent object, calculating the light intensity of the reflection rays and the light intensity of the refraction rays, and tracing the rays obtained by reflecting or refracting the initial rays for three times; if the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, calculating the light intensity of the rays; and if not, giving up the tracing, and entering the step (2); and
[0020] (2) if all reflection and refraction rays generated by the initial rays are not emitted onto the view plane right in front of the safety mine lamp, determining an intersection point second closest to the view point in the intersection points of the initial rays and the objects; repeating the step (1); if the second closest intersection point does not meet conditions, sequentially calculating the next closest intersection point until the intersection point found meets the conditions.
[0021] In the step 7, the pixel value of the strong light emitted from the camera is eliminated in the image finally shown on the view plane to obtain the image after the strong light source influence elimination according to the following methods:
[0022] besides light of the safety mine lamp simulated by light emitted from the camera underground, i.e., a light source A, other artificial lamp light, i.e., a light source B also exists, and meanwhile, the environment light, i.e., an artificial light source C also exists.
[0023] When the third time reflection rays and/or refraction rays are irradiated onto the view plane, the image on the view plane is shown as the following formula:
P(x, y) = R (x, y) S(x, y) - L (x, y) • (3)
[0024] wherein P(xy) represents the image finally shown on the view plane; R(xy) represents an image shown on the view plane when the camera does not emit light, i.e., the image shown on the view plane when the light source B and the light source C are overlapped; S(xy) represents an image on the view plane when only the camera emits light; and L(xy) represents an image of the environment light, i.e., the light source C, on the view plane.
[00251 I(x,y)= R(x,y)- S(x,y) (4)isset,
[0026] the logarithm is taken at both sides to obtain InP(x,y) = InI(xy) + InL(xy) (5),
[0027] and the environment light L(xy) is shown as follows through P(xy) and Gaussian kernel convolution of a Gaussian function G(xy):
L(x,y)= P(x,y) * G(x,y) (6) -(x 2 -ry 2
)
[0028] wherein G (x, y) = Ae C2
[0029] C represents a Gaussian surrounding scale, and Z is one scale, and enables ff G(x,y) dx dy = 1 to be always true. Through the formulas (4), (5) and (6), it
can be obtained:
In R (x,y) = In P(x, y) - In(P(x, y) * G(x, y)) - In S(x, y)
[0030] wherein S'(x, y) = e'nR (xIY) is set,
[0031] and S'(xy) is the image after the strong light source influence elimination.
Advantageous Effect
[0032] Compared with the prior art, the technical solution of an embodiment of the present invention has the following beneficial technical effects:
[00331 the present invention changes a conventional thought on image processing by utilizing the ray reverse tracing. Conventional methods mostly use methods of linear conversion, gamma correction, histogram equalization, anti-sharpening mask, homomorphic filtering, tone mapping, dark channel algorithm and the like for the condition of sudden occurrence of a strong light source, and a processing effect is not obvious. The ray reverse tracing technology can effectively eliminate the interference of the strong light source, restore the original underground image, and ensure smooth proceeding of underground work and life safety of operators.
BRIEF DESCRIPTION OF THE DRAWINGS
[0034] Fig. 1 is a schematic diagram of an opened solid included angle dwI of a unit area towards a light source;
[0035] Fig. 2 is a schematic diagram of reflection and refraction receiving of ray reverse tracing of the present invention; and
[0036] Fig. 3 is a process of eliminating strong light source interference by ray reverse tracing of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The technical solutions of the present invention are further described below with reference to the accompanying drawings and embodiments.
[0037] According to a method for restoring an underground image on the basis of a ray reverse tracing technology of the present invention, by aiming at a phenomenon that under the conditions of low illuminance, much dust and high dampness in a coal mine, a suddenly occurring strong light source may interfere an original video image, so that black and white level contrast of the monitoring image is too great, and information in the video image cannot be recognized, a ray reverse tracing method is used, and a pixel value of the strong light source in a view plane is eliminated, so that interference of the strong light source to the original video image is eliminated. As shown in Fig. 3, a process of eliminating strong light source interference by ray reverse tracing of the present invention concretely includes steps as follows.
[0038] Step 1: an underground camera is supposed as a light source emitting point, i.e., a view point, and rays are emitted into an underground scene. Intensity of the rays is equal to light intensity of rays emitted from a safety mine lamp.
[0039] Step 2: all intersection points of all rays and underground objects are recorded, and an intersection point closest to the view point in the intersection points is calculated.
[0040] Step 3: according to illumination, object materials and a normal direction, light intensity of reflection rays or refraction rays in the closest intersection point determined in the step 2 is calculated.
[0041] The light intensity of the reflection rays in the position of the intersection point is calculated through a formula (1):
1, = a KL, + I (N -L)d Ei(K R + KSR) (.
[0042] 1, represents the light intensity of the reflection rays. JaKa represents an influence value of environment light in the position of the intersection point. JI represents the light intensity of incident light. Kd represents aspecular reflectivity coefficient. Ks represents a diffuse reflectivity coefficient. Rd represents specular reflectivity. Rs represents diffuse reflectivity. N, L and diorespectively represent an object surface normal vector, a ray direction unit vector and a solid angle. As shown in Fig. 1, a horizontal axis direction represents an object surface; a longitudinal axis direction represents a normal vector direction of the object surface; and the solid angle is defined as an angle of a projection area of an underground object on a spherical surface to an observation point after the three-dimensional spherical surface is formed by using the camera as the observation point.
[0043] Or, the light intensity of the refraction rays in the position of the intersection point is calculated through a formula (2):
It = (cos 02 /cos 01) (i- Ir) (2).
[0044] It represents the light intensity of the refraction rays, and 0O and 02 are an incidence angle and a refraction angle.
[0045] A light and shade effect is only determined jointly by a first intersected object surface normal direction, a material, a view point and an illumination direction, and illumination intensity, and the second layer and deeper layer rays are not considered for ray projection, so that shade, reflection, refraction and fluorescence effects do not exist.
[0046] Step 4: a direction of rays newly generated after the rays are reflected and refracted by the objects in a position of the intersection point is calculated. The direction of the rays newly generated is jointly determined by an incidence light direction, an object surface normal direction and media.
[0047] Step 5: the rays newly generated in the step 4 is traced, and whether the third time reflection or refraction rays are emitted onto the view plane right in front of a safety mine lamp or not is judged; if so, the third time reflection light intensity and/or refraction light intensity is calculated; and otherwise, flow returns to the step 2 to redetermine the closest intersection point, and the step 3 to the step 5 are repeated.
[0048] After the rays are emitted from the camera, ray tracing is performed as follows: the rays may intersect with transparent objects and nontransparent objects or may not intersect with any object in the scene after being emitted from the camera.
[0049] (1) If the rays do not intersect with any object, tracing is given up. If the intersection point is on the nontransparent object, only the light intensity of the reflection rays is calculated. If the intersection point is on the transparent object, the light intensity of the reflection rays and the light intensity of refraction rays are calculated, and the rays obtained by reflecting or refracting the initial rays for three times are traced. If the rays obtained by reflecting or refracting the initial rays for three times are emitted onto the view plane right in front of the safety mine lamp, the light intensity of the rays is calculated. If not, the tracing is given up, and flow enters the step (2).
[0050] (2) If all reflection and refraction rays generated by the initial rays are not emitted onto the view plane right in front of the safety mine lamp, an intersection point second closest to the view point in the intersection points of the initial rays and the objects. The step (1) is repeated. If the second closest intersection point does not meet conditions, the next closest intersection point is sequentially calculated until the intersection point found meets the conditions.
[0051] As shown in Fig. 2, an example for calculating the light intensity of the reflection rays and the light intensity of refraction are given concretely as follows.
[0052 It is supposed that in the underground scene, the camera is positioned in the position of the view point; light is emitted from the camera; and a transparent object and a nontransparent object 02 exist. Firstly, an initial ray E is emitted from the view point and intersects with the O at P1 , and a reflection ray R, and a refraction ray T1 are generated. Light intensity of the R1 conforms to a formula
lr =Ia, Ka, + I (N1 . Ll) dM-(Kd, R, 1 + KS Rs,) , and since the R 1 no longer
intersect with other objects, tracing is ended. Light intensity of the T, conforms to a
formula Ir1 = (cos 02 /cos 01)(Ij - I,) . The T, intersects inside the 0 at P2, and a
reflection ray R2 and a refraction ray T2 are generated. Light intensity of the R 2
conforms to a formula Ir2 = 'a 2 Ka2 + It1 (N 2 - L2 )d72 (Ka 2R 2 + K2 RS2), and light
intensity of the T2 conforms to a formula Itz = (cOs 6 4 /cos 63)(It1 - Ir2) •
Recursion may be continuously performed to trace the R 2 and the T2. For example, the T2 and 03 intersect at P3 , and since the 03 is nontransparent, only a reflection ray R3 is generated. Light intensity of the R3 conforms to a formula
Ir3 = ,asKa, + It2 (N 3 - L 3 )d T(Kd Rd 3 3 3 ). The R3 finally enters the view 3+KSRs
plane.
[00531 Oand 02 are an incidence angle and a reflection angle at the position P1 .03 and 64 are an incidence angle and a reflection angle at the position P2 . la1 Kai represents
an influence value of the environment light at the position P1 . 'a 2 Ka 2 represents an influence value of the environment light at the position P2 . Ia3 Ka, represents an influence value of the environment light at the position P3 .I represents light intensity of the ray E, i.e., the light intensity of incidence light of the initial ray. Kd,, Kd 2 , and
Kd respectively represent specular reflectivity coefficients at the positions Pi, P2 and P3 . K, 1 , IK 2 , and Ks 3 respectively represent diffuse reflectivity coefficients at
the positions P, P2 and P3 . Rd1 , Rd2 , and Rd3 respectively represent specular
reflectivity at the positions P1 , P2 and P3 . RS 1 , Rs 2 , and Rs3 respectively represent diffuse reflectivity at the positions P, P2 and P 3 . N, N2, and N 3 respectively represent
normal vectors of the object surface at the positions P1 , P2 and P3 . L 1, L 2 and L3 respectively represent unit vectors of ray directions of the initial ray E, the refraction ray T1 and the refraction ray T2. d57i , d and d7- respectively represent solid angles generated at the positions P1 , P2 and P3 .
[0054] Step 6: the light intensity in the step 5 is converted into a pixel value through a camera CCD photosensitive element. The rays obtained after the third time reflection and/or refraction of the rays emitted from the camera are emitted onto the view plane. Imaging is performed on the view plane.
[00551 Step 7: the pixel value of strong light emitted from the camera is eliminated in an image finally shown on the view plane to obtain an image after strong light source influence elimination according to the methods as follows.
[00561 Besides light of the safety mine lamp simulated by light emitted from the camera underground, i.e., a light source A, other artificial lamp light, i.e., a light source B also exists, and meanwhile, environment light, i.e., an artificial light source C also exists.
[0057] When the third time reflection rays and/or refraction rays are irradiated onto the view plane, the image on the view plane may be shown as the following formula:
P(x, y) = R(x, y) - S(x, y) - L (x, y) (3).
[0058] P(xy) represents the image finally shown on the view plane. R(xy) represents an image shown on the view plane when the camera does not emit light, i.e., the image shown on the view plane when the light source B and the light source C are overlapped. S(xy) represents an image on the view plane when only the camera emits light. L(xy) represents an image of the environment light, i.e., the light source C, on the view plane.
[00591 I(x, y) = R (x, y) - S(x, y) (4) is set,
[0060] the logarithm is taken at both sides to obtain lnP(x,y) = InI(x,y) + InL(x,y) (5),
[0061] and the environment light L(xy) may be shown as follows through P(xy) and Gaussian kernel convolution of a Gaussian function G(xy):
L(x, y) P(x, y) * G(x, y) (6) -(x2+2)
[0062] wherein G(x, y) = {e C2
[0063] C represents a Gaussian surrounding scale, and Z is one scale, and enables ff G(x, y) dx dy = 1 to be always true. Through the formulas (4), (5) and (6), it
can be obtained:
InR(x,y) = InP(x,y) - ln(P(x,y) * G(x,y)) - InS(x,y)
[0064] wherein S'(x, y) = elnR(Xy) is set,
[0065] and S'(xy) is the image after the strong light source influence elimination.
[0066] The present invention utilizes the ray reverse tracing technology. Under the condition of greatly reducing the calculation amount of the ray tracing, the dazzle light phenomenon of the strong light source on the low-illuminance underground video image is effectively reduced, so that the effect of restoring the video image is achieved.

Claims (4)

What is claimed is:
1. A method for restoring an underground image on the basis of a ray reverse tracing technology, the method comprising the following steps:
step 1: providing an underground camera as a light source emitting point, i.e., a view point, and emitting rays from the emitting point into an underground scene;
step 2: recording all intersection points of all rays and underground objects, and calculating one intersection point closest to the view point in the intersection points;
step 3: according to illumination, object materials and a normal direction, calculating light intensity of reflection rays or refraction rays at the closest intersection point determined in the step 2;
step 4: calculating a direction of rays that are reflected and refracted by the objects in a position of the intersection point;
step 5: tracing the reflected and refracted rays in step 4, and determining whether the traced reflected or refracted rays are emitted onto a view plane right in front of a safety mine lamp or not; if so, calculating a reflection light intensity and/or a refraction light intensity; and otherwise, returning to step 2 to redetermine the closest intersection point, and repeating steps 3 to step 5;
step 6: converting the reflection and/or refraction light intensity in step 5 into a pixel value through a camera CCD photosensitive element, emitting the reflected and/or refracted rays from the camera onto the view plane, and performing imaging on the view plane; and
step 7: eliminating the pixel value of light emitted from the camera in an image finally shown on the view plane to obtain an image after light source influence elimination.
2. The method for restoring the underground image on the basis of the ray reverse tracing technology according to claim 1, wherein in step 3, the light intensity of the reflected rays or refracted rays in the closest intersection point determined in the step 2 is calculated according to the following method:
calculating the light intensity of the reflected rays in the position of the intersection point through a formula (1):
I,= laKa+I(N L)di(KdRd+KR) wherein Ir represents the light intensity of the reflected rays; IKa represents an influence value of environment light in the position of the intersection point; I represents the light intensity of incident light; Kd represents a speular refletivity coefficient; Ks represents a diffuse reflectivity coefficient; Rd represents specular reflectivity; Rs represents diffuse reflectivity; and N, L and respectively represent an object surface normal vector, a ray direction unit vector and a solid angle; or calculating the light intensity the refracted rays in the position of the intersection point through a formula (2):
It = (cos 02 /cos 0 1)(1i - Ir) (2)
wherein It represents the light intensity of the refracted rays, and 01 and 02 are an incidence angle and a refraction angle.
3. The method for restoring the underground image on the basis of the ray reverse tracing technology according to claim 1 or 2, wherein in step 5, the reflected and refracted rays in step 4 are traced according to the following methods:
(1) if the rays do not intersect with any object, then tracing is stopped; if the intersection point is on a nontransparent object, only calculating the light intensity of the reflected rays; if the intersection point is on a transparent object, calculating the light intensity of the reflected rays and the light intensity of the refracted rays, and tracing the rays obtained by reflecting or refracting the rays emitted from the camera for three times; if the rays obtained by reflecting or refracting the rays emitted from the camera for three times are emitted onto the view plane right in front of the safety mine lamp, calculating the light intensity of the rays; and if not, then tracing is stopped, and entering the step (2); and
(2) if all reflected and refracted rays generated by the rays emitted from the camera are not emitted onto the view plane right in front of the safety mine lamp, determining an intersection point second closest to the view point in the intersection points of the rays emitted from the camera and the objects; repeating the step (1); if the second closest intersection point does not meet conditions, sequentially calculating the next closest intersection point until the intersection point found meets the conditions.
4. The method for restoring the underground image on the basis of the ray reverse tracing technology according to claim 1 or 2, wherein in step 7, the pixel value of the light emitted from the camera is eliminated in the image finally shown on the view plane to obtain the image after the light source influence elimination according to the following method: when the reflected rays and/or refracted rays are irradiated onto the view plane, the image on the view plane is shown as the following formula:
P (x, y) = R (x, y) - S (x, y) - L (x, y) (3)
wherein P(xy) represents the image finally shown on the view plane; R(xy) represents an image shown on the view plane when the camera does not emit light; S(x,y) represents an image on the view plane when only the camera emits light; and L(xy) represents an image of environment light on the view plane:
I(x, y) = R (x, y) - S(x, y) (4) is set,
the logarithm is taken at both sides to obtain In P(x, y) = In I(x, y) + inL(x,y)
(5),
and the environment light L(xy) is shown as follows through P(xy) and Gaussian kernel convolution of a Gaussian function G(x,y):
L(x,y) = P(x,y) * G(x,y) (6)
wherein G(x, y) = e c2
C represents a Gaussian surrounding scale, and is one scale; through the formulas (4), (5) and (6), it can be obtained:
InR(x,y)= lnP(x,y) - ln(P(x,y) * G(xy)) - InS(x,y)
wherein S'(x, y)= e l"(xY) is set,
and S'(xy) is the image after the light source influence elimination.
AU2019395238A 2019-01-04 2019-06-18 Method for restoring underground image on basis of ray reverse tracing technology Active AU2019395238B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201910006766.3A CN109862209B (en) 2019-01-04 2019-01-04 Method for restoring underground image based on light ray inverse tracking technology
CN2019100067663 2019-01-04
PCT/CN2019/091631 WO2020140397A1 (en) 2019-01-04 2019-06-18 Method for restoring downhole image based on reverse ray tracing technology

Publications (2)

Publication Number Publication Date
AU2019395238A1 AU2019395238A1 (en) 2020-07-23
AU2019395238B2 true AU2019395238B2 (en) 2021-11-25

Family

ID=66893940

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2019395238A Active AU2019395238B2 (en) 2019-01-04 2019-06-18 Method for restoring underground image on basis of ray reverse tracing technology

Country Status (5)

Country Link
CN (1) CN109862209B (en)
AU (1) AU2019395238B2 (en)
CA (1) CA3079552C (en)
RU (1) RU2742814C9 (en)
WO (1) WO2020140397A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109862209B (en) * 2019-01-04 2021-02-26 中国矿业大学 Method for restoring underground image based on light ray inverse tracking technology
CN114286375B (en) * 2021-12-16 2023-08-18 北京邮电大学 Mobile communication network interference positioning method
CN116051450B (en) * 2022-08-15 2023-11-24 荣耀终端有限公司 Glare information acquisition method, device, chip, electronic equipment and medium
CN116681814B (en) * 2022-09-19 2024-05-24 荣耀终端有限公司 Image rendering method and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101592793B1 (en) * 2014-12-10 2016-02-12 현대자동차주식회사 Apparatus and Method for Correcting Image Distortion
US9558580B2 (en) * 2014-01-10 2017-01-31 Canon Kabushiki Kaisha Image processing apparatus and method therefor
US20170046821A1 (en) * 2014-07-04 2017-02-16 Sony Corporation Image processing device and method

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6005916A (en) * 1992-10-14 1999-12-21 Techniscan, Inc. Apparatus and method for imaging with wavefields using inverse scattering techniques
US7389041B2 (en) * 2005-02-01 2008-06-17 Eastman Kodak Company Determining scene distance in digital camera images
US7764230B2 (en) * 2007-03-13 2010-07-27 Alcatel-Lucent Usa Inc. Methods for locating transmitters using backward ray tracing
EP2245523B1 (en) * 2008-01-11 2016-11-16 O-Net Wave Touch Limited A touch-sensitive device
WO2011066275A2 (en) * 2009-11-25 2011-06-03 Massachusetts Institute Of Technology Actively addressable aperture light field camera
KR101395255B1 (en) * 2010-09-09 2014-05-15 한국전자통신연구원 Apparatus and method for analysing propagation of radio wave in radio wave system
RU125335U1 (en) * 2012-11-07 2013-02-27 Общество с ограниченной ответственностью "Артек Венчурз" DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS
US9041914B2 (en) * 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
KR101716928B1 (en) * 2013-08-22 2017-03-15 주식회사 만도 Image processing method for vehicle camera and image processing apparatus usnig the same
US9311565B2 (en) * 2014-06-16 2016-04-12 Sony Corporation 3D scanning with depth cameras using mesh sculpting
US9977644B2 (en) * 2014-07-29 2018-05-22 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for conducting interactive sound propagation and rendering for a plurality of sound sources in a virtual environment scene
KR20160071774A (en) * 2014-12-12 2016-06-22 삼성전자주식회사 Apparatus, Method and recording medium for processing image
US10521952B2 (en) * 2016-04-12 2019-12-31 Quidient, Llc Quotidian scene reconstruction engine
CN106231286B (en) * 2016-07-11 2018-03-20 北京邮电大学 A kind of three-dimensional image generating method and device
CN109118531A (en) * 2018-07-26 2019-01-01 深圳大学 Three-dimensional rebuilding method, device, computer equipment and the storage medium of transparent substance
CN109862209B (en) * 2019-01-04 2021-02-26 中国矿业大学 Method for restoring underground image based on light ray inverse tracking technology

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9558580B2 (en) * 2014-01-10 2017-01-31 Canon Kabushiki Kaisha Image processing apparatus and method therefor
US20170046821A1 (en) * 2014-07-04 2017-02-16 Sony Corporation Image processing device and method
KR101592793B1 (en) * 2014-12-10 2016-02-12 현대자동차주식회사 Apparatus and Method for Correcting Image Distortion

Also Published As

Publication number Publication date
CA3079552C (en) 2021-03-16
CA3079552A1 (en) 2020-07-04
CN109862209B (en) 2021-02-26
RU2742814C1 (en) 2021-02-11
CN109862209A (en) 2019-06-07
WO2020140397A1 (en) 2020-07-09
AU2019395238A1 (en) 2020-07-23
RU2742814C9 (en) 2021-04-20

Similar Documents

Publication Publication Date Title
AU2019395238B2 (en) Method for restoring underground image on basis of ray reverse tracing technology
Murez et al. Photometric stereo in a scattering medium
US8831334B2 (en) Segmentation for wafer inspection
CN100527165C (en) Real time object identification method taking dynamic projection as background
KR20170059469A (en) Method, visualization device, and computer program product for visualizing a three-dimensional object
CN104101611A (en) Mirror-like object surface optical imaging device and imaging method thereof
CN102519391B (en) Object surface three-dimensional image reconstruction method on basis of weak saturated two-dimensional images
WO2020019352A1 (en) Method, device and system for inspecting camera module, and machine readable storage medium
CN111833258B (en) Image color correction method based on double-transmissivity underwater imaging model
CN105096370B (en) The equivalent partition reverse sawtooth method of ray tracing
Danciu et al. Shadow removal in depth images morphology-based for kinect cameras
CN110134987B (en) Optical spherical defect detection illumination design method based on ray tracing
US20180235465A1 (en) Eye gaze tracking
Huang et al. Enhancing object detection in the dark using U-Net based restoration module
KR102291162B1 (en) Apparatus and method for generating virtual data for artificial intelligence learning
Dai et al. Nighttime smartphone reflective flare removal using optical center symmetry prior
CN104240286A (en) Real-time reflection method based on screen space
US8733951B2 (en) Projected image enhancement
CN209729457U (en) The device that movable air applied to stage is imaged
CN109612408B (en) Method and device for testing emission angle of semiconductor laser and readable storage medium
CN103982857B (en) Optical lens, image pickup device and optical touch system
US11410378B1 (en) Image processing for generating three-dimensional shape and spatially-varying reflectance of the object using a deep neural network
CN117255964A (en) External illumination with reduced detectability
CN211857172U (en) Long-focus ambient light resistant cloth curtain
CN103676150A (en) Method for eliminating center crossed lobster-eye image points

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)