CN117994452A - Multi-camera fused 3D scene and radiation distribution drawing device and method - Google Patents

Multi-camera fused 3D scene and radiation distribution drawing device and method Download PDF

Info

Publication number
CN117994452A
CN117994452A CN202311805536.6A CN202311805536A CN117994452A CN 117994452 A CN117994452 A CN 117994452A CN 202311805536 A CN202311805536 A CN 202311805536A CN 117994452 A CN117994452 A CN 117994452A
Authority
CN
China
Prior art keywords
radiation
scene
camera
depth
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311805536.6A
Other languages
Chinese (zh)
Inventor
刘立业
李会
李华
樊清
林海鹏
陈法国
王崇扬
高怀众
夏三强
李德源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Institute for Radiation Protection
Original Assignee
China Institute for Radiation Protection
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Institute for Radiation Protection filed Critical China Institute for Radiation Protection
Priority to CN202311805536.6A priority Critical patent/CN117994452A/en
Publication of CN117994452A publication Critical patent/CN117994452A/en
Pending legal-status Critical Current

Links

Landscapes

  • Measurement Of Radiation (AREA)

Abstract

The invention discloses a multi-camera fused 3D scene and radiation distribution drawing device and method, relating to the technical field of nuclear radiation detection, wherein the device comprises an integrated probe, a computer data processing center and a display interaction panel, and the integrated probe integrates a plurality of depth cameras and radiation detectors and is used for providing RGB images, depth images and radiation information; the computer data processing center is connected with the integrated probe through a cable and is used for providing power supply and data processing; the display interaction panel is connected with the computer data processing center and used for analyzing and displaying analysis results and user interaction. The device provided by the invention utilizes a multi-camera fusion scene construction and positioning technology, combines a plurality of depth cameras and forms a combined probe with a radiation detector, can stably and efficiently construct a nuclear facility site three-dimensional terrain point cloud in real time, position gamma radiation dose rate measurement positions, reconstruct a gamma radiation field according to a radiation distribution reconstruction algorithm, and position radiation hot spot positions.

Description

Multi-camera fused 3D scene and radiation distribution drawing device and method
Technical Field
The invention belongs to the technical field of nuclear radiation detection, and particularly relates to a multi-camera fused 3D scene and radiation distribution drawing device and method.
Background
Nuclear facility field workers are in urgent need for a collaborative measurement device that can efficiently measure field radiation distribution and assist in guiding field operation schemes, emergency treatment schemes, and digital interconnections. The existing radiation dose field distribution measurement work depends on a plurality of measuring instruments such as a dose rate instrument, a three-dimensional scanner, a total station and the like, requires a plurality of persons to cooperatively work, depends on a labor and management intensive data collection method, takes a plurality of hours for measuring the radiation distribution and data processing of a three-dimensional scene once, consumes longer time, has poor radiation dose field timeliness, and can not effectively guide the work such as the field radiation protection work when the field radiation field dynamically changes.
The method is characterized in that a robot sensing environment and a positioning technology are used as elicitations, a radiation detection technology and a real-time and simultaneous map construction and positioning technology (SLAM, simultaneous Localization AND MAPPING) are combined, real-time measurement of a three-dimensional scene, radiation dose rate and the like is realized, space-time positioning of a measurement track is completed, and a three-dimensional spatial interpolation method is combined, so that rapid, efficient and accurate capture of three-dimensional radiation distribution is realized.
Real-time and simultaneous mapping and localization techniques (SLAMs) are generally based on scene sensors and inertial measurement units, and can be classified into laser SLAMs, vision SLAMs, laser-vision coupled SLAMs, etc., depending on the scene sensor used. The laser SLAM has the advantages of high distance measurement precision, high price and high cost of a high-line laser radar, small field of view and low cost.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a multi-camera fused 3D scene and radiation distribution drawing device and method, by using the device and method, the three-dimensional terrain point cloud of a nuclear facility site can be constructed stably and efficiently in real time, the gamma radiation dose rate measuring position can be positioned, the gamma radiation field can be reconstructed according to the radiation distribution reconstruction algorithm, the radiation hot spot position can be further positioned, and the gamma radiation hot spot position can be used as a digital interconnection interactive terminal, so that key information support for decision making is provided for the on-site operation scheme and emergency treatment scheme formulation.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
a multi-camera fused 3D scene and radiation distribution mapping apparatus comprising:
the integrated probe integrates a plurality of depth cameras and radiation detectors and is used for providing RGB images, depth images and radiation information;
the computer data processing center is connected with the integrated probe through a cable and is mainly used for providing power supply and data processing;
and the display interaction panel is connected with the computer data processing center and is used for analyzing the measurement data, displaying the analysis result and performing interaction operation of a user.
Further, the multi-camera fused 3D scene and radiation distribution mapping device is characterized in that the radiation detector is positioned at the center of the integrated probe, and a plurality of depth cameras are uniformly arranged along the axial direction of the radiation detector.
Further, the integrated probe comprises four depth cameras as described above for a multi-camera fused 3D scene and radiation distribution mapping device.
Further, the multi-camera fused 3D scene and radiation distribution mapping apparatus as described above, the radiation detector is of the gamma dose rate meter type.
Further, the 3D scene and radiation distribution drawing device with the multi-camera fusion function is characterized in that the display interaction panel is in wireless connection with the computer data processing center through WIFI or Bluetooth.
A method of multi-camera fused 3D scene and radiation distribution mapping based on the apparatus described above, comprising the steps of:
s1, converting an RGB image and a depth image provided by each depth camera into a color 3D point cloud;
S2, aligning and splicing the 3D point cloud acquired by each depth camera and the dose rate information acquired by the radiation detector by using a time tag and space coordinate transformation to form a large-view-field 3D scene point cloud;
S3, carrying out feature extraction, pose estimation and pose optimization on the 3D scene point cloud so as to construct an accurate 3D point cloud map and equipment measurement tracks and poses;
S4, performing time alignment on the radiation measurement data and the track measurement data according to the space-time relationship between the radiation detector and the depth camera so as to obtain the track dose rate, and then obtaining the distribution of the radiation dose rate field in the scene by using a spatial interpolation algorithm;
S5, combining the basic assumption of the source item to obtain the distribution of the radiation hot spots in the 3D scene.
Further, in the multi-camera fused 3D scene and radiation distribution mapping method as described above, the color 3D point cloud conversion formula in step S1 is:
Wherein, (c x,cy) is the optical center of the pixel coordinate; (i, j) is pixel coordinates; (f x,fy) is the position of the focal length f scaled in the u-axis and v-axis, i.e., (αf, βf); (x, y, z) is a coordinate in the world coordinate system, and d (i, j) is a depth value corresponding to the pixel (i, j).
Further, in the multi-camera fused 3D scene and radiation distribution mapping method as described above, in step S2, when the integrated probe includes four depth cameras, a calculation formula for stitching the 3D point clouds collected by the four depth cameras is:
A=A1+A2R90+A3R180+A4R270
Wherein, A is the spliced point cloud, A 1、A2、A3、A4 corresponds to the point cloud acquired by the four depth cameras respectively, and subscripts represent the numbers of the depth cameras; r 90、R180、R270 is the rotation matrix of the depth camera relative to the depth camera No.1, and the subscript indicates the rotation angle.
Further, the method for mapping the 3D scene and the radiation distribution fused by the multiple cameras specifically includes the following step S3:
s31, carrying out feature extraction on the 3D scene point cloud obtained in the step S2 by adopting an ORB algorithm;
S32, carrying out pose estimation by adopting a random consistency sampling algorithm, and carrying out pose correction by adopting an iterative nearest neighbor point algorithm;
And S33, carrying out pose diagram optimization by adopting a local closed-loop detection algorithm and a global pose diagram optimization algorithm, thereby constructing an accurate 3D point cloud map, and obtaining a device measurement track and pose by utilizing a key frame 3D point cloud of the spliced map.
Further, the method for performing time alignment on the radiation measurement data and the track measurement data in step S4 is as follows: the radiation measurements are matched to the midpoint of the path integration over the measurement time.
Compared with the prior art, the multi-camera fused 3D scene and radiation distribution drawing device and method provided by the invention have the following beneficial effects:
1. The invention utilizes a multi-camera fusion scene construction and positioning technology, combines a plurality of depth cameras and forms a combined scene and radiation information acquisition probe with a radiation detector, and a data processing center, the probe and a display terminal are arranged separately to form a satchel structure;
2. The invention uses the fusion of a plurality of depth cameras and the radiation detector, avoids the problem that a single depth camera is unstable and easy to lose when in scene construction and measurement, and is beneficial to improving the use stability of scene construction and positioning.
Drawings
Fig. 1 is a schematic structural diagram of a multi-camera fused 3D scene and radiation distribution mapping device according to an embodiment of the present invention;
FIG. 2 is a schematic view of the probe in the apparatus of FIG. 1;
FIG. 3 is a flowchart of a method for generating a multi-camera fused 3D scene and radiation distribution map according to an embodiment of the present invention;
FIG. 4 is another expression flow chart of the method of FIG. 3;
FIG. 5 is a schematic diagram of a method of aligning radiation measurement data with trajectory measurement data;
in the figure: 101-integrated probe, 102-computer processing center, 103-display interaction tablet, 201-hand-held rack, 202-depth camera, 203-radiation detector.
Detailed Description
The invention is described in further detail below with reference to the drawings and the detailed description.
In an embodiment of the present invention, a multi-camera fused 3D scene and radiation distribution mapping apparatus is provided, and fig. 1 shows a schematic structural diagram of the apparatus, where the apparatus includes: the integrated probe 101, the computer data processing center 102 and the display interaction panel 103 are integrated, and the integrated probe 101 integrates a plurality of depth cameras and radiation detectors and is used for providing acquisition information of RGB images, depth images, radiation information such as gamma dose rate and the like; the computer data processing center 102 is connected with the integrated probe 101 through a cable and is mainly used for providing power and data processing resources; the display interaction tablet 103 is in wireless connection with the computer data processing center 102 through WIFI or Bluetooth, and is used for measuring and analyzing result display and user interaction operation.
FIG. 2 shows a schematic structural view of one embodiment of an integrated probe 101 in a multi-camera fused radiation mapping terminal device, the integrated probe 101 comprising a handheld frame 201, a radiation detector 203 and four depth cameras 202, the radiation detector 203 being of the gamma-ray dose rate meter, the gamma-ray dose rate meter being located in the center of the integrated probe 101 for collecting radiation information; four depth cameras 202 are uniformly arranged along the gamma dose rate meter axis for providing RGB and depth maps.
Based on the above device, in the embodiment of the invention, a multi-camera fused 3D scene and radiation distribution mapping method is provided, the relative spatial position relationship between a plurality of depth cameras 202 and a gamma dose rate meter is utilized, scene point clouds acquired by the plurality of depth cameras are spliced, and are fused with a radiation detector such as the gamma dose rate meter to reconstruct the three-dimensional scene point clouds, position a measurement track, and realize three-dimensional radiation distribution reconstruction by using dose rate information. Fig. 3 and 4 show a flow chart of the method comprising the steps of:
S1, converting an RGB image and a depth image provided by each depth camera into a color 3D point cloud, wherein a conversion formula is as follows:
Wherein, (c x,cy) is the optical center of the pixel coordinate; (i, j) is pixel coordinates; (f x,fy) is the position of the focal length f scaled in the u-axis and v-axis, i.e., (αf, βf); (x, y, z) is a coordinate in the world coordinate system, and d (i, j) is a depth value corresponding to the pixel (i, j).
S2, aligning and splicing the 3D point cloud acquired by each depth camera and the dose rate information acquired by the gamma dose rate instrument by using a time tag and space coordinate transformation to form a large-view-field 3D scene point cloud.
Because of the view field of the depth camera, the 3D point cloud area acquired by one depth camera is smaller, when the probe moves faster or the rotation angle is overlarge during measurement, the calculated two frames of point clouds before and after movement cannot be matched correctly to estimate the pose transformation of the equipment, so that the local map of the SLAM reconstructed scene is lost or map frame artifacts appear, and invalid information is unfavorable for map construction. Therefore, the invention adopts a plurality of depth cameras, aligns the 3D point clouds acquired by each depth camera by using a time tag, and splices the 3D point clouds by using space coordinate transformation, thereby realizing the acquisition of the 3D scene point clouds in a large field of view. The 3D scene point cloud information is rich, and the feature extraction information amount is large when the equipment moves, so that the estimation of the moving position and the moving gesture of the equipment is facilitated. Similarly, the dose rate information collected by the gamma dose rate meter is also aligned by time stamping and aligned using spatial transformation.
When the 3D point clouds acquired by the four depth cameras are spliced, the following formula is adopted for calculation:
A=A1+A2R90+A3R180+A4R270
Wherein, A is the spliced point cloud, A 1、A2、A3、A4 corresponds to the point cloud acquired by the four depth cameras respectively, and subscripts represent the camera numbers; r 90、R180、R270 is the rotation matrix of the depth camera relative to the No. 1 depth camera, and the subscript indicates the rotation angle.
And binding the dose rate measurement data with the point cloud A, wherein the dose rate measurement data is positioned at the coordinate origin of the point cloud A and a distance L above the Z axis.
And S3, carrying out feature extraction, pose estimation and pose optimization on the 3D scene point cloud, so as to construct an accurate 3D point cloud map, and equipment measurement tracks and poses.
After a large-view-field 3D scene point cloud is obtained, firstly, performing feature extraction by adopting an ORB algorithm (Oriented FAST);
Then, a random consistency sampling algorithm (RANSAC, random Sample Consensus) is adopted to carry out pose estimation, and an iterative nearest neighbor algorithm (ICP, ITERATIVE CLOSEST POINT) is adopted to carry out pose correction;
In order to reduce inaccurate scene map construction caused by accumulated errors, a local closed loop detection algorithm and a global pose map optimization algorithm are adopted to optimize the pose map, so that an accurate 3D point cloud map is constructed; and obtaining the equipment measurement track and pose by using the key frame 3D point cloud of the spliced map.
S4, obtaining track dose rate according to the space-time relationship between the depth camera and the dose rate meter, and obtaining the distribution of the radiation dose rate field in the scene by using a spatial interpolation algorithm.
Track dose rate refers to the position of the dose rate measurement per frame in the scene. After the track dose rate is obtained, the distribution of the radiation dose rate field in the 3D scene can be obtained by using a spatial interpolation algorithm.
S5, combining the basic assumption of the source item to obtain the distribution of the radiation hot spots in the 3D scene.
The 3D scene and radiation distribution mapping method with multi-camera fusion is implemented on the premise that dose rate and point cloud information are acquired on each frame of data, however, in fact, the dose rate acquisition frequency is far less than the depth camera image acquisition frequency, so that a special space-time alignment method is required for radiation measurement data and track measurement data.
Fig. 5 shows a schematic diagram of a method for time alignment of radiation measurement data and trajectory measurement data provided by the present invention. Since the radiation dose rate acquisition frequency (about 2 Hz) is much smaller than the camera image acquisition frequency (about 20-25 Hz), in order to manage nuclear measurement position data, a balance must be found between position uncertainty and count uncertainty. The position uncertainty is dependent on the measurement time and the count uncertainty is inversely proportional to the measurement time. As shown in fig. 5, the dose rate measurement is matched to the midpoint of the path integration over the measurement time.
The time measurement error in parallel calculations is on the order of a few milliseconds and the minimum integration time for dose rate measurements is on the order of 500 milliseconds, so it is believed that the error caused by assigning radiation measurements to specific time ranges is negligible and the main measurement positioning uncertainty is caused by the motion of the instrument during the measurement integration and linear pose interpolation method.
According to the multi-camera fused 3D scene and radiation distribution drawing device and method, a multi-camera fused scene construction and positioning technology is utilized, a plurality of depth cameras are combined and form a combined scene and radiation information acquisition probe with a radiation detector, and a data processing center, the probe and a display terminal are arranged separately to form a satchel type structure; meanwhile, the multiple depth cameras are fused with the radiation detector, so that the problem that a single depth camera is small in view field and unstable and easy to lose during scene construction and measurement is avoided, and the use stability of scene construction and positioning is improved.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. A multi-camera fused 3D scene and radiation distribution mapping apparatus, the apparatus comprising:
An integrated probe (101), the integrated probe (101) integrating a plurality of depth cameras and radiation detectors for providing RGB map, depth map and radiation information;
the computer data processing center (102) is connected with the integrated probe (101) through a cable and is mainly used for providing power supply and data processing;
And a display interaction panel (103), the display interaction panel (103) being connected to the computer data processing center (102) for analyzing the measurement data, displaying the analysis result, and the user's interaction.
2. The multi-camera fused 3D scene and radiation distribution mapping device of claim 1, wherein the radiation detector (203) is located in the center of the integrated probe (101), and a plurality of the depth cameras (202) are uniformly arranged along the axis of the radiation detector (203).
3. The multi-camera fused 3D scene and radiation distribution mapping device of claim 2 wherein said integrated probe (101) comprises four of said depth cameras (202).
4. A multi-camera fused 3D scene and radiation distribution mapping apparatus as claimed in claim 3, characterized in that the radiation detector (203) is of the gamma-dose rate meter type.
5. The multi-camera fused 3D scene and radiation distribution mapping device of any of claims 1-4 wherein said display interaction tablet (103) is wirelessly connected to said computer data processing center (102) by WIFI or bluetooth.
6. A method of multi-camera fused 3D scene and radiation distribution mapping based on the apparatus of any of claims 1-5, comprising the steps of:
s1, converting an RGB image and a depth image provided by each depth camera into a color 3D point cloud;
S2, aligning and splicing the 3D point cloud acquired by each depth camera and the dose rate information acquired by the radiation detector by using a time tag and space coordinate transformation to form a large-view-field 3D scene point cloud;
S3, carrying out feature weighting, pose estimation and pose optimization on the 3D scene point cloud so as to construct an accurate 3D point cloud map and equipment measurement tracks and poses;
S4, performing time alignment on the radiation measurement data and the track measurement data according to the space-time relationship between the radiation detector and the depth camera so as to obtain the track dose rate, and then obtaining the distribution of the radiation dose rate field in the scene by using a spatial interpolation algorithm;
S5, combining the basic assumption of the source item to obtain the distribution of the radiation hot spots in the 3D scene.
7. The multi-camera fused 3D scene and radiation distribution mapping method of claim 6, wherein the color 3D point cloud conversion formula in step S1 is:
Wherein, (c x,cy) is the optical center of the pixel coordinate; (i, j) is pixel coordinates; (f x,fy) is the position of the focal length f scaled in the u-axis and v-axis, i.e., (αf, βf); (x, y, z) is a coordinate in the world coordinate system, and d (i, j) is a depth value corresponding to the pixel (i, j).
8. The multi-camera fused 3D scene and radiation distribution mapping method according to claim 7, wherein, in step S2, when the integrated probe includes four depth cameras, a calculation formula for stitching the 3D point clouds acquired by the four depth cameras is:
A=A1+A2R90+A3R180+A4R270
Wherein, A is the spliced point cloud, A 1、A2、A3、A4 corresponds to the point cloud acquired by the four depth cameras respectively, and subscripts represent the numbers of the depth cameras; r 90、R180、R270 is the rotation matrix of the depth camera relative to the depth camera No.1, and the subscript indicates the rotation angle.
9. The multi-camera fused 3D scene and radiation distribution mapping method of claim 8, wherein step S3 is specifically:
s31, carrying out feature extraction on the 3D scene point cloud obtained in the step S2 by adopting an ORB algorithm;
S32, carrying out pose estimation by adopting a random consistency sampling algorithm, and carrying out pose correction by adopting an iterative nearest neighbor point algorithm;
And S33, carrying out pose diagram optimization by adopting a local closed-loop detection algorithm and a global pose diagram optimization algorithm, thereby constructing an accurate 3D point cloud map, and obtaining a device measurement track and pose by utilizing a key frame 3D point cloud of the spliced map.
10. The multi-camera fused 3D scene and radiation distribution mapping method according to claim 8 or 9, wherein the method of time-aligning the radiation measurement data with the trajectory measurement data in step S4 is: the radiation measurements are matched to the midpoint of the path integration over the measurement time.
CN202311805536.6A 2023-12-26 2023-12-26 Multi-camera fused 3D scene and radiation distribution drawing device and method Pending CN117994452A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311805536.6A CN117994452A (en) 2023-12-26 2023-12-26 Multi-camera fused 3D scene and radiation distribution drawing device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311805536.6A CN117994452A (en) 2023-12-26 2023-12-26 Multi-camera fused 3D scene and radiation distribution drawing device and method

Publications (1)

Publication Number Publication Date
CN117994452A true CN117994452A (en) 2024-05-07

Family

ID=90901813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311805536.6A Pending CN117994452A (en) 2023-12-26 2023-12-26 Multi-camera fused 3D scene and radiation distribution drawing device and method

Country Status (1)

Country Link
CN (1) CN117994452A (en)

Similar Documents

Publication Publication Date Title
CN111174799B (en) Map construction method and device, computer readable medium and terminal equipment
Zollmann et al. Augmented reality for construction site monitoring and documentation
CN113570721B (en) Method and device for reconstructing three-dimensional space model and storage medium
US11557083B2 (en) Photography-based 3D modeling system and method, and automatic 3D modeling apparatus and method
Zlot et al. Efficiently capturing large, complex cultural heritage sites with a handheld mobile 3D laser mapping system
Roth et al. Moving Volume KinectFusion.
WO2019170166A1 (en) Depth camera calibration method and apparatus, electronic device, and storage medium
CN112894832A (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
CN107179086A (en) A kind of drafting method based on laser radar, apparatus and system
CN101709964B (en) Large-scale grotto instrument-measuring imaging visual geological recording method
CN103226838A (en) Real-time spatial positioning method for mobile monitoring target in geographical scene
CN107358633A (en) Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things
RU2572637C2 (en) Parallel or serial reconstructions in online and offline modes for 3d measurements of rooms
CN105741379A (en) Method for panoramic inspection on substation
CN110569849B (en) AR (augmented reality) -glasses-based multi-instrument simultaneous identification and spatial positioning method and system
CN112825190B (en) Precision evaluation method, system, electronic equipment and storage medium
US20220375220A1 (en) Visual localization method and apparatus
CN107734449A (en) A kind of outdoor assisted location method, system and equipment based on optical label
CN104460951A (en) Human-computer interaction method
CN110260861A (en) Pose determines method and device, odometer
Mi et al. A vision-based displacement measurement system for foundation pit
CN111311659A (en) Calibration method based on three-dimensional imaging of oblique plane mirror
CN114049401A (en) Binocular camera calibration method, device, equipment and medium
CN105844700A (en) System for acquiring three-dimensional point clouds in outdoor scene
CN117994452A (en) Multi-camera fused 3D scene and radiation distribution drawing device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination