CN115379182A - Bidirectional structure optical coding and decoding method and device, electronic equipment and storage medium - Google Patents
Bidirectional structure optical coding and decoding method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN115379182A CN115379182A CN202210997597.6A CN202210997597A CN115379182A CN 115379182 A CN115379182 A CN 115379182A CN 202210997597 A CN202210997597 A CN 202210997597A CN 115379182 A CN115379182 A CN 115379182A
- Authority
- CN
- China
- Prior art keywords
- phase
- line
- transverse
- image acquisition
- longitudinal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000002457 bidirectional effect Effects 0.000 title claims abstract description 49
- 230000003287 optical effect Effects 0.000 title claims abstract description 38
- 230000010363 phase shift Effects 0.000 claims abstract description 20
- 238000004364 calculation method Methods 0.000 claims abstract description 19
- 239000011159 matrix material Substances 0.000 claims description 22
- 239000000126 substance Substances 0.000 claims description 21
- 238000004804 winding Methods 0.000 claims description 17
- 150000001875 compounds Chemical class 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 17
- 230000015654 memory Effects 0.000 description 10
- 238000005259 measurement Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 229910052602 gypsum Inorganic materials 0.000 description 1
- 239000010440 gypsum Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000011505 plaster Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3129—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] scanning a light beam on the display screen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/133—Equalising the characteristics of different image components, e.g. their average brightness or colour balance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application provides a bidirectional structure optical coding and decoding method, a device, an electronic device and a storage medium, wherein the method comprises the following steps: firstly, longitudinally scanning a target through multi-frequency phase-shift sinusoidal structured light, longitudinally scanning the target through the sinusoidal structured light, capturing images of scanning results through image acquisition equipment to obtain captured images, then performing unwrapping operation on wrapping phases of second pixel points in the captured images to obtain unwrapped phase information, and finally constructing a three-dimensional point cloud reconstruction model based on polar lines and phase information determined by projection equipment and the image acquisition equipment and reconstructing three-dimensional coordinates of the target based on the three-dimensional point cloud reconstruction model; the method and the device can effectively reduce the number of the projection coding patterns required in one direction aiming at the bidirectional scanning, and can improve the calculation efficiency in the process of three-dimensional point cloud reconstruction.
Description
Technical Field
The present application relates to the field of optical measurement technologies, and in particular, to a bidirectional structured light encoding and decoding method and apparatus, an electronic device, and a storage medium.
Background
Structured light three-dimensional measurement is an active optical three-dimensional measurement mode, and with the progress of science and technology, the acquisition and processing of three-dimensional information are important research problems in the field of three-dimensional measurement. The structured light three-dimensional measurement technology is widely applied to various fields such as automatic manufacturing, bioengineering and the like due to the advantages of non-contact, high precision and high speed. A structured light three-dimensional measurement system typically consists of a projector, a camera, and a computer. The projector projects the coded pattern to the measured object, the camera captures the pattern, and the computer completes the final three-dimensional reconstruction.
The problem that the number of patterns required by bidirectional scanning is large exists in the process of sampling bidirectional structured light to carry out three-dimensional measurement at present, the calculated amount is large in the process of reconstructing three-dimensional point cloud, and a scheme for solving the problem is lacked in the prior art.
Disclosure of Invention
In view of this, embodiments of the present application provide a bidirectional structured light encoding and decoding method and apparatus, an electronic device, and a storage medium, which can effectively reduce the number of projection coding patterns required in one direction for bidirectional scanning, and can improve the calculation efficiency in the process of three-dimensional point cloud reconstruction.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a bidirectional structure optical coding and decoding method, including the following steps:
performing longitudinal scanning and transverse scanning on a target through projection equipment to obtain a scanning result, wherein the scanning result comprises a light intensity value of at least one first pixel point, the longitudinal scanning is performed through multi-frequency phase-shift structured light, and the transverse scanning is performed through sinusoidal structured light;
capturing an image of the scanning result through an image acquisition device to obtain a captured image, wherein the captured image comprises a winding phase of the at least one second pixel point, the winding phase comprises a transverse winding phase and a longitudinal winding phase, the image acquisition device and the projection device are not in the same position, an intersection point exists between a second sight line of the image acquisition device and a first sight line of the projection device, and a projection of the second sight line of the image acquisition device on the plane of the projection device can form a polar line;
performing unwrapping operation on the longitudinal wrapping phase of the at least one second pixel point to obtain a longitudinal phase, calculating a transverse temporary phase according to the longitudinal phase, performing unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and taking the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel point;
and constructing a three-dimensional point cloud reconstruction model based on the polar line and the phase information, and reconstructing a three-dimensional coordinate of the target based on the three-dimensional point cloud reconstruction model, wherein the three-dimensional point cloud reconstruction model is used for searching a first closest point which is closest to the first pixel point on the polar line in the plane of the projection equipment and searching a second closest point which is closest to the second pixel point on the polar line in the plane of the image acquisition equipment.
In a possible embodiment, the scanning the target by the projection device longitudinally and transversely to obtain the scanning result includes:
longitudinally scanning through the multi-frequency phase-shift sine structured light, transversely scanning through the sine structured light to obtain a light intensity value of the at least one first pixel point, and encoding the light intensity value to obtain
Wherein x is p Is the abscissa of said first pixel point under said projection device plane,y p is the ordinate of the first pixel point under the projection device plane,for the intensity value of said at least one first pixel under said lateral scanning,for the light intensity value, β, of said at least one first pixel under said longitudinal scan p To determine an equilibrium constant with a non-negative value of light intensity, f x For the frequency of the projected pattern under said transverse scan, f y For the frequency of the projected pattern under said longitudinal scan, f y ={f 1 ,f 2 ,f 3 ,...,f h },f h Is the highest frequency of the projected pattern, f x =f h ,α p To control the modulation constant of the projected pattern range, N is the phase shift index, N is the total phase shift step number, W p For the lateral resolution of the projection device, H p Is the longitudinal resolution of the projection device.
In a possible implementation manner, the image capturing the scanning result through the image acquisition device to obtain a captured image includes:
image capture is carried out on the scanning result through image acquisition equipment, and the captured result is modeled to obtain
Wherein x is c Is the abscissa, y, of the second pixel point in the plane of the image acquisition device c Is the ordinate of the second pixel point under the plane of the image acquisition equipment,the average light intensity of the abscissa of the at least one second pixel point collected by the image collecting device is obtained,the average light intensity of the vertical coordinate of the at least one second pixel point acquired by the image acquisition equipment is obtained;for brightness modulation of the image acquisition device in the transverse direction,modulating the brightness of the image acquisition equipment in the longitudinal direction;in order to wind the phase in the transverse direction,in order to wind the phase in the longitudinal direction,andis shown as
In a possible embodiment, the performing an unwrapping operation on the longitudinally wrapped phase of the at least one second pixel to obtain a longitudinal phase, calculating a transverse temporary phase from the longitudinal phase, performing an unwrapping operation on the transverse wrapped phase based on the transverse temporary phase to obtain a transverse phase, and using the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel includes:
solving the longitudinal winding phase through a time domain unwrapping model to obtain the longitudinal phase
Determining the epipolar lineStarting point of (2)And a termination pointThe equation of the polar line is expressed as
The transverse phase after the unwrappingSaid longitudinal phase after unwindingMapping into the space of the projection device to obtain
Based on the transverse temporary phaseSaid longitudinal phase after unwindingAligning the transverse phases by a time domain unwrapping modelAnd (6) performing unwrapping.
In one possible embodiment, the constructing a three-dimensional point cloud reconstruction model based on the epipolar line comprises:
acquiring a first intersection point of the first pixel point after making a perpendicular line to the polar line, acquiring a first direction vector of a first sight line of the projection equipment based on the first intersection point, acquiring a second intersection point of the second pixel point after making a perpendicular line to the polar line, and acquiring a second direction vector of a second sight line of the image acquisition equipment based on the second intersection point;
respectively obtaining a first proportional coefficient of the first direction vector and a second proportional coefficient of the second direction vector based on a third intersection point of the first direction vector and the second direction vector;
and constructing the three-dimensional point cloud reconstruction model based on the second scale coefficient.
In a possible implementation manner, the obtaining a first intersection point after the first pixel point makes a perpendicular to the epipolar line, and based on the first intersection point, obtaining a first direction vector of a first line of sight of the projection device includes:
for the first pixel point (x) p ,y p ) Intersecting the polar line at a first intersection point after making a perpendicular line to the polar lineWherein, the first and the second end of the pipe are connected with each other,is the first closest point of the image to the image,is shown as
Wherein the content of the first and second substances,
based on the first intersection pointObtaining a first line of sight O of the projection device p First direction vector of P
Wherein the content of the first and second substances,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the projection device p Is composed of
obtaining a second intersection point of the second pixel point after the second pixel point is perpendicular to the polar line, and obtaining a second direction vector of a second sight line of the image acquisition device based on the second intersection point, including:
for the second pixel point (x) c ,y c ) Intersecting the polar line at a second intersection pointWherein, the first and the second end of the pipe are connected with each other,is the second closest point of the image to the first closest point,is shown as
Wherein the content of the first and second substances,
based on the second intersection pointObtaining a second line of sight O of the image acquisition device c Second direction vector of P
Wherein the content of the first and second substances,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the image acquisition device c Is composed of
the obtaining a first scaling factor of the first direction vector and a second scaling factor of the second direction vector based on a third intersection point of the first direction vector and the second direction vector comprises:
the first line of sight O p P is represented by
The second line of sight O c P is represented by
Wherein, the first and the second end of the pipe are connected with each other,is the optical center of the image acquisition device,is the optical center of the projection device, (X) w ,Y w ,Z w ) Is the three-dimensional coordinates of the target; t is t 1 Is said first scaling factor, t 2 Is the second proportionality coefficient;
the first line of sight O p P and the second line of sight O c P is atIntersecting at a third intersection point P in the three-dimensional space to obtain
The building of the three-dimensional point cloud reconstruction model based on the second proportionality coefficient comprises the following steps:
the three-dimensional coordinates of the object pass through t 2 Calculating to obtain a three-dimensional point cloud reconstruction model of
Wherein, the first and the second end of the pipe are connected with each other,is composed ofThe coordinate values of (2).
In one possible embodiment, the method further comprises:
calibrating the image acquisition device and the projection device;
after calibration, t is measured 2 The calculation result is recorded, so that when the target is detected by the bidirectional structure optical coding and decoding method, t can be obtained through recording 2 The molecule of (1).
In a second aspect, an embodiment of the present application further provides a bidirectional structured light coding and decoding apparatus, where the apparatus includes:
the scanning module is used for longitudinally scanning and transversely scanning a target through projection equipment to obtain a scanning result, wherein the scanning result comprises a light intensity value of at least one first pixel point, the longitudinal scanning is carried out through multi-frequency phase-shift structured light, and the transverse scanning is carried out through the highest-frequency sine structured light;
a capturing module, configured to perform image capturing on the scanning result through an image acquisition device, so as to obtain a captured image, where the captured image includes a wrapped phase of the at least one second pixel point, the wrapped phase includes a transverse wrapped phase and a longitudinal wrapped phase, the image acquisition device and the projection device are not in the same position, an intersection point exists between a second line of sight of the image acquisition device and a first line of sight of the projection device, and a projection of the second line of sight of the image acquisition device on the plane of the projection device can form an epipolar line;
a calculation module, configured to perform an unwrapping operation on the longitudinal wrapping phase of the at least one second pixel to obtain a longitudinal phase, calculate a transverse temporary phase according to the longitudinal phase, perform an unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and use the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel;
and the reconstruction module is used for constructing a three-dimensional point cloud reconstruction model based on the polar line and reconstructing the three-dimensional coordinates of the target based on the three-dimensional point cloud reconstruction model, wherein the three-dimensional point cloud reconstruction model is used for searching the closest point on the polar line to the pixel point in the plane of the projection equipment.
In a third aspect, an embodiment of the present application further provides an electronic device, including: the optical encoder comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, when an electronic device runs, the processor and the storage medium communicate through the bus, and the processor executes the machine-readable instructions to execute the steps of the bidirectional structured light encoding and decoding method according to any one of the first aspect.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to perform the steps of the bi-directional structured light coding and decoding method according to any one of the first aspect.
The embodiment of the application has the following beneficial effects:
the epipolar geometry relation of a measuring system consisting of a projection device, an image acquisition device and a computer is utilized, so that the number of projection coding patterns required in one direction is effectively reduced for bidirectional scanning; and the intersection relation of the sight lines of the image acquisition equipment and the projection equipment in the three-dimensional space is utilized to replace the matrix inversion process after the traditional bidirectional scanning and unidirectional scanning, so that the calculation speed of the three-dimensional point cloud reconstruction is increased.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
FIG. 1 is a schematic flowchart of steps S101-S104 provided in an embodiment of the present application;
FIG. 2 is a polar geometry diagram of a structured light imaging system provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of an approximation process of a bidirectional scanning least squares method provided by an embodiment of the present application;
fig. 4 is a schematic structural diagram of a bidirectional optical codec device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Additionally, it should be understood that the schematic drawings are not necessarily drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and that steps without logical context may be reversed in order or performed concurrently. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application and is not intended to be limiting of the application.
Referring to fig. 1, fig. 1 is a schematic flowchart of steps S101 to S104 of a bidirectional structured light coding and decoding method provided in an embodiment of the present application, and will be described with reference to steps S101 to S104 shown in fig. 1.
Step S101, performing longitudinal scanning and transverse scanning on a target through projection equipment to obtain a scanning result, wherein the scanning result comprises a light intensity value of at least one first pixel point, the longitudinal scanning is performed through multi-frequency phase-shift structured light, and the transverse scanning is performed through sinusoidal structured light;
step S102, image capturing is carried out on the scanning result through an image acquisition device to obtain a captured image, wherein the captured image comprises a winding phase of the at least one second pixel point, the winding phase comprises a transverse winding phase and a longitudinal winding phase, the image acquisition device and the projection device are not in the same position, an intersection point exists between a second sight line of the image acquisition device and a first sight line of the projection device, and a projection of the second sight line of the image acquisition device on the plane of the projection device can form an epipolar line;
step S103, performing unwrapping operation on the longitudinal wrapping phase of the at least one second pixel point to obtain a longitudinal phase, calculating a transverse temporary phase according to the longitudinal phase, performing unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and taking the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel point;
step S104, a three-dimensional point cloud reconstruction model is built based on the polar line and the phase information, and a three-dimensional coordinate of the target is reconstructed based on the three-dimensional point cloud reconstruction model, wherein the three-dimensional point cloud reconstruction model is used for searching a first closest point on the polar line, which is closest to the first pixel point, in the projection equipment plane and searching a second closest point on the polar line, which is closest to the second pixel point, in the image acquisition equipment plane.
According to the bidirectional structure optical coding and decoding method, the number of projection coding patterns required in one direction is effectively reduced aiming at bidirectional scanning by using the polar line geometric relation of a measuring system consisting of the projection equipment, the image acquisition equipment and the computer; and the intersection relation of the sight lines of the image acquisition equipment and the projection equipment in the three-dimensional space is utilized to replace the matrix inversion process after the traditional bidirectional scanning and unidirectional scanning, so that the calculation speed of the three-dimensional point cloud reconstruction is increased.
The above exemplary steps of the embodiments of the present application will be described below.
In step S101, a projection device performs longitudinal scanning and transverse scanning on a target to obtain a scanning result, where the scanning result includes a light intensity value of at least one first pixel, the longitudinal scanning is performed by multi-frequency phase-shift structured light, and the transverse scanning is performed by sinusoidal structured light.
In some embodiments, the scanning the object by the projection device in the longitudinal direction and the transverse direction to obtain the scanning result includes:
longitudinally scanning through the multi-frequency phase-shift sinusoidal structured light, transversely scanning through the sinusoidal structured light to obtain a light intensity value of the at least one first pixel point, and encoding the light intensity value to obtain
Wherein x is p Is the abscissa, y, of the first pixel point in the plane of the projection device p Is the ordinate of the first pixel point under the projection device plane,for the light intensity value of said at least one first pixel under said lateral scan,is the light intensity value, beta, of the at least one first pixel under the longitudinal scan p To determine non-negative equilibrium constant of light intensity value,f x For the frequency of the projected pattern under said transverse scan, f y For the frequency of the projected pattern under said longitudinal scan, f y ={f 1 ,f 2 ,f 3 ,...,f h },f h Is the highest frequency of the projected pattern, f x =f h ,α p To control the modulation constant of the projected pattern range, N is the phase shift index, N is the total phase shift step number, W p For the lateral resolution of the projection device, H p Is the longitudinal resolution of the projection device.
In step S102, an image capturing device captures an image of the scanning result to obtain a captured image, where the captured image includes a wrapping phase of the at least one second pixel point, the wrapping phase includes a transverse wrapping phase and a longitudinal wrapping phase, the image capturing device and the projection device are not in the same position, there is an intersection point between a second line of sight of the image capturing device and a first line of sight of the projection device, and a projection of the second line of sight of the image capturing device on the projection device plane can form an epipolar line.
In some embodiments, the image capturing the scanning result by the image acquisition device to obtain a captured image includes:
image capture is carried out on the scanning result through image acquisition equipment, and the captured result is modeled to obtain
Wherein x is c Is the abscissa, y, of the second pixel point in the plane of the image acquisition device c Is the ordinate of the second pixel point in the plane of the image acquisition device,the average light intensity of the abscissa of the at least one second pixel point collected by the image collecting device is obtained,the average light intensity of the vertical coordinate of the at least one second pixel point acquired by the image acquisition equipment is obtained;for brightness modulation of the image acquisition device in the transverse direction,modulating the brightness of the image acquisition equipment in the longitudinal direction;in order to wind the phase in the transverse direction,in order to wind the phase in the longitudinal direction,andis shown as
In step S103, performing an unwrapping operation on the longitudinal wrapping phase of the at least one second pixel to obtain a longitudinal phase, calculating a transverse temporary phase according to the longitudinal phase, performing an unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and taking the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel.
In some embodiments, referring to fig. 2, fig. 2 is an epipolar geometry diagram of a structured light imaging system provided by embodiments of the present application. As shown in fig. 2, from the optical center O of the projector p One ray O is emitted p P enters phase after being refracted by objectLine of sight O of machine c P, and then to the optical center O of the camera c . Line of sight O of camera c The P projection forms a corresponding polar line on the plane of the projector
In some embodiments, the performing an unwrapping operation on the longitudinally wrapped phase of the at least one second pixel to obtain a longitudinal phase, calculating a transverse temporary phase from the longitudinal phase, performing an unwrapping operation on the transverse wrapped phase based on the transverse temporary phase to obtain a transverse phase, and using the longitudinal phase and the transverse phase as the phase information corresponding to the at least one second pixel includes:
solving the longitudinal winding phase through a time domain unwrapping model to obtain the longitudinal phase
Determining the epipolar lineStarting point of (2)And a termination pointThe equation of the polar line is expressed as
The transverse phase after the unwindingSaid longitudinal phase after unwindingMapping into the space of the projection device to obtain
Based on the transverse temporary phaseSaid longitudinal phase after unwindingAligning the transverse phases by a time domain unwrapping modelAnd (5) performing unwrapping.
In step S104, a three-dimensional point cloud reconstruction model is constructed based on the epipolar line and the phase information, and a three-dimensional coordinate of the target is reconstructed based on the three-dimensional point cloud reconstruction model, where the three-dimensional point cloud reconstruction model is configured to find a first closest point on the epipolar line, which is closest to the first pixel point, in the projection device plane, and to find a second closest point on the epipolar line, which is closest to the second pixel point, in the image acquisition device plane.
In some embodiments, the three-dimensional point cloud reconstruction is essentially the intersection of the projector and camera lines of sight in three-dimensional space. In the structured light three-dimensional imaging system, the projection of the sight line of the camera on the plane of the projector forms a corresponding polar line, so that the three-dimensional reconstruction process can be simplified as follows: in the projector plane, the intersection point of the projector sight line and the projector plane is on the polar line corresponding to the intersection point. According to the relation, the three-dimensional reconstruction after the unidirectional and bidirectional structured light scanning can be optimized respectively.
It should be noted that, in the process of finding a first closest point on the epipolar line closest to the first pixel point in the plane of the projection device or finding a second closest point on the epipolar line closest to the second pixel point in the plane of the image acquisition device, the three-dimensional coordinates of the target may be reconstructed, and one of the three-dimensional coordinates may be selected.
In some embodiments, said constructing a three-dimensional point cloud reconstruction model based on said epipolar line comprises:
acquiring a first intersection point of the first pixel point after making a perpendicular line to the polar line, acquiring a first direction vector of a first sight line of the projection equipment based on the first intersection point, acquiring a second intersection point of the second pixel point after making a perpendicular line to the polar line, and acquiring a second direction vector of a second sight line of the image acquisition equipment based on the second intersection point;
respectively obtaining a first proportional coefficient of the first direction vector and a second proportional coefficient of the second direction vector based on a third intersection point of the first direction vector and the second direction vector;
and constructing the three-dimensional point cloud reconstruction model based on the second scale coefficient.
In some embodiments, the bidirectional scanning uses a least square method to calculate the point cloud, so that the problem that the camera sight and the projector sight cannot intersect at one point due to lens distortion and noise interference is solved, and the imaging robustness is improved. The process of least squares is a process of finding the point closest to the projector pixel along the camera's line of sight in a three-dimensional world coordinate system, which can be approximated by finding the first pixel point (x) on the epipolar line in the projector plane p ,y p ) A process of a closest point, see fig. 3, where fig. 3 is an approximate process schematic diagram of a bidirectional scanning least square method provided in an embodiment of the present application, where the first intersection point after the first pixel point makes a perpendicular line to the epipolar line is obtained, and the projection device is obtained based on the first intersection pointComprises:
for the first pixel point (x) p ,y p ) Intersecting the polar line at a first intersection point after making a perpendicular line to the polar lineWherein the content of the first and second substances,is the first closest point of the plurality of points,is shown as
Wherein the content of the first and second substances,
based on the first intersection pointObtaining a first line of sight O of the projection device p First direction vector of P
Wherein, the first and the second end of the pipe are connected with each other,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the projection device p Is composed of
obtaining a second intersection point of the second pixel point after the second pixel point is perpendicular to the polar line, and obtaining a second direction vector of a second sight line of the image acquisition device based on the second intersection point, including:
for the second pixel point (x) c ,y c ) Intersecting the polar line at a second intersection pointWherein, the first and the second end of the pipe are connected with each other,is the second closest point of the plurality of points,is shown as
Wherein the content of the first and second substances,
based on the second intersection pointObtaining a second line of sight O of the image acquisition device c Second direction vector of P
Wherein the content of the first and second substances,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the image acquisition device c Is composed of
the obtaining a first scaling factor of the first direction vector and a second scaling factor of the second direction vector based on a third intersection point of the first direction vector and the second direction vector comprises:
the first line of sight O p P is represented by
The second line of sight O c P is represented as
Wherein the content of the first and second substances,is the optical center of the image acquisition device,is the optical center of the projection device, (X) w ,Y w ,Z w ) Three-dimensional coordinates of the target; t is t 1 Is said first scaling factor, t 2 Is the second proportionality coefficient;
the first line of sight O p P and the second line of sight O c P intersects with a third intersection point P in the three-dimensional space to obtain
The building of the three-dimensional point cloud reconstruction model based on the second proportionality coefficient comprises the following steps:
the three-dimensional coordinates of the object pass through t 2 Calculating to obtain a three-dimensional point cloud reconstruction model of
Wherein, the first and the second end of the pipe are connected with each other,is composed ofThe coordinate values of (2).
In some embodiments, the method further comprises:
calibrating the image acquisition device and the projection device;
after calibration, t is measured 2 The molecules of (2) are calculated and the calculation results are recorded so as to be on-lineWhen the target is detected by the bidirectional structure optical coding and decoding method, t can be obtained by recording 2 The molecule of (1).
As an example, in actual use, the system consisting of the projection device, the image acquisition device and the computer only needs to be calibrated once, and t is after the calibration 2 The numerator of (2) can be calculated in advance in the form of a lookup table, so that the calculation time is shortened.
In the following, an exemplary application of the embodiment of the present application in a practical bidirectional structured light codec application scenario will be described.
The experimental system included a Casio XJ-155V projector with a resolution of 800X 600, a Prosilicon GC650C camera with a resolution of 640X 480 and a desktop computer for scan control and data processing. The camera and projector are synchronized by extracting the VGA signals from the computer video graphics array port. The proposed algorithm is implemented by C + + programming, and the program runs on a desktop computer with a CPU configured as 3.10ghz Intel i 5-10500. The experiment contained four parts: (1) Verifying the error of the bidirectional structure optical coding and decoding method and the traditional method provided by the embodiment of the application; (2) The real-time performance and the accuracy of the bidirectional structure optical coding and decoding method are verified; and (3) visualizing the reconstruction result.
Firstly, the target and the plaster image are respectively scanned bidirectionally to verify the error of the bidirectional structure optical coding and decoding method and the traditional method. To formulaThe parameters of (2) are set: alpha (alpha) ("alpha") p =255,β p =0,N=8,f y ={1,8,32},f x =32. In the embodiment of the present application, only the sine wave with the highest frequency of 32 is used to scan the X direction for the purpose of reducing the projection pattern, and the frequency is set to f x =f y The only difference between the conventional methods of = {1,8,32} is the lateral phaseSo obtained by comparing the two waysTo verify the feasibility of this approach. The more significant error is that the root mean square error of the lateral solution phase of the target and the gypsum image is 6.91 × 10 at the edge of the scanned object and at the place where the reflectance change is large -4 rad and 7.02X 10 -4 rad。
The resulting unwrapped phase is then usedAndand performing three-dimensional reconstruction to verify the real-time performance and accuracy of the three-dimensional point cloud reconstruction after bidirectional scanning. As shown in table 1, the calculation speed is improved by 6.08 times for the bidirectional scanning text algorithm compared with the traditional least square algorithm. But there will be some error in the following white wall background and the edges of the object that are not within the measurement range of the system.
TABLE 1 Bi-directional scanning point cloud reconstruction time-consuming contrast
Table 1 Comparing the speeds when scanning along two-direction
In summary, the embodiment of the application has the following beneficial effects:
the epipolar geometric relation of a measuring system consisting of a projection device, an image acquisition device and a computer is utilized, so that the number of projection coding patterns required by one direction is effectively reduced for bidirectional scanning; and the intersection relation of the sight lines of the image acquisition equipment and the projection equipment in the three-dimensional space is utilized to replace the matrix inversion process after the traditional bidirectional scanning and unidirectional scanning, so that the calculation speed of the three-dimensional point cloud reconstruction is increased.
Based on the same inventive concept, the embodiment of the present application further provides a bidirectional structure optical coding and decoding apparatus corresponding to the bidirectional structure optical coding and decoding method in the first embodiment, and since the principle of the apparatus in the embodiment of the present application for solving the problem is similar to the bidirectional structure optical coding and decoding method, the implementation of the apparatus can refer to the implementation of the method, and repeated details are not repeated.
As shown in fig. 4, fig. 4 is a schematic structural diagram of a bidirectional optical codec device 400 provided in this embodiment of the present application. The bidirectional structure optical codec 400 includes:
the scanning module 401 is configured to perform longitudinal scanning and transverse scanning on a target through a projection device to obtain a scanning result, where the scanning result includes a light intensity value of at least one first pixel, the longitudinal scanning is performed through multi-frequency phase-shift structured light, and the transverse scanning is performed through a highest-frequency sinusoidal structured light;
a capturing module 402, configured to perform image capturing on the scanning result through an image acquisition device, so as to obtain a captured image, where the captured image includes a wrapping phase of the at least one second pixel point, the wrapping phase includes a transverse wrapping phase and a longitudinal wrapping phase, the image acquisition device and the projection device are not in the same position, an intersection point exists between a second line of sight of the image acquisition device and a first line of sight of the projection device, and a projection of the second line of sight of the image acquisition device on the plane of the projection device can form an epipolar line;
a calculating module 403, configured to perform an unwrapping operation on the longitudinal wrapping phase of the at least one second pixel to obtain a longitudinal phase, calculate a transverse temporary phase according to the longitudinal phase, perform an unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and use the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel;
a reconstructing module 404, configured to construct a three-dimensional point cloud reconstruction model based on the epipolar line and the phase information, and reconstruct a three-dimensional coordinate of the target based on the three-dimensional point cloud reconstruction model, where the three-dimensional point cloud reconstruction model is configured to find a first closest point on the epipolar line closest to the first pixel point in the projection device plane, and find a second closest point on the epipolar line closest to the second pixel point in the image acquisition device plane.
It should be understood by those skilled in the art that the functions of the units in the bi-directional structured light codec device 400 shown in fig. 4 can be understood by referring to the related description of the bi-directional structured light codec method. The functions of the units in the bi-directional optical codec 400 shown in fig. 4 can be implemented by a program running on a processor, and can also be implemented by specific logic circuits.
In a possible implementation, the scanning module 401 scans the target longitudinally and transversely by the projection device to obtain scanning results, and includes:
longitudinally scanning through the multi-frequency phase-shift sine structured light, transversely scanning through the sine structured light to obtain a light intensity value of the at least one first pixel point, and encoding the light intensity value to obtain
Wherein x is p Is the abscissa, y, of the first pixel point in the plane of the projection device p Is the ordinate of the first pixel point under the projection device plane,for the intensity value of said at least one first pixel under said lateral scanning,for the light intensity value, β, of said at least one first pixel under said longitudinal scan p To determine the equilibrium constant of the light intensity values that are not negative, f x For the frequency of the projected pattern under said transverse scan, f y For the frequency of the projected pattern under said longitudinal scan, f y ={f 1 ,f 2 ,f 3 ,…,f h },f h Is the highest frequency of the projected pattern, f x =f h ,α p To control the modulation constant for a range of projection patterns, N is the phase shift index, N is the total number of phase shift steps, W p For the lateral resolution of the projection device, H p Is the longitudinal resolution of the projection device.
In a possible implementation, the capturing module 402 performs image capturing on the scanning result through an image capturing device to obtain a captured image, including:
image capture is carried out on the scanning result through image acquisition equipment, and the captured result is modeled to obtain
Wherein x is c Is the abscissa, y, of the second pixel point in the plane of the image acquisition device c Is the ordinate of the second pixel point under the plane of the image acquisition equipment,the average light intensity of the abscissa of the at least one second pixel point collected by the image collecting device is obtained,the average light intensity of the vertical coordinate of the at least one second pixel point acquired by the image acquisition equipment is obtained;for brightness modulation of the image acquisition device in the transverse direction,modulating the brightness of the image acquisition equipment in the longitudinal direction;for transverse winding phaseThe number of bits is,in order to wind the phase in the longitudinal direction,andis shown as
In a possible implementation manner, the calculating module 403 performs an unwrapping operation on the longitudinal wrapping phase of the at least one second pixel point to obtain a longitudinal phase, calculates a transverse temporary phase from the longitudinal phase, performs an unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and uses the longitudinal phase and the transverse phase as the phase information corresponding to the at least one second pixel point, including:
solving the longitudinal winding phase through a time domain unwrapping model to obtain the longitudinal phase
Determining the epipolar lineStarting point of (2)And a termination pointThe equation of the polar line is expressed as
The transverse phase after the unwrappingSaid longitudinal phase after unwindingMapping into the space of the projection device to obtain
Based on the transverse temporary phaseSaid longitudinal phase after unwindingThe transverse phase is separated by a time domain unwrapping modelAnd (5) performing unwrapping.
In one possible implementation, the reconstruction module 404 constructs a three-dimensional point cloud reconstruction model based on the epipolar line, including:
acquiring a first intersection point of the first pixel point after making a perpendicular line to the polar line, acquiring a first direction vector of a first sight line of the projection equipment based on the first intersection point, acquiring a second intersection point of the second pixel point after making a perpendicular line to the polar line, and acquiring a second direction vector of a second sight line of the image acquisition equipment based on the second intersection point;
respectively obtaining a first proportional coefficient of the first direction vector and a second proportional coefficient of the second direction vector based on a third intersection point of the first direction vector and the second direction vector;
and constructing the three-dimensional point cloud reconstruction model based on the second scale coefficient.
In a possible implementation manner, the obtaining, by the reconstruction module 404, a first intersection point after the first pixel point makes a perpendicular to the epipolar line, and based on the first intersection point, obtaining a first direction vector of a first line of sight of the projection device includes:
for the first pixel point (x) p ,y p ) Intersecting the polar line at a first intersection point after making a perpendicular line to the polar lineWherein, the first and the second end of the pipe are connected with each other,is the first closest point of the image to the image,is shown as
Wherein the content of the first and second substances,
based on the first intersection pointObtaining a first line of sight O of the projection device p First direction vector of P
Wherein the content of the first and second substances,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the projection device p Is composed of
obtaining a second intersection point of the second pixel point after the second pixel point is perpendicular to the polar line, and obtaining a second direction vector of a second sight line of the image acquisition device based on the second intersection point, including:
for the second pixel point (x) c ,y c ) Intersecting the polar line at a second intersection pointWherein the content of the first and second substances,is the second closest point of the image to the first closest point,is shown as
Wherein the content of the first and second substances,
based on the second intersection pointObtaining a second line of sight O of the image acquisition device c Second direction vector of P
Wherein, the first and the second end of the pipe are connected with each other,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the image acquisition device c Is composed of
the obtaining a first scaling factor of the first direction vector and a second scaling factor of the second direction vector based on a third intersection point of the first direction vector and the second direction vector respectively includes:
the first line of sight O p P is represented by
The second line of sight O c P is represented by
Wherein the content of the first and second substances,is the optical center of the image acquisition device,is the optical center of the projection device, (X) w ,Y w ,Z w ) Three-dimensional coordinates of the target; t is t 1 Is said first scaling factor, t 2 Is the second scaling factor;
the first line of sight O p P and the second line of sight O c P intersects with a third intersection point P in the three-dimensional space to obtain
The building of the three-dimensional point cloud reconstruction model based on the second proportionality coefficient comprises the following steps:
the three-dimensional coordinates of the object pass through t 2 Calculating to obtain a three-dimensional point cloud reconstruction model of
In the embodiment of the present application, t may be used 1 To calculate the three-dimensional coordinates of the object, but by t 2 The calculation efficiency is higher.
In a possible implementation, the reconstruction module 404 further includes:
calibrating the image acquisition device and the projection device;
after calibration, t is measured 2 The calculation is carried out on the molecules, and the calculation result is recorded, so that when the target is detected by the bidirectional structure optical coding and decoding method, t can be obtained through recording 2 The molecule of (1).
The bidirectional structure optical coding and decoding device effectively reduces the number of projection coding patterns required in one direction aiming at bidirectional scanning by using polar line geometric relation of a measuring system consisting of projection equipment, image acquisition equipment and a computer; and the intersection relation of the sight lines of the image acquisition equipment and the projection equipment in the three-dimensional space is utilized to replace the matrix inversion process after the traditional bidirectional scanning and unidirectional scanning, so that the calculation speed of the three-dimensional point cloud reconstruction is increased.
As shown in fig. 5, fig. 5 is a schematic view of a composition structure of an electronic device 500 provided in an embodiment of the present application, where the electronic device 500 includes:
the optical encoder comprises a processor 501, a storage medium 502 and a bus 503, wherein the storage medium 502 stores machine-readable instructions executable by the processor 501, when the electronic device 500 runs, the processor 501 communicates with the storage medium 502 through the bus 503, and the processor 501 executes the machine-readable instructions to perform the steps of the bidirectional structure optical codec method according to the embodiment of the present application.
In practice, the various components of the electronic device 500 are coupled together by a bus 503. It is understood that the bus 503 is used to enable connected communication between these components. The bus 503 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various busses are labeled as bus 503 in figure 5.
The electronic equipment effectively reduces the number of projection coding patterns required in one direction aiming at bidirectional scanning by using the polar line geometric relation of a measuring system consisting of projection equipment, image acquisition equipment and a computer; and the intersection relation of the sight lines of the image acquisition equipment and the projection equipment in the three-dimensional space is utilized to replace the matrix inversion process after the traditional bidirectional scanning and unidirectional scanning, so that the calculation speed of the three-dimensional point cloud reconstruction is increased.
The embodiment of the present application further provides a computer-readable storage medium, where the storage medium stores executable instructions, and when the executable instructions are executed by at least one processor 501, the bidirectional structured light coding and decoding method described in the embodiment of the present application is implemented.
In some embodiments, the storage medium may be a Memory such as a magnetic random Access Memory (FRAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read Only Memory (EPROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical Disc, or a Compact Disc Read-Only Memory (CD-ROM); or may be various devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of a program, software module, script, or code written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts stored in a hypertext markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or distributed across multiple sites and interconnected by a communication network.
The computer readable storage medium effectively reduces the number of projection coding patterns required for one direction for bi-directional scanning using epipolar geometry of a measurement system consisting of a projection device, an image acquisition device and a computer; and the intersection relation of the sight lines of the image acquisition equipment and the projection equipment in the three-dimensional space is utilized to replace the matrix inversion process after the traditional bidirectional scanning and unidirectional scanning, so that the calculation speed of the three-dimensional point cloud reconstruction is increased.
In the several embodiments provided in the present application, it should be understood that the disclosed method and electronic device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a platform server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A bidirectional structure optical coding and decoding method is characterized by comprising the following steps:
performing longitudinal scanning and transverse scanning on a target through projection equipment to obtain a scanning result, wherein the scanning result comprises a light intensity value of at least one first pixel point, the longitudinal scanning is performed through multi-frequency phase-shift structured light, and the transverse scanning is performed through sinusoidal structured light;
capturing an image of the scanning result through an image acquisition device to obtain a captured image, wherein the captured image comprises a winding phase of the at least one second pixel point, the winding phase comprises a transverse winding phase and a longitudinal winding phase, the image acquisition device and the projection device are not in the same position, an intersection point exists between a second sight line of the image acquisition device and a first sight line of the projection device, and a projection of the second sight line of the image acquisition device on the plane of the projection device can form a polar line;
performing unwrapping operation on the longitudinal wrapping phase of the at least one second pixel point to obtain a longitudinal phase, calculating a transverse temporary phase according to the longitudinal phase, performing unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and taking the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel point;
and constructing a three-dimensional point cloud reconstruction model based on the epipolar line and the phase information, and reconstructing a three-dimensional coordinate of the target based on the three-dimensional point cloud reconstruction model, wherein the three-dimensional point cloud reconstruction model is used for searching a first closest point on the epipolar line, which is closest to the first pixel point, in the plane of the projection equipment and searching a second closest point on the epipolar line, which is closest to the second pixel point, in the plane of the image acquisition equipment.
2. The method of claim 1, wherein scanning the object longitudinally and transversely by the projection device to obtain scanning results comprises:
longitudinally scanning through the multi-frequency phase-shift sinusoidal structured light, transversely scanning through the sinusoidal structured light to obtain a light intensity value of the at least one first pixel point, and encoding the light intensity value to obtain
Wherein x is p Is the abscissa, y, of the first pixel point in the plane of the projection device p Is the ordinate of the first pixel point under the projection device plane,for the intensity value of said at least one first pixel under said lateral scanning,for the light intensity value, β, of said at least one first pixel under said longitudinal scan p To determine an equilibrium constant with a non-negative value of light intensity, f x For the frequency of the projected pattern under said transverse scan, f y For the frequency of the projected pattern under said longitudinal scan, f y ={f 1 ,f 2 ,f 3 ,…,f h },f h Is the highest frequency of the projected pattern, f x =f h ,α p To control the modulation constant for a range of projection patterns, N is the phase shift index, N is the total number of phase shift steps, W p For the lateral resolution of the projection device, H p Is the longitudinal resolution of the projection device.
3. The method according to claim 2, wherein the image capturing the scanning result by an image acquisition device to obtain a captured image comprises:
image capture is carried out on the scanning result through image acquisition equipment, and the captured result is modeled to obtain
Wherein x is c Is the abscissa, y, of the second pixel point in the plane of the image acquisition device c Is under the plane of the image acquisition equipmentThe ordinate of the second pixel point, in the plane of the image acquisition device,the average light intensity of the abscissa of the at least one second pixel point collected by the image collecting device is calculated,the average light intensity of the vertical coordinate of the at least one second pixel point acquired by the image acquisition equipment is obtained;for brightness modulation of the image acquisition device in the transverse direction,modulating the brightness of the image acquisition equipment in the longitudinal direction;in order to wind the phase in the transverse direction,in order to wind the phase in the longitudinal direction,andis shown as
4. The method according to claim 3, wherein the unwrapping the longitudinally wrapped phase of the at least one second pixel to obtain a longitudinal phase, calculating a transverse temporary phase from the longitudinal phase, unwrapping the transverse wrapped phase based on the transverse temporary phase to obtain a transverse phase, and using the longitudinal phase and the transverse phase as the phase information corresponding to the at least one second pixel comprises:
solving the longitudinal winding phase through a time domain unwrapping model to obtain the longitudinal phase
Determining the epipolar lineStarting point of (2)And a termination pointThe polar line equation is expressed as
The transverse phase after the unwrappingSaid longitudinal phase after unwindingMapping into the space of the projection device to obtain
5. The method of claim 1, wherein said constructing a three-dimensional point cloud reconstruction model based on said epipolar lines comprises:
acquiring a first intersection point of the first pixel point after making a perpendicular line to the polar line, acquiring a first direction vector of a first sight line of the projection equipment based on the first intersection point, acquiring a second intersection point of the second pixel point after making a perpendicular line to the polar line, and acquiring a second direction vector of a second sight line of the image acquisition equipment based on the second intersection point;
respectively obtaining a first proportional coefficient of the first direction vector and a second proportional coefficient of the second direction vector based on a third intersection point of the first direction vector and the second direction vector;
and constructing the three-dimensional point cloud reconstruction model based on the second scale coefficient.
6. The method of claim 5, wherein the obtaining a first intersection point after the first pixel point makes a perpendicular to the epipolar line, and based on the first intersection point, obtaining a first direction vector of a first line of sight of the projection device comprises:
for the first pixel point (x) p ,y p ) Making a perpendicular line to the polar line and intersecting at a first intersection pointWherein the content of the first and second substances,is the first closest point of the plurality of points,is shown as
Wherein the content of the first and second substances,
based on the first intersection pointObtaining a first line of sight O of the projection device p First direction vector of P
Wherein the content of the first and second substances,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the projection device p Is composed of
acquiring a second intersection point of the second pixel point after the second pixel point is perpendicular to the polar line, and acquiring a second direction vector of a second sight line of the image acquisition equipment based on the second intersection point, wherein the second direction vector comprises:
for the second pixel point (x) c ,y c ) Making a perpendicular line to the polar line and intersecting at a second intersection pointWherein the content of the first and second substances,is the second closest point of the plurality of points,is shown as
Wherein, the first and the second end of the pipe are connected with each other,
based on the second intersection pointObtaining a second line of sight O of the image acquisition device c Second direction vector of P
Wherein the content of the first and second substances,
in the formula (I), the compound is shown in the specification,
calibration matrix M of the image acquisition device c Is composed of
the obtaining a first scaling factor of the first direction vector and a second scaling factor of the second direction vector based on a third intersection point of the first direction vector and the second direction vector comprises:
the first line of sight O p P meterShown as
The second line of sight O c P is represented by
Wherein, the first and the second end of the pipe are connected with each other,is the optical center of the image acquisition device,is the optical center of the projection device, (X) w ,Y w ,Z w ) Is the three-dimensional coordinates of the target; t is t 1 Is said first scaling factor, t 2 Is the second scaling factor;
the first line of sight O p P and the second line of sight O c P intersects with a third intersection point P in the three-dimensional space to obtain
The building the three-dimensional point cloud reconstruction model based on the second proportionality coefficient comprises the following steps:
the three-dimensional coordinates of the object pass through t 2 Calculating to obtain a three-dimensional point cloud reconstruction model of
7. The method of claim 6, further comprising:
calibrating the image acquisition device and the projection device;
after calibration, t is measured 2 The calculation is carried out on the molecules, and the calculation result is recorded, so that when the target is detected by the bidirectional structure optical coding and decoding method, t can be obtained through recording 2 The molecule of (1).
8. An apparatus for bi-directional structured light encoding and decoding, the apparatus comprising:
the scanning module is used for longitudinally scanning and transversely scanning a target through projection equipment to obtain a scanning result, wherein the scanning result comprises a light intensity value of at least one first pixel point, the longitudinal scanning is carried out through multi-frequency phase-shift structured light, and the transverse scanning is carried out through highest-frequency sine structured light;
a capturing module, configured to perform image capturing on the scanning result through an image acquisition device, so as to obtain a captured image, where the captured image includes a wrapped phase of the at least one second pixel point, the wrapped phase includes a transverse wrapped phase and a longitudinal wrapped phase, the image acquisition device and the projection device are not in the same position, an intersection point exists between a second line of sight of the image acquisition device and a first line of sight of the projection device, and a projection of the second line of sight of the image acquisition device on the plane of the projection device can form an epipolar line;
a calculation module, configured to perform an unwrapping operation on the longitudinal wrapping phase of the at least one second pixel to obtain a longitudinal phase, calculate a transverse temporary phase according to the longitudinal phase, perform an unwrapping operation on the transverse wrapping phase based on the transverse temporary phase to obtain a transverse phase, and use the longitudinal phase and the transverse phase as phase information corresponding to the at least one second pixel;
and the reconstruction module is used for constructing a three-dimensional point cloud reconstruction model based on the polar line and reconstructing the three-dimensional coordinates of the target based on the three-dimensional point cloud reconstruction model, wherein the three-dimensional point cloud reconstruction model is used for searching the closest point on the polar line to the pixel point in the plane of the projection equipment.
9. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the electronic device is operating, the processor executing the machine-readable instructions to perform the steps of the bi-directional structured light coding and decoding method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, performs the steps of the bi-directional structured light coding and decoding method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210997597.6A CN115379182B (en) | 2022-08-19 | 2022-08-19 | Bidirectional structure optical coding and decoding method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210997597.6A CN115379182B (en) | 2022-08-19 | 2022-08-19 | Bidirectional structure optical coding and decoding method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115379182A true CN115379182A (en) | 2022-11-22 |
CN115379182B CN115379182B (en) | 2023-11-24 |
Family
ID=84066353
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210997597.6A Active CN115379182B (en) | 2022-08-19 | 2022-08-19 | Bidirectional structure optical coding and decoding method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115379182B (en) |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101667303A (en) * | 2009-09-29 | 2010-03-10 | 浙江工业大学 | Three-dimensional reconstruction method based on coding structured light |
US20130127854A1 (en) * | 2010-08-11 | 2013-05-23 | Primesense Ltd. | Scanning Projectors And Image Capture Modules For 3D Mapping |
CN109993826A (en) * | 2019-03-26 | 2019-07-09 | 中国科学院深圳先进技术研究院 | A kind of structural light three-dimensional image rebuilding method, equipment and system |
CN110285775A (en) * | 2019-08-02 | 2019-09-27 | 四川大学 | Three-dimensional rebuilding method and system based on structure photoperiod coding pattern |
CN110487216A (en) * | 2019-09-20 | 2019-11-22 | 西安知象光电科技有限公司 | A kind of fringe projection 3-D scanning method based on convolutional neural networks |
CN111586387A (en) * | 2020-06-23 | 2020-08-25 | 广东省航空航天装备技术研究所 | Projection assembly and three-dimensional imaging device |
CN113124779A (en) * | 2021-04-06 | 2021-07-16 | 电子科技大学 | Rapid bidirectional structured light decoding method |
CN114219866A (en) * | 2021-12-17 | 2022-03-22 | 中国科学院苏州纳米技术与纳米仿生研究所 | Binocular structured light three-dimensional reconstruction method, reconstruction system and reconstruction equipment |
CN114663597A (en) * | 2022-04-06 | 2022-06-24 | 四川大学 | Real-time structured light reconstruction method and device based on normalized extended polar line geometry |
CN114708316A (en) * | 2022-04-07 | 2022-07-05 | 四川大学 | Structured light three-dimensional reconstruction method and device based on circular stripes and electronic equipment |
-
2022
- 2022-08-19 CN CN202210997597.6A patent/CN115379182B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101667303A (en) * | 2009-09-29 | 2010-03-10 | 浙江工业大学 | Three-dimensional reconstruction method based on coding structured light |
US20130127854A1 (en) * | 2010-08-11 | 2013-05-23 | Primesense Ltd. | Scanning Projectors And Image Capture Modules For 3D Mapping |
CN109993826A (en) * | 2019-03-26 | 2019-07-09 | 中国科学院深圳先进技术研究院 | A kind of structural light three-dimensional image rebuilding method, equipment and system |
CN110285775A (en) * | 2019-08-02 | 2019-09-27 | 四川大学 | Three-dimensional rebuilding method and system based on structure photoperiod coding pattern |
CN110487216A (en) * | 2019-09-20 | 2019-11-22 | 西安知象光电科技有限公司 | A kind of fringe projection 3-D scanning method based on convolutional neural networks |
CN111586387A (en) * | 2020-06-23 | 2020-08-25 | 广东省航空航天装备技术研究所 | Projection assembly and three-dimensional imaging device |
CN113124779A (en) * | 2021-04-06 | 2021-07-16 | 电子科技大学 | Rapid bidirectional structured light decoding method |
CN114219866A (en) * | 2021-12-17 | 2022-03-22 | 中国科学院苏州纳米技术与纳米仿生研究所 | Binocular structured light three-dimensional reconstruction method, reconstruction system and reconstruction equipment |
CN114663597A (en) * | 2022-04-06 | 2022-06-24 | 四川大学 | Real-time structured light reconstruction method and device based on normalized extended polar line geometry |
CN114708316A (en) * | 2022-04-07 | 2022-07-05 | 四川大学 | Structured light three-dimensional reconstruction method and device based on circular stripes and electronic equipment |
Non-Patent Citations (4)
Title |
---|
FANG-HSUAN CHENG: "3D object scanning system by coded structured light", IEEE * |
KAI LIU: "reconstructing 3D point clouds in real time with look-up tables for structured light scanning along both horizontal and vertical directions", OPTICS LETTTERS, vol. 44, no. 24 * |
吴海滨: "基于计算机视觉的人体内腔三维重建技术综述", 计算机工程, vol. 47, no. 10 * |
孙伟文: "用于实时三维成像的双频结构光编解码方法", 强激光与粒子束 * |
Also Published As
Publication number | Publication date |
---|---|
CN115379182B (en) | 2023-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10911672B2 (en) | Highly efficient three-dimensional image acquisition method based on multi-mode composite encoding and epipolar constraint | |
US9879985B2 (en) | Simultaneous multiple view surface geometry acquisition using structured light and mirrors | |
CN104299211B (en) | Free-moving type three-dimensional scanning method | |
CN102750697B (en) | Parameter calibration method and device | |
US10810718B2 (en) | Method and device for three-dimensional reconstruction | |
Acosta et al. | Laser triangulation for shape acquisition in a 3D scanner plus scan | |
US10559085B2 (en) | Devices, systems, and methods for reconstructing the three-dimensional shapes of objects | |
CN111724443B (en) | Unified scene visual positioning method based on generative confrontation network | |
US10062171B2 (en) | 3D reconstruction from photometric stereo with shadows | |
CN114792345B (en) | Calibration method based on monocular structured light system | |
Song et al. | Super-resolution phase retrieval network for single-pattern structured light 3D imaging | |
CN115379182B (en) | Bidirectional structure optical coding and decoding method and device, electronic equipment and storage medium | |
CN116363302B (en) | Pipeline three-dimensional reconstruction and pit quantification method based on multi-view geometry | |
CN105181646A (en) | Computer vision based transparent medium refractivity measurement method | |
CN112504156A (en) | Structural surface strain measurement system and measurement method based on foreground grid | |
CN114708316B (en) | Structured light three-dimensional reconstruction method and device based on circular stripes and electronic equipment | |
Zhang et al. | Freight train gauge-exceeding detection based on three-dimensional stereo vision measurement | |
Goshin et al. | Parallel implementation of the multi-view image segmentation algorithm using the Hough transform | |
CN114897959A (en) | Phase unwrapping method based on light field multi-view constraint and related components | |
CN112097690B (en) | Transparent object reconstruction method and system based on multi-wavelength ray tracing | |
CN100388905C (en) | Three-dimensional feet data measuring method to sparse grid based on curve subdivision | |
Wang et al. | A novel color encoding fringe projection profilometry based on wavelet ridge technology and phase-crossing | |
Gruen et al. | DSM generation with ALOS/PRISM data using SAT-PP | |
CN113375600B (en) | Three-dimensional measurement method and device and electronic equipment | |
Inzerillo | SfM Techniques Applied in Bad Lighting and Reflection Conditions: The Case of a Museum Artwork |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |