CN102494609B - Three-dimensional photographing process based on laser probe array and device utilizing same - Google Patents

Three-dimensional photographing process based on laser probe array and device utilizing same Download PDF

Info

Publication number
CN102494609B
CN102494609B CN201110367668.6A CN201110367668A CN102494609B CN 102494609 B CN102494609 B CN 102494609B CN 201110367668 A CN201110367668 A CN 201110367668A CN 102494609 B CN102494609 B CN 102494609B
Authority
CN
China
Prior art keywords
dimensional camera
laser probe
dimensional
camera
predetermined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201110367668.6A
Other languages
Chinese (zh)
Other versions
CN102494609A (en
Inventor
李志扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201110367668.6A priority Critical patent/CN102494609B/en
Publication of CN102494609A publication Critical patent/CN102494609A/en
Application granted granted Critical
Publication of CN102494609B publication Critical patent/CN102494609B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a three-dimensional photographing process based on a laser probe array and a device utilizing the same, and belongs to the field of three-dimensional photographing and measuring. Thousands of laser probes are projected to preset space positions according to the digital optical phase conjugation principle, reflection of the laser probes on surfaces of articles is monitored by a general two-dimensional camera, and then three-dimensional coordinate measurement is realized. Meanwhile, since the laser probe array is combined with a pair of three-dimensional cameras, more three-dimensional coordinates can be acquired by means of three-dimensional reconstruction, and accurate and compact three-dimensional coordinate measurement for large sites can be realized. Accordingly, three-dimensional matching treatment accuracy of three-dimensional images in the three-dimensional reconstruction process can be improved, matching treatment time is shortened, images picked up by the three-dimensional cameras can be compensated at the post stage, error in focal length and installation of the two cameras is eliminated, and requirements for the previous stage installation and synchronization of the focal length in the actual photographing process of the three-dimensional cameras are lowered. The three-dimensional photographing process based on the laser probe array and the device utilizing the same are applicable to fields of three-dimensional films, robots, intelligent driving, quick detection for obstacles, automatic measurement for industrial parts, and the like.

Description

A kind of three-dimensional photographing process and device based on the laser probe array
The invention belongs to 3-D photography and three-dimensional measurement field, relate more specifically to a kind of three-dimensional photographing process based on the laser probe array, also relate to a kind of 3-D photography device based on the laser probe array.Be applicable to fine and close three-dimensional coordinate measurement and the stereoscopic photograph of large scene, also be applicable to the quick three-dimensional measurement of coordinates of robot, intelligent driving and obstacle detection simultaneously, and the industrial part three-dimensional measurement etc.
Background technology
Common camera arrives image planes to project objects by optical lens, and adopts film or CCD/CMOS imageing sensor to carry out record, and the picture of taking thus is two-dimentional, and depth information has been lost.But the fast development along with intelligent robot, the safety field such as unmanned, demand to depth information is increasing, three-dimensional coordinate information in the urgent need to the Quick Acquisition scene, particularly along with the development of real tri-dimension display technique, need the while high resolving power accurately to gather three-dimensional coordinate and the color information of three-dimensional scenic.Strictly, at present also without any a kind of camera technique simultaneously high resolving power accurately gather three-dimensional coordinate and the color information of three-dimensional scenic.For example, adopt the stereo camera shooting technology of twin camera, although can be from the color information of two angle high resolving power Real-time Collection three-dimensional scenics, the picture of by three-dimensional display two video cameras being taken again when playing is presented to respectively right and left eyes, make the observer produce very strong stereoscopic sensation, but in this process, do not relate to the concrete numerical value of three-dimensional coordinate of scene.Although by how much imaging relations, the image that can take two video cameras is to processing, calculate the three-dimensional coordinate of three-dimensional scenic by three-dimensional reconstruction, but wherein Stereo matching is complicated, the processing time is long, particularly for the unconspicuous zone of architectural feature, Stereo matching can't carry out, and causes the three-dimensional reconstruction noise large.Stereoscopic photograph based on two video cameras requires very harsh to video camera in addition, it is synchronously strict that two cameras must be aimed at and keep, for example when a video camera zoom, an other video camera also must produce identical zoom amount immediately, otherwise take three-dimensional film meeting giddy when playing out, produce serious uncomfortable.Immediate with above-mentioned three-dimensional photographic technology truly at present is existing various three-dimensional coordinate measurement technology, although these technology generally do not record the color information of scene, and also do not reach in real time at aspects such as measuring speed or measurement range and density or measuring accuracy or anti-interferences, high resolving power, high picture element requirement.For example, adopt point-to-point metering system based on the measuring technique of laser triangulation with based on the laser scanner technique of flight time is general, can not Quick for large-area intensive three-dimensional coordinate; Although laser interferometry technology precision is very high, be easy to the Stimulated Light noise and vibration and disturb, and it is less to measure area; Although various structured light projection measuring techniques have preferably measuring accuracy for another example, fathoming generally is no more than 5 meters with width, often produces simultaneously shade blocking problem etc.
Summary of the invention
The objective of the invention is for the prior art above shortcomings, a kind of three-dimensional photographing process based on the laser probe array and device have been to provide, so that two-dimensional camera can sampling depth information, obtain simultaneously to high resolving power fine and close three-dimensional coordinate and the color information of large scene.
For achieving the above object, the present invention adopts following technical scheme:
A kind of three-dimensional photographing process based on the laser probe array may further comprise the steps:
Step 1, the optical centre that the laser probe generator is set are true origin, and the optical axis of laser probe generator is Z axis, and X-axis is perpendicular to Z axis, and X-axis Z axis plane is surface level, and Y-axis is perpendicular to X-axis Z axis plane; The initial point symmetria bilateralis arranges the first two-dimensional camera and the second two-dimensional camera on X-axis, the optical axis of the first two-dimensional camera and the second two-dimensional camera is parallel to Z axis, so that the first two-dimensional camera and the second two-dimensional camera consist of a pair of stereo camera, simultaneously the first two-dimensional camera and the second two-dimensional camera consist of three-dimensional measuring apparatus based on the laser probe array with the laser probe generator respectively;
The laser probe array that step 2, setting laser probe generator send obtains the predetermined focus point coordinate of every laser probe in the laser probe array perpendicular to the predetermined focus point on the predetermined focussing plane of Z axis;
Step 3, the predetermined focus point coordinate of basis focus on predetermined focus point to the laser probe array that the laser probe generator sends;
Step 4, the laser probe that focuses on predetermined focus point that utilizes the laser probe generator to send is demarcated the first two-dimensional camera and the second two-dimensional camera, one planar object is placed on predetermined focus point place focussing plane perpendicular to Z axis, the predetermined focal distance of the first two-dimensional camera and the second two-dimensional camera is adjusted to respectively its shortest focal length, the position of the captured laser probe of the first two-dimensional camera this moment reflection image patch be the first two-dimensional camera when shortest focal length laser probe be predetermined to be the image position, and the position of the captured laser probe reflection image patch of the second two-dimensional camera be the second two-dimensional camera when shortest focal length laser probe be predetermined to be the image position;
Step 5, the focal length of the first two-dimensional camera and the second two-dimensional camera is increased respectively a fixed value, the position of the captured laser probe of the first two-dimensional camera this moment reflection image patch be the first two-dimensional camera when new predetermined focal distance laser probe be predetermined to be the image position, and the position of the captured laser probe reflection image patch of the second two-dimensional camera be the second two-dimensional camera when new predetermined focal distance laser probe be predetermined to be the image position;
Step 6, repeating step 5, until the predetermined focal distance of the first two-dimensional camera and the second two-dimensional camera reaches respectively its longest focal length, the first two-dimensional camera and the second two-dimensional camera are demarcated and are finished;
Step 7, utilize in the step 3 and to focus on the laser probe array irradiation subject that produces, and with the first two-dimensional camera and the second two-dimensional camera that demarcation in the step 6 is finished subject is taken, obtain detecting picture;
Read respectively the real focal length of the first two-dimensional camera and the second two-dimensional camera in step 8, the first two-dimensional camera and the second two-dimensional camera are taken from step 7 the detection picture, according to demarcate in the step 6 the first two-dimensional camera finish and the second two-dimensional camera when the predetermined focal distance laser probe be predetermined to be the image position, interpolation calculation obtain the first two-dimensional camera and the second two-dimensional camera under real focal length laser probe be predetermined to be the image position;
Step 9, according to the first two-dimensional camera that obtains in the step 8 and the second two-dimensional camera when the real focal length every laser probe be predetermined to be the image position, the detection picture that the detection picture that the first two-dimensional camera that step 7 is obtained is taken and the second two-dimensional camera are taken carries out respectively the search of laser probe reflection image patch, if there is laser probe reflection image patch, then depart from the pixel distance that is predetermined to be the image position that is set as search center according to laser probe reflection image patch and calculate the object under test surface to the fore-and-aft distance Z on X-axis Y-axis plane, all are predetermined to be the image position search enter step 10 after complete;
Step 10, the first two-dimensional camera that reads according to step 8 and the real focal length of the second two-dimensional camera, the detection picture that the second two-dimensional camera that obtains in the step 7 is taken carries out convergent-divergent, so that the detection picture that the second two-dimensional camera is taken has identical horizontal magnification multiplying power with the detection picture that the first two-dimensional camera is taken; The laser probe that obtains according to search in the step 9 reflects the image patch position, the detection picture that the second two-dimensional camera that obtains in the step 7 is taken is rotated, so that the same delegation laser probe reflection image patch in the detection picture that the reflection image patch of the laser probe in the detection picture that the second two-dimensional camera is taken and the first two-dimensional camera are taken is parallel to each other;
Step 11, the detection picture that the first two-dimensional camera of obtaining in the step 7 is taken and the detection picture of taking through pretreated the second two-dimensional camera of step 10 carry out respectively angle point Edge Gradient Feature and coupling, obtain respectively the unique point that above-mentioned two width of cloth detect picture, according to the fore-and-aft distance Z of same unique point this unique point of disparity computation in above-mentioned two width of cloth pictures to X-axis Y-axis plane;
Choose successively each pixel A in the detection picture that step 12, the first two-dimensional camera that obtains are taken in step 7 1i, jIf this pixel is the laser probe reflection image patch position that search obtains in the step 9, then this laser probe that calculates in the step 9 is reflected the corresponding fore-and-aft distance Z of image patch as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jNot being the laser probe reflection image patch position that search obtains in the step 9, is the unique point of extracting in the step 11, then with the corresponding fore-and-aft distance Z of this unique point as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jIt or not the laser probe reflection image patch position that search obtains in the step 9, the unique point of extracting in neither step 11, then in the detection picture that pretreated the second two-dimensional camera is taken through step 10, according to the measure function search Stereo matching point of choosing in advance, the hunting zone obtains for search in step 9 and and pixel A 1i, jThe most contiguous up and down four rectangular areas that laser probe reflection image patch surrounds are if find Stereo matching point A 2i ', j ', then utilize A 1i, jWith A 2i ', j 'The parallax size calculate pixel A 1i, jThe fore-and-aft distance Z of place's body surface is not if find the Stereo matching point, then according to above-mentioned and pixel A 1i, jThe most contiguous up and down four corresponding fore-and-aft distances of laser probe reflection image patch calculate pixel A by linear interpolation 1i, jThe fore-and-aft distance Z of place's body surface.
A kind of 3-D photography device based on the laser probe array, comprise the first two-dimensional camera, the second two-dimensional camera, laser probe generator and support, the first two-dimensional camera, laser probe generator and the second two-dimensional camera equidistantly are fixed on the support successively, the optical axis of the first two-dimensional camera, the optical axis of the optical axis of laser probe generator and the second two-dimensional camera is parallel to each other and is positioned at same plane, the line of the optical centre of the optical lens of the optical centre of the optical lens of the first two-dimensional camera and the second two-dimensional camera is perpendicular to their optical axis, the optical centre of laser probe generator is positioned on the line of optical centre of optical lens of the optical centre of optical lens of the first two-dimensional camera and the second two-dimensional camera, so that the first two-dimensional camera and the second two-dimensional camera consist of a pair of stereo camera, simultaneously the first two-dimensional camera and the second two-dimensional camera consist of three-dimensional measuring apparatus based on the laser probe array with the laser probe generator respectively, the first two-dimensional camera and the second two-dimensional camera are built-in with the focal length monitoring device, and the real focal length during photographic images writes in the captured view data.
The light that aforesaid laser probe generator sends is infrared light, and the first two-dimensional camera and the second two-dimensional camera are the two waveband two-dimensional camera;
The two waveband two-dimensional camera is by the first imageing sensor, the second imageing sensor as mentioned above, optical lens and wavelength select spectroscope to form, wavelength selects spectroscopical reflecting surface to become 45 degree with the optical axis of optical lens, the first imageing sensor and the second imageing sensor be symmetrical being placed on two image planes of optical lens respectively, so that the incident infrared light is selected spectroscopical reflection through wavelength, be imaged onto the first imageing sensor, and the incident visible light is selected spectroscopical transmission through wavelength, is imaged onto the second imageing sensor.
The two waveband two-dimensional camera also comprises an arrowband infrared fileter as mentioned above, and the arrowband infrared fileter is placed on the front of the first imageing sensor.
Used term explanation in this instructions:
(1) two-dimensional camera: the image planes that common camera arrives project objects by optical lens, and adopt film or CCD/CMOS imageing sensor to carry out record, the picture of taking thus is two-dimentional, for the 3-D photography device with the present invention's proposition distinguishes, we are referred to as two-dimensional camera to above-mentioned common camera.
Principle of the present invention: the applicant has proposed an invention " a kind of method for three-dimensional measurement and device based on the laser probe array " (application number: 201110322563.9) recently, it is based on the digital optical phase conjugation principle, thousands of laser probes are projected aerial precalculated position, then somewhere meet with body surface by ordinary two dimensional camera observed and recorded laser probe, and determine the coordinate of object by the three-dimensional coordinate of known laser probe.Although the method can supply large-area three-dimensional coordinate by Quick, the three-dimensional coordinate that gathers is intensive not enough, approximately only accounts for one of percentage of the total pixel of two-dimensional camera.Corresponding, the method for three-dimensional measurement based on twin camera of commonly using at present, can be from two different angles shot objects, the image space of same object point in the picture that two video cameras are taken is slightly different, namely there is said parallax in the technical term, can instead push away the far and near distance of object by this parallax, thereby calculate the three-dimensional coordinate of object by pixel, yet because corresponding match point is difficult to accurately determine one by one in the picture of the left and right sides, although the three-dimensional coordinate of the object of calculating is very intensive, but reliability is relatively poor, and is particularly very large at the unconspicuous noise region of feature.Core concept of the present invention is to unite uses a stylobate in three-dimensional measuring apparatus and a pair of stereo camera of laser probe array, thereby has the former accuracy and the latter's compactness concurrently.But the combination of two complete equipments is not to put together simply, but organically merges, and particularly also need solve some new difficulties that occur in both cohesive process.For example, in the simplest situation, one stylobate comprises a laser probe generator and a two-dimensional camera in the three-dimensional measuring apparatus of laser probe array, and a pair of stereo camera itself has comprised two two-dimensional camera, therefore we can save the two-dimensional camera among the former, directly monitor laser probe in the reflection of body surface with a pair of stereo camera.But produced new problem this moment, at first in the method for three-dimensional measurement of tradition based on the laser probe array, the focal length of the two-dimensional camera that it adopts is fixed, and the image position that is predetermined to be of every laser probe also is changeless like this, therefore can demarcate in advance.And the focal length of three-dimensional video camera changes at any time, the time and further, the time and further, to reach specific artistic effect, the variation of focal length will cause the image position that is predetermined to be of every laser probe to change.Secondly, laser probe reflection image patch is superimposed upon on the picture, can affect the integrality of picture, and perhaps the brightness of scenery picture may be far longer than the brightness of laser probe reflection image patch, causes the reflection image patch of laser probe to be submerged, and can't obtain the Measurement accuracy result.
In order to address the above problem, the present invention has at first proposed a kind of method of dynamically determining focal length of camera.In the method for three-dimensional measurement based on laser probe, if adopt the ordinary two dimensional video camera to replace the two-dimensional camera of fixed focal length, calculate the object depth apart from the time only have an independent known variables, the i.e. focal length of video camera.In case the focal length of video camera is determined, laser probe reflection image patch being predetermined to be the image position and can precomputing when this focal length, depart from the pixel distance that it is predetermined to be the image position according to each laser probe reflection image patch, just can calculate the depth distance of each laser probe reflection spot, these laser probe reflection spots have reflected the geometric configuration of object.In most of application scenarios, background generally immobilizes, although the position of the relative video camera of object can change simultaneously, the geometric configuration of object itself also often remains unchanged, or during two two field pictures, change slowly, according to these characteristics, can determine two focal length values exploratoryly, respectively as before the video camera zoom and zoom after focal length, then choose N laser probe reflection spot A i, i=1,2 ..., N (N 〉=2) processes two two field pictures of taking before and after the video camera zoom respectively, calculates object in the depth distance of this N point.If object is whole front and back translation just, the change amount of depth distance before and after zoom of this N point should be identical; If further object immobilizes, the change amount of depth distance before and after zoom of this N point should be zero.Can verify thus whether aforementioned exploration focal length value is correct, if incorrect then adjust above-mentioned exploration value, repeat above-mentioned calculating, until obtain accurately focal length value.Because the shooting speed of video camera is at least per second 25 frames, in most of the cases, even the object in the picture all deforms, but because pace of change is relatively slow, this hypothesis is approximate the establishment.But to the fast-changing object of self shape, comprise the object of quick upset, above-mentioned hypothesis is just no longer set up.Need video camera that accurate focal length value is provided this moment, needs thus existing video camera is improved.Present two-dimensional camera generally adopts piezo-electric motor or gear adjusting focal length, therefore can add up the motor amount of movement by the electrical drive signal of totally giving motor, perhaps can add a length monitoring device, measure the amount of movement of motor or gear, and the focal length of Real-time Feedback report video camera.To modern Digital Video, further also can write this focal length value in the captured image, for example increase some fixed byte record focal lengths in every frame image data front.In processing, successive image can directly from image, read in the focal length value that adopts when taking this image like this.
In case the focal length of video camera is determined, just can process respectively the image that two video cameras are taken, and calculates object at the three-dimensional coordinate of each laser probe reflection spot.Further centered by these points, to about two width of cloth images carry out Stereo matching, calculate object at the three-dimensional coordinate of every bit by pixel.Also need carry out pre-service to the image that two video cameras are taken as last mating, for example, if two video cameras asynchronous focal length that causes when zoom is different, then need image is carried out convergent-divergent, so that their imaging multiplying powers are identical, if the CCD of two video cameras or cmos image sensor are installed not parallelly or use procedure in loosening, then need be rotated image, so that about be parallel to each other with delegation's laser probe reflection image patch in two width of cloth images, thereby compensation installation deviation etc.Through pre-service, can improve on the one hand the precision of Stereo matching, on the other hand so that captured stereo-picture to deliver to watch when three-dimensional display shows more comfortable.Above-mentioned preprocessing process is equivalent to utilize laser probe that video camera is carried out post-compensation, eliminate two video cameras because the measuring error that focal length and the deviation installed cause, thereby reduced by two video cameras were installed and the requirement of focal length synchronization aspects during actual photographed in early stage.Thousands of laser probes that project in addition body surface are equivalent to stamp mark to object, and matching operation is labeled as the center with these, have reduced on the one hand the blind search scope, saved the time, also improve precision on the one hand, reduced the mistake coupling, thereby reduced the three-dimensional reconstruction noise.Even body surface lacks feature structure, also can determine by linear interpolation the three-dimensional coordinate of other points between the gauge point.
Further the present invention has designed a kind of two waveband video camera, separate visible light and infrared laser probe reflection image patch by the wavelength selectivity semi-transparent semi-reflecting lens, and record respectively visible images and infrared laser probe reflection image patch image with two CCD or cmos image sensor.Like this, avoided on the one hand the broken ring of laser probe reflection image patch to picture, two CCD or cmos image sensor can independently adopt different hardware circuit gain amplifiers on the other hand, avoid strong luminance picture to flood weak luminance picture.
In sum, from the accurate three-dimensional measuring result based on the laser probe array, add the three-dimensional reconstruction result from stereoscopic camera, just can realize the intensive measurement of large scene three-dimensional coordinate.Specialty need to have the focal length mechanism for monitoring and focal length value is write captured picture for the video camera of 3-D photography simultaneously, also need satisfy simultaneously two waveband shooting requirement, to separate visible images and infrared laser probe reflection image patch image.
Compared with prior art, the present invention has the following advantages and beneficial effect:
1, by introducing thousands of laser probes, be equivalent to stamp mark to object, improved precision and the speed of method for three-dimensional measurement in the Stereo Matching process based on twin camera, reduce the noise of three-dimensional reconstruction etc., become possibility so that gather when the three-dimensional coordinate of high-resolution three-dimension scene and color information.
2, utilize the laser probe of thousands of location awares, can correct the image that stereo camera is taken, by the deviation that the pre-service such as convergent-divergent or rotation are eliminated two focal length of camera and installed, greatly reduce two video cameras were installed and the requirement of focal length synchronization aspects during actual photographed in early stage.
3, because laser energy focuses on every laser probe, rather than cover whole measured zone, therefore compare with other active optics measuring methods, can adopt lower laser energy.
Description of drawings
Fig. 1 is a kind of embodiment principle schematic of the 3-D photography device based on the laser probe array.
Fig. 2 is the three-dimensional measurement principle of work schematic diagram based on the laser probe array.
Fig. 3 is based on the right three-dimensional measurement principle of work schematic diagram of stereoscopic camera.
Fig. 4 is the measurement in space method neutral body match search scope schematic diagram based on twin camera.
Fig. 5 is a kind of theory structure schematic diagram of two waveband two-dimensional camera.
Embodiment
Below in conjunction with accompanying drawing technical scheme of the present invention is further elaborated:
Embodiment 1:
Step 1, the optical centre that the laser probe generator is set are true origin, and the optical axis of laser probe generator is Z axis, and X-axis is perpendicular to Z axis, and X-axis Z axis plane is surface level, and Y-axis is perpendicular to X-axis Z axis plane; The initial point symmetria bilateralis arranges the first two-dimensional camera and the second two-dimensional camera on X-axis, the optical axis of the first two-dimensional camera and the second two-dimensional camera is parallel to Z axis, so that the first two-dimensional camera and the second two-dimensional camera consist of a pair of stereo camera, simultaneously the first two-dimensional camera and the second two-dimensional camera consist of three-dimensional measuring apparatus based on the laser probe array with the laser probe generator respectively;
The laser probe array that step 2, setting laser probe generator send obtains the predetermined focus point coordinate of every laser probe in the laser probe array perpendicular to the predetermined focus point on the predetermined focussing plane of Z axis;
Step 3, the predetermined focus point coordinate of basis focus on predetermined focus point to the laser probe array that the laser probe generator sends;
Step 4, the laser probe that focuses on predetermined focus point that utilizes the laser probe generator to send is demarcated the first two-dimensional camera and the second two-dimensional camera, one planar object is placed on predetermined focus point place focussing plane perpendicular to Z axis, the predetermined focal distance of the first two-dimensional camera and the second two-dimensional camera is adjusted to respectively its shortest focal length, the position of the captured laser probe of the first two-dimensional camera this moment reflection image patch be the first two-dimensional camera when shortest focal length laser probe be predetermined to be the image position, and the position of the captured laser probe reflection image patch of the second two-dimensional camera be the second two-dimensional camera when shortest focal length laser probe be predetermined to be the image position;
Step 5, the focal length of the first two-dimensional camera and the second two-dimensional camera is increased respectively a fixed value, the position of the captured laser probe of the first two-dimensional camera this moment reflection image patch be the first two-dimensional camera when new predetermined focal distance laser probe be predetermined to be the image position, and the position of the captured laser probe reflection image patch of the second two-dimensional camera be the second two-dimensional camera when new predetermined focal distance laser probe be predetermined to be the image position;
Step 6, repeating step 5, until the predetermined focal distance of the first two-dimensional camera and the second two-dimensional camera reaches respectively its longest focal length, the first two-dimensional camera and the second two-dimensional camera are demarcated and are finished;
Step 7, utilize in the step 3 and to focus on the laser probe array irradiation subject that produces, and with the first two-dimensional camera and the second two-dimensional camera that demarcation in the step 6 is finished subject is taken, obtain detecting picture;
Read respectively the real focal length of the first two-dimensional camera and the second two-dimensional camera in step 8, the first two-dimensional camera and the second two-dimensional camera are taken from step 7 the detection picture, according to demarcate in the step 6 the first two-dimensional camera finish and the second two-dimensional camera when the predetermined focal distance laser probe be predetermined to be the image position, interpolation calculation obtain the first two-dimensional camera and the second two-dimensional camera under real focal length laser probe be predetermined to be the image position;
Step 9, according to the first two-dimensional camera that obtains in the step 8 and the second two-dimensional camera when the real focal length every laser probe be predetermined to be the image position, the detection picture that the detection picture that the first two-dimensional camera that step 7 is obtained is taken and the second two-dimensional camera are taken carries out respectively the search of laser probe reflection image patch, if there is laser probe reflection image patch, then depart from the pixel distance that is predetermined to be the image position that is set as search center according to laser probe reflection image patch and calculate the object under test surface to the fore-and-aft distance Z on X-axis Y-axis plane, all are predetermined to be the image position search enter step 10 after complete;
Step 10, the first two-dimensional camera that reads according to step 8 and the real focal length of the second two-dimensional camera, the detection picture that the second two-dimensional camera that obtains in the step 7 is taken carries out convergent-divergent, so that the detection picture that the second two-dimensional camera is taken has identical horizontal magnification multiplying power with the detection picture that the first two-dimensional camera is taken; The laser probe that obtains according to search in the step 9 reflects the image patch position, the detection picture that the second two-dimensional camera that obtains in the step 7 is taken is rotated, so that the same delegation laser probe reflection image patch in the detection picture that the reflection image patch of the laser probe in the detection picture that the second two-dimensional camera is taken and the first two-dimensional camera are taken is parallel to each other;
Step 11, the detection picture that the first two-dimensional camera of obtaining in the step 7 is taken and the detection picture of taking through pretreated the second two-dimensional camera of step 10 carry out respectively angle point Edge Gradient Feature and coupling, obtain respectively the unique point that above-mentioned two width of cloth detect picture, according to the fore-and-aft distance Z of same unique point this unique point of disparity computation in above-mentioned two width of cloth pictures to X-axis Y-axis plane;
Choose successively each pixel A in the detection picture that step 12, the first two-dimensional camera that obtains are taken in step 7 1i, jIf this pixel is the laser probe reflection image patch position that search obtains in the step 9, then this laser probe that calculates in the step 9 is reflected the corresponding fore-and-aft distance Z of image patch as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jNot being the laser probe reflection image patch position that search obtains in the step 9, is the unique point of extracting in the step 11, then with the corresponding fore-and-aft distance Z of this unique point as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jIt or not the laser probe reflection image patch position that search obtains in the step 9, the unique point of extracting in neither step 11, then in the detection picture that pretreated the second two-dimensional camera is taken through step 10, according to the measure function search Stereo matching point of choosing in advance, the hunting zone obtains for search in step 9 and and pixel A 1i, jThe most contiguous up and down four rectangular areas that laser probe reflection image patch surrounds are if find Stereo matching point A 2i ', j ', then utilize A 1i, jWith A 2i ', j 'The parallax size calculate pixel A 1i, jThe fore-and-aft distance Z of place's body surface is not if find the Stereo matching point, then according to above-mentioned and pixel A 1i, jThe most contiguous up and down four corresponding fore-and-aft distances of laser probe reflection image patch calculate pixel A by linear interpolation 1i, jThe fore-and-aft distance Z of place's body surface.
A kind of 3-D photography device based on the laser probe array, comprise the first two-dimensional camera 1, the second two-dimensional camera 2, laser probe generator 3 and support 4, the first two-dimensional camera 1, laser probe generator 3 and the second two-dimensional camera 2, equidistantly be fixed on successively on the support 4, the optical axis of the first two-dimensional camera 1, the optical axis of the optical axis of laser probe generator 3 and the second two-dimensional camera 2 is parallel to each other and is positioned at same plane, the line of the optical centre of the optical lens of the optical centre of the optical lens of the first two-dimensional camera 1 and the second two-dimensional camera 2 is perpendicular to their optical axis, the optical centre of laser probe generator 3 is positioned on the line of optical centre of optical lens of the optical centre of optical lens of the first two-dimensional camera 1 and the second two-dimensional camera 2, so that the first two-dimensional camera 1 and the second two-dimensional camera 2 consist of a pair of stereo camera, the three-dimensional measuring apparatus that consists of based on the laser probe array with laser probe generator 3 respectively of the first two-dimensional camera 1 and the second two-dimensional camera 2 simultaneously, the first two-dimensional camera 1 and the second two-dimensional camera 2 are built-in with the focal length monitoring device, and the real focal length during photographic images writes in the captured view data.The light that laser probe generator 3 sends is infrared light, and the first two-dimensional camera 1 and the second two-dimensional camera 2 are the two waveband two-dimensional camera; The two waveband two-dimensional camera is by the first imageing sensor 7, the second imageing sensor 8, optical lens 5 and wavelength select spectroscope 6 to form, wavelength selects the reflecting surface of spectroscope 6 to become 45 degree with the optical axis of optical lens 5, the first imageing sensor 7 and the second imageing sensor 8 be symmetrical being placed on two image planes of optical lens 5 respectively, so that the incident infrared light is selected the reflection of spectroscope 6 through wavelength, be imaged onto the first imageing sensor 7, and the incident visible light is selected the transmission of spectroscope 6 through wavelength, is imaged onto the second imageing sensor 8.The two waveband two-dimensional camera also comprises an arrowband infrared fileter 9, and arrowband infrared fileter 9 is placed on the front of the first imageing sensor 7.
Embodiment 2:
Fig. 1 has provided a kind of 3-D photography device based on the laser probe array, it comprises the first two-dimensional camera 1, the second two-dimensional camera 2, laser probe generator 3 and support 4, the first two-dimensional camera 1, laser probe generator 3 and the second two-dimensional camera 2 equidistantly are fixed on the support 4 successively, the optical axis of the first two-dimensional camera 1, the optical axis of the optical axis of laser probe generator 3 and the second two-dimensional camera 2 is parallel to each other and is positioned at same plane, the line of the optical centre of the optical lens of the optical centre of the optical lens of the first two-dimensional camera 1 and the second two-dimensional camera 2 is perpendicular to their optical axis, the optical centre of laser probe generator 3 is positioned on the line of optical centre of optical lens of the optical centre of optical lens of the first two-dimensional camera 1 and the second two-dimensional camera 2, so that the first two-dimensional camera 1 and the second two-dimensional camera 2 consist of a pair of stereo camera, the three-dimensional measuring apparatus that consists of based on the laser probe array with laser probe generator 3 respectively of the first two-dimensional camera 1 and the second two-dimensional camera 2 simultaneously, and the first two-dimensional camera 1 and the second two-dimensional camera 2 are built-in with the focal length monitoring device, and the real focal length during photographic images writes in the captured view data.
In Fig. 1, the three-dimensional measuring apparatus based on the laser probe array that the first two-dimensional camera 1 and laser probe generator 3 consist of, and second three-dimensional measuring apparatus based on the laser probe array that consists of of two-dimensional camera 2 and laser probe generator 3, their principle of work, (application number: that sets forth 201110322563.9) is the same, repeats no more here to comprise the invention " a kind of method for three-dimensional measurement and device based on the laser probe array " that the setting in the precalculated position of the laser probe that laser probe generator 3 sends all proposes with the applicant.The below will be for the three-dimensional photographing process based on the laser probe array among the embodiment 1, focus on and how the first two-dimensional camera 1 and the second two-dimensional camera 2 with zoom function are demarcated, how to confirm laser probe when any focal length reflects the image position that is predetermined to be of image patch, how dynamically to determine the real focal length of the first two-dimensional camera 1 and the second two-dimensional camera 2, how the laser probe reflection image patch of the first two- dimensional camera 1 and 2 shootings of the second two-dimensional camera is searched for calculating, the image of how the first two-dimensional camera 1 and the second two-dimensional camera 2 being taken carries out pre-service and Stereo matching, and the three-dimensional coordinate of how determining at last object by pixel.
The three-dimensional measuring apparatus based on the laser probe array that the first two-dimensional camera 1 and laser probe generator 3 consist of at first is discussed.Shown in Fig. 2 a, laser probe generator 3 sends a laser probe, and it is by precalculated position A point.Make a ray from the center of the optical lens of the first two-dimensional camera 1 and pass the A point, as shown in phantom in FIG., after 1 imaging of the first two-dimensional camera, all drop on same picture point A ' along all object points of this ray.Be positioned at the P of A point front when object 1During the plane, the laser facula that the first two-dimensional camera 1 photographs is positioned at the left side that is predetermined to be image position A ', shown in Fig. 2 b.Be positioned at the P of A point back when object 2During the plane, the laser facula that the first two-dimensional camera 1 photographs is positioned at the right that is predetermined to be image position A ', shown in Fig. 2 d.When object was positioned at the A point just, the laser image patch that the first two-dimensional camera 1 photographs overlapped with being predetermined to be image position A ', shown in Fig. 2 c.In Fig. 2 a, object departs from fore-and-aft distance Δ Z that predetermined A orders and is proportional to laser probe and above-mentioned auxiliary ray at the distance, delta d along directions X at Z place 1,
Δ d 1 ΔZ = D 2 Z 0 - - - ( 1 )
D/2 represents the spacing of the optical centre of the first two-dimensional camera 1 and laser probe generator 3, Z in the formula (1) 0It is the predetermined fore-and-aft distance that A is ordered.When laser probe generator 3 not during the mid point at the line of the optical centre of the optical lens of two two- dimensional camera 1 and 2, the D/2 in the formula (1) need change respectively the center of optical lens of two- dimensional camera 1 or 2 into from the distance of the optical centre of laser probe generator 3.
Δ d in formula (1) 1Be proportional to the pixel distance Δ j of the predetermined picture point A ' of position deviation of the laser image patch that the first two-dimensional camera 1 photographs 1
Δd 1 = W N Δj 1 = 2 Ztgα 1 N Δj 1 - - - ( 2 )
In the formula (2) W be the first two-dimensional camera 1 at the visual field at Z place width, it has covered all N pixel of the imageing sensor of the first two-dimensional camera 1, α 1It is the angle of half field-of view of the first two-dimensional camera 1.The width of supposing the imageing sensor of the first two-dimensional camera 1 is W 1, the image distance of the first two-dimensional camera is L when object distance is Z 1, since constant by the radiation direction of optical lens optical centre, W 1With L 1Between have a following geometric relationship:
W 1=2L 1tgα 1 (3)
The focal length of remembering the first two-dimensional camera 1 is f 1, then object distance Z, image distance L 1And focal distance f 1Between the meeting geometric imaging relations.Object distance Z is far longer than focal distance f under normal conditions 1, so image distance L 1And focal distance f 1Approximately equal, so
tg α 1 = W 1 2 L 1 ≈ W 1 2 f 1 - - - ( 4 )
(1-2) can release by formula,
Z = Z 0 + ΔZ = DNZ 0 DN - 4 Z 0 tgα 1 Δj 1 - - - ( 5 )
When a large scene was measured, laser probe generator 3 need to send thousands of laser probes, and Fig. 1 has drawn the situation when laser probe generator 3 sends 6 laser probes, and they are respectively by precalculated position A i, i=1,2,3,4,5,6.Suppose that the focal length when the first two-dimensional camera 1 is f 1The time, it is Δ j that the reflection image patch of i root laser probe departs from its pixel distance that is predetermined to be the image position 1i, can calculate focal length according to formula (5) is f 1Time point A iDepth apart from Z i, namely
Z i = Z 0 + Δ Z i = DNZ 0 DN - 4 Z 0 tgα 1 Δj 1 i - - - ( 6 )
Suppose to become f when the focal length of the first two-dimensional camera 1 1' time, the angle of half field-of view of the first two-dimensional camera 1 becomes α 1', the reflection image patch of i root laser probe departs from its pixel distance that is predetermined to be the image position and becomes Δ j in the image of taking behind zoom 1i', can calculate focal length according to formula (5) is f 1' time point A iDepth apart from Z i', namely
Z i ′ = Z 0 + Δ Z i ′ = DNZ 0 DN - 4 Z 0 tg α 1 ′ Δ j 1 i ′ - - - ( 7 )
Can be calculated at the first two-dimensional camera 1 zoom front and back object at an A by formula (6-7) iAmount of movement:
dZ i=Z i′-Z i=C (8)
If the shape of object itself remains unchanged before and after the first two-dimensional camera 1 zoom, and integral translation only occurs in its position, and then C is constant.If further the position of object immobilizes, then C=0.
Because formula (8) is all set up all laser probes, therefore can determine the focal distance f of the first two-dimensional camera 1 before and after zoom by trial method 1And f 1'.Namely give respectively f 1And f 1' one an exploration value, and determine that the first two-dimensional camera 1 is respectively in focal distance f 1And f 1' time laser probe be predetermined to be the image position, calculate Z according to formula (6-7) again iAnd Z i', then see the no establishment of formula (8), if be false, then adjust exploration value f 1And f 1', until set up (8).If the shape of object itself remains unchanged before and after the first two-dimensional camera 1 zoom, and integral translation only occurs in its position, verification expression (8) only to need in principle two points of investigation just much of that.If further the position of object immobilizes before and after the first two-dimensional camera 1 zoom, verify (8) even only need to investigate a point.Since but formula (8) all sets up all laser probes, suitable some points of investigating then for example to N more 0Individual laser probe reflection image patch is added up (N 0〉=2), and be calculated according to the following formula the mean square deviation of the variable quantity dZ of fore-and-aft distance Z:
σ N = Σ i = 1 i = N 0 ( dZ i - C ) 2 / N 0 - - - ( 9 )
C represents the average translational movement of object in the formula (9), can be with object at N 0The average translational movement of individual point is similar to:
C = Σ i = 1 i = N 0 ( Z i ′ - Z i ) / N 0 - - - ( 10 )
If the mean square deviation of the variable quantity dZ of the fore-and-aft distance Z that calculates according to (9-10) formula is less than preseting threshold value, namely approach zero, can think that then " shape of object itself remains unchanged, and integral translation only occurs in its position " this assumed condition sets up, simultaneously the exploration value f of this moment 1And f 1' just in time be the real focal length of the first two-dimensional camera 1 before and after zoom.If the mean square deviation of the variable quantity dZ of the fore-and-aft distance Z that calculates according to (9-10) formula then changes exploration value f greater than preseting threshold value 1And f 1', and repeat said process.If no matter exploration value f 1And f 1' how to change, the least mean-square error that calculates according to formula (9-10) is all greater than predefined threshold value, then explanation " shape of object itself remains unchanged; and integral translation only occurs in its position " and this assumed condition is false, should adopt the professional camera with the focal length monitoring device this moment, monitored the change of focal length by video camera oneself, and write in the view data changing a back focal length, in order to when subsequent treatment, can directly read the real focal length of video camera when taking this image.
Can find out from above-mentioned discussion, according to the position calculation fore-and-aft distance Z of laser probe reflection image patch the time, need to demarcate the image position that be predetermined to be of laser probe when the different focal video camera.For demarcating, at first a planar object is placed on the predetermined focus point place of laser probe focussing plane perpendicular to Z axis, further the zooming range of video camera is divided into the M equal portions, the focal length increment of each equal portions is df=(f Max-f Min)/M.Then from minimum focus f MinBegin to maximum focal length f MaxFinish, the focal length of video camera is set to f successively i=f Min+ i*df, i=0,1 ..., M is at each predetermined focal distance f iTake piece image, to be predetermined focal distance be f to the image space of the reflection image patch of each laser probe in this image iThe time laser probe be predetermined to be the image position.Be respectively f according to focal length of camera iAnd f I+1The time laser probe be predetermined to be the image position, by interpolation calculation can determine when the video camera real focal length between f iAnd f I+1Between the time laser probe be predetermined to be the image position.
In case determined the real focal length of video camera, just can calculate thus three-dimensional coordinate generally more sparse according to the accurate depth distance of formula (4-5) calculating at all laser probe reflection spot place objects, approximately only have one of percentage of the total pixel of video camera.Next step need to the captured stereo-picture of the first two-dimensional camera 1 and the second two-dimensional camera 2 to processing, calculate the more three-dimensional coordinate of multiple spot by Stereo matching.Based on the measurement in space principle of twin camera as shown in Figure 3.Convenient for derivation formula, at first respectively take the optical centre of the optical lens of the first two-dimensional camera 1 and the second two-dimensional camera 2 as true origin, take the optical axis of optical lens as coordinate axis, the local rectangular coordinate system X of the first two-dimensional camera 1 is set respectively 1-Y 1-Z 1Local rectangular coordinate system X with the second two-dimensional camera 2 2-Y 2-Z 2Simultaneously take the mid point of the line of the optical centre of the optical lens of two two-dimensional camera 1 and 2 as initial point world rectangular coordinate system X-Y-Z is set, wherein the coordinate axis of two local rectangular coordinate system is parallel to respectively the coordinate axis of world coordinate system, and Y-axis is perpendicular to paper.In world coordinate system, local rectangular coordinate system X 1-Y 1-Z 1And X 2-Y 2-Z 2The coordinate of initial point be respectively (D/2,0,0), (D/2,0,0).Further the image planes of hypothesis the first two-dimensional camera 1 are L from the distance of its local coordinate initial point 1, and the image planes of the second two-dimensional camera 2 are L from the distance of its local coordinate initial point 2When object is far away, L 1And L 2Be approximately equal to respectively the first and second two-dimensional camera 1,2 focal distance f 1And f 2The coordinate of 1 A of hypothesis space in world coordinate system is (x, y, z), and this o'clock is at the picture point A of the image planes of the first two-dimensional camera 1 1At local rectangular coordinate system X 1-Y 1-Z 1In coordinate be (x 1, y 1,-L 1), this o'clock is at the picture point A of the image planes of the second two-dimensional camera 2 simultaneously 2At local rectangular coordinate system X 2-Y 2-Z 2In coordinate be (x 2, y 2,-L 2), can write out according to the triangle proportionate relationship,
x - D / 2 z = x 1 L 1 - - - ( 11 )
x + D / 2 z = x 2 L 2 - - - ( 12 )
Can obtain according to (12)-(11):
z = D / ( x 2 L 2 - x 1 L 1 ) - - - ( 13 )
As long as known that a certain object point respectively at the coordinate of two-dimensional camera 1,2 image planes, just can calculate its depth distance easily according to formula (13).In other words, to any point in the image of the first two-dimensional camera 1 shooting, as long as known the coordinate of its Corresponding matching point in the image that the second two-dimensional camera 2 is taken, just can calculate object at the three-dimensional coordinate of this point according to (13), so the crucial search that is Stereo matching point.The searching algorithm of Stereo matching point is very many at present, mainly contain matching algorithm based on feature, based on the matching algorithm in zone with based on matching algorithm of phase place etc., its essential characteristic is measure function of definition, when measure function reaches maximum or hour, when perhaps reaching certain threshold range, then think the Corresponding matching point.Take based on the matching algorithm of feature as example, it first to about two width of cloth views carry out respectively feature extraction, find out the unique points such as angle point, edge, then select certain similarity measure function, unique point in the view of the left and right sides is mated one by one, utilize formula (13) can calculate the three-dimensional coordinate of this unique point according to the parallax of same unique point in the view of the left and right sides.A lot of matching algorithms based on feature are arranged in the existing document, as space is limited, do not discuss in detail here.Again take based on the matching algorithm in zone as example, any point Q in the image that the first two-dimensional camera 1 is taken 1(i 0, j 0), with Q 1(i 0, j 0) centered by, the rectangular window that to choose a pixel wide be m * n, another Q in the image that the second two-dimensional camera 2 is taken 2(i 0+ Δ i, j 0+ Δ j), with Q 2(i 0+ Δ i, j 0+ Δ j), centered by, choose the rectangular window that a pixel wide is similarly m * n, the definition measure function:
d ( Δi , Δj ) = Σ i = 1 i = m Σ j = 1 j = n [ Q 1 ( i 0 + i , j 0 + j ) - Q 2 ( i 0 + Δi + i , j 0 + Δj + j ) ] 2 - - - ( 14 )
Q in the formula (14) 1And Q 2Be also illustrated in simultaneously a Q 1And Q 2Brightness of image, the physical meaning of formula (14) is that Luminance Distribution in above-mentioned two rectangular windows is subtracted each other by pixel, and square adding up error.If the some Q in the image that the first two-dimensional camera 1 is taken 1(i 0, j 0) and the image taken of the second two-dimensional camera 2 in some Q 2(i 0+ Δ i, j 0+ Δ j) be a pair of match point, then pattern or the Luminance Distribution around them should be identical or very close, and the measure function that is therefore calculated by formula (14) should reach minimum value.The process of search match point is exactly constantly change Δ i and Δ j so that formula (14) minimum, and less than a predetermined threshold.The Δ i and the Δ j that obtain at last are called vertical parallax and horizontal parallax, if the first video camera 1 is identical with the second video camera 2 vertical heights, then vertical error is zero, but because alignment error may cause vertical error non-vanishing.The factors such as the Region Matching method based on formula (14) is adaptive to image scaled, distortion, picture noise are responsive, because these factors all can change the Luminance Distribution of image, therefore need image is carried out pre-service before coupling.If the real focal length f of the first two-dimensional camera 1 for example 1Real focal length f with the second two-dimensional camera 2 2Unequal, then the horizontal magnification multiplying power of their images of taking can be different, and namely the size of same object in the image that two video cameras are taken can be different, and employing formula (14) just can't reach a less minimum value when calculating.If the image that same the first two-dimensional camera 1 and the second two-dimensional camera 2 are taken is not parallel to each other, then adopt formula (14) also can't reach a less minimum value when calculating.Therefore when carrying out pre-service, at first need the real focal length according to the first two-dimensional camera and the second two-dimensional camera, the detection picture that the second two-dimensional camera is taken carries out convergent-divergent, so that the detection picture that the second two-dimensional camera is taken has identical horizontal magnification multiplying power with the detection picture that the first two-dimensional camera is taken.The laser probe that further also needs to obtain according to search reflects the image patch position, the detection picture that the second two-dimensional camera is taken is rotated, so that the same delegation laser probe reflection image patch in the detection picture that the reflection image patch of the laser probe in the detection picture that the second two-dimensional camera is taken and the first two-dimensional camera are taken is parallel to each other.Certainly before carrying out above-mentioned pre-service, also can carry out filtering to image and process, to reduce picture noise etc.
In the measurement in space method of tradition based on twin camera, it is the most difficult that the search of above-mentioned match point is calculated, and takes time most, and for the unconspicuous image of architectural feature, often draw some wrong match points, cause the three-dimensional reconstruction noise large, the result is unreliable.In the present invention, owing to having introduced laser probe generator 3, the thousands of laser probes that it sends project body surface, be equivalent to do a lot of marks at body surface, therefore do not need to carry out Stereo matching at these laser probe reflection spots, the result that formula (4-5) is calculated can be directly as the accurate three-dimensional coordinate of body surface at this point again.Other points are then carried out Stereo matching in two steps, the first step adopts first the matching algorithm based on feature to mate, clearly angle point of feature in those images, edge etc. are extracted first, and one by one pairing, these angle points, edge are because feature is obvious, and matching result has higher accuracy.Second step adopts the matching algorithm based on the zone to mate to remaining area.If the match is successful in above-mentioned two steps then calculate body surface at the three-dimensional coordinate of this point according to formula (13).If mate unsuccessfully, then by the three-dimensional coordinate inlet wire row interpolation of contiguous laser probe reflection spot or unique point, calculate the three-dimensional coordinate of this point.Specifically, in the detection picture that the first two dimension shooting 1 is taken, choose successively each pixel A 1i, jIf this pixel is laser probe reflection image patch position just, then this laser probe is reflected the corresponding fore-and-aft distance Z of image patch as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jBe not laser probe reflection image patch position, but it is based on the unique point that finds in the coupling of feature, then with the corresponding fore-and-aft distance Z of this unique point as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; Otherwise in the detection picture of taking through pretreated the second two-dimensional camera, according to the matching algorithm search Stereo matching point of measure function formula (14) employing of choosing in advance based on the zone.Fig. 4 has provided the hunting zone schematic diagram, and its left figure is that the first two-dimensional camera is taken 1 picture, and right figure is the picture that the second two-dimensional camera 2 is taken, and bullet represents laser probe reflection image patch, respectively with A 1i, jAnd A 2i ', j 'Centered by the little box indicating of dotted line be used for to calculate the rectangular window of measure function, about the interior pixel brightness value of little square frame among two width of cloth figure subtract each other one by one according to formula (14) and add up, if reach minimum value, A is described then 1i, jAnd A 2i ', j 'Be a pair of Stereo matching point.For the unconspicuous zone of those architectural features, because the object height variation slowly, in pixel A 1i, jThe depth distance of place's body surface should be close with the depth distance of the laser probe reflection spot place body surface that is close to, and so just can significantly reduce the hunting zone, not only can improve processing speed, also can improve matching accuracy simultaneously.That is to say, the hunting zone is and pixel A 1i, jThe most contiguous up and down four laser probe reflection image patch P M, n, P M+1, n, P M, n+1And P M+1, n+1The rectangular area that surrounds.If find Stereo matching point A 2i ', j ', namely at an A 2i ', j 'The measure function that the place is calculated according to formula (14) reaches a less minimum value, then by A 1i, jWith A 2i ', j 'Horizontal pixel coordinate j and j ' obtain image planes local space coordinate x through linear transformation 1=(j-N/2) W 1/ N and x 2=(j '-N/2) W 2/ N, here W 1, W 2Be respectively width and total pixel of the imageing sensor of the first and second video cameras with N.Further calculate pixel A according to the three-dimensional measurement formula (13) based on twin camera again 1i, jThe fore-and-aft distance Z of place's body surface.If do not find the Stereo matching point, then according to and pixel A 1i, jThe most contiguous up and down four laser probe reflection image patch P M, n, P M+1, n, P M, n+1And P M+1, n+1Corresponding fore-and-aft distance calculates pixel A by linear interpolation 1i, jThe fore-and-aft distance Z of place's body surface.
Embodiment 3:
Fig. 5 has provided a two waveband two-dimensional camera structural representation, it is by the first imageing sensor 7, the second imageing sensor 8, optical lens 5 and wavelength select spectroscope 6 to form, wavelength selects the reflecting surface of spectroscope 6 to become 45 degree with the optical axis of optical lens 5, the first imageing sensor 7 and the second imageing sensor 8 be symmetrical being placed on two image planes of optical lens 5 respectively, so that the incident infrared light is selected the reflection of spectroscope 6 through wavelength, be imaged onto the first imageing sensor 7, and the incident visible light is selected the transmission of spectroscope 6 through wavelength, is imaged onto the second imageing sensor 8.Above-mentioned two waveband two-dimensional camera also comprises an arrowband infrared fileter 9 in addition, and arrowband infrared fileter 9 is placed on the front of the first imageing sensor 7.
In the 3-D photography device based on the laser probe array shown in Figure 1, the captured subject image of the first two-dimensional camera 1 and the second two-dimensional camera 2 is with the reflection image patch of laser probe, and this is unfavorable to the later stage 3-D display.Can allow laser probe generator 3 send wavelength is 0.7~1.2 micron near infrared light laser probe for this reason, the first two-dimensional camera 1 and the second two-dimensional camera 2 all adopt two waveband two-dimensional camera shown in Figure 5 respectively infrared laser probe and visible light object to be taken imaging simultaneously, another benefit of doing like this is that infrared image sensor can adopt respectively different enlargement ratios from the hardware amplifying circuit of visible light image sensor, avoids both because brightness differs and causes more greatly low light level image to be covered by the high light image.
In order to improve the anti-interference based on the 3-D photography device of laser probe array, above-mentioned two waveband two-dimensional camera also comprises an arrowband infrared fileter 9 in addition, and arrowband infrared fileter 9 is placed on the front of the first imageing sensor 7.Like this when many stylobates when the 3-D photography device of laser probe array is worked simultaneously, can allow the laser probe generator in the different 3-D photography devices send the slightly variant infrared laser of wavelength, arrowband infrared fileter 9 can block the infrared laser that sends from the laser probe generator in other 3-D photography devices, only allows the infrared laser that sends from self laser probe generator pass through.

Claims (5)

1. three-dimensional photographing process based on the laser probe array is characterized in that: may further comprise the steps:
Step 1, the optical centre that the laser probe generator is set are true origin, and the optical axis of laser probe generator is Z axis, and X-axis is perpendicular to Z axis, and X-axis Z axis plane is surface level, and Y-axis is perpendicular to X-axis Z axis plane; The initial point symmetria bilateralis arranges the first two-dimensional camera and the second two-dimensional camera on X-axis, the optical axis of the first two-dimensional camera and the second two-dimensional camera is parallel to Z axis, so that the first two-dimensional camera and the second two-dimensional camera consist of a pair of stereo camera, simultaneously the first two-dimensional camera and the second two-dimensional camera consist of three-dimensional measuring apparatus based on the laser probe array with the laser probe generator respectively;
The laser probe array that step 2, setting laser probe generator send obtains the predetermined focus point coordinate of every laser probe in the laser probe array perpendicular to the predetermined focus point on the predetermined focussing plane of Z axis;
Step 3, the predetermined focus point coordinate of basis focus on predetermined focus point to the laser probe array that the laser probe generator sends;
Step 4, the laser probe that focuses on predetermined focus point that utilizes the laser probe generator to send is demarcated the first two-dimensional camera and the second two-dimensional camera, one planar object is placed on predetermined focus point place focussing plane perpendicular to Z axis, the predetermined focal distance of the first two-dimensional camera and the second two-dimensional camera is adjusted to respectively its shortest focal length, the position of the captured laser probe of the first two-dimensional camera this moment reflection image patch be the first two-dimensional camera when shortest focal length laser probe be predetermined to be the image position, and the position of the captured laser probe reflection image patch of the second two-dimensional camera be the second two-dimensional camera when shortest focal length laser probe be predetermined to be the image position;
Step 5, the focal length of the first two-dimensional camera and the second two-dimensional camera is increased respectively a fixed value, the position of the captured laser probe of the first two-dimensional camera this moment reflection image patch be the first two-dimensional camera when new predetermined focal distance laser probe be predetermined to be the image position, and the position of the captured laser probe reflection image patch of the second two-dimensional camera be the second two-dimensional camera when new predetermined focal distance laser probe be predetermined to be the image position;
Step 6, repeating step 5, until the predetermined focal distance of the first two-dimensional camera and the second two-dimensional camera reaches respectively its longest focal length, the first two-dimensional camera and the second two-dimensional camera are demarcated and are finished;
Step 7, utilize in the step 3 and to focus on the laser probe array irradiation subject that produces, and with the first two-dimensional camera and the second two-dimensional camera that demarcation in the step 6 is finished subject is taken, obtain detecting picture;
Read respectively the real focal length of the first two-dimensional camera and the second two-dimensional camera in step 8, the first two-dimensional camera and the second two-dimensional camera are taken from step 7 the detection picture, according to demarcate in the step 6 the first two-dimensional camera finish and the second two-dimensional camera when the predetermined focal distance laser probe be predetermined to be the image position, interpolation calculation obtain the first two-dimensional camera and the second two-dimensional camera under real focal length laser probe be predetermined to be the image position;
Step 9, according to the first two-dimensional camera that obtains in the step 8 and the second two-dimensional camera when the real focal length every laser probe be predetermined to be the image position, the detection picture that the detection picture that the first two-dimensional camera that step 7 is obtained is taken and the second two-dimensional camera are taken carries out respectively the search of laser probe reflection image patch, if there is laser probe reflection image patch, then depart from the pixel distance that is predetermined to be the image position that is set as search center according to laser probe reflection image patch and calculate the object under test surface to the fore-and-aft distance Z on X-axis Y-axis plane, all are predetermined to be the image position search enter step 10 after complete;
Step 10, the first two-dimensional camera that reads according to step 8 and the real focal length of the second two-dimensional camera, the detection picture that the second two-dimensional camera that obtains in the step 7 is taken carries out convergent-divergent, so that the detection picture that the second two-dimensional camera is taken has identical horizontal magnification multiplying power with the detection picture that the first two-dimensional camera is taken; The laser probe that obtains according to search in the step 9 reflects the image patch position, the detection picture that the second two-dimensional camera that obtains in the step 7 is taken is rotated, so that the same delegation laser probe reflection image patch in the detection picture that the reflection image patch of the laser probe in the detection picture that the second two-dimensional camera is taken and the first two-dimensional camera are taken is parallel to each other;
Step 11, the detection picture that the first two-dimensional camera of obtaining in the step 7 is taken and the detection picture of taking through pretreated the second two-dimensional camera of step 10 carry out respectively angle point Edge Gradient Feature and coupling, obtain respectively the unique point that above-mentioned two width of cloth detect picture, according to the fore-and-aft distance Z of same unique point this unique point of disparity computation in above-mentioned two width of cloth detection picture to X-axis Y-axis plane;
Choose successively each pixel A in the detection picture that step 12, the first two-dimensional camera that obtains are taken in step 7 1i, jIf this pixel is the laser probe reflection image patch position that search obtains in the step 9, then this laser probe that calculates in the step 9 is reflected the corresponding fore-and-aft distance Z of image patch as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jNot being the laser probe reflection image patch position that search obtains in the step 9, is the unique point of extracting in the step 11, then with the corresponding fore-and-aft distance Z of this unique point as pixel A 1i, jThe fore-and-aft distance Z of place's body surface; If pixel A 1i, jIt or not the laser probe reflection image patch position that search obtains in the step 9, the unique point of extracting in neither step 11, then in the detection picture that pretreated the second two-dimensional camera is taken through step 10, according to the measure function search Stereo matching point of choosing in advance, the hunting zone obtains for search in step 9 and and pixel A 1i, jThe most contiguous up and down four rectangular areas that laser probe reflection image patch surrounds are if find Stereo matching point A 2i ', j ', then utilize A 1i, jWith A 2i ', j 'The parallax size calculate pixel A 1i, jThe fore-and-aft distance Z of place's body surface is not if find the Stereo matching point, then according to above-mentioned and pixel A 1i, jThe most contiguous up and down four corresponding fore-and-aft distances of laser probe reflection image patch calculate pixel A by linear interpolation 1i, jThe fore-and-aft distance Z of place's body surface.
2. a right to use requires the 3-D photography device based on the laser probe array of 1 described method, comprise the first two-dimensional camera (1), the second two-dimensional camera (2), laser probe generator (3) and support (4), the first two-dimensional camera (1), laser probe generator (3) and the second two-dimensional camera (2) equidistantly are fixed on the support (4) successively, the optical axis of the first two-dimensional camera (1), the optical axis of the optical axis of laser probe generator (3) and the second two-dimensional camera (2) is parallel to each other and is positioned at same plane, the line of the optical centre of the optical lens of the optical centre of the optical lens of the first two-dimensional camera (1) and the second two-dimensional camera (2) is perpendicular to their optical axis, the optical centre of laser probe generator (3) is positioned on the line of optical centre of optical lens of the optical centre of optical lens of the first two-dimensional camera (1) and the second two-dimensional camera (2), so that the first two-dimensional camera (1) and the second two-dimensional camera (2) consist of a pair of stereo camera, simultaneously the first two-dimensional camera (1) and the second two-dimensional camera (2) respectively with the three-dimensional measuring apparatus of laser probe generator (3) formation based on the laser probe array, it is characterized in that: the first two-dimensional camera (1) and the second two-dimensional camera (2) are built-in with the focal length monitoring device, and the real focal length during photographic images writes in the captured view data.
3. a kind of 3-D photography device based on the laser probe array according to claim 2, it is characterized in that: the light that described laser probe generator (3) sends is infrared light, and the first two-dimensional camera (1) and the second two-dimensional camera (2) are the two waveband two-dimensional camera.
4. a kind of 3-D photography device based on the laser probe array according to claim 3, it is characterized in that: described two waveband two-dimensional camera is by the first imageing sensor (7), the second imageing sensor (8), optical lens (5) and wavelength select spectroscope (6) to form, wavelength selects the reflecting surface of spectroscope (6) to become 45 degree with the optical axis of optical lens (5), the first imageing sensor (7) and the second imageing sensor (8) respectively symmetry are placed on two image planes of optical lens (5), so that the incident infrared light is selected the reflection of spectroscope (6) through wavelength, be imaged onto the first imageing sensor (7), and the incident visible light is selected the transmission of spectroscope (6) through wavelength, is imaged onto the second imageing sensor (8).
5. a kind of 3-D photography device based on the laser probe array according to claim 4, it is characterized in that: described two waveband two-dimensional camera also comprises an arrowband infrared fileter (9), and arrowband infrared fileter (9) is placed on the front of the first imageing sensor (7).
CN201110367668.6A 2011-11-18 2011-11-18 Three-dimensional photographing process based on laser probe array and device utilizing same Expired - Fee Related CN102494609B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110367668.6A CN102494609B (en) 2011-11-18 2011-11-18 Three-dimensional photographing process based on laser probe array and device utilizing same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110367668.6A CN102494609B (en) 2011-11-18 2011-11-18 Three-dimensional photographing process based on laser probe array and device utilizing same

Publications (2)

Publication Number Publication Date
CN102494609A CN102494609A (en) 2012-06-13
CN102494609B true CN102494609B (en) 2013-09-18

Family

ID=46186453

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110367668.6A Expired - Fee Related CN102494609B (en) 2011-11-18 2011-11-18 Three-dimensional photographing process based on laser probe array and device utilizing same

Country Status (1)

Country Link
CN (1) CN102494609B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018116305A1 (en) * 2016-12-22 2018-06-28 Eva - Esthetic Visual Analytics Ltd. Real-time tracking for three-dimensional imaging

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103512909A (en) * 2012-06-18 2014-01-15 中国航空工业第六一八研究所 X-ray focusing device
EP2972478B1 (en) * 2013-03-15 2020-12-16 Uatc, Llc Methods, systems, and apparatus for multi-sensory stereo vision for robotics
CN103792950B (en) * 2014-01-06 2016-05-18 中国航空无线电电子研究所 A kind of method that uses the stereoscopic shooting optical parallax deviation correcting device based on piezoelectric ceramics to carry out error correction
CN103759645B (en) * 2014-01-26 2016-08-17 湖南航天机电设备与特种材料研究所 A kind of aerostat capsule volume measurement method
CN103869593B (en) * 2014-03-26 2017-01-25 深圳科奥智能设备有限公司 Three-dimension imaging device, system and method
CN104243843B (en) * 2014-09-30 2017-11-03 北京智谷睿拓技术服务有限公司 Pickup light shines compensation method, compensation device and user equipment
CN106264536A (en) * 2015-05-21 2017-01-04 长沙维纳斯克信息技术有限公司 A kind of 3D anthropometric scanning apparatus and method
US10338225B2 (en) 2015-12-15 2019-07-02 Uber Technologies, Inc. Dynamic LIDAR sensor controller
US10281923B2 (en) 2016-03-03 2019-05-07 Uber Technologies, Inc. Planar-beam, light detection and ranging system
US10630884B2 (en) 2016-03-23 2020-04-21 Huawei Technologies Co., Ltd. Camera focusing method, apparatus, and device for terminal
US9952317B2 (en) 2016-05-27 2018-04-24 Uber Technologies, Inc. Vehicle sensor calibration system
CN106530343A (en) * 2016-10-18 2017-03-22 深圳奥比中光科技有限公司 Projection device and projection method based on target depth image
US11412204B2 (en) 2016-12-22 2022-08-09 Cherry Imaging Ltd. Three-dimensional image reconstruction using multi-layer data acquisition
EP3559741B1 (en) 2016-12-22 2021-09-22 Cherry Imaging Ltd Three-dimensional image reconstruction using multi-layer data acquisition
US11402740B2 (en) 2016-12-22 2022-08-02 Cherry Imaging Ltd. Real-time tracking for three-dimensional imaging
CN106931903A (en) * 2017-01-19 2017-07-07 武汉中观自动化科技有限公司 A kind of hand-held spatial digitizer of real-time generation model
US10479376B2 (en) 2017-03-23 2019-11-19 Uatc, Llc Dynamic sensor selection for self-driving vehicles
CN107026392B (en) 2017-05-15 2022-12-09 奥比中光科技集团股份有限公司 VCSEL array light source
CN107289858A (en) * 2017-07-06 2017-10-24 广州市九州旗建筑科技有限公司 The measurement apparatus and method of virtual ruler built in a kind of digital picture
US10775488B2 (en) 2017-08-17 2020-09-15 Uatc, Llc Calibration for an autonomous vehicle LIDAR module
US10746858B2 (en) 2017-08-17 2020-08-18 Uatc, Llc Calibration for an autonomous vehicle LIDAR module
CN108181610B (en) * 2017-12-22 2021-11-19 鲁东大学 Indoor robot positioning method and system
US10914820B2 (en) 2018-01-31 2021-02-09 Uatc, Llc Sensor assembly for vehicles
CN108955641B (en) * 2018-04-23 2020-11-17 维沃移动通信有限公司 Depth camera shooting method, depth camera shooting equipment and mobile terminal
CN108592791B (en) * 2018-04-27 2020-06-16 烟台南山学院 Pit inspection method
WO2019227974A1 (en) * 2018-06-02 2019-12-05 Oppo广东移动通信有限公司 Electronic assembly and electronic device
CN112101338B (en) * 2018-08-14 2021-04-30 成都佳诚弘毅科技股份有限公司 Image restoration method based on VIN image acquisition device
CN110411339B (en) * 2019-07-30 2021-07-02 中国海洋大学 Underwater target size measuring equipment and method based on parallel laser beams
CN112710253B (en) * 2019-10-24 2023-06-06 先临三维科技股份有限公司 Three-dimensional scanner and three-dimensional scanning method
CN110728852A (en) * 2019-11-07 2020-01-24 陈定良 Solar traffic light
CN111649686B (en) * 2019-12-04 2021-06-22 西华大学 High-precision vehicle collision deformation measuring method
CN111147840A (en) * 2019-12-23 2020-05-12 南京工业职业技术学院 Automatic control and communication system for video and audio acquisition of 3D camera rocker arm
CN111595254B (en) * 2020-06-04 2021-09-21 中国人民解放军陆军装甲兵学院 Method and system for measuring axial distance between lens array and LCD display screen
CN113701710B (en) * 2021-08-31 2024-05-17 高新兴科技集团股份有限公司 Laser spot positioning method, ranging method, medium and equipment applied to security monitoring
CN114046768B (en) * 2021-11-10 2023-09-26 重庆紫光华山智安科技有限公司 Laser ranging method, device, laser ranging equipment and storage medium
CN115839667B (en) * 2023-02-21 2023-05-12 青岛通产智能科技股份有限公司 Height measurement method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539422A (en) * 2009-04-22 2009-09-23 北京航空航天大学 Monocular vision real time distance measure method
CN101694370A (en) * 2009-09-15 2010-04-14 北京信息科技大学 Method for evaluating precision of large-scale industrial photogrammetry system and benchmark device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257414A (en) * 1996-03-25 1997-10-03 Kobe Steel Ltd Object position detector
US20030035100A1 (en) * 2001-08-02 2003-02-20 Jerry Dimsdale Automated lens calibration

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101539422A (en) * 2009-04-22 2009-09-23 北京航空航天大学 Monocular vision real time distance measure method
CN101694370A (en) * 2009-09-15 2010-04-14 北京信息科技大学 Method for evaluating precision of large-scale industrial photogrammetry system and benchmark device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CN特开平9-257414A 1997.10.03

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018116305A1 (en) * 2016-12-22 2018-06-28 Eva - Esthetic Visual Analytics Ltd. Real-time tracking for three-dimensional imaging

Also Published As

Publication number Publication date
CN102494609A (en) 2012-06-13

Similar Documents

Publication Publication Date Title
CN102494609B (en) Three-dimensional photographing process based on laser probe array and device utilizing same
US8718326B2 (en) System and method for extracting three-dimensional coordinates
EP3480648B1 (en) Adaptive three-dimensional imaging system
CN105115445A (en) Three-dimensional imaging system and imaging method based on combination of depth camera and binocular vision
US20090214107A1 (en) Image processing apparatus, method, and program
JP6251142B2 (en) Non-contact detection method and apparatus for measurement object
CN109883391B (en) Monocular distance measurement method based on digital imaging of microlens array
CN103793911A (en) Scene depth obtaining method based on integration image technology
CN109859272A (en) A kind of auto-focusing binocular camera scaling method and device
CN103090846A (en) Distance measuring device, distance measuring system and distance measuring method
CN110763140B (en) Non-parallel optical axis high-precision binocular ranging method
CN105184784A (en) Motion information-based method for monocular camera to acquire depth information
CN109819235A (en) A kind of axial distributed awareness integrated imaging method having following function
CN107036579A (en) A kind of target relative positioning method based on monocular liquid lens optical system
WO2019125427A1 (en) System and method for hybrid depth estimation
CN114359406A (en) Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
JP3986748B2 (en) 3D image detection device
KR101296601B1 (en) The camera control system and method for producing the panorama of map information
CN106500729A (en) A kind of smart mobile phone self-inspection calibration method without the need for control information
CN104049257A (en) Multi-camera space target laser three-dimensional imaging device and method
CN110024365A (en) Photographic device and camera system
CN106019536A (en) Device for extending depth of field of array image sensors
CN103630118B (en) A kind of three-dimensional Hyperspectral imaging devices
CN203069176U (en) Ranging device and ranging system thereof
JP2020194454A (en) Image processing device and image processing method, program, and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130918

Termination date: 20211118

CF01 Termination of patent right due to non-payment of annual fee