CN102436657A - Active light depth measurement value modifying method based on application of the internet of things - Google Patents

Active light depth measurement value modifying method based on application of the internet of things Download PDF

Info

Publication number
CN102436657A
CN102436657A CN2011102897535A CN201110289753A CN102436657A CN 102436657 A CN102436657 A CN 102436657A CN 2011102897535 A CN2011102897535 A CN 2011102897535A CN 201110289753 A CN201110289753 A CN 201110289753A CN 102436657 A CN102436657 A CN 102436657A
Authority
CN
China
Prior art keywords
depth
value
camera
distance
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011102897535A
Other languages
Chinese (zh)
Inventor
夏东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUNAN LINGCHUANG INTELLIGENT SCIENCE & TECHNOLOGY CO., LTD.
Original Assignee
夏东
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 夏东 filed Critical 夏东
Priority to CN2011102897535A priority Critical patent/CN102436657A/en
Publication of CN102436657A publication Critical patent/CN102436657A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an active light depth measurement value modifying method based on an application of an internet of things. Two light depth map measurement value modifying methods are discloses. In the method, most of recovered depth map can be regarded as a value in a Zc coordinate direction of a camera coordinate system XcYcZc by analyzing an active light depth map imaging principle; generally speaking, the difference between different image coordinates is small, and the non-uniformity is low; for simplifying the storage, the significant modification can be obtained by storing only one group of modification data. By using a correction method of a depth value, a comparatively accurate depth measurement value can be obtained, and the method plays a great role in reducing the error in the sequential evaluation.

Description

Active optical depth measured value modification method based on the Internet of Things application
Technical field
The present invention relates to a kind of active optical depth measured value modification method of using based on Internet of Things.
Background technology
Modern intelligent video monitoring system (Intelligence video surveillance) all adopts the method for computer vision to come the information in the video is analyzed; Extract the various various actions main bodys in the video, and the behavior of behavioral agent is analyzed.But present intelligent video monitoring system all also is faced with various problems; Stable detection such as the partial occlusion target; Shadow removal; The detection of the not obvious target of contrast; And to accurately cutting apart of foreground target etc. all be the difficult point problem in the current intelligent video monitoring, these problems why difficulty that all be because can't obtain the mutual spatial relationship between the object in the scene, make the performance of intelligent video monitoring system be difficult to the performance that reaches higher to the disappearance of each pixel depth information.
Depth information in the video monitoring scene is vital information, shines upon toward the two dimensional image coordinate system but the common video monitor procedure all is the three-dimensional world coordinate system, in obtaining the process of video, has all lost range information.Because human eye and brain have extremely strong self calibrating function; And has a considerable extra priori; Thereby also can be good at understanding the content in the image for the video that does not have direct depth information, also can distinguish the relative position relation of various piece in the image.But it is impossible adopting manpower to come video content analysis fully always, the epoch that particularly the monitoring camera number is numerous and human cost is increasingly high now.
In other intelligence system fields, all need the environmental information of system around can perception such as fields such as robot application, particularly to the accurate perception of range information between object in the surrounding environment and the system.Thereby for various intelligence systems, the depth information of perception surrounding environment all is the key that can system give full play to should have effect.Owing in process for imaging, all lost depth information, the depth information that wants to regain surrounding environment is a very process of trouble.The current method of obtaining picture depth information mainly combines the mode of certain priori or used for multi-vision visual to obtain through camera calibration simultaneously; But these methods all need more prior imformation; And the calibration process trouble, also need demarcate in the mode installation process of used for multi-vision visual recovery depth information camera.
Primesense company is the global leader of 3D field of machine vision, and its depth image based on the structure light coding technology can directly obtain the picture depth on each pixel orientation, and it is extremely convenient to use.But the ranging information of primesense has certain structure property error, and this has caused in to scene object accurately to locate and survey in the high subsequent applications at three-dimensional real world having error.Thereby through makeover process, the depth information that collects is at first revised, and then carried out follow-up processing, can guarantee the various precision in the computation process.
In order to obtain the update information of degree of depth camera accurately; Need carry out accurately mapping, modeling to the relation between actual distance and measuring distance and the orientation; After the mapping relations of having set up between actual distance and measuring distance and the orientation; Just can revise, and then these revised depth informations are used for subsequent calculations the measured real-time depth information.The degree of depth correction algorithm that this paper designed is applicable to various degree of depth cameras, particularly PS company designed based on structured light degree of depth camera series.
Summary of the invention
The objective of the invention is to overcome deficiency of the prior art, the active optical depth measured value modification method of using based on Internet of Things is provided.
Based on the active optical depth measured value modification method that Internet of Things is used, concrete grammar is following:
Definition one, depth map; Depth map is to be exported by depth camera, and forgives the image of target range camera range information on each orientation.Be its at coordinate u, the image value f at v place (u v) representes u, on the v sight line apart from the distance value of camera closest approach.
Definition two, active optical depth figure; Having obtained of depth image is multiple, and a kind of is to utilize binocular or used for multi-vision visual to obtain the image that parallax obtains, and a kind of in addition is the method for utilizing active light to throw light on and find range then.The mentioned active ligh-ranging method of the present invention comprises the depth distance image that adopts following method to obtain: interferometric method, structure light coding method, time-of-flight method etc.
Definition three, index plane; A large-area plane is such as wall;
Definition four, linear interpolation; A kind of mathematics estimation method, the functional value of two points carries out Linear Estimation to the functional value of intermediate point about utilization, f ( x 2 ) = x 2 - x 0 x 1 - x 0 f ( x 1 ) + x 1 - x 2 x 1 - x 0 f ( x 0 ) For x 0 < x 2 < x 1 .
Definition five, look-up table, English lookup table, a kind of data storage method becomes an array or similar data structure conveniently to calculate calculative data storage.
Definition six, arc cosine method; Be used for after the angle of knowing adjacent side and adjacent side and hypotenuse, calculating the triangle hypotenuse; Promptly
Figure BSA00000582595100031
wherein a be adjacent side length; C is a hypotenuse length, and θ is the angle of adjacent side and hypotenuse.
Based on the active optical depth measured value modification method that Internet of Things is used, concrete steps are following:
(1) at first degree of depth camera is fixedly mounted on the horizontal slide rail;
(2) with the some index planes of camera perpendicular alignmnet, the typical case is like wall;
(3) camera is moved along slide rail, and note camera apart from the actual range of metope as nominal range, and the target depth image value that simultaneously camera is measured is as measuring distance;
(4) combine the field size information calculations of camera to go out the accurate distance of camera and metope on each pixel orientation through the arc cosine method;
(5) set up look-up table table, size is M * N * 2 * Mark, and M * N is the image size, and each pixel is set up look-up table in the relation that measures between distance and the actual distance on the different distance, and so far preliminary work finishes;
(6) obtain the depth map of a width of cloth through degree of depth camera collection;
(7) read pixel orientation u, the last measuring distance of v is as the x in the linear interpolation 2, and on relevant this pixel orientation among the look-up table table, promptly look-up table tabel (u, v ...) and in x 2Two immediate measured values are as x 0x 1, and read measuring value x 0x 1Nominal range as f (x 0) and f (x 1);
(8) utilize two nominal range f (x 0) and f (x 1) depth value to current x 2Measure distance and carry out the estimation f (x that linear interpolation obtains current accurate depth value 2);
(9) travel through all pixels, each pixel is repeated (7) and (8) to obtain complete Corrected Depth image;
This LUT Method that each pixel is all set up a plurality of different nominal ranges and measuring distance can be replied very accurately; And based on the method that complete measurement information is revised again need set up all pixels at the accurate look-up table corresponding relation between the degree of depth and the both sides degree of depth on each nominal depth; This theoretical method is clear; Method is simple, and the recovery precision of picture depth is higher.But a plurality of positions of each pixel of needs storage, need relatively large storage space;
Through the analysis to active optical depth figure image-forming principle, most of depth map that recovers can be regarded camera coordinate system X as cY cZ cZ cThe value of coordinate direction, and in general the difference between the different images coordinate is less, and heterogeneity is lower, thereby in order to simplify storage, can only store the correction that one group of correction data can obtain quite big degree.
Below the other a kind of method of our brief account:
A kind of in addition modification method is based on the revised law of model; We find that through after a large amount of measurements the distance that degree of depth camera provides is the Z axle measuring value under the camera coordinate system, but the measuring value of Z axle has discrepancy with real Z-direction coordinate figure really; Need revise; The same a kind of method of method of revising has similar place, such as all being to select an index plane, with camera and alignment criteria face; And then measure nominal value and video camera distance apart from metope; Because degree of depth camera is that the Z axle is carried out to picture, thus can utilize measured value on the more index plane on average obtain the corresponding relation between actual measured value and the nominal value, set up the nominal value of one dimension and the look-up table of measured value;
Trimming process is following:
(1): at first degree of depth camera is fixedly mounted on the horizontal slide rail;
(2): with the some index planes of camera perpendicular alignmnet, the typical case is like wall;
(3): camera is moved along slide rail, and note camera apart from the actual range of metope as nominal range, and the image value that simultaneously camera is measured is on average as measuring distance;
(4): set up one group of relevant average measurement Z cValue and nominal Z cThe look-up table table of value (being the distance of video camera) apart from metope AverageSize is 2 * Mark, and what Mark represented is the number of nominal value.
(5): obtain a width of cloth depth map depth, big or small M * N;
(6): the angular distance of calculating current pixel point distance center pixel; The calculating of angular distance need be known the visual field size and the sum of all pixels of video camera;
&theta; u = ( u - M 2 ) &times; ( fov u M ) , &theta; v = ( v - N 2 ) &times; ( fov v N )
In the following formula: fov uWith fov vExpression is the visual field size of the both direction of video camera respectively, and M and N represent video camera u respectively, the number of pixels on the v direction;
(7): data calibration, read measuring distance on each pixel orientation as the x in the linear interpolation through linear interpolation to the depth image value of input 2, and to look-up table table AverageIn with x 2Two immediate measured values are as x 0x 1, and read measuring value x 0x 1Nominal range as f (x 0) and f (x 1); Carrying out interpolation correction through linear interpolation again, to obtain measured value be x 2Corrected value f (x 2), proofread and correct back image depth as the first step Modified_table(u, value v);
(8): angularity correction, the depth value behind the 8th step process look-up table data calibration is carried out the correction of angle information: depth according to the pixel index value again Modified(u, v)=depth Modified_table(u, v)/(cos (θ u) * cos (θ v));
(9): travel through all pixels and obtain through the image depth behind the data calibration Modified
Beneficial effect:
The invention provides two kinds degree of depth camera carried out the depth value method of correcting, through the bearing calibration of depth value, we can obtain comparatively accurate depth measurement, have so also brought into play greatly and act on for we reduce error in the subsequent calculations.
Embodiment
Be easy to understand and understand in order to make technological means of the present invention, creation characteristic, workflow, method of application reach purpose and effect,, further set forth the present invention below in conjunction with specific embodiment.
Based on the active optical depth measured value modification method that Internet of Things is used, concrete grammar is following:
Embodiment 1:
Definition one, depth map; Depth map is to be exported by depth camera, and forgives the image of target range camera range information on each orientation.Be its at coordinate u, the image value f at v place (u v) representes u, on the v sight line apart from the distance value of camera closest approach.
Definition two, active optical depth figure; Having obtained of depth image is multiple, and a kind of is to utilize binocular or used for multi-vision visual to obtain the image that parallax obtains, and a kind of in addition is the method for utilizing active light to throw light on and find range then.The mentioned active ligh-ranging method of the present invention comprises the depth distance image that adopts following method to obtain: interferometric method, structure light coding method, time-of-flight method etc.
Definition three, index plane; A large-area plane is such as wall;
Definition four, linear interpolation; A kind of mathematics estimation method, the functional value of two points carries out Linear Estimation to the functional value of intermediate point about utilization, f ( x 2 ) = x 2 - x 0 x 1 - x 0 f ( x 1 ) + x 1 - x 2 x 1 - x 0 f ( x 0 ) For x 0 < x 2 < x 1 .
Definition five, look-up table, English lookup table, a kind of data storage method becomes an array or similar data structure conveniently to calculate calculative data storage.
Definition six, arc cosine method; Be used for after the angle of knowing adjacent side and adjacent side and hypotenuse, calculating the triangle hypotenuse; Promptly wherein a be adjacent side length; C is a hypotenuse length, and θ is the angle of adjacent side and hypotenuse.
Based on the active optical depth measured value modification method that Internet of Things is used, concrete steps are following:
(1) at first degree of depth camera is fixedly mounted on the horizontal slide rail;
(2) with the some index planes of camera perpendicular alignmnet, the typical case is like wall;
(3) camera is moved along slide rail, and note camera apart from the actual range of metope as nominal range, and the target depth image value that simultaneously camera is measured is as measuring distance;
(4) combine the field size information calculations of camera to go out the accurate distance of camera and metope on each pixel orientation through the arc cosine method;
(5) set up look-up table table, size is M * N * 2 * Mark, and M * N is the image size, and each pixel is set up look-up table in the relation that measures between distance and the actual distance on the different distance, and so far preliminary work finishes;
(6) obtain the depth map of a width of cloth through degree of depth camera collection;
(7) read pixel orientation u, the last measuring distance of v is as the x in the linear interpolation 2, and on relevant this pixel orientation among the look-up table table, promptly look-up table tabel (u, v ...) and in x 2Two immediate measured values are as x 0x 1, and read measuring value x 0x 1Nominal range as f (x 0) and f (x 1);
(8) utilize two nominal range f (x 0) and f (x 1) depth value to current x 2Measure distance and carry out the estimation f (x that linear interpolation obtains current accurate depth value 2);
(9) travel through all pixels, each pixel is repeated (7) and (8) to obtain complete Corrected Depth image;
This LUT Method that each pixel is all set up a plurality of different nominal ranges and measuring distance can be replied very accurately; And based on the method that complete measurement information is revised again need set up all pixels at the accurate look-up table corresponding relation between the degree of depth and the both sides degree of depth on each nominal depth; This theoretical method is clear; Method is simple, and the recovery precision of picture depth is higher.But a plurality of positions of each pixel of needs storage, need relatively large storage space; Through the analysis to active optical depth figure image-forming principle, most of depth map that recovers can be regarded camera coordinate system X as cY cZ cZ cThe value of coordinate direction, and in general the difference between the different images coordinate is less, and heterogeneity is lower, thereby in order to simplify storage, can only store the correction that one group of correction data can obtain quite big degree.
Embodiment 2:
We find that through after a large amount of measurements the distance that degree of depth camera provides is the Z axle measuring value under the camera coordinate system, but the measuring value of Z axle has discrepancy with real Z-direction coordinate figure really; Need revise; The same a kind of method of method of revising has similar place, such as all being to select an index plane, with camera and alignment criteria face; And then measure nominal value and video camera distance apart from metope; Because degree of depth camera is that the Z axle is carried out to picture, thus can utilize measured value on the more index plane on average obtain the corresponding relation between actual measured value and the nominal value, set up the nominal value of one dimension and the look-up table of measured value;
Trimming process is following:
(1): at first degree of depth camera is fixedly mounted on the horizontal slide rail;
(2): with the some index planes of camera perpendicular alignmnet, the typical case is like wall;
(3): camera is moved along slide rail, and note camera apart from the actual range of metope as nominal range, and the image value that simultaneously camera is measured is on average as measuring distance;
(4): set up one group of relevant average measurement Z cValue and nominal Z cThe look-up table table of value (being the distance of video camera) apart from metope AverageSize is 2 * Mark, and what Mark represented is the number of nominal value.
(5): obtain a width of cloth depth map depth, big or small M * N;
(6): the angular distance of calculating current pixel point distance center pixel; The calculating of angular distance need be known the visual field size and the sum of all pixels of video camera;
&theta; u = ( u - M 2 ) &times; ( fov u M ) , &theta; v = ( v - N 2 ) &times; ( fov v N )
In the following formula: fov uWith fov vExpression is the visual field size of the both direction of video camera respectively, and M and N represent video camera u respectively, the number of pixels on the v direction;
(7): data calibration, read measuring distance on each pixel orientation as the x in the linear interpolation through linear interpolation to the depth image value of input 2, and to look-up table table AverageIn with x 2Two immediate measured values are as x 0x 1, and read measuring value x 0x 1Nominal range as f (x 0) and f (x 1); Carrying out interpolation correction through linear interpolation again, to obtain measured value be x 2Corrected value f (x 2), proofread and correct back image depth as the first step Modified_table(u, value v);
(8): angularity correction, the depth value behind the 8th step process look-up table data calibration is carried out the correction of angle information: depth according to the pixel index value again Modified(u, v)=depth Modified_table(u, v)/(cos (θ u) * cos (θ v));
(9): travel through all pixels and obtain through the image depth behind the data calibration Modified
More than show and described ultimate principle of the present invention and principal character and advantage of the present invention.The technician of the industry should understand; The present invention is not restricted to the described embodiments; That describes in the foregoing description and the instructions just explains principle of the present invention; Under the prerequisite that does not break away from spirit and scope of the invention, the present invention also has various changes and modifications, and these variations and improvement all fall in the scope of the invention that requires protection.The present invention requires protection domain to be defined by appending claims and equivalent thereof.

Claims (2)

1. the active optical depth measured value modification method of using based on Internet of Things is characterized in that, may further comprise the steps:
Definition one, depth map; Depth map is to be exported by depth camera, and forgives the image of target range camera range information on each orientation, and promptly it is at coordinate u, the image value f at v place (u v) representes u, on the v sight line apart from the distance value of camera closest approach,
Definition two, active optical depth figure; Having obtained of depth image is multiple, and a kind of is to utilize binocular or used for multi-vision visual to obtain the image that parallax obtains, and a kind of in addition is the method for utilizing active light to throw light on and find range then,
Definition three, index plane; A large-area plane is such as wall;
Definition four, linear interpolation; A kind of mathematics estimation method, the functional value of two points carries out Linear Estimation to the functional value of intermediate point about utilization, f ( x 2 ) = x 2 - x 0 x 1 - x 0 f ( x 1 ) + x 1 - x 2 x 1 - x 0 f ( x 0 ) For x 0 < x 2 < x 1 ,
Definition five, look-up table, a kind of data storage method becomes an array or similar data structure conveniently calculating calculative data storage,
Definition six, arc cosine method; Be used for after the angle of knowing adjacent side and adjacent side and hypotenuse, calculating the triangle hypotenuse; Promptly
Figure FSA00000582595000012
wherein a be adjacent side length; C is a hypotenuse length, and θ is the angle of adjacent side and hypotenuse.
Based on the active optical depth measured value modification method that Internet of Things is used, concrete steps are following:
(1) at first degree of depth camera is fixedly mounted on the horizontal slide rail;
(2) with the some index planes of camera perpendicular alignmnet, the typical case is like wall;
(3) camera is moved along slide rail, and note camera apart from the actual range of metope as nominal range, and the target depth image value that simultaneously camera is measured is as measuring distance;
(4) combine the field size information calculations of camera to go out the accurate distance of camera and metope on each pixel orientation through the arc cosine method;
(5) set up look-up table table, size is M * N * 2 * Mark, and M * N is the image size, and each pixel is set up look-up table in the relation that measures between distance and the actual distance on the different distance, and so far preliminary work finishes;
(6) obtain the depth map of a width of cloth through degree of depth camera collection;
(7) read pixel orientation u, the last measuring distance of v is as the x in the linear interpolation 2, and on relevant this pixel orientation among the look-up table table, promptly look-up table tabel (u, v ...) and in x 2Two immediate measured values are as x 0x 1, and read measuring value x 0x 1Nominal range as f (x 0) and f (x 1);
(8) utilize two nominal range f (x 0) and f (x 1) depth value to current x 2Measure distance and carry out the estimation f (x that linear interpolation obtains current accurate depth value 2);
(9) travel through all pixels, each pixel is repeated (7) and (8) to obtain complete Corrected Depth image;
This LUT Method that each pixel is all set up a plurality of different nominal ranges and measuring distance can be replied very accurately; And based on the method that complete measurement information is revised again need set up all pixels at the accurate look-up table corresponding relation between the degree of depth and the both sides degree of depth on each nominal depth; This theoretical method is clear; Method is simple, and the recovery precision of picture depth is higher.But a plurality of positions of each pixel of needs storage, need relatively large storage space; Through the analysis to active optical depth figure image-forming principle, most of depth map that recovers can be regarded camera coordinate system X as cY cZ cZ cThe value of coordinate direction, and in general the difference between the different images coordinate is less, and heterogeneity is lower, thereby in order to simplify storage, can only store the correction that one group of correction data can obtain quite big degree.
2. the active optical depth measured value modification method of using based on Internet of Things is characterized in that a kind of in addition modification method is following:
The distance that degree of depth camera provides is the Z axle measuring value under the camera coordinate system; But the measuring value of Z axle has discrepancy with real Z-direction coordinate figure really, need revise, and the same a kind of method of the method for correction has similar place; Such as all being index plane of selection; With camera and alignment criteria face, and then measure nominal value and the distance of video camera, owing to degree of depth camera is that the Z axle is carried out to picture apart from metope; Thereby can utilize measured value on the more index plane on average obtain the corresponding relation between actual measured value and the nominal value, set up the nominal value of one dimension and the look-up table of measured value;
Concrete steps are following:
(1): at first degree of depth camera is fixedly mounted on the horizontal slide rail;
(2): with the some index planes of camera perpendicular alignmnet, the typical case is like wall;
(3): camera is moved along slide rail, and note camera apart from the actual range of metope as nominal range, and the image value that simultaneously camera is measured is on average as measuring distance;
(4): set up one group of relevant average measurement Z cValue and nominal Z cThe look-up table table of value (being the distance of video camera) apart from metope AverageSize is 2 * Mark, and what Mark represented is the number of nominal value.
(5): obtain a width of cloth depth map depth, big or small M * N;
(6): the angular distance of calculating current pixel point distance center pixel; The calculating of angular distance need be known the visual field size and the sum of all pixels of video camera;
&theta; u = ( u - M 2 ) &times; ( fov u M ) , &theta; v = ( v - N 2 ) &times; ( fov v N )
In the following formula: fov uWith fov vExpression is the visual field size of the both direction of video camera respectively, and M and N represent video camera u respectively, the number of pixels on the v direction;
(7): data calibration, read measuring distance on each pixel orientation as the x in the linear interpolation through linear interpolation to the depth image value of input 2, and to look-up table table AverageIn with x 2Two immediate measured values are as x 0x 1, and read measuring value x 0x 1Nominal range as f (x 0) and f (x 1); Carrying out interpolation correction through linear interpolation again, to obtain measured value be x 2Corrected value f (x 2), proofread and correct back image depth as the first step Modified_table(u, value v);
(8): angularity correction, the depth value behind the 8th step process look-up table data calibration is carried out the correction of angle information: depth according to the pixel index value again Modified(u, v)=depth Modified_table(u, v)/(cos (θ u) * cos (θ v));
(9): travel through all pixels and obtain through the image depth behind the data calibration Modified
CN2011102897535A 2011-09-27 2011-09-27 Active light depth measurement value modifying method based on application of the internet of things Pending CN102436657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011102897535A CN102436657A (en) 2011-09-27 2011-09-27 Active light depth measurement value modifying method based on application of the internet of things

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011102897535A CN102436657A (en) 2011-09-27 2011-09-27 Active light depth measurement value modifying method based on application of the internet of things

Publications (1)

Publication Number Publication Date
CN102436657A true CN102436657A (en) 2012-05-02

Family

ID=45984699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011102897535A Pending CN102436657A (en) 2011-09-27 2011-09-27 Active light depth measurement value modifying method based on application of the internet of things

Country Status (1)

Country Link
CN (1) CN102436657A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105899969A (en) * 2014-01-06 2016-08-24 微软技术许可有限责任公司 Fast general multipath correction in time-of-flight imaging
WO2019075942A1 (en) * 2017-10-17 2019-04-25 深圳奥比中光科技有限公司 Method and system for correcting temperature error of depth camera
CN110868582A (en) * 2018-08-28 2020-03-06 钰立微电子股份有限公司 Image acquisition system with correction function

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008073280A2 (en) * 2006-12-07 2008-06-19 Sensormatic Electronics Corporation Method and apparatus for video surveillance system field alignment
CN101551907A (en) * 2009-04-28 2009-10-07 浙江大学 Method for multi-camera automated high-precision calibration

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008073280A2 (en) * 2006-12-07 2008-06-19 Sensormatic Electronics Corporation Method and apparatus for video surveillance system field alignment
CN101551907A (en) * 2009-04-28 2009-10-07 浙江大学 Method for multi-camera automated high-precision calibration

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QI MEI-BIN ET AL: "Moving Object Localization with Single Camera Based on Height Model in Video Surveillance", 《2007 1ST INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICAL ENGINEERING》 *
权铁汉等: "摄影测量系统的高精度标定与修正", 《自动化学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105899969A (en) * 2014-01-06 2016-08-24 微软技术许可有限责任公司 Fast general multipath correction in time-of-flight imaging
CN105899969B (en) * 2014-01-06 2018-10-19 微软技术许可有限责任公司 Universal-purpose quick multi-path correction in flight time imaging
WO2019075942A1 (en) * 2017-10-17 2019-04-25 深圳奥比中光科技有限公司 Method and system for correcting temperature error of depth camera
US11335020B2 (en) 2017-10-17 2022-05-17 Orbbec Inc. Method and system for correcting temperature error of depth camera
CN110868582A (en) * 2018-08-28 2020-03-06 钰立微电子股份有限公司 Image acquisition system with correction function
CN110868582B (en) * 2018-08-28 2022-06-03 钰立微电子股份有限公司 Image acquisition system with correction function

Similar Documents

Publication Publication Date Title
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
US20160260250A1 (en) Method and system for 3d capture based on structure from motion with pose detection tool
US20090153669A1 (en) Method and system for calibrating camera with rectification homography of imaged parallelogram
CN112880642B (en) Ranging system and ranging method
WO2015134795A2 (en) Method and system for 3d capture based on structure from motion with pose detection tool
CN110142805A (en) A kind of robot end&#39;s calibration method based on laser radar
CN112686877B (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN102831601A (en) Three-dimensional matching method based on union similarity measure and self-adaptive support weighting
CN102650886A (en) Vision system based on active panoramic vision sensor for robot
CN102800127A (en) Light stream optimization based three-dimensional reconstruction method and device
CN101900531B (en) Method for measuring and calculating binocular vision displacement measurement errors and measuring system
KR101926953B1 (en) Matching method of feature points in planar array of four - camera group and measurement method based theron
CN110889873A (en) Target positioning method and device, electronic equipment and storage medium
CN113888639B (en) Visual odometer positioning method and system based on event camera and depth camera
CN111192235A (en) Image measuring method based on monocular vision model and perspective transformation
CN108269286A (en) Polyphaser pose correlating method based on combination dimensional mark
CN105551020A (en) Method and device for detecting dimensions of target object
CN102519434A (en) Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN102436676A (en) Three-dimensional reestablishing method for intelligent video monitoring
CN105004324A (en) Monocular vision sensor with triangulation ranging function
CN104634246A (en) Floating type stereo visual measuring system and measuring method for coordinates of object space
CN108180888A (en) A kind of distance detection method based on rotating pick-up head
Cain et al. Laser based rangefinder for underwater applications
CN115410167A (en) Target detection and semantic segmentation method, device, equipment and storage medium
CN115371673A (en) Binocular camera target positioning method based on Bundle Adjustment in unknown environment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: HUNAN AIVAS SCIENCE + TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: XIA DONG

Effective date: 20130109

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 410000 CHANGSHA, HUNAN PROVINCE TO: 410205 CHANGSHA, HUNAN PROVINCE

TA01 Transfer of patent application right

Effective date of registration: 20130109

Address after: 410205 Hunan province Changsha high tech Zone Lu Jing Road No. 2 building two floor productivity wealth center

Applicant after: HUNAN LINGCHUANG INTELLIGENT SCIENCE & TECHNOLOGY CO., LTD.

Address before: 410000 Hunan province Changsha Kaifu District, No. 10 century ship road trip spring Jinyuan 3 block J

Applicant before: Xia Dong

AD01 Patent right deemed abandoned

Effective date of abandoning: 20120502

C20 Patent right or utility model deemed to be abandoned or is abandoned