CN109978957B - Binocular system calibration method based on quantum behavior particle swarm - Google Patents
Binocular system calibration method based on quantum behavior particle swarm Download PDFInfo
- Publication number
- CN109978957B CN109978957B CN201910223867.6A CN201910223867A CN109978957B CN 109978957 B CN109978957 B CN 109978957B CN 201910223867 A CN201910223867 A CN 201910223867A CN 109978957 B CN109978957 B CN 109978957B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- target
- camera
- particle swarm
- quantum
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/004—Artificial life, i.e. computing arrangements simulating life
- G06N3/006—Artificial life, i.e. computing arrangements simulating life based on simulated virtual individual or collective life forms, e.g. social simulations or particle swarm optimisation [PSO]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Health & Medical Sciences (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention belongs to the field of computer vision, and particularly relates to a binocular system calibration method based on quantum behavior particle swarm. The invention introduces a global convergence algorithm, namely a particle swarm algorithm based on quantum behaviors, into calibration calculation of a binocular system, obtains coordinates of feature points on a planar target under two camera coordinate systems based on cross ratio invariance by selecting a proper cost function, and obtains a globally optimal rotation matrix and translation matrix through the quantum behavior particle swarm algorithm according to the corresponding relation of the feature points under the two camera coordinate systems.
Description
Technical Field
The invention belongs to the field of computer vision, and particularly relates to a binocular system calibration method based on quantum behavior particle swarm.
Background
The binocular vision sensor mainly comprises two cameras, and three-dimensional measurement of characteristic points, characteristic lines and the like in a public visual field is completed based on an optical triangulation method and a three-dimensional parallax principle. The binocular vision measurement has the advantages of non-contact, high speed, good system flexibility, high measurement precision and the like, and is widely applied to the fields of three-dimensional model reconstruction, object surface contour three-dimensional information measurement, object key geometric parameter measurement and the like.
The calibration of the parameters of the measurement model of the binocular vision sensor is the key to the success of the application of the binocular vision sensor, and mainly comprises the calibration of the internal parameters of a camera and the calibration of the structural parameters of the sensor. The internal parameters of the cameras are not changed along with the structure between the two cameras, offline calibration can be performed, and the structural parameters of the sensors are different, so that the sensors are easily influenced by the installation process and need online calibration.
The existing calibration method of binocular vision sensor model parameters mainly comprises the following steps: (1) a three-dimensional targeting method based on known three-dimensional coordinates; (2) based on the known motion plane target method and the unknown motion plane target method; (3) based on an unknown motion one-dimensional target method. The three-dimensional target method can obtain high-quality calibration images only at specific positions due to the mutual influence of different planes on illumination, and the three-dimensional target has high processing difficulty and high manufacturing cost. The known motion plane target method needs auxiliary equipment such as a high-precision movable guide rail and the like, and the calibration process is complex. Based on an unknown motion plane target method, the calculation process is an iterative process for solving a nonlinear equation root, the calculation amount is large, and the calculation process is complex. Based on the unknown motion one-dimensional target method, although the precision is high and the realization is convenient, the solving process needs a plurality of times of matrix transformation and nonlinear equation iteration root solving, so that the calculation complexity is high and the calculation error is large. The existing calibration method does not need to give calibration precision and does not need to require pure translation relation between two cameras.
Disclosure of Invention
In a binocular vision measurement system, the measurement accuracy of the system is directly affected by the accuracy of a rotation matrix and a translation matrix between cameras, so it is very important to accurately acquire a relative position relationship between two cameras (generally referred to as calibration). According to the invention, a global convergence algorithm, namely a particle swarm algorithm based on quantum behaviors, is introduced into calibration calculation of a binocular system, coordinates of feature points on a planar target under two camera coordinate systems are obtained on the basis of constant cross ratio by selecting a proper cost function, and a globally optimal rotation matrix and translation matrix are obtained through the quantum behavior particle swarm algorithm according to the corresponding relation of the feature points under the two camera coordinate systems.
A binocular system calibration method based on quantum behavior particle swarm is characterized in that two or more cameras which are arranged left and right and used for shooting the same object and a target where the object is located are used;
the method comprises the following steps:
firstly, respectively establishing a left camera coordinate system, a right camera coordinate system and a left image plane coordinate system and a right image plane coordinate system; then, taking a left camera coordinate system as a global coordinate system of the binocular vision measuring system, and taking a left camera image coordinate system as an image plane coordinate system of the global coordinate system;
step two, firstly, establishing a target on a target planeWorld coordinate system O W -X W Y W Z W (ii) a Then, according to the relation between the image plane and the target plane, determining the relation between the camera coordinate system and the target world coordinate system;
calculating coordinates of the target feature points in the left camera coordinate system and the right camera coordinate system respectively;
and step four, determining the optimal conversion relation between the coordinate systems of the left camera and the right camera by a quantum-behavior-based particle swarm algorithm according to the cost function and the parameters from the step one to the step three, and finishing calibration.
In the fourth step, the particle swarm optimization based on the quantum behaviors is a global convergence algorithm.
Specifically, the binocular system calibration method based on quantum behavior particle swarm is characterized in that two or more cameras which are arranged on the left and right and used for shooting the same object and a checkerboard target where the object is located are used;
the method comprises the following steps:
step one, respectively establishing a left camera coordinate system and a right camera coordinate system which are respectively O L C -X L C Y L C Z L C And O R C -X R C Y R C Z R C ;
Setting a global coordinate system which takes a left camera coordinate system as a binocular vision measuring system to be defined as O-XYZ, and setting an image plane pi coordinate system as O-uv;
step three, firstly, establishing a target world coordinate system O on a target plane W -X W Y W Z W (ii) a Then, decomposing the object into a plurality of space characteristic points K, wherein the space characteristic points K are in a coordinate system O W -X W Y W Z W The coordinates below are M = (X) W ,Y W ,Z W );
Projecting the spatial characteristic point K onto an image plane pi to obtain the spatial characteristic point K, wherein p = (u, v) is obtained in an image coordinate system o-uv;
step five, according to the camera projection model, the method comprises the following steps:
wherein f is x Normalized focal length in the x-axis direction of the image, f y The normalized focal length in the y-axis direction of the image is shown, and (u 0, v 0) is the image coordinate of a principal point of the camera, R is a rotation matrix from a target world coordinate system to a camera coordinate system, and t' is a translation matrix from the target world coordinate system to the camera coordinate system;
step six, with the help of a target with accurately known dimensions, taking checkerboard angular points in the target as a plurality of characteristic points K;
seventhly, obtaining image point coordinates of the corner points by using a Harris corner point extraction method;
step eight, calibrating the camera by using a 2D plane target-based camera calibration method of Zhangyingyou according to the formula (1), obtaining internal parameters of the camera and external parameters R, t' from a camera coordinate system to a target world coordinate system, and obtaining three-dimensional coordinates of the checkerboard corner points under the left and right camera coordinate systems, wherein the internal parameters comprise internal parameters including principal points (u) and (u) are determined by using the internal parameters 0 ,v 0 ) And normalized focal length f x And f y ;
Step nine, establishing an objective function equation, namely f = | RT × P l -P r L, where RT is the rotation and translation matrix of the left camera to the right camera, P l For the expression of the characteristic point K in the left camera coordinate system, P r For the expression of the feature point K in a right camera coordinate system, a rotation matrix in the RT is represented by adopting hyper-parameters, namely 9 variables are adopted to replace 6 degrees of freedom of the rotation matrix, so that the conversion matrix comprises a rotation matrix and a translation matrix; the transformation matrix has 12 degrees of freedom;
and step ten, in the 12-dimensional target search space of the conversion matrix in the step nine, performing global search based on a quantum-behavior particle swarm algorithm to obtain a globally optimal solution of the conversion matrix.
In the tenth step, a method for carrying out global search by using a particle swarm algorithm based on quantum behaviors is to place spatial feature points K as particles in quantum physics, adopt a motion mode of a fluctuation function, and simultaneously determine the probability of the occurrence of M at a set position by a probability density function given by the fluctuation function in a quantum behavior particle swarm model;
the method specifically comprises the following steps;
step A, randomly initializing the current position of a particle, and setting the current position as the optimal position of the optimal solution representing the conversion matrix;
step B, solving a population X (t) consisting of M particles and N-dimensional spatial particle positions X for M potential problems in an N = 12-dimensional target search space for an objective function f (X) i (t) represents, i.e.
Calculating according to a formula (3) to obtain the average best position of the particle swarm;
step C, obtaining the evolutionary equation of the particles according to the formula (3) as
X i,j (t+1)=p i,j (t)αgC j (t)-X i,j (t)|gln[1/u i,j (t)]; u i,j (t)~U(0,1), (3)
Wherein the content of the first and second substances,a is the coefficient of contraction expansion, j =1, 2.., M; u (0, 1) is X in (0, 1) i,j (t) mean distribution of obeys;
updating the best position of the particle individuals in the step B and the adaptive value of the current position of the calculated particle according to the formula (4);
step D, comparing the adaptive value of the particles in the step C with the global optimal position in the step A, and updating according to a formula (4) to obtain the adaptive value
Wherein f (x) is an objective function,
obtaining the individual best position Pi (t) of the particle i according to the formula (5);
step E, obtaining the product according to the formula (5)
Calculating the new position of the particle according to the formula (6);
and F, iterating the steps A-E to obtain an optimal rotation matrix and an optimal translation matrix, and finally obtaining the global optimal position G (t) of the group by a formula (6).
The invention has the advantages that:
the method disclosed by the invention has the advantages of accurate and effective calibration result, lower sensitivity to image noise and good robustness.
Drawings
FIG. 1: a coordinate system schematic diagram of the binocular measurement system;
FIG. 2: a checkerboard target object graph;
FIG. 3: and 4, a calibration process flow chart.
Detailed Description
As shown in fig. 1-3, a binocular system calibration method based on quantum behavior particle swarm specifically includes the following steps:
(1) Establishment of a coordinate system
As shown in FIG. 1, the left and right camera coordinate systems are O L C -X L C Y L C Z L C And O R C -X R C Y R C Z R C (ii) a In general, a left camera coordinate system is taken as a global coordinate system of a binocular vision measuring system to be defined as O-XYZ, and an image plane pi coordinate system is O-uv; setting the existence of a spatial characteristic point K in a target world coordinate system O W -X W Y W Z W The coordinates below are M = (X) W ,Y W ,Z W ) Projected onto an image plane pi, and the spatial feature point K is p = (u, v) in an image coordinate system o-uv, then according to a camera projection model:
wherein f is x Normalized focal length in the x-axis direction of the image, f y Normalized focal length for the y-axis direction of the image, (u) 0 ,v 0 ) The coordinate of the principal point image is shown, R is a rotation matrix from a target world coordinate system to a camera coordinate system, and t' is a translation matrix from the target world coordinate system to the camera coordinate system;
(2) Extraction of calibration target feature points
The calibration process adopts a checkerboard target (as shown in fig. 2), wherein checkerboard corner points in the target are used as a plurality of characteristic points K, and the size of each checkerboard is accurately known. And obtaining the image point coordinates of the corner points by a Harris corner point extraction method. Calibrating a camera by using a 2D plane target-based camera calibration method of Zhang Zhengyou to obtain internal parameters of the camera and external parameters R, t 'under a target world coordinate system, wherein R is a rotation matrix from the target world coordinate system to a camera coordinate system, and t' is a translation matrix from the target world coordinate system to the camera coordinate system, so that a checkerboard corner point M = (X is defined as the angular point M =) () W ,Y W ,Z W ) Three-dimensional coordinates under the coordinate systems of the left camera and the right camera; wherein the internal parameters include an internal parameter including a principal point (u) 0 ,v 0 ) And a normalized focal length f x And f y ;
(3) Particle swarm algorithm based on quantum behaviors
The main idea of the particle swarm optimization algorithm is derived from research on bird group behaviors, and the purpose of finding the optimal solution is achieved through the mechanisms of evolution and information sharing among bird groups. However, due to the evolutionary limitation in the basic particle algorithm (PSO), it is easy to fall into the local optimal solution, so from the perspective of quantum mechanics, sunjun et al propose a new PSO algorithm model, namely, the quantum-behaved particle swarm optimization (QPSO) algorithm. The algorithm places particles in quantum physics, replaces the motion mode of the particles in Newton space with a fluctuation function, and determines the probability of the particles appearing at a certain position through a probability density function given by the fluctuation function in a QPSO model. The QPSO algorithm enables particles to have global searching capability, and the performance is greatly improved compared with the traditional PSO.
For a population X (t) of M potential problem solutions in an N-dimensional target search space of an objective function (f (X)) that may consist of M particles and an N-dimensional spatial particle position X i (t) represents, i.e.
The evolution equation of the particle is
X i,j (t+1)=p i,j (t)αg|C j (t)-X i,j (t)|gln[1/u i,j (t)];u i,j (t)~U(0,1), (8)
a is the coefficient of contraction and expansion, j =1, 2.., M; u (0, 1) is X in (0, 1) i,j (t) mean distribution of obedience;
the individual optimum position Pi (t) of the particle i is determined by the following equation
Wherein f (x) is an objective function; the global best position G (t) of the population is then determined by
The particle swarm optimization of quantum behaviors can well search the global optimal point without participation of an initial value.
(4) Acquisition based on QPSO rotation matrix and translation matrix
Firstly, a left camera coordinate system and a right camera coordinate system are established, and coordinates of corresponding points of the space characteristic points in the two camera coordinate systems are obtained.
Secondly, an objective function equation is established,i.e. f = | RT × P l -P r Where RT is the left-to-right camera rotation and translation matrix, P l For the expression of the feature points in the left camera coordinate system, P r Is the expression of the feature points in the right camera coordinate system. Here we use hyper-parametric representation instead of minimized parametric representation for the rotation matrix, i.e. although the rotation matrix consists of 6 degrees of freedom, we still use 9 variables because it is an efficient strategy to discard the parameter redundancy. Thus, there are a total of 12 degrees of freedom for the rotation matrix and the translation matrix;
thirdly, in a 12-dimensional target search space required by the transformation matrix (including the rotation matrix and the translation matrix), the method introduced in (3) is used for carrying out global search to obtain a global optimal solution of the transformation matrix, and the specific steps are as follows:
I. the current position of the particle is randomly initialized and set to the optimal position. The position is a 12-dimensional space variable, and the optimal position represents the optimal solution of the conversion matrix.
II. Calculating the average best position of the particle swarm according to equation (7)
III, calculating an adaptive value of the current position of the particle, and updating the best position of the particle individual according to a formula (8)
And IV, comparing the adaptive value of the particle with the global optimal position, and updating. Meanwhile, a random position is calculated according to formula (9) for each dimension of the particle, and a new position of the particle is calculated according to formula (10)
And V, iteration is carried out to obtain an optimal rotation matrix and an optimal translation matrix. The flow of the calibration process is shown in fig. 3.
Claims (2)
1. A binocular system calibration method based on quantum behavior particle swarm is characterized in that: the method comprises the following steps of utilizing two or more cameras which are arranged left and right and are used for shooting the same object and a target where the object is located;
the method comprises the following steps:
firstly, respectively establishing a left camera coordinate system and a right camera coordinate system; then, taking a left camera coordinate system as a global coordinate system and an image plane coordinate system based on the global coordinate system;
firstly, setting a target coordinate system on a plane where the target is located; then, according to the relation between the image plane and the target plane, the relation between the camera coordinate system and the target coordinate system is determined;
calculating the coordinates of the target feature points in the left and right camera coordinate systems;
determining the optimal conversion relation between the coordinate systems of the left camera and the right camera by a quantum-behavior-based particle swarm algorithm according to the cost function and the parameters from the step one to the step three, and completing calibration;
in the fourth step, the particle swarm algorithm based on the quantum behaviors is a global convergence algorithm;
two or more than two cameras which are arranged at the left and the right and used for shooting the same object and a checkerboard target where the object is located are used;
the method comprises the following steps:
step one, respectively establishing a left camera coordinate system and a right camera coordinate system which are respectivelyAnd
setting a global coordinate system which takes a left camera coordinate system as a binocular vision measuring system to be defined as O-XYZ, and setting a pi coordinate system of an image plane to be O-uv;
step three, firstly, establishing a target world coordinate system O on a target plane W -X W Y W Z W (ii) a Then, decomposing the object into a plurality of spatial characteristic points K, wherein the spatial characteristic points K are in a target world coordinate system OW-X W Y W Z W The coordinates below are M = (X) W ,Y W ,Z W );
Projecting the spatial characteristic point K onto an image plane pi to obtain the spatial characteristic point K, wherein p = (u, v) is obtained in an image coordinate system o-uv;
step five, according to the camera projection model, the method comprises the following steps:
wherein f is x Normalized focal length in the x-axis direction of the image, f y Normalized focal length for the y-axis direction of the image, (u) 0 ,v 0 ) The method comprises the following steps of (1) taking a principal point image coordinate of a camera, R taking a rotation matrix from a target world coordinate system to a camera coordinate system, and t' taking a translation matrix from the target world coordinate system to the camera coordinate system;
step six, with the help of a target with a known size, taking the checkerboard angular points in the target as a plurality of characteristic points K;
seventhly, obtaining image point coordinates of the corner points by using a Harris corner point extraction method;
step eight, calibrating the camera by using a 2D plane target-based camera calibration method of Zhangyingyou according to the formula (1), obtaining internal parameters of the camera and external parameters R, t' under a target world coordinate system, and obtaining three-dimensional coordinates of the checkerboard corner points under the left and right camera coordinate systems, wherein the internal parameters comprise internal parameters including principal points (u) 0 ,v 0 ) And a normalized focal length f x And f y ;
Step nine, establishing an objective function equation, namely f = | RT × P l -P r L, where RT is the rotation and translation matrix of the left camera to the right camera, P l Is the expression of the characteristic point K in the left camera coordinate system, P r For the expression of the feature point K in a right camera coordinate system, a rotation matrix in the RT is represented by adopting hyper-parameters, namely 9 variables are adopted to replace 6 degrees of freedom of the rotation matrix, so that the conversion matrix comprises a rotation matrix and a translation matrix; the transformation matrix has 12 degrees of freedom;
and step ten, in the 12-dimensional target search space of the conversion matrix in the step nine, performing global search based on a quantum-behavior particle swarm algorithm to obtain a globally optimal solution of the conversion matrix.
2. The binocular system calibration method based on the quantum behavior particle swarm of claim 1, wherein: in the tenth step, a particle swarm algorithm based on quantum behaviors carries out global search by taking spatial feature points K as particles and placing the spatial feature points K in quantum physics, adopting a motion mode of a fluctuation function, and meanwhile, in a quantum behavior particle swarm model, determining the probability of the occurrence of M at a set position through a probability density function given by the fluctuation function;
the method specifically comprises the following steps;
step A, randomly initializing the current position of a particle, and setting the current position as the optimal position of the optimal solution representing the conversion matrix;
step B, solving a population X (t) consisting of M particles and N-dimensional spatial particle positions X for M potential problems in an N = 12-dimensional target search space for an objective function f (X) i (t) represents, i.e.
Calculating according to a formula (3) to obtain the average best position of the particle swarm;
step C, obtaining an evolution equation of the particles according to the formula (3) as
X i,j (t+1)=p i,j (t)αg|C j (t)-X i,j (t)|gln[1/u i,j (t)];u i,j (t)~U(0,1), (3)
a is the coefficient of contraction and expansion, j =1, 2.., M; u (0, 1) is X in (0, 1) i,j (t) mean distribution of obeys;
updating the best position of the particle individuals in the step B and the adaptive value of the current position of the calculated particle according to the formula (4);
step D, comparing the adaptive value of the particles in the step C with the global optimal position in the step A, and updating according to a formula (4) to obtain the adaptive value
Wherein f (x) is an objective function,
obtaining the individual best position Pi (t) of the particle i according to the formula (5);
step E, obtaining the compound according to the formula (5)
Calculating the new position of the particle according to the formula (6);
and F, iterating the steps A-E to obtain an optimal rotation matrix and an optimal translation matrix, and finally obtaining the global optimal position G (t) of the group by a formula (6).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910223867.6A CN109978957B (en) | 2019-03-22 | 2019-03-22 | Binocular system calibration method based on quantum behavior particle swarm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910223867.6A CN109978957B (en) | 2019-03-22 | 2019-03-22 | Binocular system calibration method based on quantum behavior particle swarm |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109978957A CN109978957A (en) | 2019-07-05 |
CN109978957B true CN109978957B (en) | 2023-01-31 |
Family
ID=67080149
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910223867.6A Active CN109978957B (en) | 2019-03-22 | 2019-03-22 | Binocular system calibration method based on quantum behavior particle swarm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109978957B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110598834A (en) * | 2019-09-19 | 2019-12-20 | 吉林大学 | Binocular vision detection system structure optimization method |
CN112381874B (en) * | 2020-11-04 | 2023-12-12 | 北京大华旺达科技有限公司 | Calibration method and device based on machine vision |
CN115100365B (en) * | 2022-08-25 | 2023-01-20 | 国网天津市电力公司高压分公司 | Camera optimal baseline acquisition method based on particle swarm optimization |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101436073A (en) * | 2008-12-03 | 2009-05-20 | 江南大学 | Wheeled mobile robot trace tracking method based on quantum behavior particle cluster algorithm |
CN102509304A (en) * | 2011-11-24 | 2012-06-20 | 江南大学 | Intelligent optimization-based camera calibration method |
CN104182982A (en) * | 2014-08-27 | 2014-12-03 | 大连理工大学 | Overall optimizing method of calibration parameter of binocular stereo vision camera |
CN107662211A (en) * | 2017-10-16 | 2018-02-06 | 西北工业大学 | A kind of robot for space forecast Control Algorithm based on quanta particle swarm optimization |
CN109343345A (en) * | 2018-09-28 | 2019-02-15 | 江南大学 | Mechanical arm polynomial interopolation method for planning track based on QPSO algorithm |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102008042356A1 (en) * | 2008-09-25 | 2010-04-08 | Carl Zeiss Smt Ag | Projection exposure system with optimized adjustment option |
-
2019
- 2019-03-22 CN CN201910223867.6A patent/CN109978957B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101436073A (en) * | 2008-12-03 | 2009-05-20 | 江南大学 | Wheeled mobile robot trace tracking method based on quantum behavior particle cluster algorithm |
CN102509304A (en) * | 2011-11-24 | 2012-06-20 | 江南大学 | Intelligent optimization-based camera calibration method |
CN104182982A (en) * | 2014-08-27 | 2014-12-03 | 大连理工大学 | Overall optimizing method of calibration parameter of binocular stereo vision camera |
CN107662211A (en) * | 2017-10-16 | 2018-02-06 | 西北工业大学 | A kind of robot for space forecast Control Algorithm based on quanta particle swarm optimization |
CN109343345A (en) * | 2018-09-28 | 2019-02-15 | 江南大学 | Mechanical arm polynomial interopolation method for planning track based on QPSO algorithm |
Non-Patent Citations (1)
Title |
---|
"基于粒子群算法的双目立体视觉系统标定";廉小磊等;《计算机工程与应用》;20111231;第47卷(第24期);正文第1-4页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109978957A (en) | 2019-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107301654B (en) | Multi-sensor high-precision instant positioning and mapping method | |
CN109360240B (en) | Small unmanned aerial vehicle positioning method based on binocular vision | |
Lv et al. | LCCNet: LiDAR and camera self-calibration using cost volume network | |
CN110969668B (en) | Stereo calibration algorithm of long-focus binocular camera | |
CN105300316B (en) | Optical losses rapid extracting method based on grey scale centre of gravity method | |
CN109978957B (en) | Binocular system calibration method based on quantum behavior particle swarm | |
CN109323650B (en) | Unified method for measuring coordinate system by visual image sensor and light spot distance measuring sensor in measuring system | |
CN109146935B (en) | Point cloud registration method and device, electronic equipment and readable storage medium | |
CN109579695B (en) | Part measuring method based on heterogeneous stereoscopic vision | |
CN109341668B (en) | Multi-camera measuring method based on refraction projection model and light beam tracking method | |
CN109272537A (en) | A kind of panorama point cloud registration method based on structure light | |
CN111415379A (en) | Three-dimensional point cloud data registration method based on cuckoo optimization | |
CN113091608A (en) | Digital speckle correlation rapid implementation method based on grid extraction seed points | |
CN111998862A (en) | Dense binocular SLAM method based on BNN | |
Zheng et al. | Registration of optical images with LiDAR data and its accuracy assessment | |
Wei et al. | Optimization of 3-D pose measurement method based on binocular vision | |
CN108898629B (en) | Projection coding method for enhancing aerial luggage surface texture in three-dimensional modeling | |
CN112525106B (en) | Three-phase machine cooperative laser-based 3D detection method and device | |
CN117197333A (en) | Space target reconstruction and pose estimation method and system based on multi-view vision | |
CN111429571B (en) | Rapid stereo matching method based on spatio-temporal image information joint correlation | |
WO2018214179A1 (en) | Low-dimensional bundle adjustment calculation method and system | |
Xiong et al. | Automatic three-dimensional reconstruction based on four-view stereo vision using checkerboard pattern | |
CN112630469B (en) | Three-dimensional detection method based on structured light and multiple light field cameras | |
CN114299477A (en) | Vehicle vision positioning method, system, equipment and readable storage medium | |
Yang et al. | Improved calibration method of binocular vision measurement system for large hot forging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |