CN113112545B - Handheld mobile printing device positioning method based on computer vision - Google Patents
Handheld mobile printing device positioning method based on computer vision Download PDFInfo
- Publication number
- CN113112545B CN113112545B CN202110403051.9A CN202110403051A CN113112545B CN 113112545 B CN113112545 B CN 113112545B CN 202110403051 A CN202110403051 A CN 202110403051A CN 113112545 B CN113112545 B CN 113112545B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- camera
- printing device
- printing
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Abstract
The invention discloses a positioning method of a handheld mobile printing device based on computer vision, which comprises the following implementation processes: constructing a positioning scene; (2) establishing a world coordinate system; (3) determining a printing plane; (4) establishing a printing coordinate system; (5) calculating a coordinate system transformation matrix; (6) calculating a pose transformation matrix; and (7) calculating the position and posture of the printing device in real time. The invention realizes the positioning by using a single camera and a simple algorithm and realizes the real-time positioning of the handheld mobile printing device with lower cost.
Description
Technical Field
The invention belongs to the field of space positioning, and further relates to a handheld mobile printing device positioning method based on computer vision in the field of space positioning. The invention is used for holding the mobile printing device, the camera on the printing device continuously acquires images, and the real-time positioning of the printing device can be realized through processing the images.
Background
Different from traditional printer, handheld removal printer small in size, the portability is good, printable rectangular shape pattern, owing to do not have locate function, if will print the great pattern of breadth, then can only rely on the manual concatenation of experience to print, often the concatenation print effect is not good. If the handheld mobile printer can obtain high-frequency high-precision positioning information, a required image can be printed according to the position of the printing head, and high-precision handheld mobile printing can be carried out without limitation of breadth and limitation of a frame.
The existing positioning technologies are mainly divided into wireless positioning and optical positioning. The wireless positioning technology comprises Bluetooth positioning, wifi positioning, ultra-bandwidth positioning and the like. Optical positioning techniques include infrared positioning and computer vision positioning. Wherein the penetrability of wireless location is higher but the positioning accuracy is lower, and the accuracy of optical location is higher but the penetrability is lower. The computer vision positioning can realize the high-precision positioning in a large range in the camera view field and is suitable for positioning the handheld mobile printing device.
The existing computer vision positioning method for the handheld mobile printing device is realized by performing three-dimensional reconstruction on a marker through an external binocular or multi-view camera. The method comprises the steps of placing mark points on the handheld mobile printing device, shooting the mark points from different visual angles by a plurality of cameras, detecting the mark points on an image, then acquiring depth information of the mark points by utilizing a triangulation principle, and reconstructing a three-dimensional model of the mark points through the depth information, thereby realizing three-dimensional positioning of the handheld mobile printing device and acquiring pose information of the handheld mobile printing device. The positioning method of the multi-view reconstruction has the advantages that the calculation amount required for processing each frame of image is large, so the image frames processed in a fixed time are few, the positioning frequency is low, meanwhile, at least two cameras are required for positioning, and the positioning cost is high.
Disclosure of Invention
The invention aims to provide a positioning method of a handheld mobile printing device based on computer vision, aiming at overcoming the defects of the prior art, and solving the problems of low positioning frequency and high cost caused by a large amount of calculation and a large number of required cameras in the prior positioning technology.
The invention is realized by the idea that the pose of the camera under the world coordinate system is calculated according to the shooting geometric relation between the pixel points and the points in the world coordinate system by utilizing the image information acquired by the monocular camera on the handheld mobile printing device, the pose of the printing head of the handheld mobile printing device under the printing coordinate system can be obtained through a series of coordinate system transformation, the steps are carried out every time one frame of image is obtained, the pose of the printing device is calculated, and the real-time positioning of the handheld mobile printing device is realized.
The specific steps for realizing the purpose of the invention are as follows:
(1) Constructing a positioning scene:
placing the marking pattern in front of a camera on the handheld mobile printing device so that the marking pattern is within the field angle range of the camera;
(2) Establishing a world coordinate system:
taking any point on the marked pattern as an origin, taking a normal vector of a plane where the marked pattern is located as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a world coordinate system;
(3) Determining a printing plane:
(3a) If the printing area is rectangular, shooting the marking patterns at three corner points of the rectangular printing area by using a camera of the printing device;
(3b) If the printing area is non-rectangular, shooting the marking patterns at any non-collinear three points in the non-rectangular printing area by using a camera of the printing device;
(4) Establishing a printing coordinate system:
(4a) If the printing area is rectangular, using corner points adjacent to other two points in the three corner points selected in the step (3 a) as an origin O 1 The other two points are respectively marked as A 1 And B 1 ,A 1 And B 1 Is located at a position of B 1 In a directed line segmentTo the left of the advancing direction, willAndrespectively as X axis and Y axis, will be beatenPrinting a normal vector of a plane as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(4b) If the print area is non-rectangular, any one of the non-collinear three points selected in step (3 b) is set as the origin O 2 The other two points are respectively marked as A 2 And B 2 ,A 2 And B 2 Is located at a position of B 2 In a directed line segmentOn the left side of the advancing direction, the normal vector of the printing plane is taken as the Z axisTaking the cross multiplication result of the X axis and the Z axis as the Y axis as the X axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(5) Calculating a coordinate system transformation matrix;
(5a) Calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting;
(5b) Calculating Euclidean distance between every two points of the position of the camera optical center in the world coordinate system, calculating the position of the camera optical center of the printing device in the printing coordinate system at each shooting according to the Euclidean distance and the height H from the camera optical center to the printing plane, and recording the position of the camera in O if the printing area is rectangular 1 、A 1 、B 1 When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W 11 、W 12 、W 13 Then the non-homogeneous coordinates of the camera optical center in the printing coordinate system are respectivelyIf the printing area is non-rectangular, the camera is recorded at O 2 、A 2 、B 2 When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W 21 、W 22 、W 23 Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively Wherein, | | · | represents the Euclidean distance solving operation, and θ representsAndthe included angle between them;
(5c) Calculating a coordinate system transformation matrix from a world coordinate system to a printing coordinate system according to the positions of the optical centers of the cameras under the two coordinate systems during each shooting;
(6) Calculating a pose transformation matrix:
(6a) Taking the end point of a nozzle of the printing device as an original point, taking the ink outlet direction of the nozzle as an X axis, taking the nozzle arrangement direction as a Y axis, and establishing a three-dimensional rectangular coordinate system as a coordinate system of the printing device;
(6b) Constructing a pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device according to the Euler angle and the offset between the coordinate system of the camera and the coordinate system of the printing device;
(7) Calculating the pose of the printing device in real time:
(7a) Calculating the pose of a camera on the printing device in the world coordinate system by adopting the same method as the step (5 a);
(7b) Calculating the pose of the camera on the printing device in the printing coordinate system according to the coordinate system transformation matrix obtained in the step (5 c);
(7c) And (4) calculating the position and the posture of the printing device in the printing coordinate system according to the posture transformation matrix obtained in the step (6 b) by adopting the same method as the step (7 b).
Compared with the prior art, the invention has the following advantages:
first, the invention can position the hand-held mobile printing device by only a single camera, thus overcoming the problem of the prior art that the number of cameras is large, and having the advantages of less number of cameras and low cost of cameras.
Secondly, because the invention positions according to the pose of the camera without carrying on complicated reconstruction algorithm, has overcome the problem that the prior art calculates the quantity and causes the low positioning frequency greatly, because the calculated quantity that processes each frame of picture is smaller than prior art, therefore the same chip can process more picture frames in the fixed time, thus obtain the positioning information of higher frequency, make the invention have advantage that the positioning frequency is high.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of an embodiment of the present invention.
Detailed Description
The invention will now be further described with reference to the accompanying drawings and examples.
The specific implementation steps of the present invention are further described with reference to fig. 1.
Step 1, constructing a positioning scene.
The marking pattern is placed in front of a camera on the handheld mobile printing device so that the marking pattern is within the field angle of the camera. The mark pattern is any one of a checkerboard, a two-dimensional code, concentric rings and a gray pattern. A camera refers to a camera whose intrinsic parameter matrix is known and whose captured images are all subjected to a distortion removal process, the intrinsic parameter matrix being as follows,
wherein, K represents an internal parameter matrix of the camera, f represents a lens focal length value of the camera, and m and n represent the offset of a camera principal point on an x axis and a y axis respectively in a pixel coordinate system.
And 2, establishing a world coordinate system.
And taking any point on the marked pattern as an origin, taking a normal vector of a plane where the marked pattern is positioned as a Z axis, and establishing a three-dimensional rectangular coordinate system as a world coordinate system.
And step 3, determining a printing plane.
3.1 In the case where the printing area is rectangular, the marking pattern is photographed at three corner points of the rectangular printing area by a camera of the printing apparatus, respectively.
3.2 If the print area is non-rectangular, the camera of the printing device is used to capture the marker pattern at any three non-collinear points within the non-rectangular print area.
And 4, establishing a printing coordinate system.
If the printing area is rectangular, the corner points adjacent to other two points in the three corner points selected in the step 3.1) are taken as the original points O 1 The other two points are respectively marked as A 1 And B 1 ,A 1 And B 1 Is located at a position of B 1 In a directed line segmentTo the left of the advancing direction, willAndrespectively serving as an X axis and a Y axis, taking a normal vector of a printing plane as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
if the printing area is non-rectangular, any one of the non-collinear three points selected in step 3.2) is taken as an origin O 2 And the other two points are respectively marked as A 2 And B 2 ,A 2 And B 2 Is located at a position of B 2 In a directed line segmentOn the left side of the advancing direction, the printing plane normal vector is taken as the Z axisAs an X axis, taking the cross multiplication result of the X axis and the Z axis as a Y axis, establishing a right-handed three-dimensional rectangular coordinate system as a printing coordinateIs a step of;
and 5, calculating a coordinate system transformation matrix.
5.1 Utilizing x) ij =KR i [I|-C i ]X ij And calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting. Wherein x is ij The 3 x 1-dimensional homogeneous pixel coordinate of the jth mark point in the ith shooting is represented, i =1, 2, 3,0 is more than or equal to j and less than or equal to n, n is more than or equal to 4, K represents a 3 x 3-dimensional camera internal parameter matrix, R i A 3 × 3 dimensional rotation matrix representing the ith shooting camera in the world coordinate system, I represents a 3 × 3 dimensional identity matrix, | represents a matrix blocking operation, C i Non-homogeneous world coordinates of 3X 1 dimension representing camera optical center at i-th shot, X ij Representing the homogeneous world coordinate of 4X 1 dimension of the jth mark point in the ith shooting by the establishment method of the world coordinate system, X ij Has a Z component of 0,x ij And X ij One-to-one correspondence, using not less than 4 pairs of mark points and rotation matrix R obtained in each shot i The orthogonal property of (2) can solve for the camera optical center position C i And camera pose R i 。
5.2 Computing Euclidean distance between every two points of the camera optical center in the world coordinate system, computing the position of the camera optical center of the printing device in the printing coordinate system at each shooting according to the Euclidean distance and the height H from the camera optical center to the printing plane, and if the printing area is rectangular, recording the position of the camera in O 1 、A 1 、B 1 When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W 11 、W 12 、W 13 Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectivelyIf the printing area is non-rectangular, the camera is recorded at O 2 、A 2 、B 2 When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W 21 、W 22 、W 23 Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively correspondingIs composed of Wherein, | | · | represents the Euclidean distance solving operation, and θ representsAndthe included angle therebetween;
5.3 By X) p =T wp X w And calculating a coordinate system transformation matrix from the world coordinate system to the printing coordinate system according to the formula and the positions of the optical centers of the cameras in the two coordinate systems during each shooting. Wherein, X p Homogeneous coordinate vector, T, representing 4 x 1 dimensions of the camera's optical center in the print coordinate system wp A 4 x 4 dimensional Euclidean transformation matrix representing a world coordinate system to a print coordinate system, the Euclidean transformation matrix T being resolvable using at least three point pairs wp ,X w A homogeneous coordinate vector representing the 4 x 1 dimension of the camera's optical center in the world coordinate system.
And 6, calculating a pose transformation matrix.
And establishing a three-dimensional rectangular coordinate system as a printing device coordinate system by taking the end point of the nozzle of the printing device as an original point, taking the ink outlet direction of the nozzle as an X axis and the nozzle arrangement direction as a Y axis.
Using a formula based on the Euler angle and offset between the camera coordinate system and the printing device coordinate systemAnd constructing a pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device. Wherein, T 4×4 Expressing a 4 x 4 dimensional Euclidean transformation matrix, R 3×3 A rotation matrix representing 3 x 3 dimensions, which can be derived from the Euler angle between the camera coordinate system and the printing device coordinate system, t 3×1 Horizontal migration representing 3 x 1 dimensionsThe horizontal offset vector is the horizontal offset vector between the camera coordinate system and the printing device coordinate system.
And 7, calculating the pose of the printing device in real time.
7.1 Adopting the same method as the step 5.1), calculating the pose of the camera on the printing device in a world coordinate system;
7.2 By transforming the matrix according to a coordinate system, using the position calculation formula X b =TX a And attitude calculation formula R b =RR a And calculating the pose of the camera on the printing device in the printing coordinate system. Wherein, X b A homogeneous coordinate vector representing the transformed 4 × 1-dimensional representation position, T represents a 4 × 4-dimensional Euclidean transform matrix, X a Homogeneous coordinate vector, R, representing position in 4 x 1 dimensions before transformation a A matrix for representing the 3 × 3 dimension representing posture after transformation, R represents a 3 × 3 dimension rotation transformation matrix, the R matrix is a 3 × 3 dimension block matrix at the upper left of the T matrix, R a The matrix representing the 3 × 3-dimensional representation of the pose before transformation may be transformed according to the transformation matrix T and R therein.
7.3 In the same way as in step 7.2), the position and attitude of the printing apparatus in the printing coordinate system are calculated from the pose transformation matrix.
The implementation of the present invention is further described with reference to fig. 2 in conjunction with the embodiment of the present invention.
Fig. 2 (a) is a schematic diagram of a world coordinate system including a marker pattern constructed in step 2 of the present invention, wherein 1 denotes the marker pattern, 2 denotes the world coordinate system, and 3 denotes a marker point on the marker pattern.
Fig. 2 (b) is a schematic diagram of a printing coordinate system including a handheld mobile printing device constructed in step 4 of the present invention, where 4 denotes a printing area, 5 denotes a printing coordinate system, 6 denotes the handheld mobile printing device, 7 denotes a camera on the handheld mobile printing device, and 8 denotes a field angle of the camera on the handheld mobile printing device.
The marker is placed in front of a camera on a handheld mobile printing device with the marker pattern within the field of view of the camera. The shape of the marker is not necessarily rectangular, and the rectangular marker is used to facilitate calculation of the marker point position, and the rectangular marker is used as an example here. And taking the lower left corner of the rectangular marking pattern as an origin, taking two sides of the rectangle as an X axis and a Y axis respectively, taking a normal vector of the marker as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a world coordinate system. Since the position of the marker point on the marker pattern is known, the position of the marker point in the world coordinate system can be calculated.
After the camera shoots the marker, the position of the marker point on the picture and the position of the marker point in the world coordinate system form a pair of points, and the position of the camera optical center in the world coordinate system can be calculated by utilizing not less than 4 pairs of points. If the printing area is rectangular and the rectangle accords with the characteristic that the coordinate axes are mutually vertical, the corner point O of the rectangle is 1 、A 1 、B 1 Taking an image of the marker, and 1 as an origin, willAndand respectively serving as an X axis and a Y axis, and taking a normal vector of a printing plane as a Z axis to establish a printing coordinate system. Then, the Euclidean distance between the optical centers in each shooting is calculated, and the position of the optical center in the printing coordinate system in each shooting is calculated by using the Euclidean distance and the height from the optical center to the printing plane. If the print area is non-rectangular, then at O 2 、A 2 、B 2 Taking a picture of the marker, taking the normal vector of the printing plane as the Z axis, and taking the picture as the Z axisAs the X-axis, the result of cross-multiplication of the X-axis and the Z-axis is taken as the Y-axis, and a print coordinate system is established. Then, the Euclidean distance between the optical centers at each shooting is calculated, and the position of the optical center in the printing coordinate system at each shooting is calculated by using the Euclidean distance and the height from the optical center to the printing plane. After three times of shooting, the positions of the optical centers of the cameras in the two coordinate systems form three point pairs, and a coordinate system transformation matrix can be calculated according to the three point pairs. Will coordinateThe system transformation matrix is applied to the camera pose in the world coordinate system to obtain the camera pose in the printing coordinate system. Since the positional relationship between the camera and the printing apparatus is fixed at the time of manufacture, the pose transformation matrix can be determined from the euler angle and the offset between the camera and the printing apparatus, and the pose of the printing apparatus can be obtained by applying the pose transformation matrix to the pose of the camera.
Claims (7)
1. A handheld mobile printing device positioning method based on computer vision is characterized in that three coordinate systems are respectively established, a camera on the device is used for shooting a marking pattern, the camera pose is calculated, and a coordinate system transformation matrix and a pose transformation matrix are used for calculating the pose of a printing device, and the method comprises the following steps:
(1) Constructing a positioning scene:
placing the marking pattern in front of a camera on the handheld mobile printing device so that the marking pattern is within the field angle range of the camera;
(2) Establishing a world coordinate system:
taking any point on the marked pattern as an origin, taking a normal vector of a plane where the marked pattern is located as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a world coordinate system;
(3) Determining a printing plane:
(3a) If the printing area is rectangular, shooting the marking patterns at three corner points of the rectangular printing area by using a camera of the printing device;
(3b) If the printing area is non-rectangular, shooting the marking patterns at any non-collinear three points in the non-rectangular printing area by using a camera of the printing device;
(4) Establishing a printing coordinate system:
(4a) If the printing area is rectangular, using corner points adjacent to other two points in the three corner points selected in the step (3 a) as an origin O 1 The other two points are respectively marked as A 1 And B 1 ,A 1 And B 1 Is located at a position of B 1 In a directed line segmentTo the left of the advancing direction, willAndrespectively serving as an X axis and a Y axis, taking a normal vector of a printing plane as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(4b) If the print area is non-rectangular, any one of the non-collinear three points selected in step (3 b) is set as the origin O 2 The other two points are respectively marked as A 2 And B 2 ,A 2 And B 2 Is located at a position of B 2 In a directed line segmentOn the left side of the advancing direction, the normal vector of the printing plane is taken as the Z axisTaking the cross multiplication result of the X axis and the Z axis as a Y axis as an X axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(5) Calculating a coordinate system transformation matrix;
(5a) Calculating the position of the camera optical center of the printing device in a world coordinate system during each shooting;
(5b) Calculating Euclidean distance between every two points of the position of the camera optical center in the world coordinate system, calculating the position of the camera optical center of the printing device in the printing coordinate system at each shooting according to the Euclidean distance and the height H from the camera optical center to the printing plane, and recording the position of the camera in O if the printing area is rectangular 1 、A 1 、B 1 When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W 11 、W 12 、W 13 Then the non-homogeneous coordinates of the camera optical center in the printing coordinate system are respectivelyIf the print area is non-rectangular, the camera is recorded at O 2 、A 2 、B 2 When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W 21 、W 22 、W 23 Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectivelyWhere, represents the Euclidean distance operation, and θ representsAndthe included angle between them;
(5c) Calculating a coordinate system transformation matrix from a world coordinate system to a printing coordinate system according to the positions of the optical centers of the cameras under the two coordinate systems during each shooting;
(6) Calculating a pose transformation matrix:
(6a) Taking the end point of a nozzle of the printing device as an original point, taking the ink outlet direction of the nozzle as an X axis, taking the nozzle arrangement direction as a Y axis, and establishing a right-hand three-dimensional rectangular coordinate system as a coordinate system of the printing device;
(6b) Constructing a pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device according to the Euler angle and the offset between the coordinate system of the camera and the coordinate system of the printing device;
(7) Calculating the pose of the printing device in real time:
(7a) Calculating the pose of a camera on the printing device in the world coordinate system by adopting the same method as the step (5 a);
(7b) Calculating the pose of the camera on the printing device in the printing coordinate system according to the coordinate system transformation matrix obtained in the step (5 c);
(7c) And (4) calculating the position and the posture of the printing device in the printing coordinate system according to the posture transformation matrix obtained in the step (6 b) by adopting the same method as the step (7 b).
2. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the marking pattern in the step (1) refers to any one of a checkerboard, a two-dimensional code, concentric rings and a gray pattern.
3. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the camera in the step (1) is a camera with a known intrinsic parameter matrix and the taken images are all subjected to distortion removal treatment, wherein the intrinsic parameter matrix is as follows
Wherein, K represents an internal parameter matrix of the camera, f represents a lens focal length value of the camera, and m and n respectively represent the offset of a camera principal point on an X axis and a Y axis in a pixel coordinate system.
4. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the step (5 a) of calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting time is to use x ij =KR i [I|-C i ]X ij Is obtained by the formula, wherein x ij The 3 x 1-dimensional homogeneous pixel coordinate of the jth mark point in the ith shooting is represented, i =1, 2, 3,0 is more than or equal to j and less than or equal to n, n is more than or equal to 4, K represents a 3 x 3-dimensional camera internal parameter matrix, R i A 3 × 3 dimensional rotation matrix representing the ith shooting camera in the world coordinate system, I represents a 3 × 3 dimensional identity matrix, | represents a matrix blocking operation, C i Non-homogeneous world coordinates of 3X 1 dimension representing camera optical center at i-th shot, X ij Homogeneous world coordinates of 4 × 1 dimension, X, representing the jth mark point at the ith shot ij Has a Z component of 0,x ij And X ij One by oneCorrespondingly, not less than 4 pairs of mark points and a rotation matrix R obtained in each shooting are utilized i Can solve for the camera optical center position C i And camera pose R i 。
5. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: calculating a coordinate system transformation matrix from the world coordinate system to the printing coordinate system in the step (5 c) by using X p =T wp X w Is obtained by the formula, wherein X p Homogeneous coordinate vector, T, representing 4 x 1 dimensions of the camera's optical center in the print coordinate system wp A 4 x 4 dimensional Euclidean transformation matrix representing a world coordinate system to a print coordinate system, the Euclidean transformation matrix T being resolvable using at least three point pairs wp ,X w A homogeneous coordinate vector representing the 4 x 1 dimension of the camera's optical center in the world coordinate system.
6. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the step (6 b) of constructing the pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device uses a formulaIs implemented, wherein T 4×4 Expressing a 4 x 4 dimensional Euclidean transformation matrix, R 3×3 A rotation matrix representing 3 x 3 dimensions, which can be derived from the Euler angle between the camera coordinate system and the printing device coordinate system, t 3×1 A horizontal offset vector representing 3 x 1 dimensions, which is the horizontal offset vector between the camera coordinate system and the printing device coordinate system.
7. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the step (7 b) of calculating the pose of the camera on the printing device in the printing coordinate system is to use the position calculation formula X b =TX a And attitude calculation formula R b =RR a Obtained wherein X b A homogeneous coordinate vector representing the transformed 4 × 1-dimensional representation position, T represents a 4 × 4-dimensional Euclidean transform matrix, X a Homogeneous coordinate vector, R, representing position in 4 x 1 dimensions before transformation a A matrix representing the transformed 3 × 3-dimensional representation attitude, R representing a 3 × 3-dimensional rotation transformation matrix, the R matrix being a 3 × 3-dimensional block matrix at the top left of the T matrix, R a The matrix representing the 3 × 3-dimensional representation of the pose before transformation may be transformed according to the transformation matrix T and R therein.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110403051.9A CN113112545B (en) | 2021-04-15 | 2021-04-15 | Handheld mobile printing device positioning method based on computer vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110403051.9A CN113112545B (en) | 2021-04-15 | 2021-04-15 | Handheld mobile printing device positioning method based on computer vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113112545A CN113112545A (en) | 2021-07-13 |
CN113112545B true CN113112545B (en) | 2023-03-21 |
Family
ID=76716953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110403051.9A Active CN113112545B (en) | 2021-04-15 | 2021-04-15 | Handheld mobile printing device positioning method based on computer vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113112545B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114606541B (en) * | 2022-03-15 | 2023-03-24 | 南通大学 | Two-dimensional structure micro-nano scale rapid printing system and method based on glass microprobe |
CN116080290A (en) * | 2022-12-29 | 2023-05-09 | 上海魅奈儿科技有限公司 | Three-dimensional high-precision fixed-point printing method and device |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015096806A1 (en) * | 2013-12-29 | 2015-07-02 | 刘进 | Attitude determination, panoramic image generation and target recognition methods for intelligent machine |
CN106101689A (en) * | 2016-06-13 | 2016-11-09 | 西安电子科技大学 | Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses |
CN107449402A (en) * | 2017-07-31 | 2017-12-08 | 清华大学深圳研究生院 | A kind of measuring method of the relative pose of noncooperative target |
CN107977996A (en) * | 2017-10-20 | 2018-05-01 | 西安电子科技大学 | Space target positioning method based on target calibrating and positioning model |
CN109685913A (en) * | 2018-12-21 | 2019-04-26 | 西安电子科技大学 | Augmented reality implementation method based on computer vision positioning |
CN109895392A (en) * | 2019-02-15 | 2019-06-18 | 上海幂方电子科技有限公司 | A method of operating coordinates are marked and are accurately positioned in equipment working region |
CN110281664A (en) * | 2019-07-11 | 2019-09-27 | 森大(深圳)技术有限公司 | Print media positioning printing method, device, equipment, medium and flat-panel printer |
CN110686650A (en) * | 2019-10-29 | 2020-01-14 | 北京航空航天大学 | Monocular vision pose measuring method based on point characteristics |
CN111590899A (en) * | 2020-04-27 | 2020-08-28 | 蒋青 | Vision auxiliary positioning device for mechanical arm 3D printing and positioning method thereof |
CN111673735A (en) * | 2020-04-28 | 2020-09-18 | 平安科技(深圳)有限公司 | Mechanical arm control method and device based on monocular vision positioning |
CN111791589A (en) * | 2020-09-10 | 2020-10-20 | 季华实验室 | Positioning detection method and device based on ink-jet printer, electronic equipment and medium |
CN111862238A (en) * | 2020-07-23 | 2020-10-30 | 中国民航大学 | Full-space monocular light pen type vision measurement method |
CN111968128A (en) * | 2020-07-10 | 2020-11-20 | 北京航空航天大学 | Unmanned aerial vehicle visual attitude and position resolving method based on image markers |
CN112066879A (en) * | 2020-09-11 | 2020-12-11 | 哈尔滨工业大学 | Air floatation motion simulator pose measuring device and method based on computer vision |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130259403A1 (en) * | 2012-04-03 | 2013-10-03 | Oluwatosin Osinusi | Flexible easy-to-use system and method of automatically inserting a photorealistic view of a two or three dimensional object into an image using a cd,dvd or blu-ray disc |
US9508146B2 (en) * | 2012-10-31 | 2016-11-29 | The Boeing Company | Automated frame of reference calibration for augmented reality |
WO2019090487A1 (en) * | 2017-11-07 | 2019-05-16 | 大连理工大学 | Highly dynamic wide-range any-contour-error monocular six-dimensional measurement method for numerical control machine tool |
-
2021
- 2021-04-15 CN CN202110403051.9A patent/CN113112545B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015096806A1 (en) * | 2013-12-29 | 2015-07-02 | 刘进 | Attitude determination, panoramic image generation and target recognition methods for intelligent machine |
CN106101689A (en) * | 2016-06-13 | 2016-11-09 | 西安电子科技大学 | Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses |
CN107449402A (en) * | 2017-07-31 | 2017-12-08 | 清华大学深圳研究生院 | A kind of measuring method of the relative pose of noncooperative target |
CN107977996A (en) * | 2017-10-20 | 2018-05-01 | 西安电子科技大学 | Space target positioning method based on target calibrating and positioning model |
CN109685913A (en) * | 2018-12-21 | 2019-04-26 | 西安电子科技大学 | Augmented reality implementation method based on computer vision positioning |
CN109895392A (en) * | 2019-02-15 | 2019-06-18 | 上海幂方电子科技有限公司 | A method of operating coordinates are marked and are accurately positioned in equipment working region |
CN110281664A (en) * | 2019-07-11 | 2019-09-27 | 森大(深圳)技术有限公司 | Print media positioning printing method, device, equipment, medium and flat-panel printer |
CN110686650A (en) * | 2019-10-29 | 2020-01-14 | 北京航空航天大学 | Monocular vision pose measuring method based on point characteristics |
CN111590899A (en) * | 2020-04-27 | 2020-08-28 | 蒋青 | Vision auxiliary positioning device for mechanical arm 3D printing and positioning method thereof |
CN111673735A (en) * | 2020-04-28 | 2020-09-18 | 平安科技(深圳)有限公司 | Mechanical arm control method and device based on monocular vision positioning |
CN111968128A (en) * | 2020-07-10 | 2020-11-20 | 北京航空航天大学 | Unmanned aerial vehicle visual attitude and position resolving method based on image markers |
CN111862238A (en) * | 2020-07-23 | 2020-10-30 | 中国民航大学 | Full-space monocular light pen type vision measurement method |
CN111791589A (en) * | 2020-09-10 | 2020-10-20 | 季华实验室 | Positioning detection method and device based on ink-jet printer, electronic equipment and medium |
CN112066879A (en) * | 2020-09-11 | 2020-12-11 | 哈尔滨工业大学 | Air floatation motion simulator pose measuring device and method based on computer vision |
Non-Patent Citations (4)
Title |
---|
Accuracy Analysis of Alignment Methods based on Reference Features for Robot-Based Optical Inspection Systems;Philipp Bauer等;《Procedia CIRP》;20201231;第93卷;1115-1120 * |
Inertial measurement unit aided extrinsic parameters calibration for stereo vision systems;Weiwu Feng等;《Optics and Lasers in Engineering》;20201130;第134卷;1-12 * |
基于增强现实的精准术前规划技术研究;张品;《中国优秀硕士学位论文全文数据库 医药卫生科技辑》;20200815;第2020年卷(第8期);E066-200 * |
面向自然特征的增强现实中高精度目标跟踪注册研究;张华等;《长春工程学院学报(自然科学版)》;20210331;第22卷(第1期);69-73 * |
Also Published As
Publication number | Publication date |
---|---|
CN113112545A (en) | 2021-07-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11039121B2 (en) | Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method | |
KR101666959B1 (en) | Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor | |
CN107886547B (en) | Fisheye camera calibration method and system | |
CN113112545B (en) | Handheld mobile printing device positioning method based on computer vision | |
CN110163912B (en) | Two-dimensional code pose calibration method, device and system | |
CN110956660B (en) | Positioning method, robot, and computer storage medium | |
CN106485753B (en) | The method and apparatus of camera calibration for pilotless automobile | |
CN107578450B (en) | Method and system for calibrating assembly error of panoramic camera | |
JP7218435B2 (en) | CALIBRATION DEVICE, CALIBRATION CHART AND CALIBRATION METHOD | |
CN113329179B (en) | Shooting alignment method, device, equipment and storage medium | |
CN111524195B (en) | Camera calibration method in positioning of cutting head of heading machine | |
CN112541973B (en) | Virtual-real superposition method and system | |
CN107977996A (en) | Space target positioning method based on target calibrating and positioning model | |
CN110838164A (en) | Monocular image three-dimensional reconstruction method, system and device based on object point depth | |
WO2022040983A1 (en) | Real-time registration method based on projection marking of cad model and machine vision | |
CN102914295A (en) | Computer vision cube calibration based three-dimensional measurement method | |
CN111768449A (en) | Object grabbing method combining binocular vision with deep learning | |
CA3233222A1 (en) | Method, apparatus and device for photogrammetry, and storage medium | |
CN114998447A (en) | Multi-view vision calibration method and system | |
CN112017302A (en) | Real-time registration method of projection mark and machine vision based on CAD model | |
CN109949249B (en) | Cylindrical image correction method and system | |
CN112950528A (en) | Certificate posture determining method, model training method, device, server and medium | |
CN111741223B (en) | Panoramic image shooting method, device and system | |
CN112288824B (en) | Device and method for calibrating tele camera based on real scene | |
KR101673144B1 (en) | Stereoscopic image registration method based on a partial linear method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |