CN113112545A - Handheld mobile printing device positioning method based on computer vision - Google Patents

Handheld mobile printing device positioning method based on computer vision Download PDF

Info

Publication number
CN113112545A
CN113112545A CN202110403051.9A CN202110403051A CN113112545A CN 113112545 A CN113112545 A CN 113112545A CN 202110403051 A CN202110403051 A CN 202110403051A CN 113112545 A CN113112545 A CN 113112545A
Authority
CN
China
Prior art keywords
coordinate system
camera
printing device
printing
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110403051.9A
Other languages
Chinese (zh)
Other versions
CN113112545B (en
Inventor
姜光
朱家辉
陈浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN202110403051.9A priority Critical patent/CN113112545B/en
Publication of CN113112545A publication Critical patent/CN113112545A/en
Application granted granted Critical
Publication of CN113112545B publication Critical patent/CN113112545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a positioning method of a handheld mobile printing device based on computer vision, which comprises the following implementation processes: (1) constructing a positioning scene; (2) establishing a world coordinate system; (3) determining a printing plane; (4) establishing a printing coordinate system; (5) calculating a coordinate system transformation matrix; (6) calculating a pose transformation matrix; (7) and calculating the position and posture of the printing device in real time. The invention realizes the positioning by using a single camera and a simple algorithm and realizes the real-time positioning of the handheld mobile printing device with lower cost.

Description

Handheld mobile printing device positioning method based on computer vision
Technical Field
The invention belongs to the field of space positioning, and further relates to a handheld mobile printing device positioning method based on computer vision in the field of space positioning. The invention is used for holding the mobile printing device, the camera on the printing device continuously acquires images, and the real-time positioning of the printing device can be realized through processing the images.
Background
Different from traditional printer, handheld removal printer small in size, the portability is good, printable rectangular shape pattern, owing to do not have locate function, if will print the great pattern of breadth, then can only rely on the manual concatenation of experience to print, often the concatenation print effect is not good. If the handheld mobile printer can obtain high-frequency high-precision positioning information, a required image can be printed according to the position of the printing head, and high-precision handheld mobile printing can be carried out without limitation of breadth and limitation of a frame.
The existing positioning technologies are mainly divided into wireless positioning and optical positioning. The wireless positioning technology comprises Bluetooth positioning, wifi positioning, ultra-bandwidth positioning and the like. Optical positioning techniques include infrared positioning and computer vision positioning. Wherein the penetrability of wireless location is higher but the positioning accuracy is lower, and the accuracy of optical location is higher but the penetrability is lower. The computer vision positioning can realize the high-precision positioning in a large range in the camera view field and is suitable for positioning the handheld mobile printing device.
The existing computer vision positioning method for the handheld mobile printing device is realized by performing three-dimensional reconstruction on a marker through an external binocular or multi-view camera. The method comprises the steps of placing mark points on the handheld mobile printing device, shooting the mark points from different visual angles by a plurality of cameras, detecting the mark points on an image, then acquiring depth information of the mark points by utilizing a triangulation principle, and reconstructing a three-dimensional model of the mark points through the depth information, thereby realizing three-dimensional positioning of the handheld mobile printing device and acquiring pose information of the handheld mobile printing device. The positioning method for multi-view reconstruction requires a large amount of calculation for processing each frame of image, so that the image frames processed in a fixed time are fewer, the positioning frequency is often low, meanwhile, at least two cameras are needed for positioning, and the positioning cost is high.
Disclosure of Invention
The invention aims to provide a positioning method of a handheld mobile printing device based on computer vision, aiming at overcoming the defects of the prior art, and solving the problems of low positioning frequency and high cost caused by a large amount of calculation and a large number of required cameras in the prior positioning technology.
The invention is realized by the idea that the pose of the camera under the world coordinate system is calculated according to the shooting geometric relation between the pixel points and the points in the world coordinate system by utilizing the image information acquired by the monocular camera on the handheld mobile printing device, the pose of the printing head of the handheld mobile printing device under the printing coordinate system can be obtained through a series of coordinate system transformation, the steps are carried out every time one frame of image is obtained, the pose of the printing device is calculated, and the real-time positioning of the handheld mobile printing device is realized.
The specific steps for realizing the purpose of the invention are as follows:
(1) constructing a positioning scene:
placing the marking pattern in front of a camera on the handheld mobile printing device so that the marking pattern is within the field angle range of the camera;
(2) establishing a world coordinate system:
taking any point on the marked pattern as an origin, taking a normal vector of a plane where the marked pattern is located as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a world coordinate system;
(3) determining a printing plane:
(3a) if the printing area is rectangular, shooting the marking patterns at three corner points of the rectangular printing area by using a camera of the printing device;
(3b) if the printing area is non-rectangular, shooting the marking patterns at any non-collinear three points in the non-rectangular printing area by using a camera of the printing device;
(4) establishing a printing coordinate system:
(4a) if the printing area is rectangular, using corner points adjacent to other two points in the three corner points selected in the step (3a) as an origin O1The other two points are respectively marked as A1And B1,A1And B1Is located at a position of B1In a directed line segment
Figure BDA0003021125900000021
To the left of the advancing direction, will
Figure BDA0003021125900000022
And
Figure BDA0003021125900000023
normal to the printing plane as X-axis and Y-axis, respectivelyTaking the quantity as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(4b) if the print area is non-rectangular, any one of the non-collinear three points selected in step (3b) is set as the origin O2The other two points are respectively marked as A2And B2,A2And B2Is located at a position of B2In a directed line segment
Figure BDA0003021125900000024
On the left side of the advancing direction, the normal vector of the printing plane is taken as the Z axis
Figure BDA0003021125900000025
Taking the cross multiplication result of the X axis and the Z axis as the Y axis as the X axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(5) calculating a coordinate system transformation matrix;
(5a) calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting;
(5b) calculating Euclidean distance between every two points of the position of the camera optical center in the world coordinate system, calculating the position of the camera optical center of the printing device in the printing coordinate system at each shooting according to the Euclidean distance and the height H from the camera optical center to the printing plane, and recording the position of the camera in O if the printing area is rectangular1、A1、B1When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W11、W12、W13Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively
Figure BDA0003021125900000031
If the print area is non-rectangular, the camera is recorded at O2、A2、B2When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W21、W22、W23Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively
Figure BDA0003021125900000032
Figure BDA0003021125900000033
Wherein, | | · | represents the Euclidean distance solving operation, and θ represents
Figure BDA0003021125900000034
And
Figure BDA0003021125900000035
the included angle between them;
(5c) calculating a coordinate system transformation matrix from a world coordinate system to a printing coordinate system according to the positions of the optical centers of the cameras under the two coordinate systems during each shooting;
(6) calculating a pose transformation matrix:
(6a) taking the end point of a nozzle of the printing device as an original point, taking the ink outlet direction of the nozzle as an X axis, taking the nozzle arrangement direction as a Y axis, and establishing a three-dimensional rectangular coordinate system as a coordinate system of the printing device;
(6b) constructing a pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device according to the Euler angle and the offset between the coordinate system of the camera and the coordinate system of the printing device;
(7) calculating the pose of the printing device in real time:
(7a) calculating the pose of a camera on the printing device in the world coordinate system by adopting the same method as the step (5 a);
(7b) calculating the pose of the camera on the printing device in the printing coordinate system according to the coordinate system transformation matrix obtained in the step (5 c);
(7c) and (4) calculating the position and the posture of the printing device in the printing coordinate system according to the posture transformation matrix obtained in the step (6b) by adopting the same method as the step (7 b).
Compared with the prior art, the invention has the following advantages:
first, the invention can position the hand-held mobile printing device by only a single camera, thus overcoming the problem of the prior art that the number of cameras is large, and having the advantages of less number of cameras and low cost of cameras.
Secondly, because the invention positions according to the pose of the camera without carrying on complicated reconstruction algorithm, has overcome the problem that the prior art calculates the quantity and causes the low positioning frequency greatly, because the calculated quantity that processes each frame of picture is smaller than prior art, therefore the same chip can process more picture frames in the fixed time, thus obtain the positioning information of higher frequency, make the invention have advantage that the positioning frequency is high.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of an embodiment of the present invention.
Detailed Description
The invention will now be further described with reference to the accompanying drawings and examples.
The specific implementation steps of the present invention are further described with reference to fig. 1.
Step 1, constructing a positioning scene.
The marking pattern is placed in front of a camera on the handheld mobile printing device so that the marking pattern is within the field angle of the camera. The mark pattern is any one of a checkerboard, a two-dimensional code, concentric rings and a gray pattern. A camera refers to a camera whose intrinsic parameter matrix is known and whose captured images are all subjected to a distortion removal process, the intrinsic parameter matrix being as follows,
Figure BDA0003021125900000041
wherein, K represents an internal parameter matrix of the camera, f represents a lens focal length value of the camera, and m and n represent the offset of a camera principal point on an x axis and a y axis respectively in a pixel coordinate system.
And 2, establishing a world coordinate system.
And taking any point on the marked pattern as an origin, taking a normal vector of a plane where the marked pattern is positioned as a Z axis, and establishing a three-dimensional rectangular coordinate system as a world coordinate system.
And step 3, determining a printing plane.
3.1) if the printing area is rectangular, shooting the mark pattern at three corner points of the rectangular printing area by a camera of the printing device.
3.2) if the print area is non-rectangular, the camera of the printing device is used to shoot the mark pattern at any three non-collinear points in the non-rectangular print area.
And 4, establishing a printing coordinate system.
If the printing area is rectangular, the corner points adjacent to other two points in the three corner points selected in the step 3.1) are taken as the original points O1The other two points are respectively marked as A1And B1,A1And B1Is located at a position of B1In a directed line segment
Figure BDA0003021125900000051
To the left of the advancing direction, will
Figure BDA0003021125900000052
And
Figure BDA0003021125900000053
respectively serving as an X axis and a Y axis, taking a normal vector of a printing plane as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
if the printing area is non-rectangular, any one of the non-collinear three points selected in step 3.2) is taken as an origin O2The other two points are respectively marked as A2And B2,A2And B2Is located at a position of B2In a directed line segment
Figure BDA0003021125900000054
On the left side of the advancing direction, the printing plane normal vector is taken as the Z axis
Figure BDA0003021125900000055
Taking the cross multiplication result of the X axis and the Z axis as the Y axis as the X axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
and 5, calculating a coordinate system transformation matrix.
5.1) Using xij=KRi[I|-Ci]XijAnd calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting. Wherein x isijThe homogeneous pixel coordinate of 3 multiplied by 1 dimension of the jth mark point in the ith shooting is represented, i is 1, 2 and 3, j is more than or equal to 0 and less than or equal to n, n is more than or equal to 4, K represents a camera internal parameter matrix of 3 multiplied by 3 dimension, R isiA 3 × 3 dimensional rotation matrix representing the ith shooting camera in the world coordinate system, I represents a 3 × 3 dimensional identity matrix, | represents a matrix blocking operation, CiNon-homogeneous world coordinates of 3X 1 dimension representing camera optical center at i-th shot, XijRepresenting the homogeneous world coordinate of 4X 1 dimension of the jth mark point in the ith shooting by the establishment method of the world coordinate system, XijZ component of (2) is 0, xijAnd XijOne-to-one correspondence, using not less than 4 pairs of mark points and rotation matrix R obtained in each shotiThe orthogonal property of (2) can solve for the camera optical center position CiAnd camera pose Ri
5.2) calculating the Euclidean distance between every two points of the position of the optical center of the camera in the world coordinate system, calculating the position of the optical center of the camera of the printing device in the printing coordinate system at each shooting according to the Euclidean distance and the height H from the optical center of the camera to the printing plane, and recording the position of the camera in O if the printing area is rectangular1、A1、B1When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W11、W12、W13Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively
Figure BDA0003021125900000061
If the print area is non-rectangular, the camera is recorded at O2、A2、B2When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W21、W22、W23Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively
Figure BDA0003021125900000062
Figure BDA0003021125900000063
Wherein, | | · | represents the Euclidean distance solving operation, and θ represents
Figure BDA0003021125900000064
And
Figure BDA0003021125900000065
the included angle between them;
5.3) by Xp=TwpXwAnd calculating a coordinate system transformation matrix from the world coordinate system to the printing coordinate system according to the formula and the positions of the optical centers of the cameras in the two coordinate systems during each shooting. Wherein, XpHomogeneous coordinate vector, T, representing 4 x 1 dimensions of the camera's optical center in the print coordinate systemwpA 4 x 4 dimensional Euclidean transformation matrix representing a world coordinate system to a print coordinate system, the Euclidean transformation matrix T being resolvable using at least three point pairswp,XwA homogeneous coordinate vector representing the 4 x 1 dimension of the camera's optical center in the world coordinate system.
And 6, calculating a pose transformation matrix.
And establishing a three-dimensional rectangular coordinate system as a printing device coordinate system by taking the end point of the nozzle of the printing device as an original point, taking the ink outlet direction of the nozzle as an X axis and the nozzle arrangement direction as a Y axis.
Using a formula based on the Euler angle and offset between the camera coordinate system and the printing device coordinate system
Figure BDA0003021125900000066
And constructing a pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device. Wherein, T4×4Expressing a 4 x 4 dimensional Euclidean transformation matrix, R3×3A rotation matrix representing 3 x 3 dimensions, which can be derived from the Euler angle between the camera coordinate system and the printing device coordinate system, t3×1Represents a horizontal offset vector of 3 x 1 dimensions, the horizontalThe offset vector is a horizontal offset vector between the camera coordinate system and the printing device coordinate system.
And 7, calculating the pose of the printing device in real time.
7.1) calculating the pose of the camera on the printing device in the world coordinate system by adopting the same method as the step 5.1);
7.2) transforming the matrix according to the coordinate system, using the position calculation formula Xb=TXaAnd attitude calculation formula Rb=RRaAnd calculating the pose of the camera on the printing device in the printing coordinate system. Wherein, XbA homogeneous coordinate vector representing the transformed 4 × 1-dimensional representation position, T represents a 4 × 4-dimensional Euclidean transform matrix, XaHomogeneous coordinate vector, R, representing position in 4 x 1 dimensions before transformationaA matrix representing the transformed 3 × 3-dimensional representation attitude, R representing a 3 × 3-dimensional rotation transformation matrix, the R matrix being a 3 × 3-dimensional block matrix at the top left of the T matrix, RaThe matrix representing the 3 × 3-dimensional representation of the pose before transformation may be transformed according to the transformation matrix T and R therein.
7.3) calculating the position and the posture of the printing device in the printing coordinate system according to the posture transformation matrix by adopting the same method as the step 7.2).
The implementation of the present invention is further described with reference to fig. 2 in conjunction with the embodiment of the present invention.
Fig. 2(a) is a schematic diagram of a world coordinate system including a marker pattern constructed in step 2 of the present invention, wherein 1 denotes the marker pattern, 2 denotes the world coordinate system, and 3 denotes a marker point on the marker pattern.
Fig. 2(b) is a schematic diagram of a printing coordinate system including a handheld mobile printing device constructed in step 4 of the present invention, where 4 denotes a printing area, 5 denotes a printing coordinate system, 6 denotes the handheld mobile printing device, 7 denotes a camera on the handheld mobile printing device, and 8 denotes a field angle of the camera on the handheld mobile printing device.
The marker is placed in front of a camera on a handheld mobile printing device with the marker pattern within the field of view of the camera. The marker shape need not be rectangular, and the use of a rectangular marker facilitates calculation of marker point positions, and a rectangular marker is exemplified here. And taking the lower left corner of the rectangular marking pattern as an origin, taking two sides of the rectangle as an X axis and a Y axis respectively, taking a normal vector of the marker as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a world coordinate system. Since the position of the marker point on the marker pattern is known, the position of the marker point in the world coordinate system can be calculated.
After the camera shoots the marker, the position of the marker point on the picture and the position of the marker point in the world coordinate system form a pair of points, and the position of the camera optical center in the world coordinate system can be calculated by utilizing not less than 4 pairs of points. If the printing area is rectangular and the rectangle accords with the characteristic that the coordinate axes are mutually vertical, the corner point O of the rectangle is1、A1、B1Taking an image of the marker, adding O1As an origin, will
Figure BDA0003021125900000071
And
Figure BDA0003021125900000072
and respectively serving as an X axis and a Y axis, and taking a normal vector of a printing plane as a Z axis to establish a printing coordinate system. Then, the Euclidean distance between the optical centers at each shooting is calculated, and the position of the optical center in the printing coordinate system at each shooting is calculated by using the Euclidean distance and the height from the optical center to the printing plane. If the print area is non-rectangular, then at O2、A2、B2Taking a picture of the marker, taking the normal vector of the printing plane as the Z axis, and taking the picture as the Z axis
Figure BDA0003021125900000073
As the X-axis, the result of cross-multiplication of the X-axis and the Z-axis is taken as the Y-axis, and a print coordinate system is established. Then, the Euclidean distance between the optical centers at each shooting is calculated, and the position of the optical center in the printing coordinate system at each shooting is calculated by using the Euclidean distance and the height from the optical center to the printing plane. After three times of shooting, the positions of the optical centers of the cameras in the two coordinate systems form three point pairs, and a coordinate system transformation matrix can be calculated according to the three point pairs. Transforming the coordinate system into a matrixThe camera pose applied to the world coordinate system can obtain the camera pose under the printing coordinate system. Since the positional relationship between the camera and the printing apparatus is fixed at the time of manufacture, the pose transformation matrix can be determined from the euler angle and the offset between the camera and the printing apparatus, and the pose of the printing apparatus can be obtained by applying the pose transformation matrix to the pose of the camera.

Claims (7)

1. A handheld mobile printing device positioning method based on computer vision is characterized in that three coordinate systems are respectively established, a camera on the device is used for shooting a marking pattern, the camera pose is calculated, and a coordinate system transformation matrix and a pose transformation matrix are used for calculating the pose of a printing device, and the method comprises the following steps:
(1) constructing a positioning scene:
placing the marking pattern in front of a camera on the handheld mobile printing device so that the marking pattern is within the field angle range of the camera;
(2) establishing a world coordinate system:
taking any point on the marked pattern as an origin, taking a normal vector of a plane where the marked pattern is located as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a world coordinate system;
(3) determining a printing plane:
(3a) if the printing area is rectangular, shooting the marking patterns at three corner points of the rectangular printing area by using a camera of the printing device;
(3b) if the printing area is non-rectangular, shooting the marking patterns at any non-collinear three points in the non-rectangular printing area by using a camera of the printing device;
(4) establishing a printing coordinate system:
(4a) if the printing area is rectangular, using corner points adjacent to other two points in the three corner points selected in the step (3a) as an origin O1The other two points are respectively marked as A1And B1,A1And B1Is located at a position of B1In a directed line segment
Figure FDA0003021125890000011
To the left of the advancing direction, will
Figure FDA0003021125890000012
And
Figure FDA0003021125890000013
respectively serving as an X axis and a Y axis, taking a normal vector of a printing plane as a Z axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(4b) if the print area is non-rectangular, any one of the non-collinear three points selected in step (3b) is set as the origin O2The other two points are respectively marked as A2And B2,A2And B2Is located at a position of B2In a directed line segment
Figure FDA0003021125890000014
On the left side of the advancing direction, the normal vector of the printing plane is taken as the Z axis
Figure FDA0003021125890000015
Taking the cross multiplication result of the X axis and the Z axis as the Y axis as the X axis, and establishing a right-hand three-dimensional rectangular coordinate system as a printing coordinate system;
(5) calculating a coordinate system transformation matrix;
(5a) calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting;
(5b) calculating Euclidean distance between every two points of the position of the camera optical center in the world coordinate system, calculating the position of the camera optical center of the printing device in the printing coordinate system at each shooting according to the Euclidean distance and the height H from the camera optical center to the printing plane, and recording the position of the camera in O if the printing area is rectangular1、A1、B1When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W11、W12、W13Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively
Figure FDA0003021125890000021
If the print area is non-rectangular, the camera is recorded at O2、A2、B2When in shooting, the heterogeneous coordinates of 3 multiplied by 1 dimension of the optical center of the camera in the world coordinate system are respectively W21、W22、W23Then the non-homogeneous coordinates of the optical center of the camera in the printing coordinate system are respectively
Figure FDA0003021125890000022
Figure FDA0003021125890000023
Wherein, | | · | represents the Euclidean distance solving operation, and θ represents
Figure FDA0003021125890000024
And
Figure FDA0003021125890000025
the included angle between them;
(5c) calculating a coordinate system transformation matrix from a world coordinate system to a printing coordinate system according to the positions of the optical centers of the cameras under the two coordinate systems during each shooting;
(6) calculating a pose transformation matrix:
(6a) taking the end point of a nozzle of the printing device as an original point, taking the ink outlet direction of the nozzle as an X axis, taking the nozzle arrangement direction as a Y axis, and establishing a right-hand three-dimensional rectangular coordinate system as a coordinate system of the printing device;
(6b) constructing a pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device according to the Euler angle and the offset between the coordinate system of the camera and the coordinate system of the printing device;
(7) calculating the pose of the printing device in real time:
(7a) calculating the pose of a camera on the printing device in the world coordinate system by adopting the same method as the step (5 a);
(7b) calculating the pose of the camera on the printing device in the printing coordinate system according to the coordinate system transformation matrix obtained in the step (5 c);
(7c) and (4) calculating the position and the posture of the printing device in the printing coordinate system according to the posture transformation matrix obtained in the step (6b) by adopting the same method as the step (7 b).
2. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the marking pattern in the step (1) refers to any one of a checkerboard, a two-dimensional code, concentric rings and a gray pattern.
3. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the camera in the step (1) is a camera with a known intrinsic parameter matrix and the taken images are all subjected to distortion removal treatment, wherein the intrinsic parameter matrix is as follows
Figure FDA0003021125890000031
Wherein, K represents an internal parameter matrix of the camera, f represents a lens focal length value of the camera, and m and n respectively represent the offset of a camera principal point on an x axis and a y axis in a pixel coordinate system.
4. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the step (5a) of calculating the position of the optical center of the camera of the printing device in the world coordinate system at each shooting time is to use xij=KRi[I|-Ci]XijIs obtained by the formula, wherein xijThe homogeneous pixel coordinate of 3 multiplied by 1 dimension of the jth mark point in the ith shooting is represented, i is 1, 2 and 3, j is more than or equal to 0 and less than or equal to n, n is more than or equal to 4, K represents a camera internal parameter matrix of 3 multiplied by 3 dimension, R isiA 3 × 3 dimensional rotation matrix representing the ith shooting camera in the world coordinate system, I represents a 3 × 3 dimensional identity matrix, | represents a matrix blocking operation, CiNon-homogeneous world coordinates of 3X 1 dimension representing camera optical center at i-th shot, XijHomogeneous world coordinates of 4 × 1 dimension, X, representing the jth mark point at the ith shotijZ component of (2) is 0, xijAnd XijOne-to-one correspondence, using not less than 4 pairs of mark points and rotation matrix R obtained in each shotiThe orthogonal property of (2) can solve for the camera optical center position CiAnd camera pose Ri
5. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: calculating a coordinate system transformation matrix from the world coordinate system to the printing coordinate system in the step (5c) by using Xp=TwpXwIs obtained by the formula, wherein XpHomogeneous coordinate vector, T, representing 4 x 1 dimensions of the camera's optical center in the print coordinate systemwpA 4 x 4 dimensional Euclidean transformation matrix representing a world coordinate system to a print coordinate system, the Euclidean transformation matrix T being resolvable using at least three point pairswp,XwA homogeneous coordinate vector representing the 4 x 1 dimension of the camera's optical center in the world coordinate system.
6. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the step (6b) of constructing the pose transformation matrix from the pose of the camera on the printing device to the pose of the printing device uses a formula
Figure FDA0003021125890000041
Is implemented, wherein T4×4Expressing a 4 x 4 dimensional Euclidean transformation matrix, R3×3A rotation matrix representing 3 x 3 dimensions, which can be derived from the Euler angle between the camera coordinate system and the printing device coordinate system, t3×1A horizontal offset vector representing 3 x 1 dimensions, which is the horizontal offset vector between the camera coordinate system and the printing device coordinate system.
7. The computer vision based handheld mobile printing device positioning method of claim 1, wherein: the calculation in step (7b)The position and posture of the camera on the printing device in the printing coordinate system are calculated by using the position calculation formula Xb=TXaAnd attitude calculation formula Rb=RRaObtained wherein XbA homogeneous coordinate vector representing the transformed 4 × 1-dimensional representation position, T represents a 4 × 4-dimensional Euclidean transform matrix, XaHomogeneous coordinate vector, R, representing position in 4 x 1 dimensions before transformationaA matrix representing the transformed 3 × 3-dimensional representation attitude, R representing a 3 × 3-dimensional rotation transformation matrix, the R matrix being a 3 × 3-dimensional block matrix at the top left of the T matrix, RaThe matrix representing the 3 × 3-dimensional representation of the pose before transformation may be transformed according to the transformation matrix T and R therein.
CN202110403051.9A 2021-04-15 2021-04-15 Handheld mobile printing device positioning method based on computer vision Active CN113112545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110403051.9A CN113112545B (en) 2021-04-15 2021-04-15 Handheld mobile printing device positioning method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110403051.9A CN113112545B (en) 2021-04-15 2021-04-15 Handheld mobile printing device positioning method based on computer vision

Publications (2)

Publication Number Publication Date
CN113112545A true CN113112545A (en) 2021-07-13
CN113112545B CN113112545B (en) 2023-03-21

Family

ID=76716953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110403051.9A Active CN113112545B (en) 2021-04-15 2021-04-15 Handheld mobile printing device positioning method based on computer vision

Country Status (1)

Country Link
CN (1) CN113112545B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114606541A (en) * 2022-03-15 2022-06-10 南通大学 Two-dimensional structure micro-nano scale rapid printing system and method based on glass microprobe
CN116080290A (en) * 2022-12-29 2023-05-09 上海魅奈儿科技有限公司 Three-dimensional high-precision fixed-point printing method and device

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130259403A1 (en) * 2012-04-03 2013-10-03 Oluwatosin Osinusi Flexible easy-to-use system and method of automatically inserting a photorealistic view of a two or three dimensional object into an image using a cd,dvd or blu-ray disc
US20140118339A1 (en) * 2012-10-31 2014-05-01 The Boeing Company Automated frame of reference calibration for augmented reality
WO2015096806A1 (en) * 2013-12-29 2015-07-02 刘进 Attitude determination, panoramic image generation and target recognition methods for intelligent machine
CN106101689A (en) * 2016-06-13 2016-11-09 西安电子科技大学 Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
CN107449402A (en) * 2017-07-31 2017-12-08 清华大学深圳研究生院 A kind of measuring method of the relative pose of noncooperative target
CN107977996A (en) * 2017-10-20 2018-05-01 西安电子科技大学 Space target positioning method based on target calibrating and positioning model
CN109685913A (en) * 2018-12-21 2019-04-26 西安电子科技大学 Augmented reality implementation method based on computer vision positioning
CN109895392A (en) * 2019-02-15 2019-06-18 上海幂方电子科技有限公司 A method of operating coordinates are marked and are accurately positioned in equipment working region
CN110281664A (en) * 2019-07-11 2019-09-27 森大(深圳)技术有限公司 Print media positioning printing method, device, equipment, medium and flat-panel printer
CN110686650A (en) * 2019-10-29 2020-01-14 北京航空航天大学 Monocular vision pose measuring method based on point characteristics
US20200061769A1 (en) * 2017-11-07 2020-02-27 Dalian University Of Technology Monocular vision six-dimensional measurement method for high-dynamic large-range arbitrary contouring error of cnc machine tool
CN111590899A (en) * 2020-04-27 2020-08-28 蒋青 Vision auxiliary positioning device for mechanical arm 3D printing and positioning method thereof
CN111673735A (en) * 2020-04-28 2020-09-18 平安科技(深圳)有限公司 Mechanical arm control method and device based on monocular vision positioning
CN111791589A (en) * 2020-09-10 2020-10-20 季华实验室 Positioning detection method and device based on ink-jet printer, electronic equipment and medium
CN111862238A (en) * 2020-07-23 2020-10-30 中国民航大学 Full-space monocular light pen type vision measurement method
CN111968128A (en) * 2020-07-10 2020-11-20 北京航空航天大学 Unmanned aerial vehicle visual attitude and position resolving method based on image markers
CN112066879A (en) * 2020-09-11 2020-12-11 哈尔滨工业大学 Air floatation motion simulator pose measuring device and method based on computer vision

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130259403A1 (en) * 2012-04-03 2013-10-03 Oluwatosin Osinusi Flexible easy-to-use system and method of automatically inserting a photorealistic view of a two or three dimensional object into an image using a cd,dvd or blu-ray disc
US20140118339A1 (en) * 2012-10-31 2014-05-01 The Boeing Company Automated frame of reference calibration for augmented reality
WO2015096806A1 (en) * 2013-12-29 2015-07-02 刘进 Attitude determination, panoramic image generation and target recognition methods for intelligent machine
CN106101689A (en) * 2016-06-13 2016-11-09 西安电子科技大学 Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
CN107449402A (en) * 2017-07-31 2017-12-08 清华大学深圳研究生院 A kind of measuring method of the relative pose of noncooperative target
CN107977996A (en) * 2017-10-20 2018-05-01 西安电子科技大学 Space target positioning method based on target calibrating and positioning model
US20200061769A1 (en) * 2017-11-07 2020-02-27 Dalian University Of Technology Monocular vision six-dimensional measurement method for high-dynamic large-range arbitrary contouring error of cnc machine tool
CN109685913A (en) * 2018-12-21 2019-04-26 西安电子科技大学 Augmented reality implementation method based on computer vision positioning
CN109895392A (en) * 2019-02-15 2019-06-18 上海幂方电子科技有限公司 A method of operating coordinates are marked and are accurately positioned in equipment working region
CN110281664A (en) * 2019-07-11 2019-09-27 森大(深圳)技术有限公司 Print media positioning printing method, device, equipment, medium and flat-panel printer
CN110686650A (en) * 2019-10-29 2020-01-14 北京航空航天大学 Monocular vision pose measuring method based on point characteristics
CN111590899A (en) * 2020-04-27 2020-08-28 蒋青 Vision auxiliary positioning device for mechanical arm 3D printing and positioning method thereof
CN111673735A (en) * 2020-04-28 2020-09-18 平安科技(深圳)有限公司 Mechanical arm control method and device based on monocular vision positioning
CN111968128A (en) * 2020-07-10 2020-11-20 北京航空航天大学 Unmanned aerial vehicle visual attitude and position resolving method based on image markers
CN111862238A (en) * 2020-07-23 2020-10-30 中国民航大学 Full-space monocular light pen type vision measurement method
CN111791589A (en) * 2020-09-10 2020-10-20 季华实验室 Positioning detection method and device based on ink-jet printer, electronic equipment and medium
CN112066879A (en) * 2020-09-11 2020-12-11 哈尔滨工业大学 Air floatation motion simulator pose measuring device and method based on computer vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
PHILIPP BAUER等: "Accuracy Analysis of Alignment Methods based on Reference Features for Robot-Based Optical Inspection Systems", 《PROCEDIA CIRP》 *
WEIWU FENG等: "Inertial measurement unit aided extrinsic parameters calibration for stereo vision systems", 《OPTICS AND LASERS IN ENGINEERING》 *
张华等: "面向自然特征的增强现实中高精度目标跟踪注册研究", 《长春工程学院学报(自然科学版)》 *
张品: "基于增强现实的精准术前规划技术研究", 《中国优秀硕士学位论文全文数据库 医药卫生科技辑》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114606541A (en) * 2022-03-15 2022-06-10 南通大学 Two-dimensional structure micro-nano scale rapid printing system and method based on glass microprobe
CN114606541B (en) * 2022-03-15 2023-03-24 南通大学 Two-dimensional structure micro-nano scale rapid printing system and method based on glass microprobe
CN116080290A (en) * 2022-12-29 2023-05-09 上海魅奈儿科技有限公司 Three-dimensional high-precision fixed-point printing method and device

Also Published As

Publication number Publication date
CN113112545B (en) 2023-03-21

Similar Documents

Publication Publication Date Title
US11039121B2 (en) Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
CN107886547B (en) Fisheye camera calibration method and system
CN110660107A (en) Plane calibration plate, calibration data acquisition method and system
CN113112545B (en) Handheld mobile printing device positioning method based on computer vision
EP3067861A2 (en) Determination of a coordinate conversion parameter
JP7218435B2 (en) CALIBRATION DEVICE, CALIBRATION CHART AND CALIBRATION METHOD
CN107578450B (en) Method and system for calibrating assembly error of panoramic camera
CN109712232B (en) Object surface contour three-dimensional imaging method based on light field
CN113329179B (en) Shooting alignment method, device, equipment and storage medium
CN110490943B (en) Rapid and accurate calibration method and system of 4D holographic capture system and storage medium
CN102914295A (en) Computer vision cube calibration based three-dimensional measurement method
CA3233222A1 (en) Method, apparatus and device for photogrammetry, and storage medium
EP3944194A1 (en) Fisheye camera calibration system, method and apparatus, and electronic device and storage medium
CN111768449A (en) Object grabbing method combining binocular vision with deep learning
CN112017302A (en) Real-time registration method of projection mark and machine vision based on CAD model
CN111862193A (en) Binocular vision positioning method and device for electric welding spots based on shape descriptors
CN114998447A (en) Multi-view vision calibration method and system
CN101320483A (en) Three-dimensional reconstruction method of rotating stereovision
CN109949249B (en) Cylindrical image correction method and system
CN111741223B (en) Panoramic image shooting method, device and system
CN112446926A (en) Method and device for calibrating relative position of laser radar and multi-eye fisheye camera
CN111724432B (en) Object three-dimensional detection method and device
CN116188594B (en) Calibration method, calibration system, calibration device and electronic equipment of camera
CN112288824B (en) Device and method for calibrating tele camera based on real scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant