CN113329179B - Shooting alignment method, device, equipment and storage medium - Google Patents
Shooting alignment method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN113329179B CN113329179B CN202110599186.7A CN202110599186A CN113329179B CN 113329179 B CN113329179 B CN 113329179B CN 202110599186 A CN202110599186 A CN 202110599186A CN 113329179 B CN113329179 B CN 113329179B
- Authority
- CN
- China
- Prior art keywords
- picture
- camera external
- reference information
- external reference
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Studio Devices (AREA)
Abstract
The application discloses a shooting alignment method, a shooting alignment device, shooting alignment equipment and a storage medium, and belongs to the field of shooting. The shooting alignment method comprises the following steps: acquiring a first picture and a pre-stored reference picture, wherein the first picture is a picture shot by first electronic equipment in a first position controlled by a mechanical arm; acquiring first camera external reference information when a reference picture is shot and acquiring second camera external reference information when the first picture is shot; determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information; and controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position to the second position along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture in the second position.
Description
Technical Field
The application belongs to the field of shooting, and particularly relates to a shooting alignment method, device, equipment and storage medium.
Background
When the shooting effect of the electronic equipment is tested, the electronic equipment is often required to be aligned, that is, the electronic equipment is moved to a specified position for shooting.
Generally, the electronic device is fixed at the end of a clamp of a robot arm, and the electronic device is moved to a specified position by controlling the motion of the robot arm, so as to realize an automatic photographing test. The electronic device has certain requirements on composition when shooting a shooting object, so that a better shooting alignment method is needed to ensure that the composition requirements are met for shooting in an automatic shooting test process.
However, there is no effective shooting registration scheme in the case of multi-scenes and multi-cameras.
Disclosure of Invention
The embodiment of the application aims to provide a shooting alignment method, a shooting alignment device, shooting alignment equipment and a storage medium, and the problem that no shooting alignment scheme suitable for multiple scenes and multiple cameras exists at present can be solved.
In a first aspect, an embodiment of the present application provides a shooting alignment method, which is applied to a control device, and the method includes:
acquiring a first picture and a pre-stored reference picture, wherein the first picture is a picture shot by first electronic equipment in a first position controlled by a mechanical arm, and the reference picture is a picture obtained by shooting a target scene;
determining first camera external reference information when the reference picture is taken and determining second camera external reference information when the first picture is taken;
determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information;
and controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position to a second position along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture in the second position.
In a second aspect, an embodiment of the present application provides a shooting alignment apparatus, which is applied to a control device, and includes:
the device comprises an acquisition module and a processing module, wherein the acquisition module is used for acquiring a first picture and a pre-stored reference picture, the first picture is a picture shot by first electronic equipment in a first position controlled by a mechanical arm, and the reference picture is a picture obtained by shooting a target scene;
a first determining module for determining first camera external reference information when the reference picture is taken and determining second camera external reference information when the first picture is taken;
the first determination module is used for determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information;
and the control module is used for controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position posture to the second position posture along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture at the second position posture.
In a third aspect, an embodiment of the present application provides a control device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, in the process of testing the shooting effect of the first electronic device, a first picture shot by the first electronic device in a first position controlled by a mechanical arm and a pre-stored reference picture are obtained, wherein the reference picture is a picture obtained by shooting a target scene; then, acquiring first camera external reference information when a reference picture is shot and second camera external reference information when the first picture is shot, and determining a motion control parameter according to the first camera external reference information and the second camera external reference information; and then, controlling the mechanical arm to move according to the motion control parameters. Under the condition that the mechanical arm completes movement according to the movement control parameters, the first electronic equipment moves from the first position to the second position along with the movement of the mechanical arm, and the first electronic equipment can shoot a picture matched with the composition of the reference picture in the second position. Therefore, if the shooting effects of the electronic equipment in a plurality of scenes need to be tested, reference pictures shot in each scene are provided, and the shooting effect test of the corresponding scene can be completed. In addition, the embodiment of the application can be suitable for shooting effect tests of different cameras, and can be particularly suitable for testing various camera lenses such as wide angles and x1, x2 and x 5. Thus, the requirements of multiple scenes and multiple cameras are met.
Drawings
Fig. 1 is a schematic diagram of an embodiment of a shooting alignment system provided in the present application.
Fig. 2 is a schematic flowchart of an embodiment of a capture alignment method provided in the present application.
FIG. 3 is a schematic diagram illustrating an embodiment of a counterpoint provided herein.
Fig. 4 is a schematic diagram of an embodiment of a reference picture provided in the present application.
FIG. 5 is a schematic view of one embodiment of a checkerboard provided herein.
Fig. 6 shows a schematic diagram of a mapping relationship of world coordinates to image coordinates provided by the present application.
FIG. 7 is a schematic diagram illustrating the relationship between pixel coordinates, an internal reference matrix, a scale system, an external reference matrix, and world coordinates provided herein.
Fig. 8 shows a schematic diagram of the build scale space provided by the present application.
Fig. 9 shows a schematic diagram of feature point positioning provided by the present application.
Fig. 10 shows a schematic diagram of the feature point principal direction assignment provided by the present application.
Fig. 11 is a schematic flowchart of an embodiment of a shooting alignment method provided in the present application.
Fig. 12 is a schematic structural diagram of an embodiment of an imaging alignment device provided in the present application.
Fig. 13 is a schematic structural diagram of an embodiment of a control device provided in the present application.
Fig. 14 is a schematic structural diagram of another embodiment of a control device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below clearly with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived from the embodiments in the present application by a person skilled in the art, are within the scope of protection of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
Before describing the imaging alignment method provided in the present application, terms related to the present application will be described.
World coordinate system (world coordinate system): a user-defined three-dimensional world coordinate system is introduced to describe the position of an object in the real world.
Pixel coordinate system (pixel coordinate system): the coordinate system is introduced for describing the coordinates of the image point (photo) on the digital image after the object is imaged, and is the coordinate system where the information is really read from the camera. The unit is one (number of pixels).
Image coordinate system (image coordinate system): the method is introduced for describing the projection transmission relation of an object from a camera coordinate system to an image coordinate system in the imaging process, and is convenient for further obtaining the coordinates in a pixel coordinate system.
Camera coordinate system (camera coordinate system): the coordinate system established on the camera, defined for describing the object position from the camera's perspective, is the middle ring that communicates the world coordinate system and the image/pixel coordinate system.
The embodiments of the present application are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a schematic diagram of an embodiment of a shooting alignment system provided in the present application. As shown in fig. 1, the imaging alignment system includes a control device 01, a robot arm 02, and an electronic device 03. The control device 01 may be a computer device, such as a mobile phone, a computer, or a console. The electronic device 03 may be any device having a shooting function, such as a mobile phone, a video camera, a camera, and the like.
In order to realize the alignment shooting of the electronic device 03, the electronic device 03 is firstly installed at the tail end of the mechanical arm 02, and the communication connection between the control device 01 and the mechanical arm 02 as well as the electronic device 03 is established.
Then, the electronic device 03 takes a first picture and transmits the taken first picture to the control device 01. The electronic device 03 may take the first picture under the control of the control device 01, or may take the first picture under the control of a worker. After the control device 01 receives the first picture sent by the electronic device 03, the motion control parameters of the mechanical arm 02 are determined according to the first picture, and the mechanical arm 02 is controlled to move according to the motion control parameters, so that the electronic device 03 can be aligned and shot.
Control software may be installed on the control device 01, and the robot arm 02 may be controlled to move by the software.
Based on the shooting alignment system, the present application provides a shooting alignment method, which can be applied to the control device 01. Fig. 2 is a schematic flowchart of an embodiment of a capture alignment method provided in the present application.
As shown in fig. 2, the photographing alignment method includes:
s102, a first picture and a pre-stored reference picture are obtained, the first picture is a picture shot by the first electronic device in the first position controlled by the mechanical arm, and the reference picture is a picture obtained by shooting a target scene. For example, when the target scene is a scene that the first electronic device needs to shoot when the shooting effect of the first electronic device is to be tested.
The first pose is the current pose of the first electronic device, and accordingly the first picture is a picture taken by the first electronic device in the current pose. The reference picture may be a picture of an ideal composition of the first electronic device that needs the test scene. In order to align the first electronic device and capture the composition effect of the reference picture, S104 is performed.
As an example, S102 may specifically include: under the condition that the first electronic equipment establishes communication connection with the control equipment, receiving a first picture sent by the first electronic equipment; and acquiring the parameter picture stored in the preset position.
The shooting alignment method further comprises the following steps:
s104, acquiring first camera external reference information when the reference picture is shot, and acquiring second camera external reference information when the first picture is shot.
The first camera external reference information represents a conversion relation from a world coordinate to a first coordinate, and the second camera external reference information represents a conversion relation from the world coordinate to a second coordinate, wherein the world coordinate is a coordinate of a plurality of target points on a reference picture in a world coordinate system, the first coordinate is a coordinate of the plurality of target points on the reference picture, and the second coordinate is a coordinate of the plurality of target points on the first picture.
As an example, the plurality of target points on the reference picture may be corner points on the reference picture, for example, the plurality of target points are 4 corner points on the reference picture. The first and second coordinates may be pixel coordinates. Taking the first coordinates as an example, the first coordinates include the number of pixels from the target point on the reference picture to the origin of coordinates in the horizontal axis direction and the number of pixels from the target point on the reference picture to the origin of coordinates in the vertical axis direction.
The shooting alignment method further comprises the following steps:
s106, determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information;
and S108, controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position posture to the second position posture along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture in the second position posture.
The shooting alignment in the embodiment of the application is finished according to the pre-stored reference picture and the first picture actually shot, the object and the scene in the reference picture can be replaced, and no fixed requirement is required for the fixed scene. Therefore, if the shooting effects of the first electronic device in multiple scenes (such as close shooting and far shooting) need to be tested, a reference picture shot in each scene is provided, so that the shooting effect test of the corresponding scene can be completed, and the rapid switching test of the multiple scenes becomes possible. In addition, the embodiment of the application can be suitable for shooting effect tests of different cameras, for example, the application can be suitable for complicated test scenes of various camera lenses such as wide-angle lenses and x1, x2 and x5 under the condition that only a monocular camera to be tested is used and no external auxiliary shooting positioning equipment is used. Thus, the requirements of multiple scenes and multiple cameras are met. Moreover, the embodiment of the application aims at the unfixed shot object, is applicable to the shooting pose alignment of any object, and the shot picture can meet the composition of the expected reference picture to a greater extent.
In one or more embodiments of the present application, at least two locators are included in a reference picture. As an example, the locator may be a locator as shown in fig. 3, which is a circular black and white mark.
In the process of shooting the reference picture, the positioning mark can be placed in the shooting range of the shooting object, so that the shot reference picture not only comprises the shooting object but also comprises the positioning mark. For example, the reference picture is a picture as shown in fig. 4, two position targets are placed on both sides of the optical disc, and when the optical disc is photographed, not only the optical disc but also the position targets placed on both sides are photographed.
Before S104, the shooting alignment method may further include:
acquiring a first distance between two locators on a reference picture, wherein the first distance can be the distance between the central points of the two locators;
determining world coordinates of the target points in a world coordinate system according to the first distance and the actual space distance of the two positioning targets;
and calculating the world coordinate, the first coordinate, the internal reference information of the second electronic equipment for shooting the reference picture and the camera distortion coefficient of the second electronic equipment according to a preset camera external reference calculation algorithm to obtain the first camera external reference information. The first electronic device may be a camera or a terminal with a photographing function (such as a mobile phone or a computer).
As an example, acquiring the first distances of the two locators on the reference picture may specifically include: detecting two position markers on a reference picture; determining the central point of each positioning mark on the reference image through image identification; and calculating a first distance between the central points of the two position targets on the reference image, wherein the first distance can be a Euclidean distance.
The two locators on the reference picture may be obtained by using a target detection model, where the target detection model may be a YOLO (young Only Look one) neural network model, and specifically, the target detection model may be a convolutional neural network model yolov4-Tiny. The target detection model can be deployed at the end side, and aims to perform target detection and identification on the positioning targets in various complex scenes.
In addition to obtaining the first distance of the two locators on the reference picture, the actual spatial distance of the two locators needs to be obtained. The actual spatial distance between the two position targets can be measured manually by a tester or measured by a sensor.
After the first distance and the actual spatial distance of the two positioning targets are obtained, the world coordinates of the target points in the world coordinate system can be determined according to the first distance and the actual spatial distance of the two positioning targets.
As an example, determining world coordinates of a plurality of target points on a reference picture in a world coordinate system according to a first distance and an actual spatial distance of two locators may include:
acquiring first relative position information of a plurality of target points on a reference picture;
determining second relative position information of the plurality of target points in the actual space according to a first ratio between the first distance and the actual space distance and the first relative position information, wherein the ratio between the first relative position information and the second relative position information is consistent with the first ratio;
and determining the world coordinates of the target points on the reference picture in the world coordinate system according to the second relative position information.
The following takes a plurality of target points as four corner points on a reference picture as an example to illustrate how to determine world coordinates.
With reference to the reference picture shown in fig. 4, first relative position information between four corner points (i.e., corner point a, corner point B, corner point C, and corner point D) on the reference picture is obtained, and the first relative position information is relative position information of each corner point with respect to other corner points. For example, the first relative position information includes a position of a 100 th pixel of the corner B located in the right direction of the corner a, a position of a 70 th pixel of the corner C located in the up direction of the corner B, a position of a 100 th pixel of the corner D located in the left direction of the corner C, and a position of a 70 th pixel of the corner a located in the down direction of the corner D.
Assume that the first distance of two locators on the reference picture is 50 pixels, the actual spatial distance of two locators is 50 centimeters, and the corner B is located at the position of the 100 th pixel to the right of the corner a on the reference picture, and thus, in actual space, the corner B is located at the position of 100 centimeters to the right of the corner a. Then, a world coordinate system is established by taking the central point of the reference picture as a coordinate origin, and world coordinates of the corner point B and the corner point A in the world coordinate system can be determined according to the relative positions of the corner point B and the corner point A in the actual space. Similarly, the relative positions of the corner point C and the corner point D in the reference picture in the real space and the world coordinates of the corner point C and the corner point D in the world coordinate system can be determined.
The above is an example of determining world coordinates taking a plurality of target points as four corner points on a reference picture as an example. When the reference picture is shot, the positioning target can be placed at any position of a scene to be shot, and the world coordinates of the target point on the shot reference picture can be accurately calculated.
The world coordinates of the plurality of target points are used for determining the first camera external reference information, and the determination of the first camera external reference information not only needs to use the world coordinates of the plurality of target points, but also needs to use the internal reference information of the second electronic device for taking the reference picture. The internal reference information can be obtained by calibrating a camera for shooting a reference picture.
As an example, the camera taking the reference picture may be calibrated using the gnomon camera calibration method. The calibration of the camera internal parameters can be completed by calling an opencv camera calibration interface.
Specifically, the camera is used for photographing the checkerboard to obtain a photo including the checkerboard. The number of photos depends on whether the reference information is converged within a specified error range, and generally at least four photos, one photo, and three oblique photos are taken. The checkerboard pattern is shown in fig. 5. Moreover, the whole checkerboard can be allowed not to be shot, and the sub-pixel corner detection can be carried out on the shot picture, so that the robustness is increased.
After a photo comprising a checkerboard is shot, a marked two-dimensional code in a white lattice in the photo is identified, and position information of the corresponding white lattice on the checkerboard is stored in the two-dimensional code in the white lattice. Specifically, a serial number is stored in the two-dimensional code in the white lattice, and the serial number corresponds to position information of several rows and columns of the white lattice on the chessboard.
After the position information of the white lattices on the chessboard is determined, the position of the chessboard can be determined based on the position information of the white lattices on the chessboard, then black and white angular points on the chessboard are detected, and pixel coordinates of each angular point on a picture and world coordinates of each angular point are determined according to the detection result.
Based on the pixel coordinates of each corner point on the photo and the world coordinates of each corner point, the internal reference information of the second electronic device can be determined.
How to determine the internal reference information of the second electronic device is described below with reference to fig. 6.
Fig. 6 shows a schematic diagram of a mapping relationship of world coordinates to image coordinates provided by the present application. As shown in fig. 6, in the 3D space, the world coordinates M = [ X, Y, Z ] of a point in the world coordinate system] T Its homogeneous coordinateIn 2D space, the coordinates m = [ u, v ] of a point in the pixel coordinate system] T Its homogeneous coordinate
Assuming the checkerboard is at Z =0, the ith column defining the rotation matrix R is R i Then, there are:
thus, the mapping of the world coordinate system to the image coordinate system is:
based on this, the relationship among the pixel coordinates, the internal reference matrix, the scale system s, the external reference matrix, and the world coordinates is as shown in fig. 7.
In the process of determining the internal reference information of the second electronic device, the points in one graph need to be mapped to the corresponding points in the other graph, the homography transformation is the current transformation of the points under the alignment secondary coordinates, and the homography matrix H has strong constraint and is in one-to-one correspondence from point to point. H is a 3 x 3 matrix and has one element as homogeneous coordinates, so H has 8 degrees of freedom. At present, 8 degrees of freedom need to be solved, so four corresponding points are needed, that is, a homography matrix H from an image coordinate system to a world coordinate system can be solved through the four points, and the homography matrix H comprises an internal reference matrix and an external reference matrix. The internal parameter matrix included in the homography matrix H is the internal parameter information of the second electronic device.
The internal reference matrix specifically comprises the physical size and focal length of one pixel, a distortion factor of an image physical coordinate, and the longitudinal and transverse offsets u and v of an image origin relative to an optical center imaging point. The external reference matrix includes rotation R and translation T matrices that are transformed from the world coordinate system to the camera coordinate system.
In addition, since real lenses are not ideal perspective imaging, but have different degrees of distortion. The distortion of the lens theoretically includes radial distortion and tangential distortion. Therefore, during the calibration of the camera, the camera distortion coefficients of the second electronic device can also be determined, wherein the camera distortion coefficients include radial camera distortion coefficients k1, k2, k 3-, and tangential camera distortion coefficients p1, p 2-, of the camera. Since the tangential distortion of the camera has a small influence, the radial distortion is usually considered, and the first two coefficients k1 and k2 of the dominant binary taylor series expansion are mainly considered in the solving process of the radial distortion.
It has been explained above how to determine world coordinates of a plurality of target points, internal reference information of a second electronic device taking a reference picture, and a camera distortion coefficient of the second electronic device, while first coordinates of the plurality of target points on the reference picture can be derived based on the reference picture. Then, the world coordinate, the first coordinate and the internal reference information of the second electronic device for shooting the reference picture can be calculated according to a preset camera external reference calculation algorithm to obtain first camera external reference information. The camera appearance parameter calculation algorithm can solve for an N-Point Perspective pose (solveP NP).
When the solvePNP is used to solve the external reference information of the first camera, the camera model according to which the solvePNP is solved may include a relationship shown in fig. 7, and based on the relationship shown in fig. 7, the pixel coordinates (i.e., first coordinates) of a plurality of target points on a reference picture, the internal reference matrix of the camera that shoots the reference picture, the scale coefficient, and the world coordinates of the plurality of target points are calculated, and the obtained external reference matrix is the external reference information of the first camera.
In one or more embodiments of the present application, before S104, the photographing alignment method may further include:
performing feature detection processing on the reference picture and the first picture to obtain a conversion matrix from the reference picture to the first picture;
performing perspective transformation according to the transformation matrix and the first coordinate to obtain a second coordinate;
and calculating the world coordinate, the second coordinate, the internal reference information of the first electronic equipment and the camera distortion coefficient of the first electronic equipment according to a preset camera external reference calculation algorithm to obtain second camera external reference information.
As an example, the conversion matrix from the reference picture to the first picture may be obtained by performing feature matching on the reference picture and the first picture by using a Speeded Up Robust Features (SURF) algorithm.
An exemplary description of how the transformation matrix H is obtained follows.
1. And constructing a black plug matrix (Hessian) for each of the reference picture and the first picture, and generating key points on the pictures for extracting features.
The Hessian matrix for picture f (x, y) is as follows:
before constructing a black plug matrix, gaussian filtering needs to be carried out on a picture, and a Hessian matrix after filtering is expressed as follows:
when the discriminant of the blackplug matrix takes a local maximum value, the current point is judged to be a point brighter or darker than other points in the surrounding area, and the position of the key point is determined.
2. Constructing a scale space
In SURF, the size of images between different groups is the same, except that the template size of box filters used between different groups is gradually increased, filters of the same size are used between different layers of the same group, and the blur coefficient of the filter is gradually increased, as shown in fig. 8.
3. Feature point localization
Referring to fig. 9, each pixel point processed by the blackplug matrix is compared with 26 points in the neighborhood of the two-dimensional image space and the scale space, the key point is preliminarily located, and the final stable feature point is screened out by filtering the key point with weak energy and the key point with wrong location.
4. Feature point principal direction assignment
In SURF, a Haar-like features (harr for short) wavelet feature in a circular neighborhood of statistical feature points is used. That is, in the circular neighborhood of the feature point, the sum of the horizontal and vertical harr wavelet features of all the points in the 60-degree sector is counted, then the sector is rotated at intervals of 0.2 radian, and after the harr wavelet feature value in the region is counted again, the direction of the sector with the maximum value is finally taken as the main direction of the feature point. A schematic diagram of this process is shown in fig. 10.
5. Generating feature point descriptors
A 4 x 4 rectangular region block is taken around the feature point, but the direction of the rectangular region taken is along the main direction of the feature point. And each subregion counts haar wavelet characteristics of 25 pixels in the horizontal direction and the vertical direction.
6. Feature matching
And determining the matching degree by calculating the Euclidean distance between the two characteristic points, wherein the shorter the Euclidean distance is, the better the matching degree of the two characteristic points is represented. In SURF, judgment of a black plug matrix trace is added, if the signs of matrix traces of two feature points are the same, the two features have contrast changes in the same direction, if the signs of the matrix traces are different, the contrast change directions of the two feature points are opposite, and even if the Euclidean distance is 0, pages are directly excluded. Therefore, a conversion matrix from the reference picture to the first picture is generated according to the euclidean distance between the feature point in the reference picture and the feature point in the first picture.
And after a conversion matrix from the reference picture to the first picture is obtained, carrying out perspective transformation according to the conversion matrix and the first coordinates to obtain second coordinates of the plurality of target points on the first picture. Thus, by means of the perspective transformation, the position of the target point on the reference picture on the first picture is determined.
After the world coordinates of the plurality of target points, the second coordinates of the plurality of target points on the first picture, the internal reference information of the first electronic device, and the camera distortion coefficient are obtained, the data may be calculated to obtain the second camera external reference information. The manner of calculating the second camera external reference information is similar to that of calculating the first camera external reference information, and how to calculate the first camera external reference information has already been described, and how to calculate the second camera external reference information is not described herein again in a similar manner.
The internal reference information and the camera distortion coefficient of the first electronic device can be obtained by calibrating the first electronic device. The specific calibration method is similar to the calibration method for the camera for taking the reference picture, and similar descriptions are omitted here.
In one or more embodiments of the present application, the first camera extrinsic reference information and the second camera extrinsic reference information are each represented by a form of a rotation vector. In order to obtain the motion control parameters of the mechanical arm, the first camera external reference information and the second camera external reference information need to be subjected to rotation matrix transformation.
Based on this, S106 may specifically include:
respectively converting the first camera external reference information and the second camera external reference information into rotation matrixes to obtain a first rotation matrix corresponding to the first camera external reference information and a second rotation matrix corresponding to the second camera external reference information;
and determining the motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix.
Wherein the rotation vector can be converted into a rotation matrix according to the rodregs formula as follows:
R=cosθI+(1-cosθ)nn T +sinθn^ (5)
where I is the identity matrix, n is the identity vector of the rotation vector, and θ is the modulo length of the rotation vector.
As one example, the motion control parameters include offset information of the robot arm and a pose adjustment angle; determining motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix, which may specifically include:
and multiplying the first rotation matrix by the transposed matrix of the second rotation matrix to obtain an attitude adjustment angle, and subtracting the first rotation matrix from the second rotation matrix to obtain offset information.
The first rotation matrix is multiplied by the transposed matrix of the second rotation matrix to obtain a camera rotation matrix for shooting the first picture to the reference picture, and the camera rotation matrix is determined as a pose adjustment euler angle (namely, a pose adjustment angle). And subtracting the first rotation matrix from the second rotation matrix to obtain the position deviation value of the mechanical arm.
After obtaining the motion control parameters of the robot arm, if the gripper end of the robot arm grips the first electronic device, the electronic device and the first electronic device also move along with the movement of the gripper when the gripper controlling the robot arm moves according to the motion control parameters. And under the condition that the clamp of the mechanical arm completes movement, the first electronic equipment moves to the second pose. In the second position, the first electronic device may capture a second picture that matches the composition of the reference picture.
It should be noted that, the calibration of the fixture is performed at the part controlled by the robot arm, and in the embodiment of the present application, the end of the fixture is defined as a calibrated user coordinate system, and the calibration can be directly controlled by giving an offset value to the user coordinate system of the robot arm.
In the embodiment of the application, the first camera external reference information and the second camera external reference information in the form of rotation vectors are respectively converted into corresponding rotation matrixes, so that the motion control parameters of the mechanical arm can be conveniently obtained, and the accurate control of shooting alignment is realized.
The embodiments of the present application will be further described with reference to fig. 11.
Fig. 11 is a schematic flowchart of an embodiment of a capture alignment method provided in the present application. As shown in fig. 11, the imaging alignment method includes:
s202, calibrating internal reference information of second electronic equipment for shooting a reference picture and a camera distortion coefficient of the second electronic equipment by using a checkerboard;
s204, acquiring a first distance between two positioning marks on a reference picture;
s206, determining world coordinates of four corner points on the reference picture according to the first distance and the actual space distance of the two positioning targets;
s208, determining first camera external reference information of a camera for shooting the reference picture according to the internal reference information of the second electronic equipment for shooting the reference picture, the camera distortion coefficient of the second electronic equipment, world coordinates of four corner points and first coordinates of the four corner points on the reference picture;
s210, under the condition that a first picture is shot by first electronic equipment on the mechanical arm, carrying out feature detection processing on the reference picture and the first picture to obtain a conversion matrix from the reference picture to the first picture;
s212, performing perspective transformation according to the transformation matrix and first coordinates of the four corner points on the reference picture to obtain second coordinates of the four corner points on the first picture;
s214, determining second camera external reference information of the first electronic device according to the internal reference information of the first electronic device, the camera distortion coefficient of the first electronic device, world coordinates of four corner points and second coordinates of the four corner points on the first picture;
s216, converting the first camera external reference information represented by the rotation amount into a first rotation matrix based on the world coordinate system, and converting the second camera external reference information represented by the rotation amount into a second rotation matrix based on the world coordinate system;
s218, multiplying the first rotation matrix by the transpose matrix of the second rotation matrix to obtain a target rotation matrix from the first picture to the reference picture;
s220, determining the target rotation matrix as a position deviation value of the tail end of the clamp of the mechanical arm, and subtracting the first rotation matrix from the second rotation matrix to obtain the attitude adjustment Euler angle of the tail end of the mechanical arm.
Corresponding to the photographing alignment method provided by the present application, the present application further provides a photographing alignment device, and fig. 12 is a schematic structural diagram of an embodiment of the photographing alignment device provided by the present application. As shown in fig. 12, the imaging alignment apparatus 300 includes:
the first obtaining module 302 is configured to obtain a first picture and a pre-stored reference picture, where the first picture is a picture taken by the first electronic device in the first position controlled by the mechanical arm, and the reference picture is a picture taken of a target scene;
a second obtaining module 304, configured to obtain first camera external reference information when the reference picture is taken, and obtain second camera external reference information when the first picture is taken;
a first determining module 306, configured to determine a motion control parameter of the mechanical arm according to the first camera external reference information and the second camera external reference information;
and the control module 308 is configured to control the mechanical arm to move according to the motion control parameter, wherein the first electronic device moves from the first position to the second position along with the mechanical arm, so that the first electronic device captures a second picture matching with the composition of the reference picture in the second position.
The shooting alignment in the embodiment of the application is finished according to the pre-stored reference picture and the first picture actually shot, the object and the scene in the reference picture can be replaced, and no fixed requirement is required for the fixed scene. Therefore, if the shooting effects of the first electronic device in multiple scenes (such as close shooting and far shooting) need to be tested, a reference picture shot in each scene is provided, so that the shooting effect test of the corresponding scene can be completed, and the rapid switching test of the multiple scenes becomes possible. In addition, the embodiment of the application can be suitable for shooting effect tests of different cameras. Thus, the requirements of multiple scenes and multiple cameras are met. Moreover, the embodiment of the application aims at the unfixed shot object, is applicable to the shooting pose alignment of any object, and the shot picture can meet the composition of the expected reference picture to a greater extent.
In one or more embodiments of the present application, the first camera external reference information represents a conversion relationship from world coordinates to first coordinates, and the second camera external reference information represents a conversion relationship from world coordinates to second coordinates, where the world coordinates are coordinates of a plurality of target points on a reference picture in a world coordinate system, the first coordinates are coordinates of the plurality of target points on the reference picture, and the second coordinates are coordinates of the plurality of target points on the first picture.
In one or more embodiments of the present application, at least two locators are included in a reference picture; the imaging alignment device 300 may further include:
the third acquisition module is used for acquiring a first distance between two positioning marks on a reference picture;
the second determining module is used for determining the world coordinates of the target points in the world coordinate system according to the first distance and the actual space distance of the two positioning targets;
the first calculation module is used for calculating the world coordinate, the first coordinate, the internal reference information of the second electronic equipment for shooting the reference picture and the camera distortion coefficient of the second electronic equipment according to a preset camera external reference calculation algorithm to obtain first camera external reference information.
In one or more embodiments of the present application, the second determining module may include:
a first acquisition unit, configured to acquire first relative position information of a plurality of target points on a reference picture;
the first determining unit is used for determining second relative position information of the plurality of target points in the actual space according to a first ratio between the first distance and the actual space distance and the first relative position information, wherein the ratio between the first relative position information and the second relative position information is consistent with the first ratio;
and the second determining unit is used for determining the world coordinates according to the second relative position information.
In one or more embodiments of the present application, the photographing alignment apparatus 300 may further include:
the processing module is used for carrying out feature detection processing on the reference picture and the first picture to obtain a conversion matrix from the reference picture to the first picture;
the perspective transformation module is used for carrying out perspective transformation according to the transformation matrix and the first coordinate to obtain a second coordinate;
and the second calculation module is used for calculating the world coordinate, the second coordinate, the internal reference information of the first electronic equipment and the camera distortion coefficient of the first electronic equipment according to a preset camera external reference calculation algorithm to obtain second camera external reference information.
In one or more embodiments of the present application, the first camera external reference information and the second camera external reference information are both represented by the form of a rotation vector; the first determination module 306 includes:
the conversion unit is used for respectively converting the first camera external reference information and the second camera external reference information into rotation matrixes to obtain a first rotation matrix corresponding to the first camera external reference information and a second rotation matrix corresponding to the second camera external reference information;
and the third determining unit is used for determining the motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix.
In one or more embodiments of the present application, the motion control parameters include offset information of the robot arm and a pose adjustment angle; the third determination unit is configured to:
and multiplying the first rotation matrix by the transposed matrix of the second rotation matrix to obtain an attitude adjustment angle, and subtracting the first rotation matrix from the second rotation matrix to obtain offset information.
In the photographing alignment method provided in the embodiment of the present application, the executing body may be a photographing alignment device, or a control module in the photographing alignment device for executing the photographing alignment method. In the embodiment of the present application, a photographing alignment device is taken as an example to execute a photographing alignment method, and the photographing alignment device provided in the embodiment of the present application is described.
The shooting alignment device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The shooting alignment device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting alignment device provided in the embodiment of the present application can implement each process implemented by the method embodiment of fig. 2 or fig. 11, and is not described here again to avoid repetition.
The present application further provides an electronic device, which includes a processor, a memory, and a program or an instruction stored in the memory and executable on the processor, where the program or the instruction when executed by the processor implements the steps of the shooting alignment method according to any one of the above embodiments.
As shown in fig. 13, an embodiment of the present application further provides a control device 400, which includes a processor 401, a memory 402, and a program or an instruction stored in the memory 402 and capable of running on the processor 401, where the program or the instruction is executed by the processor 401 to implement each process of the foregoing shooting alignment method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
Fig. 13 is a schematic structural diagram of another embodiment of a control device provided in the present application.
The control device 500 includes, but is not limited to: radio frequency unit 501, network module 502, audio output unit 503, input unit 504, sensor 505, display unit 506, user input unit 507, interface unit 508, memory 509, processor 510, and the like.
Those skilled in the art will appreciate that the control device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The device structure shown in fig. 14 does not constitute a limitation of the control device, which may include more or less components than those shown, or combine some components, or a different arrangement of components, and will not be described again here.
Wherein processor 510 is configured to:
acquiring a first picture and a pre-stored reference picture, wherein the first picture is a picture shot by first electronic equipment in a first position controlled by a mechanical arm;
acquiring first camera external reference information when a reference picture is shot and acquiring second camera external reference information when the first picture is shot;
determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information;
and controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position posture to the second position posture along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture in the second position posture.
The shooting alignment in the embodiment of the application is finished according to the pre-stored reference picture and the first picture actually shot, the object and the scene in the reference picture can be replaced, and no fixed requirement is required for the fixed scene. Therefore, if the shooting effects of the first electronic device in multiple scenes (such as close shooting and far shooting) need to be tested, the reference pictures shot in each scene are provided, so that the shooting effect test of the corresponding scene can be completed, and the rapid switching test of the multiple scenes is possible. In addition, the embodiment of the application can be suitable for shooting effect tests of different cameras. Thus, the requirements of multiple scenes and multiple cameras are met. Moreover, the embodiment of the application aims at the unfixed shot object, is applicable to the shooting pose alignment of any object, and the shot picture can meet the composition of the expected reference picture to a greater extent.
In one or more embodiments of the present application, the first camera external reference information represents a conversion relationship from world coordinates to first coordinates, and the second camera external reference information represents a conversion relationship from world coordinates to second coordinates, where the world coordinates are coordinates of a plurality of target points on a reference picture in a world coordinate system, the first coordinates are coordinates of the plurality of target points on the reference picture, and the second coordinates are coordinates of the plurality of target points on the first picture.
In one or more embodiments of the present application, processor 510 is specifically configured to:
acquiring a first distance between two positioning marks on a reference picture;
determining world coordinates of the target points in a world coordinate system according to the first distance and the actual space distance of the two positioning targets;
and calculating the world coordinate, the first coordinate, the internal reference information of the second electronic equipment for shooting the reference picture and the camera distortion coefficient of the second electronic equipment according to a preset camera external reference calculation algorithm to obtain the first camera external reference information.
In one or more embodiments of the present application, processor 510 is specifically configured to:
acquiring first relative position information of a plurality of target points on a reference picture;
determining second relative position information of the plurality of target points in the actual space according to a first ratio between the first distance and the actual space distance and the first relative position information, wherein the ratio between the first relative position information and the second relative position information is consistent with the first ratio;
world coordinates are determined from the second relative position information.
In one or more embodiments of the present application, processor 510 is specifically configured to:
performing feature detection processing on the reference picture and the first picture to obtain a conversion matrix from the reference picture to the first picture;
performing perspective transformation according to the transformation matrix and the first coordinate to obtain a second coordinate;
and calculating the world coordinate, the second coordinate, the internal reference information of the first electronic equipment and the camera distortion coefficient of the first electronic equipment according to a preset camera external reference calculation algorithm to obtain second camera external reference information.
In one or more embodiments of the present application, the first camera external reference information and the second camera external reference information are both represented by the form of a rotation vector; processor 510 is specifically configured to:
respectively converting the first camera external reference information and the second camera external reference information into rotation matrixes to obtain a first rotation matrix corresponding to the first camera external reference information and a second rotation matrix corresponding to the second camera external reference information;
and determining the motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix.
In one or more embodiments of the present application, processor 510 is specifically configured to:
and multiplying the first rotation matrix by the transposed matrix of the second rotation matrix to obtain an attitude adjustment angle, and subtracting the first rotation matrix from the second rotation matrix to obtain offset information.
It should be understood that in the embodiment of the present application, the input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes a touch panel 5071 and other input devices 5072. A touch panel 5071, also referred to as a touch screen. The touch panel 5071 may include two parts of a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in further detail herein. The memory 509 may be used to store software programs as well as various data including, but not limited to, application programs and operating systems. The processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communication. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned shooting alignment method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
Wherein the processor is the processor in the control device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the foregoing shooting alignment method embodiment, and can achieve the same technical effect, and is not described here again to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a component of' 8230; \8230;" does not exclude the presence of another like element in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (11)
1. A shooting alignment method is applied to control equipment and is characterized by comprising the following steps:
acquiring a first picture and a pre-stored reference picture, wherein the first picture is a picture shot by first electronic equipment under the control of a mechanical arm in a first position, and the reference picture is a picture obtained by shooting a target scene;
acquiring first camera external reference information when the reference picture is shot and acquiring second camera external reference information when the first picture is shot;
determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information;
controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position to a second position along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture in the second position;
the first camera external reference information represents a conversion relation from a world coordinate to a first coordinate, and the second camera external reference information represents a conversion relation from the world coordinate to a second coordinate, wherein the world coordinate is a coordinate of a plurality of target points on the reference picture in a world coordinate system, the first coordinate is a coordinate of the plurality of target points on the reference picture, and the second coordinate is a coordinate of the plurality of target points on the first picture;
the reference picture comprises at least two positioning marks; before the obtaining of the first camera external reference information when the reference picture is taken, the method further includes:
acquiring first distances between the two positioning marks on the reference picture;
determining world coordinates of the target points in a world coordinate system according to the first distance and the actual space distance of the two positioning targets;
and calculating the world coordinate, the first coordinate, the internal reference information of the second electronic equipment for shooting the reference picture and the camera distortion coefficient of the second electronic equipment according to a preset camera external reference calculation algorithm to obtain the first camera external reference information.
2. The method of claim 1, wherein the determining the world coordinates of the plurality of target points on the reference picture in a world coordinate system according to the first distance and the actual spatial distances of the two position targets comprises:
acquiring first relative position information of a plurality of target points on the reference picture;
determining second relative position information of the target points in the actual space according to a first ratio between the first distance and the actual space distance and the first relative position information, wherein the ratio between the first relative position information and the second relative position information is consistent with the first ratio;
and determining the world coordinates according to the second relative position information.
3. The method of claim 1, wherein prior to obtaining the second camera external reference information when taking the first picture, the method further comprises:
performing feature detection processing on the reference picture and the first picture to obtain a conversion matrix from the reference picture to the first picture;
performing perspective transformation according to the transformation matrix and the first coordinate to obtain the second coordinate;
and calculating the world coordinate, the second coordinate, the internal reference information of the first electronic equipment and the camera distortion coefficient of the first electronic equipment according to a preset camera external reference calculation algorithm to obtain the second camera external reference information.
4. The method of claim 1, wherein the first camera external reference information and the second camera external reference information are both represented in the form of rotation vectors;
determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information, including:
respectively converting the first camera external reference information and the second camera external reference information into rotation matrixes to obtain a first rotation matrix corresponding to the first camera external reference information and a second rotation matrix corresponding to the second camera external reference information;
and determining the motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix.
5. The method of claim 4, wherein the motion control parameters include offset information and a pose adjustment angle of the robotic arm; the determining motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix comprises:
multiplying the first rotation matrix by the transposed matrix of the second rotation matrix to obtain the attitude adjustment angle, and subtracting the first rotation matrix from the second rotation matrix to obtain the offset information.
6. The utility model provides a shoot aligning device, is applied to controlgear, its characterized in that, the device includes:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a first picture and a pre-stored reference picture, the first picture is a picture shot by first electronic equipment in a first position controlled by a mechanical arm, and the reference picture is a picture obtained by shooting a target scene;
the second acquisition module is used for acquiring first camera external reference information when the reference picture is shot and acquiring second camera external reference information when the first picture is shot;
the first determining module is used for determining motion control parameters of the mechanical arm according to the first camera external reference information and the second camera external reference information;
the control module is used for controlling the mechanical arm to move according to the motion control parameters, wherein the first electronic equipment moves from the first position posture to the second position posture along with the mechanical arm, so that the first electronic equipment shoots a second picture matched with the composition of the reference picture at the second position posture;
the first camera external reference information represents a conversion relation from a world coordinate to a first coordinate, and the second camera external reference information represents a conversion relation from the world coordinate to a second coordinate, wherein the world coordinate is a coordinate of a plurality of target points on the reference picture in a world coordinate system, the first coordinate is a coordinate of the plurality of target points on the reference picture, and the second coordinate is a coordinate of the plurality of target points on the first picture;
the reference picture comprises at least two positioning marks; the device further comprises:
a third obtaining module, configured to obtain a first distance between the two locators on the reference picture;
the second determining module is used for determining the world coordinates of the target points in a world coordinate system according to the first distance and the actual space distance of the two positioning targets;
the first calculation module is used for calculating the world coordinate, the first coordinate, the internal reference information of the second electronic device for shooting the reference picture and the camera distortion coefficient of the second electronic device according to a preset camera external reference calculation algorithm to obtain the first camera external reference information.
7. The apparatus of claim 6, wherein the second determining module comprises:
a first obtaining unit, configured to obtain first relative position information of a plurality of target points on the reference picture;
a first determining unit, configured to determine second relative position information of the plurality of target points in the actual space according to a first ratio between the first distance and the actual space distance and the first relative position information, where a ratio between the first relative position information and the second relative position information is consistent with the first ratio;
and the second determining unit is used for determining the world coordinates according to the second relative position information.
8. The apparatus of claim 6, further comprising:
the processing module is used for carrying out feature detection processing on the reference picture and the first picture to obtain a conversion matrix from the reference picture to the first picture;
the perspective transformation module is used for carrying out perspective transformation according to the transformation matrix and the first coordinate to obtain the second coordinate;
and the second calculation module is used for calculating the world coordinate, the second coordinate, the internal reference information of the first electronic equipment and the camera distortion coefficient of the first electronic equipment according to a preset camera external reference calculation algorithm to obtain the second camera external reference information.
9. The apparatus of claim 6, wherein the first camera extrinsic reference information and the second camera extrinsic reference information are each represented in the form of a rotation vector;
the first determining module includes:
a conversion unit, configured to convert the first camera external reference information and the second camera external reference information into rotation matrices, respectively, to obtain a first rotation matrix corresponding to the first camera external reference information and a second rotation matrix corresponding to the second camera external reference information;
and the third determining unit is used for determining the motion control parameters of the mechanical arm according to the first rotation matrix and the second rotation matrix.
10. The apparatus of claim 9, wherein the motion control parameters include offset information and a pose adjustment angle of the robotic arm; the third determination unit is configured to:
multiplying the first rotation matrix by a transposed matrix of the second rotation matrix to obtain the attitude adjustment angle, and subtracting the first rotation matrix from the second rotation matrix to obtain the offset information.
11. A control device comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the shoot counterpoint method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110599186.7A CN113329179B (en) | 2021-05-31 | 2021-05-31 | Shooting alignment method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110599186.7A CN113329179B (en) | 2021-05-31 | 2021-05-31 | Shooting alignment method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113329179A CN113329179A (en) | 2021-08-31 |
CN113329179B true CN113329179B (en) | 2023-04-07 |
Family
ID=77422685
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110599186.7A Active CN113329179B (en) | 2021-05-31 | 2021-05-31 | Shooting alignment method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113329179B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113506299B (en) * | 2021-09-13 | 2021-12-10 | 武汉逸飞激光股份有限公司 | Soft-package battery cell feeding control method and device, electronic equipment and storage medium |
CN114004890B (en) * | 2021-11-04 | 2023-03-24 | 如你所视(北京)科技有限公司 | Attitude determination method and apparatus, electronic device, and storage medium |
CN114449165B (en) * | 2021-12-27 | 2023-07-18 | 广州极飞科技股份有限公司 | Photographing control method and device, unmanned equipment and storage medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112837381A (en) * | 2021-02-09 | 2021-05-25 | 上海振华重工(集团)股份有限公司 | Camera calibration method, system and equipment suitable for driving equipment |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108347559A (en) * | 2018-01-05 | 2018-07-31 | 深圳市金立通信设备有限公司 | A kind of image pickup method, terminal and computer readable storage medium |
CN110933297B (en) * | 2019-11-12 | 2021-11-23 | 武汉联一合立技术有限公司 | Photographing control method and device of intelligent photographing system, storage medium and system |
WO2021102914A1 (en) * | 2019-11-29 | 2021-06-03 | 深圳市大疆创新科技有限公司 | Trajectory repeating method and system, movable platform and storage medium |
CN111654624B (en) * | 2020-05-29 | 2021-12-24 | 维沃移动通信有限公司 | Shooting prompting method and device and electronic equipment |
CN112489113B (en) * | 2020-11-25 | 2024-06-11 | 深圳地平线机器人科技有限公司 | Camera external parameter calibration method and device and camera external parameter calibration system |
-
2021
- 2021-05-31 CN CN202110599186.7A patent/CN113329179B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112837381A (en) * | 2021-02-09 | 2021-05-25 | 上海振华重工(集团)股份有限公司 | Camera calibration method, system and equipment suitable for driving equipment |
Also Published As
Publication number | Publication date |
---|---|
CN113329179A (en) | 2021-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113329179B (en) | Shooting alignment method, device, equipment and storage medium | |
CN108765498B (en) | Monocular vision tracking, device and storage medium | |
US11039121B2 (en) | Calibration apparatus, chart for calibration, chart pattern generation apparatus, and calibration method | |
CN111750820B (en) | Image positioning method and system | |
CN113841384B (en) | Calibration device, chart for calibration and calibration method | |
CN108288294A (en) | A kind of outer ginseng scaling method of a 3D phases group of planes | |
US20120148145A1 (en) | System and method for finding correspondence between cameras in a three-dimensional vision system | |
CN112132874B (en) | Calibration-plate-free heterogeneous image registration method and device, electronic equipment and storage medium | |
CN107578450B (en) | Method and system for calibrating assembly error of panoramic camera | |
KR20180105875A (en) | Camera calibration method using single image and apparatus therefor | |
CN114022560A (en) | Calibration method and related device and equipment | |
CN112270719A (en) | Camera calibration method, device and system | |
CN113112545B (en) | Handheld mobile printing device positioning method based on computer vision | |
CN104504691B (en) | Camera position and posture measuring method on basis of low-rank textures | |
CN112307912A (en) | Method and system for determining personnel track based on camera | |
CN114998447A (en) | Multi-view vision calibration method and system | |
CN117173254A (en) | Camera calibration method, system, device and electronic equipment | |
CN111383264A (en) | Positioning method, positioning device, terminal and computer storage medium | |
CN110991306B (en) | Self-adaptive wide-field high-resolution intelligent sensing method and system | |
CN112950528A (en) | Certificate posture determining method, model training method, device, server and medium | |
CN117788686A (en) | Three-dimensional scene reconstruction method and device based on 2D image and electronic equipment | |
CN112288824B (en) | Device and method for calibrating tele camera based on real scene | |
CN102110292B (en) | Zoom lens calibration method and device in virtual sports | |
CN111445453A (en) | Method, system, medium, and apparatus for determining deviation of key image acquired by camera | |
CN113538588A (en) | Calibration method, calibration device and electronic equipment applying calibration device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |