CN114500842A - Visual inertia calibration method and device - Google Patents

Visual inertia calibration method and device Download PDF

Info

Publication number
CN114500842A
CN114500842A CN202210096751.2A CN202210096751A CN114500842A CN 114500842 A CN114500842 A CN 114500842A CN 202210096751 A CN202210096751 A CN 202210096751A CN 114500842 A CN114500842 A CN 114500842A
Authority
CN
China
Prior art keywords
parameter
image data
target motion
calibration
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210096751.2A
Other languages
Chinese (zh)
Inventor
叶伟文
周强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210096751.2A priority Critical patent/CN114500842A/en
Publication of CN114500842A publication Critical patent/CN114500842A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/683Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The application discloses a visual inertia calibration method and a device thereof, belonging to the technical field of vision. The method comprises the following steps: in the process of driving the electronic equipment to rotate according to the target motion parameters through the displacement table, acquiring image data acquired by an image sensor of the electronic equipment and attitude data acquired by an inertial measurement unit of the electronic equipment; determining parameter values of calibration parameters according to the image data, the attitude data and the target motion parameters based on set constraint conditions reflecting the relationship among the image data, the attitude data and the target motion parameters; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit; saving the determined parameter values in the electronic device for the anti-shake compensation.

Description

Visual inertia calibration method and device
Technical Field
The application belongs to the technical field of vision, and particularly relates to a visual inertia calibration method and a device thereof.
Background
With the high popularity of mobile internet and smart devices, taking video has become a new form for people to record and share life.
In the process of shooting videos, the mobile equipment inevitably shakes, and the shooting effect is affected. For this reason, an Inertial Measurement Unit (IMU) is usually used to perform motion estimation, and an anti-shake process is compensated according to an estimation result, so as to solve the anti-shake problem of the mobile device.
In order to obtain a better anti-shake effect, when the mobile device leaves a factory, the mobile device needs to be calibrated by visual inertia calibration. In the related art, the visual inertial calibration of the mobile device is generally implemented by manually making a shake in hand and matching feature points of previous and subsequent frames obtained by an image sensor with a motion trajectory of an inertial measurement unit. However, the calibration accuracy is poor in this way, and the anti-shake effect is affected.
Disclosure of Invention
The embodiment of the application aims to provide a visual inertia calibration method and a device thereof, which can solve the problem of poor calibration precision of a visual inertia calibration method in the prior art.
In a first aspect, an embodiment of the present application provides a visual inertia calibration method, where the method includes:
in the process of driving the electronic equipment to rotate according to target motion parameters through the displacement table, acquiring image data acquired by an image sensor of the electronic equipment and attitude data acquired by an inertial measurement unit of the electronic equipment;
determining parameter values of calibration parameters according to the image data, the attitude data and the target motion parameters based on set constraint conditions reflecting the relationship among the image data, the attitude data and the target motion parameters; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit;
saving the determined parameter values in the electronic device for the anti-shake compensation.
In a second aspect, an embodiment of the present application provides a visual inertial calibration apparatus, including:
the acquisition module is used for acquiring image data acquired by an image sensor of the electronic equipment and attitude data acquired by an inertial measurement unit of the electronic equipment in the process of driving the electronic equipment to rotate according to target motion parameters through the displacement table;
a determining module, configured to determine a parameter value of a calibration parameter according to the image data, the attitude data, and the target motion parameter based on a set constraint condition that reflects a relationship among the image data, the attitude data, and the target motion parameter; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit;
and the storage module is used for storing the determined parameter value in the electronic equipment for the anti-shake compensation.
In a third aspect, embodiments of the present application provide an electronic device, which includes a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, stored on a storage medium, for execution by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, the electronic equipment is controlled by the displacement table to rotate according to the target motion parameters, and based on the set constraint condition reflecting the relationship among the image data, the attitude data and the target motion parameters, determining the parameter value of the calibration parameter according to the image data collected by the image sensor of the electronic equipment, the attitude data collected by the inertial measurement unit of the electronic equipment and the target motion parameter of the displacement table, in this way, the attitude data of the electronic equipment is matched with the known target motion parameters of the displacement table, and matching the motion trail of the characteristic points in the image data with the known target motion parameters of the displacement table, the calibration of the visual inertia of the electronic equipment can be realized, and a high-precision calibration result can be obtained, and then, anti-shake compensation is carried out on the image data acquired by the image sensor according to the acquired calibration parameters, so that the anti-shake effect of the electronic equipment can be improved.
Drawings
Fig. 1 is a schematic flow chart of a visual inertia calibration method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a relative relationship between an image sensor coordinate system, an inertial measurement unit coordinate system, and a displacement table coordinate system provided in an embodiment of the present application;
FIG. 3 is a schematic structural diagram of a visual inertial calibration apparatus provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of another electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
In the process of shooting videos, the electronic equipment inevitably shakes, and the shooting effect is affected. For this, an Inertial Measurement Unit (IMU) is generally used to perform motion estimation, and an anti-shake process is compensated according to the estimation result to solve the anti-shake problem of the mobile device.
However, an included angle is formed between the coordinate system of the inertial measurement unit and the coordinate system of the image sensor, and a certain time delay exists between the time of acquiring the attitude data by the inertial measurement unit and the time of acquiring the image data by the image sensor. Therefore, in order to improve the anti-shake effect of the electronic device, it is necessary to perform a visual inertial calibration process, i.e., a calibration of the synchronization between the inertial measurement unit coordinate system and the image sensor, and a calibration of the synchronization between the time stamp of the attitude data acquired by the inertial measurement unit and the time stamp of the image data acquired by the image sensor, when the electronic device is shipped from the factory.
In the related art, the calibration of the visual inertial calibration of the electronic device is usually performed outdoors, shaking is made by manual hand-holding, and the feature points of the previous and subsequent frames obtained by the image sensor are matched with the motion trajectory of the inertial measurement unit, so as to achieve the calibration of the visual inertial calibration of the mobile device. However, the calibration accuracy is poor in this way, and the anti-shake effect is affected.
In order to solve the above problem, an embodiment of the present application provides a method for calibrating visual inertial calibration, in which an electronic device is controlled by a displacement stage to rotate according to a target motion parameter, and a parameter value of a calibration parameter is determined according to image data, attitude data, and the target motion parameter based on a set constraint condition reflecting a relationship among the image data, the attitude data, and the target motion parameter, so as to complete the visual inertial calibration, and obtain a high-precision calibration result, thereby improving an anti-shake effect of the electronic device.
The visual inertia calibration method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, a method for calibrating visual inertia according to an embodiment of the present application may include steps 1100-1300, which are described in detail below.
Step 1100, in the process of driving the electronic device to rotate according to the target motion parameters through the displacement table, acquiring image data acquired by an image sensor of the electronic device and attitude data acquired by an inertial measurement unit of the electronic device.
In this embodiment, the displacement table may be configured to drive the electronic device to rotate according to the target motion parameter. The displacement stage may be, for example, a six-axis six-legged displacement stage. For example, the displacement table can drive the electronic device to perform different frequencies and different angles of rotation. The target motion parameters may be set according to jitter data of the electronic device. The target motion parameter of the displacement table may be, for example, a rotation angle of the displacement table.
The image data captured by the image sensor of the electronic device may be a video, i.e., a sequence of video frames, captured by the image sensor of the electronic device. The image data may include N frames of images, where N is a positive integer greater than 1.
The attitude data collected by the inertial measurement unit of the electronic device may be used to indicate an attitude at which the image sensor of the electronic device collects the image data.
Step 1200, determining parameter values of calibration parameters according to the image data, the attitude data and the target motion parameters based on set constraint conditions reflecting the relationship among the image data, the attitude data and the target motion parameters; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit.
In this embodiment, the set constraint condition may reflect a conversion relationship between image data acquired by an image sensor of the electronic device, attitude data acquired by an inertial measurement unit of the electronic device, and a target motion parameter of the displacement table.
The calibration parameter may be a parameter for performing anti-shake compensation on image data acquired by the image sensor through attitude data acquired by the inertial measurement unit. For example, the calibration parameters may include a coordinate transformation parameter between the image data and the pose data, and a time delay parameter between the image data and the pose data.
In this embodiment, the coordinate conversion parameter between the image data and the attitude data may be determined according to a coordinate conversion parameter between a target motion parameter of the displacement table and the attitude data collected by the inertial measurement unit, and a coordinate conversion parameter between the image data collected by the image sensor and the target motion parameter of the displacement table. Please refer to fig. 2, which is a schematic image of a relative relationship between an image sensor coordinate system, an inertial measurement unit coordinate system, and a displacement table coordinate system according to an embodiment of the present application.
For example, the coordinate transformation parameter between the target motion parameter of the displacement table and the attitude data acquired by the inertial measurement unit may be a rotation transformation matrix RDOF-IMU(ii) a The coordinate transformation parameter between the image data collected by the image sensor and the target motion parameter of the displacement table may be, for example, a rotation transformation matrix RCam-DOF(ii) a The coordinate conversion parameter between the image data and the attitude data is RCam-DOF*RDOF-IMU
In some embodiments of the present application, the set constraint includes a first constraint reflecting a first distance between the pose data and the target motion parameter; and a second constraint reflecting a second distance between the target motion parameter and the image data.
In this embodiment, the first constraint condition may reflect a first distance between the attitude data acquired by the inertial measurement unit and the target motion parameter of the displacement table.
In this embodiment, a first loss function may be constructed according to a first constraint. The first loss function is shown in equation (1):
S=∑||RDOF-IMU*AIMU-ADOF||2 (1)
wherein S is a first loss value of a first loss function, RDOF-IMUIs inertiaRotation transformation matrix of the coordinate system of the measuring unit relative to the coordinate system of the displacement table, AIMUAttitude data collected for inertial measurement units, ADOFIs the target motion parameter of the displacement table.
In this embodiment, the attitude data acquired by the inertial measurement unit may be rotational attitude information obtained by integral calculation. For example, the calculation formula (2) of the attitude data is as follows:
AIMU=∑ωi*Δt1 (2)
wherein A isIMUAttitude data, omega, collected for inertial measurement unitsiInstantaneous angular velocity, Δ t, of rotation of an electronic device1The time interval of the angular velocity of the rotation of the electronic device for two adjacent acquisitions.
In this embodiment, the rotation transformation matrix of the coordinate system of the inertial measurement unit relative to the coordinate system of the displacement table may be obtained by transformation according to the three-axis included angle of the coordinate system of the inertial measurement unit relative to the coordinate system of the displacement table. Wherein, the three-axis included angle of the inertia measurement unit coordinate system relative to the displacement table coordinate system is: thetaDOF-IMU=(θx,θy,θz). For example, the calculation formula (3) -formula (6) of the rotation transformation matrix of the inertial measurement unit coordinate system with respect to the displacement table coordinate system are as follows:
RDOF-IMu=RzRyRx (3)
Figure BDA0003488475710000061
Figure BDA0003488475710000062
Figure BDA0003488475710000063
wherein R isDOF-IMUFor rotational transformation of the inertial measurement unit coordinate system relative to the displacement table coordinate systemMatrix, RxFor a rotation transformation matrix, R, of the inertial measurement unit coordinate system with respect to the x-axis of the displacement table coordinate systemyFor a rotation transformation matrix, R, of the inertial measurement unit coordinate system with respect to the y-axis of the displacement table coordinate systemzIs a rotation transformation matrix of the inertial measurement unit coordinate system relative to the z-axis of the displacement table coordinate system, thetaxIs the angle theta of the inertial measurement unit coordinate system with respect to the x-axis of the displacement table coordinate systemyIs the angle theta of the inertial measurement unit coordinate system with respect to the y-axis of the displacement table coordinate systemyIs the angle of the inertial measurement unit coordinate system relative to the z-axis of the displacement table coordinate system.
For example, the first constraint condition may be that a first distance between the attitude data acquired by the inertial measurement unit and the target motion parameter of the displacement stage is less than or equal to a first threshold value. See equation (1) that the first loss value S of the first loss function is less than or equal to the first threshold value. For example, the first constraint condition may be that a first distance between the attitude data acquired by the inertial measurement unit and the target motion parameter of the displacement table is a minimum value.
In this embodiment, the second constraint condition may reflect a second distance between the motion parameter of the object and the image data. The image data may include a current frame image and a next frame image, that is, the second constraint condition may reflect a second distance between a motion trajectory of the target feature point in the current frame image and the next frame image and a motion trajectory of the displacement stage. The target feature points may be feature points successfully matched in the current frame image and the next frame image. And carrying out feature point matching on the current frame image and the next frame image to determine a target feature point.
Illustratively, a second loss function may be constructed in accordance with a second constraint. The second loss function is shown in equation (7):
M=∑||xj-ro-xi||2 (7)
wherein M is a second loss value of a second loss function; x is the number ofiImage coordinates in the current frame image, i.e. x, for the target feature pointi=(xi,yi);xj-roIs as followsAnd the position relation between the target characteristic point in the image of one frame and the target characteristic point in the image of the current frame.
In this embodiment, the calculation formula (8) of the positional relationship between the target feature point in the next frame image and the target feature point in the current frame image is as follows:
xj-ro=K(RjRCam-DOF)(RiRCam-DOF)TK-1xi (8)
wherein x isj-roThe position relation between the target characteristic point in the next frame image and the target characteristic point in the current frame image is obtained; rCam-DOFA rotation transformation matrix of a displacement table coordinate system relative to an image sensor coordinate system; riThe rotation angle of the displacement table corresponding to the acquisition time of the current frame image acquired by the image sensor can be specifically searched and found according to the time delay parameter delta t between the image data and the attitude dataiCorresponding rotational angle R of displacement table outputi;RjThe rotation angle of the displacement table corresponding to the acquisition time of acquiring the next frame of image for the image sensor can be specifically searched and found according to the time delay parameter delta t between the image data and the attitude datajCorresponding rotational angle R of displacement table outputj(ii) a K is an internal parameter of a camera of the electronic device.
In this embodiment, the rotation transformation matrix of the displacement table coordinate system relative to the image sensor coordinate system can be obtained by transformation according to the three-axis included angle of the displacement table coordinate system relative to the image sensor coordinate system. Wherein, the three-axis included angle of the displacement table coordinate system relative to the image sensor coordinate system is: thetaCam-DOF=(θx′,θy′,θz') wherein, θx' is the angle of the displacement table coordinate system with respect to the x-axis of the image sensor coordinate system, thetay' is the angle of the displacement table coordinate system with respect to the y-axis of the image sensor coordinate system, thetaz' is the angle of the displacement table coordinate system with respect to the z-axis of the image sensor coordinate system.
Exemplarily, the calculation formula (9) of the rotation transformation matrix of the stage coordinate system with respect to the image sensor coordinate system is as follows:
RCam-DOF=Rz′Ry′Rx′ (9)
wherein R isCam-DOFIs a rotation transformation matrix of the displacement table coordinate system relative to the image sensor coordinate system, Rx' is a rotation transformation matrix of the displacement table coordinate system with respect to the x-axis of the image sensor coordinate system, Ry' is a rotation transformation matrix of the displacement table coordinate system with respect to the y-axis of the image sensor coordinate system, Rz' is a rotation transformation matrix of the displacement table coordinate system with respect to the z-axis of the image sensor coordinate system.
For example, the second constraint may be that a second distance between the target motion parameter and the image data is less than or equal to a second threshold. See equation (7) that the second loss value M of the second loss function is less than or equal to the second threshold value. Illustratively, the second constraint may be that the second distance between the target motion parameter and the image data is a minimum value.
In this embodiment, by setting the first constraint condition, the attitude data acquired by the inertial measurement unit of the electronic device can be made to be as close as possible to the target motion parameter of the displacement table, and by setting the second constraint condition, the motion trajectory of the feature point in the image data can be made to be as close as possible to the motion trajectory of the displacement table, so that calibration and calibration of the image sensor and the inertial measurement unit of the electronic device can be realized, and the calibration and calibration accuracy is high.
In some embodiments of the present application, the set constraint is that a weighted sum of the first constraint and the second constraint satisfies a set convergence condition.
In this embodiment, the convergence condition may be that a weighted sum of a first distance reflecting the attitude data acquired by the inertial measurement unit and the target motion parameter of the displacement table and a second distance reflecting the target motion parameter and the image data is less than or equal to a third threshold.
Illustratively, the convergence condition may be expressed by the following equation (10):
w1S+w2M (10)
wherein S is a first loss function reflecting a first distance between attitude data acquired by the inertial measurement unit and a target motion parameter of the displacement table, M is a second loss function reflecting a second distance between the target motion parameter and image data, w1、w2Are weight coefficients. For example, the weight coefficient w of the first loss function1W of the weight coefficient of the first loss function being 12Is 2. Note that the weight coefficient w of the first loss function1W of the weight coefficient of the first loss function2The setting may be performed according to actual needs, and this embodiment is not particularly limited to this.
In some embodiments of the present application, the determining, based on the set constraint condition reflecting the relationship between the image data, the pose data, and the target motion parameter, a parameter value of a calibration parameter according to the image data, the pose data, and the target motion parameter may further include: determining a parameter value of a first parameter corresponding to the first distance and a parameter value of a second parameter corresponding to the second distance according to the image data, the attitude data and the target motion parameter; and obtaining the parameter value of the calibration parameter according to the parameter value of the first parameter and the parameter value of the second parameter.
In this embodiment, the first parameter corresponding to the first distance may be a three-axis included angle of the coordinate system of the inertial measurement unit relative to the coordinate system of the displacement table. The second parameter corresponding to the second distance may be a three-axis angle of the displacement table coordinate system with respect to the image sensor coordinate system.
The calibration parameters comprise coordinate conversion parameters between the image data and the attitude data and time delay parameters between the image data and the attitude data. The parameter value of the coordinate conversion parameter between the image data and the attitude data may be a product of a rotation transformation matrix of the inertial measurement unit coordinate system with respect to the displacement table coordinate system and a rotation transformation matrix of the displacement table coordinate system with respect to the image sensor coordinate system.
In an embodiment, obtaining the parameter value of the calibration parameter according to the parameter value of the first parameter and the parameter value of the second parameter may include: determining a rotation transformation matrix of the inertia measurement unit coordinate system relative to the displacement table coordinate system according to the three-axis included angle of the inertia measurement unit coordinate system relative to the displacement table coordinate system; determining a rotation transformation matrix of the displacement table coordinate system relative to the image sensor coordinate system according to the three-axis included angle of the displacement table coordinate system relative to the image sensor coordinate system; and determining parameter values of the calibration parameters according to the rotation transformation matrix of the inertial measurement unit coordinate system relative to the displacement table coordinate system and the rotation transformation matrix of the displacement table coordinate system relative to the image sensor coordinate system.
Taking a formula (10) as an example, optimizing the first loss function and the second loss function to enable a weighted summation value of the first loss value and the second loss value to be a minimum value, and acquiring a three-axis included angle of an inertial measurement unit coordinate system corresponding to the minimum value relative to a displacement table coordinate system and a three-axis included angle of the displacement table coordinate system relative to an image sensor coordinate system; determining a rotation transformation matrix of the inertia measurement unit coordinate system relative to the displacement table coordinate system according to the three-axis included angle of the inertia measurement unit coordinate system relative to the displacement table coordinate system; determining a rotation transformation matrix of the displacement table coordinate system relative to the image sensor coordinate system according to the three-axis included angle of the displacement table coordinate system relative to the image sensor coordinate system; and taking the product of the rotation transformation matrix of the inertial measurement unit coordinate system relative to the displacement table coordinate system and the rotation transformation matrix of the displacement table coordinate system relative to the image sensor coordinate system as the parameter value of the calibration parameter.
After step 1200, step 1300 is executed to store the determined parameter value in the electronic device for the anti-shake compensation.
In the embodiment of the application, the electronic equipment is controlled to rotate according to the preset target motion parameter by the displacement table, and the parameter value of the calibration parameter is determined according to the image data collected by the image sensor of the electronic equipment, the attitude data collected by the inertial measurement unit of the electronic equipment and the target motion parameter of the displacement table based on the set constraint condition reflecting the relationship among the image data, the attitude data and the target motion parameter of the displacement table, so that the attitude data of the electronic equipment is matched with the known target motion parameter of the displacement table, the motion trail of the feature point in the image data is matched with the known target motion parameter of the displacement table, the visual inertial calibration of the electronic equipment can be realized, the high-precision calibration result can be obtained, and the anti-shake compensation is carried out on the image data collected by the image sensor according to the obtained calibration parameter, the anti-shake effect of the electronic equipment can be improved.
According to the visual inertia calibration method provided by the embodiment of the application, the execution main body can be a visual inertia calibration device. In the embodiment of the present application, a method for performing visual inertia calibration by using a visual inertia calibration apparatus is taken as an example, and the apparatus for performing visual inertia calibration provided in the embodiment of the present application is described.
Referring to fig. 3, an embodiment of the present application further provides a visual inertial calibration apparatus 300, where the visual inertial calibration apparatus 300 includes an obtaining module 301, a determining module 302, and a storing module 303.
The obtaining module 301 is configured to obtain image data collected by an image sensor of the electronic device and attitude data collected by an inertial measurement unit of the electronic device in a process of driving the electronic device to rotate according to a target motion parameter through a displacement table;
the determining module 302 is configured to determine a parameter value of a calibration parameter according to the image data, the attitude data, and the target motion parameter based on a set constraint condition reflecting a relationship among the image data, the attitude data, and the target motion parameter; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit;
the storage module 303 is configured to store the determined parameter value in the electronic device for the anti-shake compensation.
Optionally, the calibration parameter includes a coordinate conversion parameter between the image data and the pose data, and a time delay parameter between the image data and the pose data.
Optionally, the set constraint condition includes a first constraint condition reflecting a first distance between the posture data and the target motion parameter; and a second constraint reflecting a second distance between the target motion parameter and the image data.
Optionally, the set constraint condition is that a weighted sum of the first constraint condition and the second constraint condition satisfies a set convergence condition.
Optionally, the determining module includes: a first determining unit, configured to determine, according to the image data, the posture data, and the target motion parameter, a parameter value of a first parameter corresponding to the first distance and a parameter value of a second parameter corresponding to the second distance; and the parameter acquisition unit is used for acquiring the parameter value of the calibration parameter according to the parameter value of the first parameter and the parameter value of the second parameter.
In the embodiment of the application, the electronic equipment is controlled by the displacement table to rotate according to the target motion parameters, and based on the set constraint condition reflecting the relationship among the image data, the attitude data and the target motion parameters, determining the parameter value of the calibration parameter according to the image data collected by the image sensor of the electronic equipment, the attitude data collected by the inertial measurement unit of the electronic equipment and the target motion parameter of the displacement table, in this way, the attitude data of the electronic equipment is matched with the known target motion parameters of the displacement table, and matching the motion trail of the characteristic points in the image data with the known target motion parameters of the displacement table, the calibration of the visual inertia of the electronic equipment can be realized, and a high-precision calibration result can be obtained, and then, the image data collected by the image sensor is subjected to anti-shake compensation according to the obtained calibration parameters, so that the anti-shake effect of the electronic equipment can be improved.
The visual inertia calibration device in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The visual inertia calibration device in the embodiment of the application can be a device with an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The visual inertia calibration device provided in the embodiment of the present application can implement each process implemented in the method embodiment of fig. 1, and is not described here again to avoid repetition.
Optionally, as shown in fig. 4, an electronic device 400 is further provided in the embodiment of the present application, and includes a processor 401 and a memory 402, where the memory 402 stores a program or an instruction that can be executed on the processor 401, and when the program or the instruction is executed by the processor 401, the steps of the embodiment of the visual inertia calibration method are implemented, and the same technical effects can be achieved, and are not described again here to avoid repetition.
Fig. 5 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and the like.
Those skilled in the art will appreciate that the electronic device 500 may further include a power supply (e.g., a battery) for supplying power to various components, and the power supply may be logically connected to the processor 510 via a power management system, so as to implement functions of managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 5 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 510 is configured to acquire image data acquired by an image sensor of the electronic device and attitude data acquired by an inertial measurement unit of the electronic device in a process of driving the electronic device to rotate according to target motion parameters through a displacement table; determining parameter values of calibration parameters according to the image data, the attitude data and the target motion parameters based on set constraint conditions reflecting the relationship among the image data, the attitude data and the target motion parameters; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit; a memory 509 for storing the determined parameter values in the electronic device for the anti-shake compensation.
Optionally, the calibration parameter includes a coordinate conversion parameter between the image data and the pose data, and a time delay parameter between the image data and the pose data.
Optionally, the set constraint condition includes a first constraint condition reflecting a first distance between the attitude data and the target motion parameter; and a second constraint reflecting a second distance between the target motion parameter and the image data.
Optionally, the set constraint condition is that a weighted sum of the first constraint condition and the second constraint condition satisfies a set convergence condition.
Optionally, the processor 510, when determining a parameter value of a calibration parameter according to the image data, the pose data and the target motion parameter, is configured to: determining a parameter value of a first parameter corresponding to the first distance and a parameter value of a second parameter corresponding to the second distance according to the image data, the attitude data and the target motion parameter; and obtaining the parameter value of the calibration parameter according to the parameter value of the first parameter and the parameter value of the second parameter.
In the embodiment of the application, the electronic equipment is controlled by the displacement table to rotate according to the target motion parameters, and based on the set constraint condition reflecting the relationship among the image data, the attitude data and the target motion parameters, determining the parameter value of the calibration parameter according to the image data collected by the image sensor of the electronic equipment, the attitude data collected by the inertial measurement unit of the electronic equipment and the target motion parameter of the displacement table, in this way, the attitude data of the electronic equipment is matched with the known target motion parameters of the displacement table, and matching the motion trail of the characteristic points in the image data with the known target motion parameters of the displacement table, the calibration of the visual inertia of the electronic equipment can be realized, and a high-precision calibration result can be obtained, and then, the image data collected by the image sensor is subjected to anti-shake compensation according to the obtained calibration parameters, so that the anti-shake effect of the electronic equipment can be improved.
It should be understood that in the embodiment of the present application, the input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 507 includes at least one of a touch panel 5071 and other input devices 5072. A touch panel 5071, also referred to as a touch screen. The touch panel 5071 may include two parts of a touch detection device and a touch controller. Other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in further detail herein.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a first storage area storing a program or an instruction and a second storage area storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 509 may include volatile memory or non-volatile memory, or the memory 509 may include both volatile and non-volatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 509 in the embodiments of the subject application includes, but is not limited to, these and any other suitable types of memory.
Processor 510 may include one or more processing units; optionally, the processor 510 integrates an application processor, which mainly handles operations related to the operating system, user interface, and applications, and a modem processor, which mainly handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the visual inertia calibration method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above embodiment of the visual inertia calibration method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes of the foregoing embodiments of the visual inertia calibration method, and achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the present embodiments are not limited to those precise embodiments, which are intended to be illustrative rather than restrictive, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of the appended claims.

Claims (10)

1. A visual inertial calibration method, the method comprising:
in the process of driving the electronic equipment to rotate according to target motion parameters through the displacement table, acquiring image data acquired by an image sensor of the electronic equipment and attitude data acquired by an inertial measurement unit of the electronic equipment;
determining parameter values of calibration parameters according to the image data, the attitude data and the target motion parameters based on set constraint conditions reflecting the relationship among the image data, the attitude data and the target motion parameters; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit;
saving the determined parameter values in the electronic device for the anti-shake compensation.
2. The method of claim 1, wherein the calibration parameters comprise a coordinate transformation parameter between the image data and the pose data, and a time delay parameter between the image data and the pose data.
3. The method of claim 1, wherein the set constraint comprises a first constraint reflecting a first distance between the pose data and the target motion parameter; and a second constraint reflecting a second distance between the target motion parameter and the image data.
4. The method according to claim 3, wherein the set constraint condition is that a weighted sum of the first constraint condition and the second constraint condition satisfies a set convergence condition.
5. The method of claim 3, wherein determining parameter values for calibration parameters from the image data, the pose data, and the target motion parameters comprises:
determining a parameter value of a first parameter corresponding to the first distance and a parameter value of a second parameter corresponding to the second distance according to the image data, the attitude data and the target motion parameter;
and obtaining the parameter value of the calibration parameter according to the parameter value of the first parameter and the parameter value of the second parameter.
6. A visual inertial calibration apparatus, the apparatus comprising:
the acquisition module is used for acquiring image data acquired by an image sensor of the electronic equipment and attitude data acquired by an inertial measurement unit of the electronic equipment in the process of driving the electronic equipment to rotate according to target motion parameters through the displacement table;
a determining module, configured to determine a parameter value of a calibration parameter according to the image data, the attitude data, and the target motion parameter based on a set constraint condition that reflects a relationship among the image data, the attitude data, and the target motion parameter; the calibration parameters are parameters for performing anti-shake compensation on the image data acquired by the image sensor through the attitude data acquired by the inertial measurement unit;
and the storage module is used for storing the determined parameter value in the electronic equipment for the anti-shake compensation.
7. The apparatus of claim 6, wherein the calibration parameters comprise a coordinate transformation parameter between the image data and the pose data, and a time delay parameter between the image data and the pose data.
8. The apparatus of claim 6, wherein the set constraint comprises a first constraint reflecting a first distance between the pose data and the target motion parameter; and a second constraint reflecting a second distance between the target motion parameter and the image data.
9. The apparatus according to claim 8, wherein the set constraint condition is that a weighted sum of the first constraint condition and the second constraint condition satisfies a set convergence condition.
10. The apparatus of claim 8, wherein the determining module comprises:
a first determining unit, configured to determine, according to the image data, the posture data, and the target motion parameter, a parameter value of a first parameter corresponding to the first distance and a parameter value of a second parameter corresponding to the second distance;
and the parameter acquisition unit is used for acquiring the parameter value of the calibration parameter according to the parameter value of the first parameter and the parameter value of the second parameter.
CN202210096751.2A 2022-01-25 2022-01-25 Visual inertia calibration method and device Pending CN114500842A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210096751.2A CN114500842A (en) 2022-01-25 2022-01-25 Visual inertia calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210096751.2A CN114500842A (en) 2022-01-25 2022-01-25 Visual inertia calibration method and device

Publications (1)

Publication Number Publication Date
CN114500842A true CN114500842A (en) 2022-05-13

Family

ID=81477175

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210096751.2A Pending CN114500842A (en) 2022-01-25 2022-01-25 Visual inertia calibration method and device

Country Status (1)

Country Link
CN (1) CN114500842A (en)

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106595640A (en) * 2016-12-27 2017-04-26 天津大学 Moving-base-object relative attitude measuring method based on dual-IMU-and-visual fusion and system
US9805512B1 (en) * 2015-11-13 2017-10-31 Oculus Vr, Llc Stereo-based calibration apparatus
CN110118572A (en) * 2019-05-08 2019-08-13 北京建筑大学 Multi-view stereo vision and inertial navigation system and relative pose parameter determination method
CN110378968A (en) * 2019-06-24 2019-10-25 深圳奥比中光科技有限公司 The scaling method and device of camera and Inertial Measurement Unit relative attitude
CN110430365A (en) * 2019-08-26 2019-11-08 Oppo广东移动通信有限公司 Anti-fluttering method, device, computer equipment and storage medium
CN111147741A (en) * 2019-12-27 2020-05-12 Oppo广东移动通信有限公司 Focusing processing-based anti-shake method and device, electronic equipment and storage medium
CN111246100A (en) * 2020-01-20 2020-06-05 Oppo广东移动通信有限公司 Anti-shake parameter calibration method and device and electronic equipment
WO2020124517A1 (en) * 2018-12-21 2020-06-25 深圳市大疆创新科技有限公司 Photographing equipment control method, photographing equipment control device and photographing equipment
CN111551191A (en) * 2020-04-28 2020-08-18 浙江商汤科技开发有限公司 Sensor external parameter calibration method and device, electronic equipment and storage medium
CN111595362A (en) * 2020-06-05 2020-08-28 联想(北京)有限公司 Parameter calibration method and device for inertial measurement unit and electronic equipment
CN111750853A (en) * 2020-06-24 2020-10-09 国汽(北京)智能网联汽车研究院有限公司 Map establishing method, device and storage medium
CN111928847A (en) * 2020-09-22 2020-11-13 蘑菇车联信息科技有限公司 Inertial measurement unit pose data optimization method and device and electronic equipment
CN112204946A (en) * 2019-10-28 2021-01-08 深圳市大疆创新科技有限公司 Data processing method, device, movable platform and computer readable storage medium
US11019265B1 (en) * 2020-11-04 2021-05-25 Bae Systems Information And Electronic Systems Integration Inc. Optimized motion compensation via fast steering mirror and roll axis gimbal
US20210182633A1 (en) * 2019-12-14 2021-06-17 Ubtech Robotics Corp Ltd Localization method and helmet and computer readable storage medium using the same
CN113551665A (en) * 2021-06-25 2021-10-26 中国科学院国家空间科学中心 High dynamic motion state sensing system and sensing method for motion carrier

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9805512B1 (en) * 2015-11-13 2017-10-31 Oculus Vr, Llc Stereo-based calibration apparatus
CN106595640A (en) * 2016-12-27 2017-04-26 天津大学 Moving-base-object relative attitude measuring method based on dual-IMU-and-visual fusion and system
WO2020124517A1 (en) * 2018-12-21 2020-06-25 深圳市大疆创新科技有限公司 Photographing equipment control method, photographing equipment control device and photographing equipment
CN110118572A (en) * 2019-05-08 2019-08-13 北京建筑大学 Multi-view stereo vision and inertial navigation system and relative pose parameter determination method
CN110378968A (en) * 2019-06-24 2019-10-25 深圳奥比中光科技有限公司 The scaling method and device of camera and Inertial Measurement Unit relative attitude
CN110430365A (en) * 2019-08-26 2019-11-08 Oppo广东移动通信有限公司 Anti-fluttering method, device, computer equipment and storage medium
CN112204946A (en) * 2019-10-28 2021-01-08 深圳市大疆创新科技有限公司 Data processing method, device, movable platform and computer readable storage medium
US20210182633A1 (en) * 2019-12-14 2021-06-17 Ubtech Robotics Corp Ltd Localization method and helmet and computer readable storage medium using the same
CN111147741A (en) * 2019-12-27 2020-05-12 Oppo广东移动通信有限公司 Focusing processing-based anti-shake method and device, electronic equipment and storage medium
CN111246100A (en) * 2020-01-20 2020-06-05 Oppo广东移动通信有限公司 Anti-shake parameter calibration method and device and electronic equipment
CN111551191A (en) * 2020-04-28 2020-08-18 浙江商汤科技开发有限公司 Sensor external parameter calibration method and device, electronic equipment and storage medium
CN111595362A (en) * 2020-06-05 2020-08-28 联想(北京)有限公司 Parameter calibration method and device for inertial measurement unit and electronic equipment
CN111750853A (en) * 2020-06-24 2020-10-09 国汽(北京)智能网联汽车研究院有限公司 Map establishing method, device and storage medium
CN111928847A (en) * 2020-09-22 2020-11-13 蘑菇车联信息科技有限公司 Inertial measurement unit pose data optimization method and device and electronic equipment
US11019265B1 (en) * 2020-11-04 2021-05-25 Bae Systems Information And Electronic Systems Integration Inc. Optimized motion compensation via fast steering mirror and roll axis gimbal
CN113551665A (en) * 2021-06-25 2021-10-26 中国科学院国家空间科学中心 High dynamic motion state sensing system and sensing method for motion carrier

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
方明;田颖;: "基于IMU-Camera标定的鲁棒电子稳像方法", 信息与控制, no. 02, 15 April 2018 (2018-04-15) *

Similar Documents

Publication Publication Date Title
CN106525074B (en) A kind of compensation method, device, holder and the unmanned plane of holder drift
CN109644231A (en) The improved video stabilisation of mobile device
US20230037922A1 (en) Image display method and apparatus, computer device, and storage medium
CN110956666B (en) Motion data calibration method and device, terminal equipment and storage medium
CN110954134B (en) Gyro offset correction method, correction system, electronic device, and storage medium
CN113556464B (en) Shooting method and device and electronic equipment
CN112907652B (en) Camera pose acquisition method, video processing method, display device, and storage medium
CN110049246A (en) Video anti-fluttering method, device and the electronic equipment of electronic equipment
JP2023502635A (en) CALIBRATION METHOD AND APPARATUS, PROCESSOR, ELECTRONICS, STORAGE MEDIUM
WO2021212278A1 (en) Data processing method and apparatus, and mobile platform and wearable device
JP2022531186A (en) Information processing methods, devices, electronic devices, storage media and programs
CN110096134B (en) VR handle ray jitter correction method, device, terminal and medium
CN114040113A (en) Image processing method and device
CN107145706B (en) Evaluation method and device for performance parameters of virtual reality VR equipment fusion algorithm
WO2024002065A1 (en) Video encoding method and apparatus, electronic device, and medium
WO2023241495A1 (en) Photographic method and apparatus
CN115379118B (en) Camera switching method and device, electronic equipment and readable storage medium
CN114500842A (en) Visual inertia calibration method and device
CN111275769A (en) Monocular vision parameter correction method and device
CN116079697A (en) Monocular vision servo method, device, equipment and medium based on image
CN115205419A (en) Instant positioning and map construction method and device, electronic equipment and readable storage medium
CN114785957A (en) Shooting method and device thereof
CN110180185B (en) Time delay measurement method, device, system and storage medium
CN114745499A (en) Control method and control device for shooting device, shooting device and electronic equipment
CN114339051A (en) Shooting method, shooting device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination