CN118055318A - Data processing method, data processing device, terminal and readable storage medium - Google Patents
Data processing method, data processing device, terminal and readable storage medium Download PDFInfo
- Publication number
- CN118055318A CN118055318A CN202211436777.3A CN202211436777A CN118055318A CN 118055318 A CN118055318 A CN 118055318A CN 202211436777 A CN202211436777 A CN 202211436777A CN 118055318 A CN118055318 A CN 118055318A
- Authority
- CN
- China
- Prior art keywords
- imu
- camera
- curve
- acquiring
- angular velocity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 49
- 238000012545 processing Methods 0.000 title claims abstract description 20
- 238000013507 mapping Methods 0.000 claims abstract description 150
- 238000000034 method Methods 0.000 claims description 31
- 230000008859 change Effects 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 4
- 238000005259 measurement Methods 0.000 description 25
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 4
- 238000001914 filtration Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000006641 stabilisation Effects 0.000 description 3
- 238000011105 stabilization Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04J—MULTIPLEX COMMUNICATION
- H04J3/00—Time-division multiplex systems
- H04J3/02—Details
- H04J3/06—Synchronising arrangements
- H04J3/0602—Systems characterised by the synchronising information used
- H04J3/0614—Systems characterised by the synchronising information used the synchronising signal being characterised by the amplitude, duration or polarity
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
Landscapes
- Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The application discloses a data processing method, a data processing device, a terminal and a nonvolatile computer readable storage medium. The data processing method comprises the following steps: acquiring a first camera angular velocity and a first IMU angular velocity of the photographing device at a plurality of different temperatures; acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation represents a mapping relation between temperature and time offset; acquiring a second camera angular speed and a second IMU angular speed of the shooting device under a plurality of different exposure time periods; acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation represents the mapping relation between the exposure duration and the time offset; and acquiring a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, wherein the calibration time offset represents the difference value between the image timestamp of the shooting device and the timing of the IMU of the shooting device.
Description
Technical Field
The present application relates to the field of image technologies, and in particular, to a data processing method, a data processing device, a terminal, and a non-volatile computer readable storage medium.
Background
When a handheld device such as a mobile phone records a video, external jitter of different degrees is easily introduced due to different shooting modes, so that the stability of a generated video picture is poor. To solve this problem, the industry typically performs anti-shake processing on video, and for mobile phone devices, there are OIS that fine-tunes the imaging light path, DIS based on alignment between image frames, and EIS that compensates for shake using extensive IMU (Inertial Measurement Unit, IMU) pose data filtering. The EIS method uses IMU data and camera data for pose compensation, typically requiring calibration of the time offset between IMU and camera. However, the time offset between the IMU and the camera is not fixed, and it is difficult to accurately obtain the time offset relationship between the IMU and the camera.
Disclosure of Invention
The embodiment of the application provides a data processing method, a data processing device, a terminal and a nonvolatile computer readable storage medium.
The data processing method of the embodiment of the application comprises the following steps: acquiring a first camera angular velocity and a first IMU angular velocity of the photographing device at a plurality of different temperatures; acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation represents a mapping relation between temperature and time offset; acquiring a second camera angular velocity and a second IMU angular velocity of the shooting device under a plurality of different exposure time periods; acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation represents a mapping relation between exposure time length and time offset; and obtaining a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, wherein the calibration time offset represents a difference value between an image timestamp of the shooting device and the timing of an IMU of the shooting device.
The data processing device of the embodiment of the application comprises a first acquisition module, a first mapping module, a second acquisition module, a second mapping module and a calibration module. The first acquisition module is used for acquiring a first camera angular speed and a first IMU angular speed of the shooting device at a plurality of different temperatures; the first mapping module is used for obtaining a first mapping relation according to the first camera angular speed and the first IMU angular speed, and the first mapping relation represents a mapping relation between temperature and time offset; the second acquisition module is used for acquiring a second camera angular speed and a second IMU angular speed of the shooting device under a plurality of different exposure time periods; the second mapping module is used for obtaining a second mapping relation according to the second camera angular speed and the second IMU angular speed, and the second mapping relation represents a mapping relation between exposure time length and time offset; the calibration module is used for obtaining a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, and the calibration time offset represents the difference value between the image timestamp of the shooting device and the timing of the IMU of the shooting device.
The terminal of the embodiment of the application comprises one or more processors, a memory and one or more programs. Wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs including instructions for performing the data processing method. The data processing method comprises the following steps: acquiring a first camera angular velocity and a first IMU angular velocity of the photographing device at a plurality of different temperatures; acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation represents a mapping relation between temperature and time offset; acquiring a second camera angular velocity and a second IMU angular velocity of the shooting device under a plurality of different exposure time periods; acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation represents a mapping relation between exposure time length and time offset; and obtaining a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, wherein the calibration time offset represents a difference value between an image timestamp of the shooting device and the timing of an IMU of the shooting device.
A non-transitory computer readable storage medium containing a computer program of an embodiment of the application, which when executed by one or more processors, causes the processors to implement instructions of a data processing method. The data processing method comprises the following steps: acquiring a first camera angular velocity and a first IMU angular velocity of the photographing device at a plurality of different temperatures; acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation represents a mapping relation between temperature and time offset; acquiring a second camera angular velocity and a second IMU angular velocity of the shooting device under a plurality of different exposure time periods; acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation represents a mapping relation between exposure time length and time offset; and obtaining a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, wherein the calibration time offset represents a difference value between an image timestamp of the shooting device and the timing of an IMU of the shooting device.
According to the data processing method, the image processing device, the terminal and the non-volatile computer readable storage medium, the mapping relation between the temperature and the time and the mapping relation between the exposure time and the time can be obtained by utilizing the camera angular velocity and the IMU angular velocity which are obtained under different temperatures and different exposure time, namely, the first mapping relation and the second mapping relation are obtained, so that the time offset between the camera and the inertial measurement unit can be obtained more accurately by utilizing the influence of the fusion temperature of the first mapping relation and the second mapping relation on the time offset.
Additional aspects and advantages of embodiments of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow diagram of a data processing method according to some embodiments of the application;
FIG. 2 is a schematic diagram of a photographing apparatus according to some embodiments of the present application;
FIG. 3 is a schematic diagram of a data processing apparatus according to some embodiments of the present application;
FIG. 4 is a schematic diagram of a terminal according to some embodiments of the present application;
FIG. 5 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 6 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 7 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 8 is a first camera curve and a first IMU curve schematic diagram of some embodiments of the present application;
FIG. 9 is a schematic diagram of an application scenario in which a first mapping relationship is obtained according to some embodiments of the present application;
FIG. 10 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 11 is a schematic illustration of an application scenario in which a first time offset is obtained according to some embodiments of the present application;
FIG. 12 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 13 is a schematic illustration of an application scenario in which a first mapping relationship is obtained according to some embodiments of the present application;
FIG. 14 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 15 is a schematic view of a second camera curve and a second IMU curve of certain embodiments of the present application;
FIG. 16 is a schematic illustration of an application scenario in which a second mapping relationship is obtained according to some embodiments of the present application;
FIG. 17 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 18 is a flow chart of a data processing method of some embodiments of the present application;
FIG. 19 is a schematic diagram of a connection of a computer readable storage medium and a processor according to some embodiments of the application.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the embodiments of the present application and are not to be construed as limiting the embodiments of the present application.
When a handheld device such as a mobile phone records a video, external jitter of different degrees is easily introduced due to different shooting modes, so that the stability of a generated video picture is poor. To solve this problem, the industry typically performs anti-shake processing on video, and for mobile phone devices, there are OIS (Optical Image Stabilization, OIS) that fine-tunes the imaging light path, DIS (DIGITAL IMAGE Stabilization, DIS) that is based on alignment between image frames, and EIS (ELECTRIC IMAGE Stabilization, EIS) that compensates for shake based on IMU (Inertial Measurement Unit, IMU) pose data filtering.
The IMU is a sensor for detecting the equipment gesture in real time, outputs the angular velocities of the current x, y and z three axes of the equipment, and obtains the current gesture of the equipment through the integration of the angular velocities. The EIS method performs attitude compensation by using the attitude data acquired by the IMU and the attitude data acquired by the camera. In order to accurately fuse the data acquired by the IMU and the camera, the time stamp offset between the IMU and the camera must be considered, and therefore, it is often necessary to calibrate the time offset between the IMU and the camera. However, the time offset between the IMU and the camera is not fixed, and it is difficult to accurately obtain the time offset relationship between the IMU and the camera.
The application provides a data processing method, which is used for processing data of a shooting device to accurately acquire a time offset relation between an IMU and a camera, so as to accurately fuse the data acquired by the IMU and the camera in subsequent processing, improve the accuracy of gesture compensation and improve the effect of gesture data filtering compensation shake.
Referring to fig. 1, the data processing method according to an embodiment of the present application includes the following steps:
01: acquiring a first camera angular velocity and a first IMU angular velocity of the photographing apparatus 200 at a plurality of different temperatures;
02: acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation represents a mapping relation between temperature and time offset;
03: acquiring a second camera angular velocity and a second IMU angular velocity of the photographing apparatus 200 at a plurality of different exposure durations;
04: acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation represents the mapping relation between the exposure duration and the time offset; and
05: And acquiring a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, wherein the calibration time offset represents the difference value between the image timestamp of the shooting device 200 and the timing of the IMU of the shooting device 200.
Please refer to fig. 2, wherein the photographing device 200 includes a camera 201 and an inertial measurement unit 202 (IMU). The camera 201 is an image sensor for acquiring an image. In one embodiment, camera 201 includes a light sensing assembly for receiving light and an imaging assembly for generating an image from the received light such that the light irradiates the imaging assembly. The inertial measurement unit 202 is an inertial sensor for measuring inertial parameters of the photographing device 200, and the inertial parameters of the photographing device 200 include an angular velocity of the photographing device 200, an acceleration of the photographing device 200, a gravitational acceleration of the photographing device 200, and the like.
The camera angular velocity refers to an angular velocity acquired with the camera 201. For example, the angle of the images photographed before and after the rotation of the camera 201 is different, and the angular velocity corresponding to the rotation of the camera 201 can be calculated using the angle of the images photographed before and after the rotation of the camera 201 and the time elapsed before and after the rotation of the camera 201. The IMU angular velocity is the angular velocity measured by the inertial measurement unit 202. That is, the camera angular velocity and the IMU angular velocity are measured by two different sensors (acquired data), respectively. If the degree of coincidence of the camera angular velocity and the IMU angular velocity of the photographing apparatus 200 at the same time is high, the accuracy of the posture compensation based on the IMU data and the camera data is high.
The image timestamp of the photographing device 200 is a digital signature corresponding to each frame of image when the camera 201 of the photographing device 200 acquires a single frame or multiple frames of images, and is used for recording the time when each frame of image is acquired by the camera 201.
The timing of the IMU of the photographing device 200 is a time stamp of the inertial measurement unit 202, and is used for recording the time when the inertial measurement unit 202 obtains the inertial parameters (i.e. IMU data) of the photographing device 200.
However, since there is often a time offset between the image timestamp of the photographing device 200 and the timestamp of the inertial measurement unit 202, if the time offset between the image timestamp of the photographing device 200 and the timestamp of the inertial measurement unit 202 cannot be accurately obtained, it is difficult to determine the camera angular velocity and the IMU angular velocity at the same time, which affects the accuracy of pose compensation based on the IMU data and the camera data. Accordingly, it is necessary to calibrate the time offset between the camera 201 and the inertial measurement unit 202 to align the image time stamp of the photographing device 200 and the time stamp of the inertial measurement unit 202, improving the accuracy of the pose compensation based on the IMU data and the camera data.
According to the data processing method, the mapping relation between the temperature and the time and the mapping relation between the exposure time and the time are obtained through the camera angular speed and the IMU angular speed which are obtained under different temperatures and different exposure time, namely the first mapping relation and the second mapping relation are obtained, so that the time offset between the camera 201 and the inertial measurement unit 202 is more accurately obtained by utilizing the influence of the first mapping relation and the second mapping relation, which are fused with the temperature and the exposure time, on the time offset.
Further description is provided below with reference to the accompanying drawings.
Referring to fig. 3, the embodiment of the application further provides an image processing apparatus 10 for processing data of the photographing apparatus 200. Referring to fig. 1 and 2, the image processing apparatus 10 may execute the data processing method according to the embodiment of the application to process the data of the photographing apparatus 200. The image processing apparatus 10 includes a first acquisition module 11, a first mapping module 12, a second acquisition module 13, a second mapping module 14, and a calibration module. Wherein, the first obtaining module 11 is used for executing the method in step 01, the first mapping module 12 is used for executing the method in step 02, the second obtaining module 13 is used for executing the method in step 03, the second mapping module is used for executing the method in step 04, and the calibration module is used for executing the method in step 05. That is, the first acquisition module 11 is configured to acquire the first camera angular velocity and the first IMU angular velocity of the photographing device 200 at a plurality of different temperatures. The first mapping module 12 is configured to obtain a first mapping relationship according to the first camera angular velocity and the first IMU angular velocity, where the first mapping relationship characterizes a mapping relationship between temperature and time offset. The second acquisition module 13 is configured to acquire a second camera angular velocity and a second IMU angular velocity of the photographing apparatus 200 at a plurality of different exposure durations. The second mapping module is used for obtaining a second mapping relation according to the second camera angular speed and the second IMU angular speed, and the second mapping relation represents the mapping relation between the exposure time length and the time offset. The calibration module is used for acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, and the second mapping relation represents the mapping relation between the exposure duration and the time offset.
Referring to fig. 4, the embodiment of the application further provides a terminal 100. The terminal 100 includes: one or more processors 30, memory 20, and one or more programs. Wherein one or more programs are stored in memory 20 and executed by one or more processors 30. Referring to fig. 1, the program includes instructions for executing the data processing methods in steps 01, 02, 03, 04, and 05. That is, the processor 30 is configured to execute the data processing methods in steps 01, 02, 03, 04, and 05.
In some embodiments, the terminal 100 may be an electronic device with an image capturing function, such as a mobile phone, a desktop computer, a notebook computer, a camera, a video camera, a smart watch, and the like, which is not limited herein. Referring to fig. 2, the processor 30 of the terminal 100 can obtain data from the photographing device 200, such as the first camera angular velocity, the first IMU angular velocity, the second camera angular velocity, the second IMU angular velocity, and so on.
In one embodiment, the terminal 100 and the photographing apparatus 200 are two independent devices, and data can be interacted between the terminal 100 and the photographing apparatus 200; in yet another embodiment, the terminal 100 further includes a photographing device 200, and the photographing device 200 includes a camera 201 and an inertial measurement unit 202, which are not limited herein.
Referring to fig. 5, in some embodiments, 01: acquiring a first camera angular velocity and a first IMU angular velocity of the photographing apparatus 200 at a plurality of different temperatures includes:
011: at each temperature, a first camera angular velocity is obtained according to the angular change of the current frame image relative to the adjacent frame image, and the angular velocity measured by the IMU of the photographing device 200 is obtained as the first IMU angular velocity.
Referring to fig. 3, in some embodiments, the first obtaining module 11 is further configured to perform the method in step 011. That is, the first obtaining module 11 is further configured to obtain, at each temperature, a first camera angular velocity according to an angular change of the current frame image relative to the adjacent frame image, and obtain, as the first IMU angular velocity, an angular velocity measured by the IMU of the photographing apparatus 200.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing method in step 011.
In some embodiments, the neighboring frame image includes one or more frame images of the first n frame images of the current frame image, n is a natural number greater than or equal to 1, for example, n may take a natural number of 1,2, 3, or more than 3. In one embodiment, the adjacent frame image is the first 1 frame image of the current frame image, the current frame image and the adjacent frame image are respectively images photographed by the camera 201 at different angles at a certain temperature, the angle change of the first 1 frame image relative to the adjacent frame image can reflect the angle change of the camera 201, so that the first camera angular velocity can be obtained according to the angle change of the current frame image relative to the adjacent frame image and the time interval between the current frame image and the adjacent frame image, and the angular velocity corresponding to the measured current frame image by the inertia measurement unit 202 is the first IMU angular velocity. Thus, a set of first camera angular velocities and corresponding first IMU angular velocities at a certain temperature may be obtained. Similarly, multiple sets of first camera angular velocities and corresponding first IMU angular velocities can be obtained at multiple different temperatures, each set of first camera angular velocities and corresponding first IMU angular velocities having the same temperature.
In some embodiments, the current frame image and the adjacent frame image are images acquired by the camera 201 capturing the calibration plate. The calibration plate comprises calibration patterns, and the angle change of the calibration patterns in the current frame image relative to the calibration patterns in the adjacent frame images is used as the angle change of the current frame image relative to the adjacent frame images. For example, the calibration pattern includes a plurality of calibration points, and if the camera 201 rotates during the capturing of the current frame image and the capturing of the adjacent frame image, the positions of the calibration points in the current frame image and the capturing of the adjacent frame image are different, and the angle at which the current frame image changes with respect to the adjacent frame image can be obtained from the positional (coordinate) transformation relationship between the calibration points in the current frame image and the calibration points in the capturing of the adjacent frame image.
Referring to fig. 6, in some embodiments, 03: acquiring the second camera angular velocity and the second IMU angular velocity of the photographing apparatus 200 at a plurality of different exposure durations includes:
031: and under each exposure time length, acquiring a second camera angular velocity according to the angular change of the current frame image relative to the adjacent frame image, and acquiring the angular velocity measured by the IMU of the shooting device 200 as a second IMU angular velocity.
Referring to fig. 3, in some embodiments, the second obtaining module 13 is further configured to perform the method in step 031. That is, the second obtaining module 13 is further configured to obtain, for each exposure time period, a second camera angular velocity according to an angular change of the current frame image relative to the adjacent frame image, and obtain, as the second IMU angular velocity, an angular velocity measured by the IMU of the photographing apparatus 200.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing method in step 031.
Referring to fig. 5 and 6, the second camera angular velocity is an angular velocity obtained according to an angular change of the current frame image with respect to the adjacent frame image, similar to the first camera angular velocity, in which an exposure time period of the current frame image is identical to an exposure time period of the adjacent frame image. Under a plurality of different exposure time periods, a plurality of groups of second camera angular velocities and corresponding second IMU angular velocities can be obtained, and the exposure time periods corresponding to each group of second camera angular velocities and the corresponding second IMU angular velocities are the same.
Referring to fig. 7, in some embodiments, 02: acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation comprises the following steps:
021: acquiring a first camera curve and a first IMU curve corresponding to each temperature, wherein the first camera curve is a relation curve of the change of the angular velocity of the first camera along with the change of time, and the first IMU curve is a relation curve of the change of the angular velocity of the first IMU along with the change of time;
022: acquiring a first time offset corresponding to each temperature according to a first camera curve and a first IMU curve corresponding to each temperature;
023: and acquiring a first mapping relation according to the plurality of temperatures and the first time offset corresponding to the plurality of temperatures.
Referring to fig. 3, in some embodiments, the first mapping module 12 is further configured to perform the methods of steps 021, 022 and 023. That is, the first mapping module 12 is further configured to obtain a first camera curve and a first IMU curve corresponding to each temperature, where the first camera curve is a relationship curve of the first camera angular velocity over time, and the first IMU curve is a relationship curve of the first IMU angular velocity over time; acquiring a first time offset corresponding to each temperature according to a first camera curve and a first IMU curve corresponding to each temperature; and acquiring a first mapping relation according to the plurality of temperatures and the first time offset corresponding to the plurality of temperatures.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing methods in steps 021, 022 and 023.
Referring to fig. 8, a first camera curve S1-c1 and a first IMU curve S1-d1 may be generated by a first camera angular velocity and a first IMU angular velocity acquired at a certain temperature T1, respectively. It can be seen that, since the camera and the inertial measurement unit 202 rotate synchronously, the variation trend of the first camera curve S1-c1 and the first IMU curve S1-d1 are close, but since the time stamps of the camera and the inertial measurement unit 202 are not aligned, there is a difference between the time corresponding to the same angular velocity of the first camera curve S1-c1 and the time corresponding to the first IMU curve S1-d1, for example, in the embodiment illustrated in fig. 8, the angular velocity w1 is t1 at the time corresponding to the first camera curve S1-c1, t2 at the time corresponding to the first IMU curve S1-d1, and the time difference between the time t1 and the time t2 can represent the time offset between the first camera curve S1-c1 and the first IMU curve S1-d1. Similarly, at a certain temperature T1, each angular velocity value can determine a time difference, and a first time offset Δs1_t1 corresponding to the temperature T1 can be obtained by integrating a plurality of time differences determined according to the first camera curve S1-c1 and the first IMU curve S1-d1 at the temperature T1, where the first time offset Δs1_t1 represents a time offset between the first camera angular velocity and the first IMU angular velocity at the temperature T1. In one embodiment, the average of the plurality of time differences determined from the first camera curve S1-c1 and the first IMU curve S1-d1 may be taken as the first time offset DeltaS 1-T1 for the temperature T1.
Referring to FIG. 9, similarly, for a plurality of different temperatures T2, T3, … …, tm, the first time offsets DeltaS 1-T2, deltaS 1-T3, … …, deltaS 1-Tm corresponding to the temperatures T2, T3, … …, tm can be obtained according to the method of obtaining the first time offset DeltaS 1-T1 corresponding to the temperature T1. The mapping relationship between the plurality of different temperatures T1, T2, T3, … … and Tm and the first time offsets DeltaS 1-T1, deltaS 1-T2, deltaS 1-T3, … … and DeltaS 1-Tm corresponding to the temperatures is the first mapping relationship.
Referring to fig. 10, in some embodiments 022: acquiring a first time offset corresponding to each temperature according to a first camera curve and a first IMU curve corresponding to each temperature, including:
0221: and enabling the first camera curve or the first IMU curve to deviate by a preset time deviation amount, and taking the time deviation amount which enables the deviation between the first camera curve and the first IMU curve to be minimum at each temperature after deviation as the first time deviation corresponding to the temperature.
Referring to fig. 3, in some embodiments, the first mapping module 12 is further configured to perform the method of step 0221. That is, the first mapping module 12 is further configured to offset the first camera curve or the first IMU curve by a preset time offset, and after the offset, take, as the first time offset corresponding to the temperature, the time offset that minimizes the deviation between the first camera curve and the first IMU curve at each temperature.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing method in step 0221.
Referring to fig. 11, taking a first camera curve S1-c1 and a first IMU curve S1-d1 corresponding to a certain temperature T1 as an example, setting a time offset as Δs1-1, translating the first camera curve S1-c1 by Δs1-1 on a time axis to obtain a curve S1-c1-1, and obtaining a coincidence ratio between the curve S1-c1-1 and the first IMU curve S1-d 1; and then changing the value of the delta S1-1 to obtain a time offset delta S1-2, enabling the first camera curve S1-c1 to translate the time axis S1-2 to obtain a curve S1-c1-2, and obtaining the coincidence ratio of the curve S1-c1-1 and the first IMU curve S1-d 1. And so on, a plurality of coincidence degrees can be obtained, and the higher the coincidence degrees are, the smaller the deviation between the first camera curve and the first IMU curve is caused by the time offset corresponding to the coincidence degrees. In this way, the time shift amount corresponding to the highest overlap ratio can be determined from the plurality of overlap ratios, and this time shift amount is set as the first time shift Δs1-T1 corresponding to the temperature T1.
Referring to fig. 12, in some embodiments, 023: acquiring a first mapping relation according to the plurality of temperatures and the first time offset corresponding to the plurality of temperatures, wherein the first mapping relation comprises the following steps:
0231: and taking the fitting curves of the plurality of temperatures and the first time offsets corresponding to the plurality of temperatures as the first mapping relation.
Referring to fig. 3, in some embodiments, the first mapping module 12 is further configured to perform the method in step 0231. That is, the first mapping module 12 is further configured to take, as the first mapping relationship, a plurality of temperatures and a fitted curve of a first time offset corresponding to the plurality of temperatures.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing method in step 0231.
Referring to fig. 13, in some embodiments, a B-spline curve may be used to fit a plurality of temperatures and first time offsets corresponding to the plurality of temperatures to obtain a fitted curve BSpline (T), where the fitted curve BSpline (T) is a function of the temperature T and the time offset Δt, and the fitted curve BSpline (T) is used as the first mapping relationship. Thus, the first time offset corresponding to any temperature value in the temperature interval where the fitting curve BSpline (T) is located can be obtained by using the fitting curve BSpline (T).
Referring to fig. 14, in some embodiments, 04: obtaining a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation comprises the following steps:
041: acquiring a second camera curve and a second IMU curve corresponding to each exposure time length, wherein the second camera curve is a relationship curve of the change of the second camera angular speed along with time, and the second IMU curve is a relationship curve of the change of the second IMU angular speed along with time;
042: acquiring a second time offset corresponding to each exposure time according to a second camera curve and a second IMU curve corresponding to each exposure time; and
043: And acquiring a second mapping relation according to the plurality of exposure time lengths and the second time offset corresponding to the plurality of exposure time lengths.
Referring to fig. 3, in some embodiments, the second mapping module 14 is further configured to perform the methods of steps 041, 042, and 043. That is, the second mapping module 14 is further configured to obtain a second camera curve and a second IMU curve corresponding to each exposure duration, where the second camera curve is a relationship curve of the second camera angular velocity over time, and the second IMU curve is a relationship curve of the second IMU angular velocity over time; acquiring a second time offset corresponding to each exposure time according to a second camera curve and a second IMU curve corresponding to each exposure time; and obtaining a second mapping relation according to the plurality of exposure time lengths and the second time offset corresponding to the plurality of exposure time lengths.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing methods in steps 041, 042 and 043.
Referring to fig. 7 to 14, the method for obtaining the second mapping relationship is similar to the method for obtaining the first mapping relationship, except that the variable temperature is replaced by the exposure time. Referring to fig. 15 and 16, for example, a second camera angular velocity and a second IMU angular velocity obtained under the exposure time F1 may generate a second camera curve S2-c1 and a second IMU curve S2-d1, respectively, and a second time offset Δs2-t1 corresponding to the exposure time F1 may be obtained according to the second camera curve S2-c1 and the second IMU curve S2-d 1. Similarly, the second time offsets ΔS2-t1, ΔS2-t2, ΔS2-t3, … …, ΔS2-tm corresponding to the exposure periods F1, F2, F3, … …, and Fm can be obtained, and the second time offsets ΔS2-t1, ΔS2-t2, ΔS2-t3, … …, and ΔS2-tm corresponding to the exposure periods are the second mapping relationship.
Referring to fig. 17, in some embodiments, 042: acquiring a second time offset corresponding to each exposure duration according to a second camera curve and a second IMU curve corresponding to each exposure duration, including:
0421: and enabling the second camera curve or the second IMU curve to deviate by a preset time deviation amount, and taking the time deviation amount which enables the deviation between the second camera curve and the second IMU curve to be minimum under each exposure time length after deviation as a second time deviation corresponding to the exposure time length.
Referring to fig. 3, in some embodiments, the second mapping module 14 is further configured to perform the method of step 0421. That is, the second mapping module 14 is further configured to offset the second camera curve or the second IMU curve by a preset time offset, and after the offset, take, as a second time offset corresponding to the exposure time period, a time offset that minimizes a deviation between the second camera curve and the second IMU curve for each exposure time period.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing method in step 0421.
Referring to fig. 10 and 15, the method for obtaining the second time offset is similar to the method for obtaining the first time offset, except that the variable temperature is replaced by the exposure time, the first camera curve and the first IMU curve are replaced by the second camera curve and the second IMU curve, after the second camera curve or the second IMU curve is offset by a preset time offset, the offset curve overlap ratio is obtained, the time offset corresponding to the highest overlap ratio is determined from a plurality of overlap ratios obtained by multiple offsets, and the time offset is used as the second time offset corresponding to the exposure time.
Referring to fig. 16, in some embodiments, 043: acquiring a second mapping relation according to the plurality of exposure time lengths and second time offsets corresponding to the plurality of exposure time lengths, wherein the second mapping relation comprises the following steps:
0431: and taking the fitting curves of the plurality of exposure time periods and the second time offsets corresponding to the plurality of exposure time periods as the second mapping relation.
Referring to fig. 3, in some embodiments, the second mapping module 14 is further configured to perform the method of step 0431. That is, the second mapping module 14 is further configured to use, as the second mapping relationship, a fitting curve of the plurality of exposure durations and the second time offsets corresponding to the plurality of exposure durations.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing method in step 0431.
Referring to fig. 13 and 16, similar to the method for obtaining the first mapping relationship, in some embodiments, a B-spline curve may be used to fit a plurality of exposure durations and second time offsets corresponding to the plurality of exposure durations to obtain a fitted curve BSpline (F), where the fitted curve BSpline (F) is a function of the exposure durations F and the time offsets Δt, and the fitted curve BSpline (F) is the second mapping relationship. Thus, the fitting curve BSpline (F) is utilized to obtain the second time offset corresponding to any one of the exposure durations in the exposure duration interval where the fitting curve BSpline (F) is located.
Referring to fig. 1, 13 and 16, in some embodiments, the predetermined relationship is: Δt=a+b+bspline (F). Wherein Δt is a time offset, a and b are preset weights, BSpline (T) is a first mapping relationship, and BSpline (F) is a second mapping relationship. In the case that the photographing device 200 photographs an image at a certain temperature and with a certain exposure time, the time offset obtained by inputting the temperature and the exposure time into a preset relational expression is the calibration time offset corresponding to the temperature and the exposure time, and the time stamp of the camera and the time stamp of the inertial measurement unit 202 can be aligned with the time stamp of the camera and the time stamp of the inertial measurement unit 202 by adopting the calibration time offset corresponding to the temperature and the exposure time under the temperature and the exposure time. If the temperature or the exposure time length changes, the changed temperature and/or the changed exposure time length is input into a preset relational expression, and a new calibration time offset can be obtained, so that the time stamp of the camera and the time stamp of the inertial measurement unit 202 can be synchronized according to the new calibration time offset. In this way, the data processing method according to the embodiment of the application can accurately acquire the calibration time offsets corresponding to the temperatures and the exposure time durations of different states, so as to accurately synchronize the time stamp of the camera and the time stamp of the inertial measurement unit 202 under the temperatures and the exposure time durations of the various states.
In some embodiments, the weight a in the preset relationship is smaller than the weight b. That is, the exposure period has a greater degree of influence on the time shift than the temperature. In one embodiment, the weight a=0.2 and the weight b=0.8. In other embodiments, the weight a may be any value of [0.1, 0.5), for example, the weight a has a value of 0.1, 0.2, 0.3, 0.4, etc., which are not listed herein; the weight b may be any value of (0.5, 1.0), for example, the weight b is 0.6, 0.7, 0.8, 0.9, etc., which are not exemplified herein
Referring to fig. 18, in some embodiments, the data processing method further includes:
06: performing attitude compensation on a picture shot by the shooting device 200 according to the calibration time offset, the camera data of the shooting device 200 and the IMU data of the shooting device 200; and
07: And generating an image or video according to the picture subjected to the gesture compensation.
Referring to fig. 3, in some embodiments, the data processing apparatus further includes a compensation module 16 and a generation module 17. The compensation module 16 is used to perform the method in step 06 and the generation module 17 is used to perform the method in step 07. That is, the compensation module 16 is configured to perform gesture compensation on the image captured by the capturing device 200 according to the calibration time offset, the camera data of the capturing device 200, and the IMU data of the capturing device 200. The generating module 17 is configured to generate an image or video according to the gesture-compensated picture.
Referring to fig. 4, in some embodiments, the processor 30 is further configured to perform the data processing methods in steps 06 and 07.
The camera data includes pose parameters acquired by the camera 201, and the IMU data includes pose parameters acquired by the inertial measurement unit 202. The time stamp of the camera and the time stamp of the inertial measurement unit 202 can be accurately synchronized according to the calibrated time offset. According to the EIS method, after synchronizing the time stamp of the camera 201 and the time stamp of the inertial measurement unit 202, the blurring phenomenon of the picture can be eliminated to some extent by performing the posture compensation of the picture photographed by the photographing device 200 using the camera data and the IMU data. And generating an image according to the picture after gesture compensation to obtain a clear image for eliminating the jitter, and generating a video according to the multi-frame picture after gesture compensation to obtain a clear video for eliminating the jitter.
In summary, the data processing method according to the embodiment of the present application can obtain the mapping relationship between temperature and time and the mapping relationship between exposure time and time by using the camera angular velocity and the IMU angular velocity obtained under different temperatures and different exposure time, that is, the first mapping relationship and the second mapping relationship, so as to more accurately obtain the time offset between the camera 201 and the inertial measurement unit 202 by using the influence of the first mapping relationship and the second mapping relationship on the time offset due to the fusion of the temperature and the exposure time.
Referring to fig. 19, an embodiment of the present application also provides a non-transitory computer readable storage medium 300 containing a computer program 301. One or more non-transitory computer-readable storage media 300 embodying a computer program 301 that, when executed by one or more processors 30, causes the processors 30 to perform the data processing method of any of the embodiments described above.
In the description of the present specification, reference is made to the terms "one embodiment," "some embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., meaning that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the various embodiments or examples described in this specification and the features of the various embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
While embodiments of the present application have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the application, and that variations, modifications, alternatives, and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the application.
Claims (11)
1. A data processing method for processing data of a photographing apparatus, the data processing method comprising:
Acquiring a first camera angular velocity and a first IMU angular velocity of the photographing device at a plurality of different temperatures;
Acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, wherein the first mapping relation represents a mapping relation between temperature and time offset;
Acquiring a second camera angular velocity and a second IMU angular velocity of the shooting device under a plurality of different exposure time periods;
acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, wherein the second mapping relation represents a mapping relation between exposure time length and time offset; and
And acquiring a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, wherein the calibration time offset represents a difference value between an image timestamp of the shooting device and the timing of an IMU of the shooting device.
2. The data processing method of claim 1, wherein the acquiring the first camera angular velocity and the first IMU angular velocity of the photographing device at a plurality of different temperatures comprises:
And at each temperature, acquiring the first camera angular velocity according to the angular change of the current frame image relative to the adjacent frame image, and acquiring the angular velocity measured by the IMU of the shooting device as the first IMU angular velocity.
3. The data processing method according to claim 1, wherein the obtaining a first mapping relationship according to the first camera angular velocity and the first IMU angular velocity, the first mapping relationship characterizing a mapping relationship between temperature and time offset, includes:
acquiring a first camera curve and a first IMU curve corresponding to each temperature, wherein the first camera curve is a relation curve of the first camera angular velocity changing along with time, and the first IMU curve is a relation curve of the first IMU angular velocity changing along with time;
Acquiring a first time offset corresponding to each temperature according to the first camera curve and the first IMU curve corresponding to each temperature; and
And acquiring the first mapping relation according to the plurality of temperatures and the first time offset corresponding to the plurality of temperatures.
4. A data processing method according to claim 3, wherein said obtaining a first time offset for each temperature from the first camera profile and the first IMU profile for each temperature comprises:
And enabling the first camera curve or the first IMU curve to deviate by a preset time deviation amount, and taking the time deviation amount which enables the deviation between the first camera curve and the first IMU curve to be minimum at each temperature after deviation as the first time deviation corresponding to the temperature.
5. The method of claim 1, wherein the acquiring the second camera angular velocity and the second IMU angular velocity of the photographing device at a plurality of different exposure durations comprises:
And under each exposure time length, acquiring a second camera angular velocity according to the angular change of the current frame image relative to the adjacent frame image, and acquiring the angular velocity measured by the IMU of the shooting device as the second IMU angular velocity.
6. The method according to claim 1, wherein the obtaining a second mapping relationship according to the second camera angular velocity and the second IMU angular velocity includes:
Acquiring a second camera curve and a second IMU curve corresponding to each exposure time length, wherein the second camera curve is a relationship curve of the second camera angular velocity changing along with time, and the second IMU curve is a relationship curve of the second IMU angular velocity changing along with time;
Acquiring a second time offset corresponding to each exposure time according to the second camera curve and the second IMU curve corresponding to each exposure time; and
And acquiring the second mapping relation according to the plurality of exposure time lengths and the second time offset corresponding to the plurality of exposure time lengths.
7. The method of claim 6, wherein the obtaining a second time offset corresponding to each exposure duration according to the second camera curve and the second IMU curve corresponding to each exposure duration comprises:
And enabling the second camera curve or the second IMU curve to deviate by a preset time deviation amount, and taking the time deviation amount which enables the deviation between the second camera curve and the second IMU curve to be minimum under each exposure time length after deviation as the second time deviation corresponding to the exposure time length.
8. The data processing method according to claim 1, characterized in that the data processing method further comprises:
performing attitude compensation on a picture shot by the shooting device according to the calibration time offset, the camera data of the shooting device and the IMU data of the shooting device; and
And generating an image or video according to the picture subjected to the gesture compensation.
9. A data processing apparatus for processing data of a photographing apparatus, comprising:
the first acquisition module is used for acquiring a first camera angular speed and a first IMU angular speed of the shooting device at a plurality of different temperatures;
The first mapping module is used for acquiring a first mapping relation according to the first camera angular speed and the first IMU angular speed, and the first mapping relation represents the mapping relation between temperature and time offset;
The second acquisition module is used for acquiring a second camera angular speed and a second IMU angular speed of the shooting device under a plurality of different exposure time periods;
the second mapping module is used for acquiring a second mapping relation according to the second camera angular speed and the second IMU angular speed, and the second mapping relation represents the mapping relation between the exposure duration and the time offset; and
The calibration module is used for acquiring a calibration time offset according to the first mapping relation, the second mapping relation and a preset relation, and the calibration time offset represents the difference value between the image timestamp of the shooting device and the timing of the IMU of the shooting device.
10. A terminal, the terminal comprising:
One or more processors, memory; and
One or more programs, wherein the one or more programs are stored in the memory and executed by the one or more processors, the programs comprising instructions for performing the data processing method of any of claims 1 to 8.
11. A non-transitory computer readable storage medium containing a computer program which, when executed by one or more processors, causes the processors to implement the instructions of the data processing method of any of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211436777.3A CN118055318A (en) | 2022-11-16 | 2022-11-16 | Data processing method, data processing device, terminal and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211436777.3A CN118055318A (en) | 2022-11-16 | 2022-11-16 | Data processing method, data processing device, terminal and readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118055318A true CN118055318A (en) | 2024-05-17 |
Family
ID=91052650
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211436777.3A Pending CN118055318A (en) | 2022-11-16 | 2022-11-16 | Data processing method, data processing device, terminal and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118055318A (en) |
-
2022
- 2022-11-16 CN CN202211436777.3A patent/CN118055318A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110567453B (en) | Bionic eye multi-channel IMU and camera hardware time synchronization method and device | |
US7564482B2 (en) | Image capturing device, correction device, mobile phone, and correcting method | |
CN107223330B (en) | Depth information acquisition method and device and image acquisition equipment | |
JP6263623B2 (en) | Image generation method and dual lens apparatus | |
CN101594464B (en) | Imaging apparatus and imaging method | |
CN101572777B (en) | Filming device and filming method | |
CN108603752B (en) | Deflection angle detection method and device and jitter compensation method and device for camera module of terminal | |
CN111934843A (en) | Multi-sensor data synchronous acquisition method for intelligent unmanned system | |
US11032477B2 (en) | Motion stabilized image sensor, camera module and apparatus comprising same | |
CN110300263B (en) | Gyroscope processing method and device, electronic equipment and computer readable storage medium | |
CN109598764A (en) | Camera calibration method and device, electronic equipment, computer readable storage medium | |
CN109600548A (en) | Image processing method and device, electronic equipment, computer readable storage medium | |
CN107135349A (en) | Picture pick-up device, lens unit, camera system and its control method | |
CN109073407A (en) | Drift scaling method, equipment and the unmanned vehicle of Inertial Measurement Unit | |
CN108260360B (en) | Scene depth calculation method and device and terminal | |
CN110456602A (en) | A kind of projection pattern means for correcting of optical projection system, method and system | |
CN111556226A (en) | Camera system | |
TW200817809A (en) | Image stabilizing in cameras | |
CN110113542B (en) | Anti-shake method and apparatus, electronic device, computer-readable storage medium | |
CN105865423A (en) | A binocular range finding method, a binocular range finding device, a panoramic image mosaicking method and a system thereof | |
CN109147059B (en) | Method and equipment for determining delay value | |
CN113436267B (en) | Visual inertial navigation calibration method, device, computer equipment and storage medium | |
CN108955642B (en) | Large-breadth equivalent center projection image seamless splicing method | |
US10362303B2 (en) | Sensor-assisted autofocus calibration | |
CN118055318A (en) | Data processing method, data processing device, terminal and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |