CN114025158B - Delay time determination method and device, image acquisition equipment and storage medium - Google Patents

Delay time determination method and device, image acquisition equipment and storage medium Download PDF

Info

Publication number
CN114025158B
CN114025158B CN202210014365.4A CN202210014365A CN114025158B CN 114025158 B CN114025158 B CN 114025158B CN 202210014365 A CN202210014365 A CN 202210014365A CN 114025158 B CN114025158 B CN 114025158B
Authority
CN
China
Prior art keywords
sequence
transformation
gyroscope
video
delay time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210014365.4A
Other languages
Chinese (zh)
Other versions
CN114025158A (en
Inventor
杨熙丞
雷春霞
王廷鸟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202210014365.4A priority Critical patent/CN114025158B/en
Publication of CN114025158A publication Critical patent/CN114025158A/en
Application granted granted Critical
Publication of CN114025158B publication Critical patent/CN114025158B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6812Motion detection based on additional sensors, e.g. acceleration sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to a delay time determination method, a delay time determination device, an image acquisition device and a storage medium, wherein the delay time determination method comprises the following steps: acquiring a first transformation sequence and a second transformation sequence, wherein the first transformation sequence is used for representing the change condition of adjacent video frames in a preset period, and the second transformation sequence is used for representing the change condition of adjacent gyroscope frames in the preset period; carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence; determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, wherein the time stamps of the target video frame and the target gyroscope frame are the same; and determining the delay time according to the interval frame number. By the method and the device, the problem that the accuracy of the delay time obtained in the prior art is low is solved, and the calculation accuracy of the delay time is improved.

Description

Delay time determination method and device, image acquisition equipment and storage medium
Technical Field
The present application relates to the field of image processing, and in particular, to a method and an apparatus for determining a delay time, an image capturing device, and a storage medium.
Background
In the mobile video service, the anti-shake function can be adopted to improve the stability of video recording so as to reduce the video shake caused by factors such as hand-held shaking and the like. The gyroscope is an important component of an anti-shake function, and the produced attitude information of the gyroscope needs to be kept synchronous with image information in time so as to obtain a stable anti-shake effect. If the signals of the two cannot be kept synchronous, namely the acquired attitude information and the image information have a delay time, the image information is subjected to shake compensation by using the attitude information at other time. This may reduce the stability of the anti-shake result and may even result in an anti-shake effect inferior to that of the original video.
In the prior art, corresponding feature points in adjacent frame images are determined, the motion directions of the feature points are obtained in a gyroscope, and different motion directions are obtained by adjusting time stamp delays of the gyroscope and the images. And when the characteristic points of the previous frame are corrected by projection transformation according to the current motion direction, obtaining the predicted coordinate points of the later frame and making a difference with the characteristic points of the actual later frame, wherein the result is a cost function. When the value of the cost function is minimal, its timestamp delay is considered correct. However, the prior art directly uses the image feature points to calculate the delay time, is limited to the influence of light and video quality, and has poor accuracy of the delay time.
Aiming at the problem of low calculation accuracy of delay time in the related art, no effective solution is provided at present.
Disclosure of Invention
The embodiment provides a delay time determination method and device, an image acquisition device and a storage medium, so as to solve the problem of low calculation accuracy of delay time in the related art.
In a first aspect, a delay time determination method is provided in this embodiment, including:
acquiring a first transformation sequence and a second transformation sequence, wherein the first transformation sequence is used for representing the change condition of adjacent video frames in a preset period, and the second transformation sequence is used for representing the change condition of adjacent gyroscope frames in the preset period;
carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence;
determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, wherein the time stamps of the target video frame and the target gyroscope frame are the same;
and determining the delay time according to the interval frame number.
In one embodiment, the obtaining the first transform sequence and the second transform sequence comprises: acquiring original video data in the preset period; performing feature point matching according to feature points of adjacent frames of the original video data to obtain a first transformation matrix; acquiring original gyroscope data in the preset period; converting the original gyroscope data under a camera coordinate system into gyroscope data under an image coordinate system to obtain a second transformation matrix, wherein the image coordinate system is established based on a video frame image shot by the camera; and performing projection transformation on a preset point set by using the first transformation matrix and the second transformation matrix, and determining the first transformation sequence and a corresponding second transformation sequence.
In one embodiment, the projective transforming the first transformation matrix and the second transformation matrix into a preset point set, and determining the first transformation sequence and the corresponding second transformation sequence includes: performing projection transformation on a preset image point set by using the first transformation matrix and the second transformation matrix to obtain a video transformation point sequence set and a gyroscope transformation point sequence set; calculating discrete values of each group of the gyroscope transformation point sequences in the gyroscope transformation point sequence set, and taking the gyroscope transformation point sequence with the maximum discrete value as the first transformation sequence; and taking the video transformation point sequence corresponding to the first transformation sequence as a second transformation sequence.
In one embodiment, before performing a normalized cross-correlation operation on the first transform sequence and the second transform sequence to obtain a cross-correlation sequence, the method further includes: and detecting whether the number of the video frames of the first transformation sequence is the same as the number of the gyroscope frames of the second transformation sequence, and if the number of the video frames is less than the number of the gyroscope frames, performing interpolation operation on the first transformation sequence to ensure that the number of the video frames of the first transformation sequence after interpolation is the same as the number of the gyroscope frames of the second transformation sequence.
In one embodiment, obtaining the first transform sequence and the second transform sequence within the preset period includes: filtering the first transformation sequence and the second transformation sequence to obtain second video data and second gyroscope data; making a difference between the second video data and the first conversion sequence to obtain third video data; subtracting the second gyroscope data from the second transformation sequence to obtain third gyroscope data; and performing normalized cross-correlation operation on the third video data and the third gyroscope data to determine delay time.
In one embodiment, the determining the delay time according to the number of interval frames comprises: correcting the video sequence timestamp and/or the gyroscope sequence timestamp based on the delay time to obtain correction data; and correcting the current video based on the correction data to obtain a target video.
In one embodiment, the obtaining the first transform sequence and the second transform sequence in the preset period includes: acquiring original gyroscope data; and calculating the variance of the original gyroscope data of multiple frames, if the variance is greater than a preset threshold value, determining that the camera shakes, and acquiring a first transformation sequence and a corresponding second transformation sequence.
In a second aspect, there is provided in this embodiment a delay time determination apparatus, including:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a first transformation sequence and a second transformation sequence, the first transformation sequence is used for representing the change condition of adjacent video frames in a preset period, and the second transformation sequence is used for representing the change condition of adjacent gyroscope frames in the preset period;
the first processing module is used for carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence;
the second processing module is used for determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, and the time stamps of the target video frame and the target gyroscope frame are the same;
and the determining module is used for determining the delay time according to the interval frame number.
In a third aspect, there is provided an image capturing apparatus comprising a camera, a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the delay time determining method of the first aspect when executing the computer program.
In a fourth aspect, in the present embodiment, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the delay time determination method described in the first aspect above.
Compared with the related art, the method and the device for determining the delay time, the image capturing apparatus, and the storage medium provided in this embodiment obtain a first transform sequence and a second transform sequence, where the first transform sequence is used to represent a change condition of an adjacent video frame in a preset period, and the second transform sequence is used to represent a change condition of an adjacent gyroscope frame in the preset period; carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence; determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, wherein the time stamps of the target video frame and the target gyroscope frame are the same; and determining the delay time according to the interval frame number, not directly calculating the delay time by using the image characteristic points, but calculating the delay time from the correlation between the video frame and the gyroscope frame, and having less influence on light and video quality, thereby solving the problem of lower accuracy of the delay time obtained in the prior art and improving the calculation accuracy of the delay time.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a terminal of a delay time determination method according to an embodiment of the present application;
FIG. 2 is a flow chart of a delay time determination method according to an embodiment of the present application;
FIG. 3 is a diagram illustrating a preset point set of a delay time determination method according to an embodiment of the present application;
FIG. 4 is a flow chart of a delay time determination method according to another embodiment of the present application;
fig. 5 is a block diagram of a delay time determination apparatus according to an embodiment of the present application.
Detailed Description
For a clearer understanding of the objects, aspects and advantages of the present application, reference is made to the following description and accompanying drawings.
Unless defined otherwise, technical or scientific terms used herein shall have the same general meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The use of the terms "a" and "an" and "the" and similar referents in the context of this application do not denote a limitation of quantity, either in the singular or the plural. The terms "comprises," "comprising," "has," "having," and any variations thereof, as referred to in this application, are intended to cover non-exclusive inclusions; for example, a process, method, and system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or modules, but may include other steps or modules (elements) not listed or inherent to such process, method, article, or apparatus. Reference throughout this application to "connected," "coupled," and the like is not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference to "a plurality" in this application means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. In general, the character "/" indicates a relationship in which the objects associated before and after are an "or". The terms "first," "second," "third," and the like in this application are used for distinguishing between similar items and not necessarily for describing a particular sequential or chronological order.
The method embodiments provided in the present embodiment may be executed in a terminal, a computer, or a similar computing device. For example, the method is executed on a terminal, and fig. 1 is a block diagram of a hardware structure of the terminal according to the delay time determination method in the embodiment of the present application. As shown in fig. 1, the terminal may include one or more processors 102 (only one shown in fig. 1) and a memory 104 for storing data, wherein the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA. The terminal may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those of ordinary skill in the art that the structure shown in fig. 1 is merely an illustration and is not intended to limit the structure of the terminal described above. For example, the terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program and a module of application software, such as a computer program corresponding to the delay time determination method in the present embodiment, and the processor 102 executes various functional applications and data processing by running the computer program stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The network described above includes a wireless network provided by a communication provider of the terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
With the development of shooting devices such as smart phones and cameras, the demand for mobile video services has also increased greatly. When shooting a video, in order to shoot a better-looking picture or a more stable video, anti-shake processing is generally required to be performed on shooting equipment such as a smart phone and a camera, and in order to enable the shooting equipment to obtain a good anti-shake effect, the existing anti-shake processing generally compensates an image in an electronic anti-shake and/or optical anti-shake manner, wherein the electronic anti-shake manner mainly synchronizes image information and electronic equipment posture information, but due to the inaccurate problem of delay time calculation, synchronization between the image information and the posture information is inaccurate, and finally the problem of poor anti-shake effect occurs.
In the present embodiment, a method for determining a delay time is provided, and fig. 2 is a flowchart of a method for determining a delay time according to an embodiment of the present application, and as shown in fig. 2, the flowchart includes the following steps:
step S201, a first transformation sequence and a second transformation sequence are obtained, where the first transformation sequence is used to represent a change condition of an adjacent video frame in a preset period, and the second transformation sequence is used to represent a change condition of an adjacent gyroscope frame in the preset period.
Specifically, an original video sequence and a corresponding gyroscope data sequence in the same time period are selected, a first transformation sequence is determined by calculating the change condition of adjacent video frames of the original video sequence, and a second transformation sequence is determined by calculating the change condition of adjacent video frames of the original gyroscope data sequence.
Step S202, carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence.
Specifically, the normalized cross-correlation operation is mainly used to describe the degree of similarity between two signals, and in image matching, the degree of similarity between an image sequence or images from different perspectives is usually described by using a correlation coefficient. In this embodiment. The correlation between the first conversion sequence and the second conversion sequence can be determined through the normalization cross-correlation operation, and the cross-correlation sequence obtained through the normalization cross-correlation operation can reflect the correlation coefficient of the first conversion sequence and the second conversion sequence and the number of interval frames between related video frames and gyroscope frames.
Step S203, determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, wherein the time stamps of the target video frame and the target gyroscope frame are the same.
Specifically, from the cross-correlation sequence, the number of frames of the difference between the corresponding video frame and the gyroscope frame under the same timestamp, for example, the video frame tn without delay time1Should be aligned with the gyroscope frame tg12Correspondingly, the video frame tn is now present due to the delay time1And gyroscope frame tg15And correspondingly. And determining the interval frame number of 3 frames of gyroscope frames of the corresponding target video frame and the target gyroscope frame according to the cross-correlation sequence.
And step S204, determining the delay time according to the interval frame number.
Specifically, the delay time can be determined according to the number of interval frames and the time difference between each frame.
Through the steps, the delay time determining method of the embodiment of the application combines the change condition of the video frame and the change condition of the gyroscope frame through the normalized cross-correlation operation, does not adopt the characteristic points to directly calculate the delay time, but utilizes the relation between the video frame and the gyroscope frame in the time dimension to calculate the delay time, and compared with the prior art, the method is less influenced by light and video quality and has better robustness.
In one embodiment, the obtaining the first transform sequence and the second transform sequence comprises: acquiring original video data in the preset period; performing feature point matching according to feature points of adjacent frames of the original video data to obtain a first transformation matrix; acquiring original gyroscope data in the preset period; converting the original gyroscope data under a camera coordinate system into gyroscope data under an image coordinate system to obtain a second transformation matrix, wherein the image coordinate system is established based on a video frame image shot by the camera; and performing projection transformation on a preset point set by using the first transformation matrix and the second transformation matrix, and determining the first transformation sequence and a corresponding second transformation sequence.
Specifically, the preset period is a preset time period, and continuous multiframe original video data and original gyroscope data in the time period are acquired. In this embodiment, the raw video data is a video sequence (n) of the mobile device1、n2、ni…). The format of the video sequence may be yuv images, rgb images, etc. The raw gyroscope data is a sequence of gyroscope data (g) corresponding to the video sequence1、g2、gi…). The gyroscope data sequence format is three-axis angular velocity data (gx)j、gyj、gzj…). Since the original video data is generally 25 frames or 30 frames per second, and the original gyroscope data is generally 200hz, the actual multi-frame original gyroscope data corresponds to one frame of original video data, and therefore, the time stamp data (tn) of the two frames needs to be acquired respectively1、tn2、tni…) and (tg)1、tg2、tgj…) adjacent frames of original video data having a time interval t1Adjacent frame time interval of raw gyroscope data is t2。t1And t2Typically a fixed value. Delay time tdUnknown at this point, the time stamp data indicating the video sequence and the gyroscope sequence may not be synchronized. If the two are synchronous, delay time tdIs 0. For example, assume tn1=tg12,tn2=tg18Then, explain g12~g17The raw gyroscope data of a frame corresponds to a video frame n of raw video data1. If there is a delay time, tdNot 0, let tn be1=tg7+td,tn2=tg13+tdThen, explain g7~g12The raw gyroscope data of a frame corresponds to a video frame n of raw video data1
The first transformation matrix can be calculated by adopting algorithms such as Scale-Invariant Feature Transform (SIFT), Speeded Up Robust Feature (SURF) and the like to acquire Feature points (Fn 1, Fn2, Fni …) of each frame, respectively performing Feature point matching on the Feature points of adjacent video frames, and calculating transformation matrices (Tn1, Tn2, Tn3, Tni …) of the adjacent frames by using the Feature point pair sets which are successfully matched. The transformation matrix is typically a projective transformation matrix. In order to reduce the time consumption of calculation and adapt to mobile equipment, the video sequence can be sampled to obtain a reduced video sequence image, and then transformation matrix calculation is carried out.
The process of computing the second transformation matrix includes using the three-axis angular velocity data (gx)j、gyj、gzj…) obtaining a rotation matrix Rgj from the camera coordinate system to the world coordinate system at the moment through three-dimensional coordinate transformation, and further converting the world coordinate system to the image coordinate system of the camera through an internal reference matrix K, thereby determining a second transformation matrix
Figure 800666DEST_PATH_IMAGE001
. Wherein the content of the first and second substances,
Figure 121925DEST_PATH_IMAGE001
the calculation formula of (2) is as follows:
Figure 938572DEST_PATH_IMAGE002
and K is an internal reference matrix, and the internal reference matrix can be obtained by pre-calibration.
And after the first transformation matrix and the second transformation matrix are obtained, performing projection transformation on the preset point set by using the first transformation matrix and the second transformation matrix to obtain a first transformation sequence and a second transformation sequence. The preset point set is a set formed by points at specified positions in the video image under an image coordinate system.
In one embodiment, the projective transforming the first transformation matrix and the second transformation matrix into a preset point set, and determining the first transformation sequence and the corresponding second transformation sequence includes: performing projection transformation on a preset image point set by using the first transformation matrix and the second transformation matrix to obtain a video transformation point sequence set and a gyroscope transformation point sequence set; calculating discrete values of each group of the gyroscope transformation point sequences in the gyroscope transformation point sequence set, and taking the gyroscope transformation point sequence with the maximum discrete value as the first transformation sequence; and taking the video transformation point sequence corresponding to the first transformation sequence as a second transformation sequence.
Specifically, a first transformation matrix and a second transformation matrix are used for respectively carrying out projection transformation on the fixed point set P to obtain a video transformation point sequence set and a gyroscope transformation point sequence set. The preset point set in the embodiment of the present application takes the form of a fixed point set. Each point in the fixed-point set may be characterized by coordinates or slope. The fixed point set is a preset point set, and may be set to a single point, such as a point (h/2, w/2) in the image, or may be set to multiple points. Fig. 3 is a diagram illustrating a preset point set of a delay time determination method according to an embodiment of the present application. As shown in fig. 3, the fixed point set consists of points in the image and 4 points around the rules. Each point can be selected to record the abscissa value, the ordinate value and the slope of the point. In fig. 3, a fixed point set P = { P1, P1/P1, P2/P2, P3/P3, P4/P4, P5/P5 }, a corresponding set of 15 video transform point sequences = { 1, 1, 1/1, 2, 2, 2/2, 3, 3/3, 4, 4, 4/4, 5, 5, 5/5 }, and similarly, a second transform matrix performs projection transform on the fixed point set to obtain a set of gyroscope transform point sequences. And calculating a gyroscope transformation point sequence P1gj with the largest discrete degree and a video transformation point sequence P1ni in the gyroscope transformation point sequence set Sgj. Wherein the degree of dispersion is a dispersion value. The degree of dispersion is calculated using variance, for example, Sgj = { Pgi1x, Pgi1y, Pgi1y/Pgi1x, Pgi2x, Pgi2y, Pgi2y/Pgi2x, Pgi3x, Pgi3y, Pgi3y/Pgi3x, Pgi4x, Pgi4y, Pgi4y/Pgi4x, Pgi5x, Pgi5y, Pgi5y/Pgi5x } respectively, and the variance of the gyro transformation point sequence P1gj = { Pgj2x } is found to be the maximum. Therefore, the gyroscope transformation point sequence P1gj = { Pgj2x } and the corresponding video transformation point sequence P1ni = { Pni2x } are selected for subsequent calculation, and the selection of the sequence with a large degree of dispersion is beneficial to increasing the accuracy of delay time calculation.
In one embodiment, before performing a normalized cross-correlation operation on the first transform sequence and the second transform sequence to obtain a cross-correlation sequence, the method further includes: and detecting whether the number of the video frames of the first transformation sequence is the same as the number of the gyroscope frames of the second transformation sequence, and if the number of the video frames is less than the number of the gyroscope frames, performing interpolation operation on the first transformation sequence to ensure that the number of the video frames of the first transformation sequence after interpolation is the same as the number of the gyroscope frames of the second transformation sequence.
Specifically, the video transform point sequence P1ni is interpolated to obtain a video second transform point sequence P2 nj. The number of sequence objects is consistent with the sequence of gyroscope transformation points P1 gj. Wherein, for a certain point P2nj in the second transformation point sequence of the video to be solved, the time stamp is tgj in the gyroscope transformation point sequence, and the video transformation points before and after the time stamp are (tnx, P1nx), (tny, P1ny), P2nj = P1nx + (tny-tnx) × (P1ny-P1 nx)/(tny-tnx).
In one embodiment, obtaining the first transform sequence and the second transform sequence within the preset period includes: filtering the first transformation sequence and the second transformation sequence to obtain second video data and second gyroscope data; making a difference between the second video data and the first conversion sequence to obtain third video data; subtracting the second gyroscope data from the second transformation sequence to obtain third gyroscope data; and performing normalized cross-correlation operation on the third video data and the third gyroscope data to determine delay time.
Specifically, the first transformed sequence P2nj and the second transformed sequence P1gj are filtered and differenced from the original sequence itself. A video third sequence of transformed points P3nj and a gyroscope second sequence of transformed points P2gj are obtained. The calculation method comprises the following steps:
P3nj=P2nj-fliter(P2nj);
P2gj=P1gj-filter(P1gj);
the filtering operation may be one-dimensional mean filtering, one-dimensional gaussian filtering, or the like. The calculation result can better reflect the discrete characteristic of the transformation point sequence.
And modifying the second gyro transformation point sequence P2gj into a third gyro transformation point sequence P3gj, and performing normalized cross-correlation operation on the third video transformation point sequence P3nj and the third gyro transformation point sequence P3gj to obtain a fourth transformation point sequence P4 l. And modifying the values of partial points on the left side and the right side of the second conversion point sequence of the gyroscope to be 0, modifying q points on the two sides, wherein the larger the q is, the larger the range of the detectable delay time is, but the lower the precision of the delay time is. The normalized cross-correlation operation formula is:
Figure 3480DEST_PATH_IMAGE003
and n is the total number of sequences in the third transformation point sequence of the video. The fourth sequence of transform points has a domain of [ -n +1, n-1 [ -n +1 [ ]]. The fourth sequence of transform points is the cross-correlation sequence. Calculating the delay time t using the fourth sequence of transformation points P4ld. Judging the delay time t by using the maximum value of the fourth conversion point sequencedReliability, wherein the maximum P4 of the sequence P4l is foundmaxAnd obtaining the sequence number maxl of the maximum value, passing through td = maxl × t2 gave the result. t is t2Is the interval time of two consecutive data of the gyroscope in step 1. For example: when P4-2At the maximum of the sequence, the gyroscope interval time is 0.005s, at which time the delay time t isdSetting a delay time reliability threshold t of = -2 × 0.005= -0.01srWhen P4maxLess than trThis delay time is considered unreliable, possibly due to the device moving too little. The above steps need to be re-executed to calculate the delay time.
In one embodiment, the determining the delay time according to the number of interval frames comprises: correcting the video sequence timestamp and/or the gyroscope sequence timestamp based on the delay time to obtain correction data; and correcting the current video based on the correction data to obtain a target video.
In particular, according to the delay time tdAnd correcting the current video sequence time stamp or the current gyroscope time stamp according to the result, and correcting the jittered video by using the corrected video sequence data and gyroscope data to obtain the jittered video.
In one embodiment, the delay time is recalculated and updated at intervals, so that the situation that the delay time is changed along with different factors such as the starting time of the equipment, the temperature of the equipment and the like is prevented.
In one embodiment, the obtaining the first transform sequence and the second transform sequence in the preset period includes: acquiring original gyroscope data; and calculating the variance of the original gyroscope data of multiple frames, if the variance is greater than a preset threshold value, determining that the camera shakes, and acquiring a first transformation sequence and a corresponding second transformation sequence.
Specifically, it is possible to determine whether the mobile device starts to move from the raw gyro data, and when it is determined that the device starts to move, the calculation of the delay time is started. Whether the device starts to move can be determined in various ways, for example, three mean values or three variances of the triaxial angular velocity data in a period of time are obtained, and whether the maximum value is larger than a certain threshold t is determinedmotionAnd when it is greater than the threshold, the device is considered to be in motion. The user may also be engaged through human-computer interaction, such as setting a gyroscope calibration button, which when pressed by the user prompts the user to shake the mobile device and think that the device is beginning to move at that time.
The present embodiment is described and illustrated below by means of preferred embodiments.
Fig. 4 is a flowchart of a delay time determination method according to another embodiment of the present application. As shown in fig. 4, the delay time determining method includes: the method comprises the steps of obtaining a video sequence of the mobile device and a corresponding gyroscope data sequence, judging whether the device starts to move according to the gyroscope data sequence, and if the device starts to move, calculating a first transformation matrix of the video sequence and a second transformation matrix of the gyroscope sequence within preset time. The delay time is then calculated using the first transformation matrix and the second transformation matrix. And correcting the current video sequence time stamp or the current gyroscope time stamp according to the delay time result, and correcting the jittered video. In addition, the delay time is updated regularly, so that the delay time is prevented from changing along with the starting time of the equipment, the temperature of the equipment and other factors to influence the anti-shake effect.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
In this embodiment, a delay time determination apparatus is further provided, and the apparatus is used to implement the foregoing embodiments and preferred embodiments, and details of which have been already described are omitted. The terms "module," "unit," "subunit," and the like as used below may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 5 is a block diagram of a delay time determination apparatus according to an embodiment of the present application, and as shown in fig. 5, the apparatus includes:
an obtaining module 10, configured to obtain a first transformation sequence and a second transformation sequence, where the first transformation sequence is used to represent a change condition of an adjacent video frame in a preset period, and the second transformation sequence is used to represent a change condition of an adjacent gyroscope frame in the preset period;
a first processing module 20, configured to perform normalized cross-correlation operation on the first transform sequence and the second transform sequence to obtain a cross-correlation sequence;
a second processing module 30, configured to determine, according to the cross-correlation sequence, the number of interval frames between a target video frame and a target gyroscope frame, where timestamps of the target video frame and the target gyroscope frame are the same;
and the determining module 40 is configured to determine the delay time according to the number of interval frames.
The obtaining module 10 is further configured to obtain original video data in the preset period; performing feature point matching according to feature points of adjacent frames of the original video data to obtain a first transformation matrix; acquiring original gyroscope data in the preset period; converting the original gyroscope data under a camera coordinate system into gyroscope data under an image coordinate system to obtain a second transformation matrix, wherein the image coordinate system is established based on a video frame image shot by the camera; and performing projection transformation on a preset point set by using the first transformation matrix and the second transformation matrix, and determining the first transformation sequence and a corresponding second transformation sequence.
The obtaining module 10 is further configured to perform projection transformation on a preset image point set by using the first transformation matrix and the second transformation matrix to obtain a video transformation point sequence set and a gyroscope transformation point sequence set; calculating discrete values of each group of the gyroscope transformation point sequences in the gyroscope transformation point sequence set, and taking the gyroscope transformation point sequence with the maximum discrete value as the first transformation sequence; and taking the video transformation point sequence corresponding to the first transformation sequence as a second transformation sequence.
The first processing module 20 is further configured to detect whether the number of video frames of the first transform sequence is the same as the number of gyroscope frames of the second transform sequence, and perform an interpolation operation on the first transform sequence if the number of video frames is smaller than the number of gyroscope frames, so that the number of video frames of the first transform sequence after interpolation is the same as the number of gyroscope frames of the second transform sequence.
The first processing module 20 is further configured to filter the first transform sequence and the second transform sequence to obtain second video data and second gyroscope data; making a difference between the second video data and the first conversion sequence to obtain third video data; subtracting the second gyroscope data from the second transformation sequence to obtain third gyroscope data; and performing normalized cross-correlation operation on the third video data and the third gyroscope data to determine delay time.
The determining module 40 is further configured to correct the video sequence timestamp and/or the gyroscope sequence timestamp based on the delay time to obtain corrected data; and correcting the current video based on the correction data to obtain a target video.
The determining module 40 is further configured to obtain original gyroscope data; and calculating the variance of the original gyroscope data of multiple frames, if the variance is greater than a preset threshold value, determining that the camera shakes, and acquiring a first transformation sequence and a corresponding second transformation sequence.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
There is also provided in this embodiment an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
and S1, acquiring a first transformation sequence and a second transformation sequence, wherein the first transformation sequence is used for representing the change situation of adjacent video frames in a preset period, and the second transformation sequence is used for representing the change situation of adjacent gyroscope frames in the preset period.
S2, carrying out normalized cross-correlation operation on the first transformation sequence and the first transformation sequence to obtain a cross-correlation sequence.
And S3, determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, wherein the time stamps of the target video frame and the target gyroscope frame are the same.
And S4, determining the delay time according to the interval frame number.
It should be noted that, for specific examples in this embodiment, reference may be made to the examples described in the foregoing embodiments and optional implementations, and details are not described again in this embodiment.
In addition, in combination with the delay time determination method provided in the foregoing embodiment, a storage medium may also be provided to implement in this embodiment. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the delay time determination methods in the above embodiments.
It should be understood that the specific embodiments described herein are merely illustrative of this application and are not intended to be limiting. All other embodiments, which can be derived by a person skilled in the art from the examples provided herein without any inventive step, shall fall within the scope of protection of the present application.
It is obvious that the drawings are only examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application can be applied to other similar cases according to the drawings without creative efforts. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
The term "embodiment" is used herein to mean that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly or implicitly understood by one of ordinary skill in the art that the embodiments described in this application may be combined with other embodiments without conflict.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the patent protection. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (9)

1. A method for determining a delay time, comprising:
acquiring a first transformation sequence and a second transformation sequence, wherein the first transformation sequence is used for representing the change condition of adjacent video frames in a preset period, and the second transformation sequence is used for representing the change condition of adjacent gyroscope frames in the preset period;
carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence;
determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, wherein the time stamps of the target video frame and the target gyroscope frame are the same;
determining delay time according to the interval frame number;
the obtaining the first transform sequence and the second transform sequence comprises: acquiring original video data in the preset period; performing feature point matching according to feature points of adjacent frames of the original video data to obtain a first transformation matrix; acquiring original gyroscope data in the preset period; converting the original gyroscope data under a camera coordinate system into gyroscope data under an image coordinate system to obtain a second transformation matrix, wherein the image coordinate system is established based on a video frame image shot by the camera; and performing projection transformation on a preset point set by using the first transformation matrix and the second transformation matrix, and determining the first transformation sequence and a corresponding second transformation sequence.
2. The method of claim 1, wherein projectively transforming the first transformation matrix and the second transformation matrix into a set of predetermined points, and wherein determining the first transformation sequence and the corresponding second transformation sequence comprises:
performing projection transformation on a preset image point set by using the first transformation matrix and the second transformation matrix to obtain a video transformation point sequence set and a gyroscope transformation point sequence set;
calculating discrete values of each group of the gyroscope transformation point sequences in the gyroscope transformation point sequence set, and taking the gyroscope transformation point sequence with the maximum discrete value as the first transformation sequence;
and taking the video transformation point sequence corresponding to the first transformation sequence as a second transformation sequence.
3. The method of claim 1, wherein the performing a normalized cross-correlation operation on the first transformed sequence and the second transformed sequence to obtain a cross-correlation sequence further comprises:
and detecting whether the number of the video frames of the first transformation sequence is the same as the number of the gyroscope frames of the second transformation sequence, and if the number of the video frames is less than the number of the gyroscope frames, performing interpolation operation on the first transformation sequence to ensure that the number of the video frames of the first transformation sequence after interpolation is the same as the number of the gyroscope frames of the second transformation sequence.
4. The method of claim 1, wherein obtaining the first transformed sequence and the second transformed sequence within the predetermined period comprises:
filtering the first transformation sequence and the second transformation sequence to obtain second video data and second gyroscope data;
making a difference between the second video data and the first conversion sequence to obtain third video data;
subtracting the second gyroscope data from the second transformation sequence to obtain third gyroscope data;
and performing normalized cross-correlation operation on the third video data and the third gyroscope data to determine delay time.
5. The delay time determination method of claim 1, wherein determining the delay time according to the number of interval frames comprises:
correcting the video sequence timestamp and/or the gyroscope sequence timestamp based on the delay time to obtain correction data;
and correcting the current video based on the correction data to obtain a target video.
6. The method of claim 1, wherein the obtaining the first transform sequence and the second transform sequence within the preset period comprises:
acquiring original gyroscope data;
and calculating the variance of the original gyroscope data of multiple frames, if the variance is greater than a preset threshold value, determining that the camera shakes, and acquiring a first transformation sequence and a corresponding second transformation sequence.
7. A delay time determination apparatus, comprising:
the device comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring a first transformation sequence and a second transformation sequence, the first transformation sequence is used for representing the change condition of adjacent video frames in a preset period, and the second transformation sequence is used for representing the change condition of adjacent gyroscope frames in the preset period;
the first processing module is used for carrying out normalized cross-correlation operation on the first transformation sequence and the second transformation sequence to obtain a cross-correlation sequence;
the second processing module is used for determining the number of interval frames of a target video frame and a target gyroscope frame according to the cross-correlation sequence, and the time stamps of the target video frame and the target gyroscope frame are the same;
the determining module is used for determining delay time according to the interval frame number;
the acquisition module is further configured to acquire original video data in the preset period; performing feature point matching according to feature points of adjacent frames of the original video data to obtain a first transformation matrix; acquiring original gyroscope data in the preset period; converting the original gyroscope data under a camera coordinate system into gyroscope data under an image coordinate system to obtain a second transformation matrix, wherein the image coordinate system is established based on a video frame image shot by the camera; and performing projection transformation on a preset point set by using the first transformation matrix and the second transformation matrix, and determining the first transformation sequence and a corresponding second transformation sequence.
8. An image acquisition device comprising a camera, a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the delay time determination method of any of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the delay time determination method according to any one of claims 1 to 6.
CN202210014365.4A 2022-01-07 2022-01-07 Delay time determination method and device, image acquisition equipment and storage medium Active CN114025158B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210014365.4A CN114025158B (en) 2022-01-07 2022-01-07 Delay time determination method and device, image acquisition equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210014365.4A CN114025158B (en) 2022-01-07 2022-01-07 Delay time determination method and device, image acquisition equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114025158A CN114025158A (en) 2022-02-08
CN114025158B true CN114025158B (en) 2022-04-08

Family

ID=80069717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210014365.4A Active CN114025158B (en) 2022-01-07 2022-01-07 Delay time determination method and device, image acquisition equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114025158B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9116001B2 (en) * 2012-06-14 2015-08-25 Qualcomm Incorporated Adaptive estimation of frame time stamp latency
CN105794192B (en) * 2013-11-06 2019-03-08 统雷有限公司 The corrected method of image for being arrived to asynchronous triggering collection
AU2017202584A1 (en) * 2017-04-19 2018-11-08 Canon Kabushiki Kaisha Method, system and apparatus for detecting a change in angular position of a camera
CN110740247B (en) * 2018-07-18 2021-10-08 腾讯科技(深圳)有限公司 Video stability augmentation method and device, computer equipment and storage medium
CN110880189B (en) * 2018-09-06 2022-09-09 舜宇光学(浙江)研究院有限公司 Combined calibration method and combined calibration device thereof and electronic equipment

Also Published As

Publication number Publication date
CN114025158A (en) 2022-02-08

Similar Documents

Publication Publication Date Title
EP2442562B1 (en) Method and apparatus for image orientation indication and correction
WO2021233032A1 (en) Video processing method, video processing apparatus, and electronic device
CN110800282B (en) Holder adjusting method, holder adjusting device, mobile platform and medium
US20130300933A1 (en) Method of visually synchronizing differing camera feeds with common subject
US7830565B2 (en) Image capture device with rolling band shutter
US20070153090A1 (en) System for operating a plurality of mobile image capturing devices
US10911675B2 (en) Method for providing shake correction, signal processing device performing the method, and imaging device including the signal processing device
US11019262B2 (en) Omnidirectional moving image processing apparatus, system, method, and recording medium
WO2022089341A1 (en) Image processing method and related apparatus
CN114025158B (en) Delay time determination method and device, image acquisition equipment and storage medium
CN111047622A (en) Method and device for matching objects in video, storage medium and electronic device
CN113364978B (en) Image processing method and device, electronic equipment and readable storage medium
EP3644600B1 (en) Imaging device, information processing method, system, and carrier means
CN110769146B (en) Shooting method and electronic equipment
CN115546043B (en) Video processing method and related equipment thereof
WO2021138768A1 (en) Method and device for image processing, movable platform, imaging apparatus and storage medium
CN113766124A (en) Photographing processing method, device and equipment and storage medium
US20130235229A1 (en) Imaging apparatus capable of specifying shooting posture, method for specifying shooting posture, and storage medium storing program
US20130044994A1 (en) Method and Arrangement for Transferring Multimedia Data
KR102367165B1 (en) The syncronizing method for the filming time and the apparatus
EP4280154A1 (en) Image blurriness determination method and device related thereto
JP2003283819A (en) Image correction method and apparatus, and program
JP3944115B2 (en) Signal processing apparatus and signal processing method
WO2019183769A1 (en) Remaining storage capacity processing method, photographic device, and computer readable storage medium
CN116980762A (en) Control method and device of virtual camera and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant