CN110381264B - Method and apparatus for generating time information - Google Patents

Method and apparatus for generating time information Download PDF

Info

Publication number
CN110381264B
CN110381264B CN201810331501.6A CN201810331501A CN110381264B CN 110381264 B CN110381264 B CN 110381264B CN 201810331501 A CN201810331501 A CN 201810331501A CN 110381264 B CN110381264 B CN 110381264B
Authority
CN
China
Prior art keywords
relative position
sequence
frame
position transformation
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810331501.6A
Other languages
Chinese (zh)
Other versions
CN110381264A (en
Inventor
吕晓磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201810331501.6A priority Critical patent/CN110381264B/en
Publication of CN110381264A publication Critical patent/CN110381264A/en
Application granted granted Critical
Publication of CN110381264B publication Critical patent/CN110381264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses a method and a device for generating time information. One embodiment of the method comprises: acquiring respective frame sequences of two shot videos of a target object shot in the same time period; for each frame sequence, sequentially selecting frame groups from the frame sequence to generate a frame group sequence; for each frame group sequence, determining a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generating a relative position transformation matrix sequence corresponding to the frame sequence; for each frame sequence, carrying out data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix so as to generate a relative position transformation vector sequence corresponding to the frame sequence; and generating a time difference value between the two frame sequences based on the relative position transformation vector sequences respectively corresponding to the two frame sequences. This embodiment enables the determination of the time difference between two frame sequences.

Description

Method and apparatus for generating time information
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for generating time information.
Background
Currently, in the case of performing shooting simultaneously using a plurality of shooting devices such as cameras or video cameras, it is necessary to perform time synchronization and spatial synchronization for the plurality of shooting devices. In the prior art, synchronizer hardware is generally used, and a synchronization signal is used for time synchronization among a plurality of shooting devices; in addition, a static calibration board (e.g., a checkerboard) is typically used to achieve spatial synchronization of multiple cameras.
Disclosure of Invention
The embodiment of the application provides a method and a device for generating time information.
In a first aspect, an embodiment of the present application provides a method for generating time information, where the method includes: acquiring respective frame sequences of two shot videos of a target object shot in the same time period; for each frame sequence, sequentially selecting a frame group from the frame sequence to generate a frame group sequence, wherein the frame group comprises two connected frames in the frame sequence; for each frame group sequence, determining a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generating a relative position transformation matrix sequence corresponding to the frame sequence, wherein the relative position transformation matrix is used for representing a transformation relation between positions of target objects respectively corresponding to two frames in the frame group; for each frame sequence, carrying out data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix so as to generate a relative position transformation vector sequence corresponding to the frame sequence, wherein the relative position transformation vector is represented by a vector of the relative position transformation matrix; and generating a time difference value between the two frame sequences based on the relative position transformation vector sequences respectively corresponding to the two frame sequences.
In some embodiments, determining a relative position transform matrix corresponding to a frame group in the sequence of frame groups comprises: for each frame in each frame group in the frame group sequence, determining a position transformation matrix between the position of a target object corresponding to the frame and the position of corresponding shooting equipment, wherein the position transformation matrix is used for representing the transformation relation between the position of the target object corresponding to the frame and the position of the shooting equipment; and for each frame group in the frame group sequence, multiplying the position transformation matrix corresponding to the next frame in the frame group by the inverse matrix of the position transformation matrix corresponding to the previous frame to obtain the relative position transformation matrix corresponding to the frame group.
In some embodiments, generating a time difference value between two frame sequences based on their corresponding relative position transform vector sequences comprises: selecting a relative position transformation vector sequence containing a small number of relative position transformation vectors from the two relative position transformation vector sequences, taking the number of the relative position transformation vectors contained in the selected relative position transformation vector sequence as a reference number, sequentially selecting a synchronization unit from the relative position transformation vector sequence containing a large number of relative position transformation vectors, and generating a synchronization unit sequence, wherein the synchronization unit comprises a reference number of relative position transformation vectors connected in the relative position transformation vector sequence; based on the sequence of synchronization units, a time difference between two sequences of frames is generated.
In some embodiments, generating a time difference between two sequences of frames based on a sequence of synchronization units comprises: determining an average distance corresponding to each synchronization unit in the synchronization unit sequence; the time difference between two frame sequences is generated based on the synchronization unit with the smallest corresponding average distance.
In some embodiments, determining the average distance corresponding to each synchronization unit in the sequence of synchronization units comprises: for each synchronization unit in the sequence of synchronization units, performing the following steps for that synchronization unit: generating a relative position conversion vector group sequence by using the synchronization unit and a relative position conversion vector with the same position in a relative position conversion vector sequence with a small number of relative position conversion vectors as a relative position conversion vector group; determining the distance corresponding to each relative position transformation vector group in the relative position transformation vector group sequence, and generating a distance sequence corresponding to the relative position transformation vector group sequence; and generating an average distance corresponding to the synchronous unit based on the distance sequence.
In a second aspect, an embodiment of the present application provides an apparatus for generating time information, where the apparatus includes: an acquisition unit configured to acquire respective frame sequences of two captured videos of a target object captured within the same period of time; the frame group sequence generating unit is configured to sequentially select a frame group from each frame sequence and generate a frame group sequence, wherein the frame group comprises two connected frames in the frame sequence; a relative position transformation matrix sequence generating unit configured to determine, for each frame group sequence, a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generate a relative position transformation matrix sequence corresponding to the frame sequence, where the relative position transformation matrix is used to represent a transformation relationship between positions of target objects respectively corresponding to two frames in the frame group; a relative position transformation vector sequence generation unit, configured to perform data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to each frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix for each frame sequence, so as to generate a relative position transformation vector sequence corresponding to the frame sequence, where the relative position transformation vector is a vector representation of the relative position transformation matrix; and a time difference value generating unit configured to generate a time difference value between the two frame sequences based on the relative position transformation vector sequences respectively corresponding to the two frame sequences.
In some embodiments, the relative position transform matrix sequence generation unit is further configured to: for each frame in each frame group in the frame group sequence, determining a position transformation matrix between the position of a target object corresponding to the frame and the position of corresponding shooting equipment, wherein the position transformation matrix is used for representing the transformation relation between the position of the target object corresponding to the frame and the position of the shooting equipment; and for each frame group in the frame group sequence, multiplying the position transformation matrix corresponding to the next frame in the frame group by the inverse matrix of the position transformation matrix corresponding to the previous frame to obtain the relative position transformation matrix corresponding to the frame group.
In some embodiments, the time difference value generation unit is further configured to: selecting a relative position transformation vector sequence containing a small number of relative position transformation vectors from the two relative position transformation vector sequences, taking the number of the relative position transformation vectors contained in the selected relative position transformation vector sequence as a reference number, sequentially selecting a synchronization unit from the relative position transformation vector sequence containing a large number of relative position transformation vectors, and generating a synchronization unit sequence, wherein the synchronization unit comprises a reference number of relative position transformation vectors connected in the relative position transformation vector sequence; based on the sequence of synchronization units, a time difference between two sequences of frames is generated.
In some embodiments, the time difference value generation unit is further configured to: determining an average distance corresponding to each synchronization unit in the synchronization unit sequence; the time difference between two frame sequences is generated based on the synchronization unit with the smallest corresponding average distance.
In some embodiments, the time difference value generation unit is further configured to: for each synchronization unit in the sequence of synchronization units, performing the following steps for the synchronization unit: generating a relative position conversion vector group sequence by using the synchronization unit and a relative position conversion vector with the same position in a relative position conversion vector sequence with a small number of relative position conversion vectors as a relative position conversion vector group; determining the distance corresponding to the relative position transformation vector group aiming at each relative position transformation vector group in the relative position transformation vector group sequence, and generating a distance sequence corresponding to the relative position transformation vector group sequence; and generating an average distance corresponding to the synchronous unit based on the distance sequence.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; a storage device having one or more programs stored thereon; when executed by one or more processors, cause the one or more processors to implement a method as described in any implementation of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, which, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for generating the time information, two frame group sequences are obtained by selecting two connected frames from respective frame sequences of two shooting videos. Then, a relative position transformation matrix sequence corresponding to each frame group sequence is generated, and then, the corresponding relative position transformation vector is obtained through processing. Then, a time difference value between the two frame sequences is obtained by transforming the vector sequence based on the corresponding relative positions of the two frame sequences, thereby providing a method for determining the time difference between the two frame sequences.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for generating time information of the present application;
FIG. 3 is a schematic diagram of an application scenario of a method for generating time information according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of a method for generating time information according to the present application;
FIG. 5 is a schematic structural diagram of yet another embodiment of a method for generating time information according to the present application;
FIG. 6 is a block diagram of one embodiment of an apparatus for generating time information according to the present application;
FIG. 7 is a block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 shows an exemplary architecture 100 to which the method for generating time information or the apparatus for generating time information of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The terminal devices 101, 102, 103 interact with a server 105 via a network 104 to receive or send messages or the like. Various client applications, such as a video processing type application, a camera type application, a search type application, a browser type application, and the like, may be installed on the terminal apparatuses 101, 102, 103. The terminal apparatuses 101, 102, 103 can be connected to a device having an image pickup function (e.g., a video camera, a mobile phone, a video recorder, a tablet computer, etc.) in communication, and can store a captured video captured by one or more devices having an image pickup function connected thereto. In addition, the terminal device itself may have a camera for shooting the target object and storing the shot video.
The terminal devices 101, 102, 103 may be various electronic devices that support video processing including, but not limited to, cameras, smart phones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server that provides various services, such as a video processing server that processes video uploaded by the terminal apparatuses 101, 102, 103. The video processing server may perform processing such as extraction on the received video, and feed back a processing result (e.g., a sequence of frames of the video) to the terminal device.
Note that the above-described video or video frame sequence may be directly stored in the local area of the server 105, and the server 105 may directly extract and process the locally stored video or video frame sequence, in which case the terminal apparatuses 101, 102, and 103 and the network 104 may not be present.
It should be noted that the method for generating time information provided in the embodiment of the present application is generally performed by the server 105, and accordingly, the apparatus for generating time information is generally disposed in the server 105.
It should be further noted that the terminal devices 101, 102, and 103 may also have a video processing application installed therein, and the terminal devices 101, 102, and 103 may also process the video or the sequence of frames of the video based on the video processing application. At this time, the method for generating time information may be executed by the terminal apparatuses 101, 102, 103, and accordingly, the apparatus for generating time information may be provided in the terminal apparatuses 101, 102, 103. At this point, the exemplary system architecture 100 may not have the server 105 and the network 104.
It should be noted that the server 105 may be a single server, or may be composed of a plurality of servers or a plurality of server clusters.
The server may be hardware or software. When the server is hardware, it may be implemented as a distributed server cluster composed of multiple servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules (for example, to provide distributed services), or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for generating time information in accordance with the present application is shown. The method comprises the following steps:
step 201, acquiring respective frame sequences of two captured videos of a target object captured in the same time period.
In the present embodiment, the execution subject of the method for generating time information (such as the server 105 shown in fig. 1) may first acquire respective frame sequences of two captured videos of a target object captured within the same period of time from a terminal device by a limited connection manner or a wireless connection manner. The target object may include a person, an animal, a plant, etc., may also include a commodity (such as furniture, a car, etc.), a mountain, a river, etc., and may also include a calibration board (such as a checkerboard), etc. It should be noted that the target object is illustrative, and the present application is not limited thereto, and the target object is an entity in a picture, and falls within the scope of the present application.
Generally, a frame refers to a single picture or image of the smallest unit in a video, or animation. Therefore, capturing a frame of a video may refer to capturing each shot in the video, or each captured image. The sequence of frames may refer to a sequence of frames of a captured video. Wherein the number of frames is related to the frame rate of the captured video. Frame rate generally refers to the number of frames displayed per second. Thus, for example, assuming that the frame rate of the captured video is 20FPS (Frames Per Second), and assuming that the length of the captured video is 2 minutes, the captured video has 2400 (2 × 60 × 20= 2400) Frames in total, that is, the frame sequence of the captured video consists of 2400 Frames.
In practice, assuming that the target object to be photographed is a checkerboard, the checkerboard can be manually held or drawn by a mobile device to move spatially, and then videos of the checkerboard movement are recorded by two photographing devices simultaneously. Then, the shot video can be processed by some existing image or video processing software to obtain frames of the video, so as to obtain a frame sequence of the video. Wherein, the frame rates of the two photographing apparatuses may be the same. It should be noted that, each frame in the frame sequence may be stored in time, for example, the first frame may be a first frame in the frame sequence, the second frame may be a second frame in the frame sequence, and so on.
Step 202, for each frame sequence, sequentially selecting frame groups from the frame sequence, and generating a frame group sequence.
In this embodiment, for each frame sequence, the execution subject may sequentially select two consecutive frames from the frame sequence as a frame group, so as to obtain a frame group sequence corresponding to the frame sequence. Wherein each frame group may refer to any two consecutive frames in the sequence of frames. For example, assuming that the frame sequence consists of 3000 frames, first, the first frame and the second frame are selected as a first frame group, then the second frame and the third frame are selected as a second frame group, then the third frame and the fourth frame are selected as a third frame group, and so on, and finally, the 2999 th frame and the 3000 th frame are selected as a 2999 th frame group, so as to obtain a frame group sequence consisting of 2999 frame groups. It should be noted that each frame group may be stored in the frame group sequence according to a corresponding relationship with the frame, for example, a first frame group composed of the first frame and the second frame may be used as a first frame group in the frame group sequence, a second frame group composed of the second frame and the third frame may be used as a second frame group in the frame group sequence, and so on.
Step 203, for each frame group sequence, determining a relative position transformation matrix corresponding to the frame group in the frame group sequence, and generating a relative position transformation matrix sequence corresponding to the frame sequence.
In this embodiment, the executing body may determine a relative position transformation matrix corresponding to each frame group based on the frame group sequence obtained in step 202, so as to generate a relative transformation matrix sequence corresponding to the frame group sequence. The relative position transformation matrix may represent a transformation relationship between positions of target objects corresponding to two frames included in each frame group in the frame group sequence. For example, for a first frame group in the sequence of frame groups, the frame group includes a first frame and a second frame. Wherein the first frame and the second frame may be first and second pictures photographed when the target object is photographed, respectively. Then, at the shooting time corresponding to each shot, there corresponds a position of a target object. For example, assume that at the shooting time corresponding to the first frame, the target object is at the a position; at the shooting time corresponding to the second frame, the target object is at the B position, and at this time, the relative position transformation matrix may indicate a transformation relationship between the a position and the B position of the target object.
In the present embodiment, the execution subject described above may determine the relative position transformation matrix corresponding to each frame group based on the existing affine transformation, and may also determine the relative position transformation matrix by measuring or calculating the positions of the target object at different shooting times. For example, for target objects corresponding to a first frame and a second frame included in the frame group, a reference coordinate system may be constructed for a certain frame (e.g., the first frame) (e.g., a three-dimensional coordinate system may be constructed by specifying a center of mass of the target object at a shooting time corresponding to the first frame as an origin), and then position coordinates of the target object at a shooting time corresponding to another frame (e.g., the second frame) in the reference coordinate system may be determined, and the position coordinates of the target object may be represented in a matrix form for subsequent calculation. Then, the measured position coordinates are used as a relative position transformation matrix between the two frames. Thus, a relative position conversion matrix corresponding to one frame group can be obtained. Then, the relative position transformation matrix corresponding to other frame groups is calculated in the same way, so as to obtain the relative position transformation matrix sequence.
In practice, for two frames in each frame group, there may be one or two frames without shooting the target object, and at this time, a zero matrix may be used as the relative position transformation matrix corresponding to the frame group. It should be noted that the relative position transformation matrices are stored according to the corresponding relationship with the frame groups, for example, the relative position transformation matrix corresponding to the first frame group may be used as the first relative position transformation matrix of the relative position transformation matrix sequence, the relative position transformation matrix corresponding to the second frame group may be used as the second relative position transformation matrix of the relative position transformation matrix sequence, and so on.
Step 204, for each frame sequence, performing data processing on the relative position transformation matrix in the relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix, so as to generate a relative position transformation vector sequence corresponding to the frame sequence.
In this embodiment, for each frame sequence, the executing entity may perform data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix, so as to obtain a relative position transformation vector sequence. Wherein the relative position transformation vector is a vector representation of the relative position transformation matrix. In practice, different data processing methods can be selected to obtain the relative position transformation vector sequence according to specific application requirements.
For example, the relative position transformation vector may be processed based on some existing mathematical software, for example, the relative position transformation vector may be processed using a reshape function to obtain a corresponding relative position transformation vector. Where the reshape function is a function that can be used to adjust the number of rows, columns, and dimensions of the matrix.
For example, if the obtained relative position transformation matrix is a rotation matrix, a vector representation in a lie algebra corresponding to any rotation matrix can be obtained by using an existing mapping method from the lie group to the lie algebra, that is, a relative position transformation vector of the relative position transformation matrix can be obtained. In practice, for a zero matrix in the relative position transformation matrix sequence, a zero vector can be adopted as its corresponding relative position transformation vector.
For example, the relative position transformation vector corresponding to the first relative position transformation matrix may be a first relative position transformation vector of the relative position transformation vector sequence, the relative position transformation vector corresponding to the second relative position transformation matrix may be a second relative position transformation vector of the relative position transformation vector sequence, and so on.
Step 205, a time difference value between the two frame sequences is generated based on the relative position transformation vector sequences corresponding to the two frame sequences respectively.
In this embodiment, the executing entity may determine a time difference between two frame sequences respectively corresponding to two relative position transformation vector sequences based on the relative position transformation vector sequence obtained in step 204. The time difference value may represent a frame difference between two frame sequences, that is, a frame difference between the shot videos corresponding to the two frame sequences.
In practice, it is assumed that two frame sequences are respectively referred to as a first frame sequence and a second frame sequence, and the relative position transformation matrices corresponding to the two frame sequences are respectively referred to as a first relative position transformation vector sequence and a second relative position transformation vector. Then the distance of the first relative position transform vector in the first sequence of relative position transform vectors from each relative position transform vector in the second sequence of relative position transform vectors, respectively, may first be determined (as may be the existing euclidean distance, minkowski distance, chebyshev distance, etc., or the angle cosine may also be calculated as the distance between the two relative position transform vectors).
Then, one relative position transform vector having the smallest distance from the first relative position transform vector in the first relative position transform vector sequence is selected from the second relative position transform vector sequence. Assuming that the mth relative position transformation vector in the second relative position transformation vector sequence is selected, | M-1| may be recorded as the first difference value, and the distance between the selected mth relative position transformation vector and the first relative position transformation vector in the first relative position transformation vector sequence may be recorded as the first distance.
Similarly, the distance between the second relative position transformation vector in the first relative position transformation vector sequence and each relative position transformation vector in the second relative position transformation vector sequence is determined, and one relative position transformation vector with the smallest distance from the second relative position transformation vector in the first relative position transformation vector sequence is selected from the second relative position transformation vector sequence, and if the selected nth relative position transformation vector is selected, then | N-2| may be recorded as the second difference value, and the distance between the selected nth relative position transformation vector and the second relative position transformation vector in the first relative position transformation vector sequence may be recorded as the second distance.
And analogizing in turn, calculating a third difference value, a fourth difference value, 8230A Q difference value (Q here represents the number of relative position transformation vectors contained in the first relative position transformation vector sequence and is a natural number generally), and a corresponding third distance, a fourth distance, 8230A and a K distance.
Then, comparing the distances, selecting the distance with the minimum value, and if the distance with the minimum value is the Kth distance, taking the corresponding Kth difference value as the time difference value between the first frame sequence and the second frame sequence.
In addition, the sum of the distances between the first relative position transformation vector in the first relative position transformation vector sequence and each relative position transformation vector in the second relative position transformation vector sequence can be recorded as a first total distance, the sum of the distances between the second relative position transformation vector in the first relative position transformation vector sequence and each relative position transformation vector in the second relative position transformation vector sequence can be recorded as a second total distance, and accordingly, a third total distance, a fourth total distance \8230 \82308230anda Q total distance can be determined. Naturally, the first total distance corresponds to the first difference, the second total distance corresponds to the second difference, and so on, and the qth total distance corresponds to the qth difference. Then, the magnitudes of the above-mentioned respective total distances may be compared, and the total distance having the smallest value is selected therefrom. Assuming that the smallest value is the lth total distance, the corresponding lth difference value may be taken as the time difference value between the first frame sequence and the second frame sequence.
In this embodiment, after the execution subject determines the time difference between the two frame sequences, the time difference may be used to correct the frame sequences of the two shooting devices, so as to calibrate the two shooting devices, that is, determine external parameters, distortion parameters, and the like of the two shooting devices. The calibration method can adopt some existing open-source calibration methods.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the method for generating time information according to the present embodiment. In the application scenario of fig. 3, first, respective frame sequences, respectively denoted as a first frame sequence 301 and a second frame sequence 302, of two captured videos of a target object captured in the same time period may be obtained. Wherein the first frame sequence comprises three frames including frame 11, frame 12 and frame 13, and the second frame sequence comprises three frames including frame 21, frame 22 and frame 23.
Then, for first frame sequence 301, frame 11 and frame 12 are selected to form frame group 11, and frame 12 and frame 13 are selected to form frame group 12, thereby obtaining first frame group sequence 303. For the second frame sequence 302, a frame group 21 is formed by selecting the frame 21 and the frame 22, and a frame group 22 is formed by selecting the frame 22 and the frame 23, so as to obtain a second frame group sequence 304.
Then, for the first frame group sequence 303, a corresponding first relative position transform matrix sequence 305 is determined, wherein the first relative position transform matrix [3 5 3 ] corresponding to the frame group 11 and the first relative position transform matrix [3 9 4 ] corresponding to the frame group 12 are included, and the first relative position transform matrix [3 5 3 1] represents a position transform relationship between target objects at shooting times corresponding to the frame 11 and the frame 12 included in the frame group 11 respectively. The first relative position conversion matrix [3 9 4 ] represents a positional conversion relationship between target objects at shooting times corresponding to the frames 12 and 13 included in the frame group 12.
For the second frame group sequence 304, a corresponding second relative position transform matrix sequence 306 is determined, where the second relative position transform matrix [ 29 4 ] corresponding to the frame group 21 and the second relative position transform matrix [7 8 2 ] corresponding to the frame group 22 are included, where the second relative position transform matrix [ 29 4 ] represents a position transform relationship between target objects at shooting times corresponding to the frame 21 and the frame 22 included in the frame group 21, respectively. The second relative position transform matrix [7 8 2 ] represents a positional transform relationship between target objects at shooting times corresponding to the frames 22 and 23 included in the frame group 22.
Then, the first relative position transformation matrix sequence 305 is processed to obtain a first relative position transformation matrix [3 5 3 1]Corresponding first relative position transform vector [3 5 3 1] T And a first relative position transform matrix [3 9 4 1]]Corresponding first relative position transform vector [3 9 4 1] T . And transforms the first relative position into a vector [3 5 3 1] T And a first relative position transform vector [3 9 4 1] T Constituting a first relative position transform vector sequence 307.
Then, the second relative position transformation matrix sequence 306 is processed to obtain the second relative position transformation matrix [ 29 4 1]]Corresponding second relative position transform vector [ 29 4 1] T And a second relative position transformation matrix [7 8 21]Corresponding second relative position transform vector [7 8 21] T . And transforms the second relative position into a matrix [ 29 4 1]] T And a second relative position transformation matrix [7 8 21] T A second relative position transform vector sequence 308 is formed.
Then, the time difference between the first frame sequence 301 and the second frame sequence 302 is determined based on the first sequence of relative position transformation vectors 307 and the second sequence of relative position transformation vectors 308.
Specifically, the first relative position transform vector [3 5 3 1] in the first relative position transform vector sequence 307] T With a second relative position transform vector [ 29 4 1] in the second sequence of relative position transform vectors 308] T And a second relative position transform vector [7 8 21] T Are 18 and 26, respectively. Therefore, a smaller 18 is selected as the first distance, since the first relative position transform vector [3 5 3 1] in the first sequence of relative position transform vectors 307] T A first second relative position transform vector [ 29 4 1] in the second sequence of relative position transform vectors 308] T Is small, the first difference is noted as 0.
The second first relative position translation vector [3 9 4 1] in the first sequence of relative position translation vectors 307] T With a second relative position transform vector [ 29 4 1] in the second sequence of relative position transform vectors 308] T And a second relative position transform vector [7 8 21] T Are 1 and 21, respectively. Thus, a smaller 1 is chosen as the second distance, due to the second first relative position transform vector [3 9 4 1] in the first sequence of relative position transform vectors 307] T With the first second relative position transform vector [ 29 4 1] in the second sequence of relative position transform vectors 308] T Is smaller, and therefore the second difference is recorded as 1.
Since the second distance 1 is smaller than the first distance 18, a second difference value 1 corresponding to the second distance is selected as a time difference value between the first frame sequence 301 and the second frame sequence 302.
The method for generating time information provided by the above embodiments of the present application obtains two frame group sequences by first selecting two consecutive frames from the two frame sequences. Then, a relative position transformation matrix sequence corresponding to each frame group sequence is generated, and then, a corresponding relative position transformation vector is obtained through processing. Then, a time difference value between the two frame sequences is obtained by transforming the vector sequence based on the corresponding relative positions of the two frame sequences, thereby providing a method for determining the time difference between the two frame sequences.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method for generating time information is shown. The flow 400 of the method for generating time information comprises the following steps:
step 401, acquiring respective frame sequences of two captured videos of a target object captured in the same time period.
Step 402, for each frame sequence, sequentially selecting frame groups from the frame sequence to generate a frame group sequence.
For the specific implementation process of steps 401 and 402, reference may be made to the relevant description of steps 201 and 202 in the corresponding embodiment of fig. 2, and details are not repeated here.
Step 403, for each frame in each frame group in the frame group sequence, determining a position transformation matrix between the position of the target object corresponding to the frame and the position of the corresponding shooting device; and for each frame group in the frame group sequence, multiplying the position transformation matrix corresponding to the next frame in the frame group by the inverse matrix of the position transformation matrix corresponding to the previous frame to obtain the relative position transformation matrix corresponding to the frame group, thereby generating the relative position transformation matrix sequence corresponding to the frame sequence.
In this embodiment, the execution subject may first determine a position transformation matrix between the position of the target object corresponding to each frame in each frame group in the frame group sequence and the position of the corresponding photographing apparatus based on the frame group sequence obtained in step 402, thereby generating a relative transformation matrix sequence corresponding to the frame group sequence. The position transformation matrix is used for representing the transformation relation between the position of the target object corresponding to the frame and the position of the shooting device. Then, the product of the position transformation matrix corresponding to the next frame in the frame group and the inverse matrix of the position transformation matrix corresponding to the previous frame may be used as the relative position transformation matrix corresponding to the frame group.
It should be noted that, two consecutive frames included in each frame group may be stored in the sequence of the two frames in the frame sequence, for example, a frame group formed by a first frame and a second frame in the frame sequence, and the first frame may be regarded as a previous frame, and the second frame may be regarded as a next frame. For a specific determination process of the position transformation matrix, reference may be made to the related description of the relative position transformation matrix in step 203 in the embodiment corresponding to fig. 2, which is not described herein again.
Step 404, for each frame sequence, performing data processing on the relative position transformation matrix in the relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix, so as to generate a relative position transformation vector sequence corresponding to the frame sequence.
The specific execution process of step 404 may refer to the related description of step 204 in the corresponding embodiment of fig. 2, and is not repeated herein.
Step 405 selects a relative position transform vector sequence including a small number of relative position transform vectors from the two relative position transform vector sequences, and sequentially selects synchronization units from the relative position transform vector sequence including a large number of relative position transform vectors using the number of relative position transform vectors included in the selected relative position transform vector sequence as a reference number, thereby generating a synchronization unit sequence.
In this embodiment, the execution subject may first compare the magnitudes of the numbers of relative position transform vectors included in the relative position transform vector sequences respectively corresponding to the two frame sequences, and then, take the smaller number as the reference number. Then, a reference number of relative position transformation vectors connected to each other may be sequentially selected from a relative position transformation vector sequence including a larger number of relative position transformation vectors as a synchronization unit, thereby obtaining a synchronization unit sequence. Each synchronization unit may refer to a reference number of relative position transformation vectors that are arbitrarily connected in a relative position transformation vector sequence including a large number of relative position transformation vectors.
For example, assume that two relative position transform vector sequences are denoted as a first relative position transform vector sequence and a second relative position transform vector sequence, respectively. Wherein the first relative position transformation vector sequence includes 15 relative position transformation vectors, and the second relative position transformation vector sequence includes 20 relative position transformation vectors, then 15 may be selected as the reference number. Then, the first relative position transform vector to the fifteenth relative position transform vector are sequentially selected from the second relative position transform vector sequence as a first synchronization unit. And selecting the second relative position transformation vector to the sixteenth relative position transformation vector as a second synchronization unit, and repeating the steps in sequence to obtain a synchronization unit sequence.
As an example, after obtaining the sequence of synchronization units, for each synchronization unit in the sequence of synchronization units, the length of each relative position transformation vector in the synchronization unit may be determined first, then the sum of the lengths of each obtained relative position transformation vector is determined, and is divided by the number of relative position transformation vectors in the synchronization unit, so as to obtain the average distance corresponding to the synchronization unit. Then, the average distance corresponding to the relative position conversion vector sequence including a smaller number of relative position conversion vectors can be determined by the same method, and the determined average distance can be referred to as a reference average distance.
Then, the distance difference value between the average distance corresponding to each synchronization unit in the synchronization unit sequence and the reference average distance is determined, so that the synchronization unit with the smallest distance difference value from the reference average distance is selected, and the difference value obtained by subtracting 1 from the position of the first relative position transformation vector in the synchronization unit in the relative position transformation vector sequence corresponding to the synchronization unit is used as the time difference value between the two frame sequences.
For example, a relative position transformation vector sequence with a larger number of relative position transformation vectors is referred to as a first relative position transformation vector sequence, and assuming that the first relative position transformation vector sequence includes 10 relative position transformation vectors and the reference number is 5, 6 corresponding synchronization units can be obtained. Assuming that the average distance corresponding to the 2 nd synchronization unit is the smallest, the second synchronization unit contains the second to sixth relative position transformation vectors in the first relative position transformation vector sequence. Therefore, the position of the first relative position transform vector in the synchronization unit in the first relative position transform vector sequence is 2, and then 1 is subtracted to obtain a difference value of 1, so 1 is taken as the time difference value between two frame sequences.
As an example, after obtaining the sequence of synchronization units, for each synchronization unit in the sequence of synchronization units, an inner product of the synchronization unit and a sequence of relative position transform vectors containing a smaller number of relative position transform vectors may be determined as an average distance corresponding to the synchronization unit. Then, the synchronization unit with the minimum average distance is selected, and the difference value obtained by subtracting 1 from the position of the first relative position transformation vector in the synchronization unit in the corresponding relative position transformation vector sequence of the synchronization unit is used as the time difference value between the two frame sequences.
Step 406, for each synchronization unit in the sequence of synchronization units, performing the steps of: generating a relative position transform vector group sequence by using the synchronization unit and a relative position transform vector having the same position in a relative position transform vector sequence including a small number of relative position transform vectors as a relative position transform vector group; determining the distance corresponding to the relative position transformation vector group aiming at each relative position transformation vector group in the relative position transformation vector group sequence, and generating a distance sequence corresponding to the relative position transformation vector group sequence; and generating the average distance corresponding to the synchronization unit based on the distance sequence.
In this embodiment, the execution body may determine an average distance corresponding to each synchronization unit in the sequence of synchronization units. Wherein, for the average distance corresponding to each synchronization unit, the following steps can be performed:
first, the synchronization unit and the relative position conversion vector having the same position in the relative position conversion vector sequence including the smaller number of relative position conversion vectors are set as one relative position conversion vector group.
For example, a sequence of relative position transform vectors including a smaller number of relative position transform vectors is taken as the first sequence of relative position transform vectors. It is assumed that the synchronization unit and the first sequence of relative position transformation vectors comprise 10 relative position transformation vectors. Then, the first relative position translation vector in the synchronization unit and the first relative position translation vector in the first sequence of relative position translation vectors may be taken as a first set of relative position translation vectors, the second relative position translation vector in the synchronization unit and the second relative position translation vector in the first sequence of relative position translation vectors may be taken as a second set of relative position translation vectors, and so on.
Then, the distance of the relative position transform vector corresponding to each relative position transform vector group is determined. In practice, for each set of relative position transformation vectors, if one of the two relative position transformation vectors included in the set of relative position transformation vectors is a zero vector, the distance between the relative position transformation vectors corresponding to the set of relative position transformation vectors can be considered as 0. For a relative position transform vector group without a zero matrix, the euclidean distance, minkowski distance, chebyshev distance, cosine similarity, etc. of two relative position transform vectors in the relative position transform vector group may be taken as the distance corresponding to the relative position transform vector group.
Then, the sum of the distances corresponding to each synchronization unit in the synchronization unit may be determined, and then the sum is divided by the number of the relative position transformation vectors included in the synchronization unit to obtain the average distance corresponding to the synchronization unit.
Step 407 generates a time difference between two frame sequences based on the synchronization unit with the smallest average distance.
In this embodiment, the execution body may first select the synchronization unit with the smallest average distance based on the average distance corresponding to each synchronization unit in the synchronization unit sequence determined in step 406. The difference between the position of the first relative position translation vector in the synchronization unit in the corresponding sequence of relative position translation vectors of the synchronization unit minus 1 may then be used as the time difference between the two frame sequences.
With continuing reference to fig. 5, fig. 5 is a schematic diagram of an application scenario of the method for generating time information according to the present embodiment. In the application scenario of fig. 5, first, respective frame sequences, respectively denoted as a first frame sequence 501 and a second frame sequence 502, of two captured videos of a target object captured in the same time period are obtained. The first frame sequence 501 includes three frames, i.e., a frame 11, a frame 12, and a frame 13. The second frame sequence 502 comprises four frames, frame 21, frame 22, frame 23 and frame 24.
Then, for the first frame sequence 501, frame 11 and frame 12 are selected to form frame group 11, and frame 12 and frame 13 are selected to form frame group 12, thereby obtaining a first frame group sequence 503. For the second frame sequence 502, a frame group 21 is formed by selecting the frame 21 and the frame 22, a frame group 22 is formed by selecting the frame 22 and the frame 23, and a frame group 23 is formed by selecting the frame 23 and the frame 24, so that a second frame group sequence 504 is obtained.
Then, for the first frame group sequence 503, position transformation matrices corresponding to the frames 11, 12, and 13, respectively, may be first determined, and then a product of the position transformation matrix corresponding to the frame 12 and an inverse matrix of the position transformation matrix corresponding to the frame 11 is taken as a first relative position transformation matrix M11 corresponding to the frame group 11, and a product of the position transformation matrix corresponding to the frame 13 and an inverse matrix of the position transformation matrix corresponding to the frame 12 is taken as a first relative position transformation matrix M12 corresponding to the frame group 12, thereby obtaining a first relative position transformation matrix sequence 505.
For the second frame group sequence 504, the position transformation matrices corresponding to the frame 21, the frame 22, the frame 23, and the frame 24 may be determined first, then a product of the position transformation matrix corresponding to the frame 22 and an inverse matrix of the position transformation matrix corresponding to the frame 21 is used as the first relative position transformation matrix M21 corresponding to the frame group 21, a product of the position transformation matrix corresponding to the frame 23 and an inverse matrix of the position transformation matrix corresponding to the frame 22 is used as the first relative position transformation matrix M22 corresponding to the frame group 22, and a product of the position transformation matrix corresponding to the frame 24 and an inverse matrix of the position transformation matrix corresponding to the frame 23 is used as the first relative position transformation matrix M23 corresponding to the frame group 23, so as to obtain the second relative position transformation matrix sequence 506.
Then, a relative position transformation vector V11 corresponding to the first relative position transformation matrix M11 and a relative position transformation vector V12 corresponding to the first relative position transformation matrix M12 may be determined, thereby obtaining a first relative position transformation vector sequence 507. Then, a relative position transformation vector V21 corresponding to the second relative position transformation matrix M21, a relative position transformation vector V22 corresponding to the second relative position transformation matrix M22, and a relative position transformation vector V23 corresponding to the second relative position transformation matrix M23 may be determined, so as to obtain a second relative position transformation vector sequence 508.
Since the first relative position transform vector sequence 507 includes two relative position transform vectors and the second relative position transform vector sequence 508 includes three relative position transform vectors, 2 may be selected as the reference number. Then, the second relative position transformation vectors V21 and V22 are selected from the second relative position transformation vector sequence 508 to form a first synchronization unit, and the second relative position transformation vectors V22 and V23 are selected to form a second synchronization unit, so as to obtain a synchronization unit sequence.
Then, for the first synchronization unit, the first second relative position transformation vector V21 and the first relative position transformation vector V11 in the first relative position transformation vector sequence form a relative position transformation vector group 510, and the second relative position transformation vector V22 and the second relative position transformation vector V12 in the first relative position transformation vector sequence form a relative position transformation vector group 511, so as to generate a relative position transformation vector group sequence corresponding to the first synchronization unit.
Then, distances corresponding to the relative position transformation vector group 510 and the relative position transformation vector group 511, respectively, may be determined, and an average value of a sum of the two distances may be used as an average distance corresponding to the first synchronization unit, and may be recorded as a first average distance.
Similarly, for the second synchronization unit, the first second relative position transform vector V22 and the first relative position transform vector V11 in the first relative position transform vector sequence form a relative position transform vector group 512, and the second relative position transform vector V23 and the second relative position transform vector V12 in the first relative position transform vector sequence form a relative position transform vector group 513, so as to generate a relative position transform vector group sequence corresponding to the second synchronization unit. Then, distances corresponding to the set of relative position transformation vectors 512 and the set of relative position transformation vectors 513, respectively, may be determined, and an average value of a sum of the two distances may be regarded as an average distance corresponding to the second synchronization unit, and may be recorded as a second average distance.
Then, the magnitudes of the first average distance and the second average distance are compared, and it is determined that the first relative position transform vector V22 in the corresponding second synchronization unit is the second relative position transform vector in the second relative position transform vector sequence 506, i.e. the position is 2, assuming that the second average distance is smaller. Therefore, subtracting 1 from position 2 results in a difference of 1, and therefore, the time difference of 1 can be taken as the time difference between the first frame sequence 501 and the second frame sequence 502.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 2, the scheme described in this embodiment may use the product of the position transformation matrix corresponding to the next frame and the position transformation matrix corresponding to the previous frame in each frame group as the relative position transformation matrix corresponding to the frame group. Then, after obtaining the relative position transformation vector sequences corresponding to the two frame sequences, the synchronization unit sequence may be generated by selecting the number of relative position transformation vectors in the relative position transformation vector sequence containing less relative position transformation vectors as a reference number, and selecting the synchronization unit from the other relative position transformation vector sequence containing more relative position transformation vectors based on the reference number, and then sequentially comparing each synchronization unit in the synchronization unit sequence with the relative position transformation vector sequence containing less relative position transformation vectors, and selecting the synchronization unit with the smallest average distance based on the distance between the relative position transformation vectors corresponding to the positions. The determination of the time difference between two relative position transformation vectors can then be carried out on the basis of the position of the first relative position transformation vector in the synchronization unit with the smallest average distance in the sequence of relative position transformation vectors which contains a larger number of relative position transformation vectors.
With further reference to fig. 6, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for generating time information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 6, the apparatus 600 for generating time information of the present embodiment includes an acquisition unit 601, a frame group sequence generation unit 602, a relative position transform matrix sequence generation unit 603, a relative position transform vector sequence generation unit 604, and a time difference value generation unit 605. Wherein the acquisition unit 601 is configured to acquire respective frame sequences of two captured videos of a target object captured in the same period of time; the frame group sequence generating unit 602 is configured to, for each frame sequence, sequentially select a frame group from the frame sequence, and generate a frame group sequence, where the frame group includes two frames connected in the frame sequence; the relative position transformation matrix sequence generating unit 603 is configured to determine, for each frame group sequence, a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generate a relative position transformation matrix sequence corresponding to the frame sequence, where the relative position transformation matrix is used to represent a transformation relationship between positions of target objects respectively corresponding to two frames in the frame group; the relative position transformation vector sequence generation unit 604 is configured to perform data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to each frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix for each frame sequence, so as to generate a relative position transformation vector sequence corresponding to the frame sequence, where the relative position transformation vector is a vector representation of the relative position transformation matrix; the time difference value generation unit 605 is configured to generate a time difference value between two frame sequences based on the relative position transform vector sequences respectively corresponding to the two frame sequences.
In this embodiment, specific processes of the obtaining unit 601, the frame group sequence generating unit 602, the relative position transformation matrix sequence generating unit 603, the relative position transformation vector sequence generating unit 604, and the time difference value generating unit 605 in the apparatus 600 for generating time information and technical effects brought by the specific processes can refer to relevant descriptions of step 201, step 202, step 203, step 204, and step 205 in the corresponding embodiment of fig. 2, which are not described herein again.
In some optional implementations of this embodiment, the relative position transformation matrix sequence generating unit 603 is further configured to: for each frame in each frame group in the frame group sequence, determining a position transformation matrix between the position of a target object corresponding to the frame and the position of corresponding shooting equipment, wherein the position transformation matrix is used for representing the transformation relation between the position of the target object corresponding to the frame and the position of the shooting equipment; and for each frame group in the frame group sequence, multiplying the position transformation matrix corresponding to the next frame in the frame group by the inverse matrix of the position transformation matrix corresponding to the previous frame to obtain the relative position transformation matrix corresponding to the frame group.
In some optional implementations of this embodiment, the time difference value generating unit 605 is further configured to: selecting a relative position transformation vector sequence containing a small number of relative position transformation vectors from the two relative position transformation vector sequences, taking the number of the relative position transformation vectors contained in the selected relative position transformation vector sequence as a reference number, sequentially selecting a synchronization unit from the relative position transformation vector sequences containing a large number of the relative position transformation vectors, and generating a synchronization unit sequence, wherein the synchronization unit comprises a reference number of relative position transformation vectors connected in the relative position transformation vector sequence; based on the sequence of synchronization units, a time difference between two sequences of frames is generated.
In some optional implementations of this embodiment, the time difference value generating unit 605 is further configured to: determining an average distance corresponding to each synchronization unit in the synchronization unit sequence; the time difference between two frame sequences is generated based on the synchronization unit with the smallest corresponding average distance.
The time difference value generation unit 605 is further configured to: for each synchronization unit in the sequence of synchronization units, performing the following steps for the synchronization unit: generating a relative position conversion vector group sequence by using the synchronization unit and a relative position conversion vector with the same position in a relative position conversion vector sequence with a small number of relative position conversion vectors as a relative position conversion vector group; determining the distance corresponding to the relative position transformation vector group aiming at each relative position transformation vector group in the relative position transformation vector group sequence, and generating a distance sequence corresponding to the relative position transformation vector group sequence; and generating an average distance corresponding to the synchronous unit based on the distance sequence.
The apparatus provided in the above embodiment of the present application obtains the frame sequences of the two captured videos through the obtaining unit 601, and then the frame group sequence generating unit 602 selects two consecutive frames from the respective frame sequences of the two captured videos to obtain the two frame group sequences. Then, the relative position transform matrix sequence generating unit 603 generates a relative position transform matrix sequence corresponding to each frame group sequence, and then the relative position transform vector sequence generating unit 604 processes each relative position transform matrix to obtain a corresponding relative position transform vector, thereby generating a pair-position transform vector sequence. Then, the time difference value generation unit 605 transforms the vector sequence based on the relative positions of the two frame sequences, so as to obtain the time difference value between the two frame sequences, thereby determining the time difference between the two frame sequences.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use in implementing a terminal device or server of an embodiment of the present application. The terminal device or the server shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU) 701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that the computer program read out therefrom is mounted in the storage section 708 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU) 701, performs the above-described functions defined in the method of the present application.
It should be noted that the computer readable medium of the present application can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a frame group sequence generation unit, a relative position transform matrix sequence generation unit, a relative position transform vector sequence generation unit, and a time difference value generation unit. Here, the names of these units do not constitute a limitation of the unit itself in some cases, and for example, the acquisition unit may also be described as a "unit that acquires respective frame sequences of two captured videos of the target object captured in the same period of time".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring respective frame sequences of two shot videos of a target object shot in the same time period; for each frame sequence, sequentially selecting a frame group from the frame sequence to generate a frame group sequence, wherein the frame group comprises two connected frames in the frame sequence; for each frame group sequence, determining a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generating a relative position transformation matrix sequence corresponding to the frame sequence, wherein the relative position transformation matrix is used for representing a transformation relation between positions of target objects respectively corresponding to two frames in the frame group; for each frame sequence, carrying out data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix so as to generate a relative position transformation vector sequence corresponding to the frame sequence, wherein the relative position transformation vector is represented by a vector of the relative position transformation matrix; and generating a time difference value between the two frame sequences based on the relative position transformation vector sequences respectively corresponding to the two frame sequences.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements in which any combination of the features described above or their equivalents does not depart from the spirit of the invention disclosed above. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (12)

1. A method for generating time information, comprising:
acquiring respective frame sequences of two shot videos of a target object shot in the same time period;
for each frame sequence, sequentially selecting a frame group from the frame sequence to generate a frame group sequence, wherein the frame group comprises two connected frames in the frame sequence;
for each frame group sequence, determining a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generating a relative position transformation matrix sequence corresponding to the frame sequence, wherein the relative position transformation matrix is used for representing a transformation relation between positions of target objects respectively corresponding to two frames in the frame group;
for each frame sequence, carrying out data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to the frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix so as to generate a relative position transformation vector sequence corresponding to the frame sequence, wherein the relative position transformation vector is represented by a vector of the relative position transformation matrix;
generating a time difference value between two frame sequences based on the relative position transformation vector sequences respectively corresponding to the two frame sequences;
and correcting respective frame sequences of the two shot videos by using the time difference value so as to realize the calibration of shooting equipment corresponding to the two shot videos respectively.
2. The method of claim 1, wherein the determining a relative position transformation matrix corresponding to a frame group in the sequence of frame groups comprises:
for each frame in each frame group in the frame group sequence, determining a position transformation matrix between the position of a target object corresponding to the frame and the position of corresponding shooting equipment, wherein the position transformation matrix is used for representing the transformation relation between the position of the target object corresponding to the frame and the position of the shooting equipment;
and for each frame group in the frame group sequence, multiplying the position transformation matrix corresponding to the next frame in the frame group by the inverse matrix of the position transformation matrix corresponding to the previous frame to obtain the relative position transformation matrix corresponding to the frame group.
3. The method according to claim 1 or 2, wherein the generating a time difference value between two frame sequences based on their corresponding relative position transformation vector sequences comprises:
selecting a relative position transformation vector sequence containing a small number of relative position transformation vectors from the two relative position transformation vector sequences, taking the number of the relative position transformation vectors contained in the selected relative position transformation vector sequence as a reference number, and sequentially selecting a synchronization unit from the relative position transformation vector sequences containing a large number of relative position transformation vectors to generate a synchronization unit sequence, wherein the synchronization unit comprises a reference number of relative position transformation vectors which are connected in the relative position transformation vector sequence;
based on the sequence of synchronization units, a time difference between the two sequences of frames is generated.
4. The method of claim 3, wherein the generating a time difference between the two frame sequences based on the synchronization unit sequence comprises:
determining an average distance corresponding to each synchronization unit in the synchronization unit sequence;
generating a time difference between the two frame sequences based on the synchronization unit having the smallest corresponding average distance.
5. The method of claim 4, wherein the determining an average distance for each synchronization unit in the sequence of synchronization units comprises:
for each synchronization unit in the sequence of synchronization units, performing the following steps for that synchronization unit:
generating a relative position conversion vector group sequence by using the synchronization unit and a relative position conversion vector with the same position in a relative position conversion vector sequence containing a small number of relative position conversion vectors as a relative position conversion vector group;
determining the distance corresponding to each relative position transformation vector group in the relative position transformation vector group sequence, and generating a distance sequence corresponding to the relative position transformation vector group sequence;
and generating an average distance corresponding to the synchronous unit based on the distance sequence.
6. An apparatus for generating time information, comprising:
an acquisition unit configured to acquire respective frame sequences of two captured videos of a target object captured within the same period of time;
the frame group sequence generating unit is configured to sequentially select a frame group from each frame sequence and generate a frame group sequence, wherein the frame group comprises two connected frames in the frame sequence;
a relative position transformation matrix sequence generating unit configured to determine, for each frame group sequence, a relative position transformation matrix corresponding to a frame group in the frame group sequence, and generate a relative position transformation matrix sequence corresponding to the frame sequence, wherein the relative position transformation matrix is used for representing a transformation relationship between positions of target objects respectively corresponding to two frames in the frame group;
a relative position transformation vector sequence generation unit, configured to perform data processing on a relative position transformation matrix in a relative position transformation matrix sequence corresponding to each frame sequence to obtain a relative position transformation vector corresponding to the relative position transformation matrix, so as to generate a relative position transformation vector sequence corresponding to the frame sequence, where the relative position transformation vector is represented by a vector of the relative position transformation matrix;
a time difference value generation unit configured to generate a time difference value between two frame sequences based on relative position transformation vector sequences respectively corresponding to the two frame sequences;
and correcting respective frame sequences of the two shot videos by utilizing the time difference value so as to realize the calibration of shooting equipment corresponding to the two shot videos respectively.
7. The apparatus of claim 6, wherein the relative position transform matrix sequence generation unit is further configured to:
for each frame in each frame group in the frame group sequence, determining a position transformation matrix between the position of a target object corresponding to the frame and the position of corresponding shooting equipment, wherein the position transformation matrix is used for representing the transformation relation between the position of the target object corresponding to the frame and the position of the shooting equipment;
and for each frame group in the frame group sequence, multiplying the position transformation matrix corresponding to the next frame in the frame group by the inverse matrix of the position transformation matrix corresponding to the previous frame to obtain the relative position transformation matrix corresponding to the frame group.
8. The apparatus according to one of claims 6 or 7, wherein the time difference value generation unit is further configured to:
selecting a relative position transformation vector sequence containing a small number of relative position transformation vectors from the two relative position transformation vector sequences, taking the number of the relative position transformation vectors contained in the selected relative position transformation vector sequence as a reference number, and sequentially selecting a synchronization unit from the relative position transformation vector sequences containing a large number of relative position transformation vectors to generate a synchronization unit sequence, wherein the synchronization unit comprises a reference number of relative position transformation vectors which are connected in the relative position transformation vector sequence;
based on the sequence of synchronization units, a time difference between the two sequences of frames is generated.
9. The apparatus of claim 8, wherein the time difference generation unit is further configured to:
determining an average distance corresponding to each synchronization unit in the synchronization unit sequence;
and generating a time difference value between the two frame sequences based on the synchronization unit with the minimum corresponding average distance.
10. The apparatus of claim 9, wherein the time difference value generation unit is further configured to:
for each synchronization unit in the sequence of synchronization units, performing the following steps for that synchronization unit:
generating a relative position conversion vector group sequence by using the synchronization unit and a relative position conversion vector with the same position in a relative position conversion vector sequence with a small number of relative position conversion vectors as a relative position conversion vector group;
determining the distance corresponding to the relative position transformation vector group aiming at each relative position transformation vector group in the relative position transformation vector group sequence, and generating a distance sequence corresponding to the relative position transformation vector group sequence;
and generating the average distance corresponding to the synchronization unit based on the distance sequence.
11. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
12. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
CN201810331501.6A 2018-04-13 2018-04-13 Method and apparatus for generating time information Active CN110381264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810331501.6A CN110381264B (en) 2018-04-13 2018-04-13 Method and apparatus for generating time information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810331501.6A CN110381264B (en) 2018-04-13 2018-04-13 Method and apparatus for generating time information

Publications (2)

Publication Number Publication Date
CN110381264A CN110381264A (en) 2019-10-25
CN110381264B true CN110381264B (en) 2022-12-02

Family

ID=68243687

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810331501.6A Active CN110381264B (en) 2018-04-13 2018-04-13 Method and apparatus for generating time information

Country Status (1)

Country Link
CN (1) CN110381264B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130047856A (en) * 2011-11-01 2013-05-09 선문대학교 산학협력단 Apparatus and method for multi-view 3d video display
CN104063867A (en) * 2014-06-27 2014-09-24 浙江宇视科技有限公司 Multi-camera video synchronization method and multi-camera video synchronization device
US9049385B1 (en) * 2014-07-01 2015-06-02 Robert K. McCullough Tool for synchronizing video media clips
CN106991690A (en) * 2017-04-01 2017-07-28 电子科技大学 A kind of video sequence synchronous method based on moving target timing information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130047856A (en) * 2011-11-01 2013-05-09 선문대학교 산학협력단 Apparatus and method for multi-view 3d video display
CN104063867A (en) * 2014-06-27 2014-09-24 浙江宇视科技有限公司 Multi-camera video synchronization method and multi-camera video synchronization device
US9049385B1 (en) * 2014-07-01 2015-06-02 Robert K. McCullough Tool for synchronizing video media clips
CN106991690A (en) * 2017-04-01 2017-07-28 电子科技大学 A kind of video sequence synchronous method based on moving target timing information

Also Published As

Publication number Publication date
CN110381264A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN108492364B (en) Method and apparatus for generating image generation model
CN109191514B (en) Method and apparatus for generating a depth detection model
CN108830235B (en) Method and apparatus for generating information
CN107633218B (en) Method and apparatus for generating image
CN109255337B (en) Face key point detection method and device
CN109829432B (en) Method and apparatus for generating information
CN110516678B (en) Image processing method and device
CN110310299B (en) Method and apparatus for training optical flow network, and method and apparatus for processing image
CN110111241B (en) Method and apparatus for generating dynamic image
CN109118456B (en) Image processing method and device
CN111325792A (en) Method, apparatus, device, and medium for determining camera pose
CN108921792B (en) Method and device for processing pictures
CN114869528A (en) Scanning data processing method, device, equipment and medium
CN110381264B (en) Method and apparatus for generating time information
KR101806840B1 (en) High Resolution 360 degree Video Generation System using Multiple Cameras
CN112927340A (en) Three-dimensional reconstruction acceleration method, system and equipment independent of mechanical placement
CN109840059B (en) Method and apparatus for displaying image
CN109816791B (en) Method and apparatus for generating information
CN111369475A (en) Method and apparatus for processing video
CN115170395A (en) Panoramic image stitching method, panoramic image stitching device, electronic equipment, panoramic image stitching medium and program product
CN111314627B (en) Method and apparatus for processing video frames
CN115002345A (en) Image correction method and device, electronic equipment and storage medium
CN112492230B (en) Video processing method and device, readable medium and electronic equipment
CN113066166A (en) Image processing method and device and electronic equipment
CN113688928A (en) Image matching method and device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant