CN115276871A - Image frame time stamp determination method, device, medium, data processor and system - Google Patents

Image frame time stamp determination method, device, medium, data processor and system Download PDF

Info

Publication number
CN115276871A
CN115276871A CN202210896646.7A CN202210896646A CN115276871A CN 115276871 A CN115276871 A CN 115276871A CN 202210896646 A CN202210896646 A CN 202210896646A CN 115276871 A CN115276871 A CN 115276871A
Authority
CN
China
Prior art keywords
timestamp
image frame
exposure
currently received
time stamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210896646.7A
Other languages
Chinese (zh)
Inventor
韩勇
胡小波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LeiShen Intelligent System Co Ltd
Original Assignee
LeiShen Intelligent System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LeiShen Intelligent System Co Ltd filed Critical LeiShen Intelligent System Co Ltd
Priority to CN202210896646.7A priority Critical patent/CN115276871A/en
Publication of CN115276871A publication Critical patent/CN115276871A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0602Systems characterised by the synchronising information used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0647Synchronisation among TDM nodes
    • H04J3/065Synchronisation among TDM nodes using timestamps

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The application discloses a method, a device, a medium, a data processor and a system for determining a time stamp of an image frame. The method comprises the following steps: determining whether the current image frame is normally exposed or not according to the trigger timestamp and the exposure timestamp of the current image frame; if the current image frame is normally exposed, determining a synchronous timestamp for the current image frame according to a receiving timestamp and the exposure timestamp; and if the current image frame is abnormally exposed, determining a synchronous time stamp for the current image frame according to the receiving time stamp and the triggering time stamp. According to the technical scheme, a fault-tolerant mechanism is provided for the determination of the synchronous timestamp, and the accuracy of the synchronous timestamp can be improved.

Description

Image frame time stamp determination method, device, medium, data processor and system
Technical Field
The present application relates to the field of computer application technologies, and in particular, to a method, an apparatus, a medium, a data processor, and a system for determining a timestamp of an image frame.
Background
The multi-sensor data fusion technology is widely applied to the fields of automatic driving and surveying and mapping, multi-sensor fusion is carried out based on timestamp information carried by each frame of sensor data, and the accuracy of the timestamp information directly influences the multi-sensor data fusion effect.
In the related art, in many cases, when the data processing module receives the sensor data, the time stamp information transmitted by the synchronization control module is bound to the sensor data as the time stamp information of the sensor data.
However, due to objective factors such as electromagnetic environment, network state and system load, packet loss may occur in sensor data such as image frames and timestamp information sent by the synchronization control module, which makes the accuracy of timestamp information determined for sensor data by using related technologies lower.
Disclosure of Invention
The application provides a method, a device, a medium, a data processor and a system for determining a timestamp of an image frame, which can achieve the purpose of improving the accuracy of timestamp information.
According to a first aspect of the present application, there is provided a method of time stamp determination of an image frame, the method comprising:
determining image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and exposure timestamp;
determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and a preset exposure duration;
if the currently received image frame is normally exposed, determining an image transmission duration corresponding to the currently received image frame according to a receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image transmission duration and a preset transmission duration;
if the exposure of the currently received image frame is abnormal, determining an image acquisition duration corresponding to the currently received image frame according to a receiving timestamp and the triggering timestamp; and determining a synchronous time stamp for the currently received image frame according to the relative size relation between the image acquisition time length and a preset acquisition time length.
According to a second aspect of the present application, there is provided an apparatus for time stamp determination of an image frame, the apparatus comprising:
the image exposure duration determining module is used for determining the image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp;
the exposure condition determining module is used for determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and the preset exposure duration;
a first synchronous timestamp determining module, configured to determine, if the currently received image frame is normally exposed, an image transmission duration corresponding to the currently received image frame according to a receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relation between the image transmission time length and a preset transmission time length;
a second synchronous timestamp determining module, configured to determine, if the currently received image frame is abnormally exposed, an image acquisition duration corresponding to the currently received image frame according to a receiving timestamp and the trigger timestamp; and determining a synchronous time stamp for the currently received image frame according to the relative size relation between the image acquisition time length and a preset acquisition time length.
According to a third aspect of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of time stamp determination for an image frame as described in embodiments of the present application.
According to a fourth aspect of the present invention, there is provided a data processor comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method for time stamp determination of image frames as described in embodiments of the present application when executing the computer program.
According to a fifth aspect of the present invention, an embodiment of the present application provides a timestamp determination system for an image frame, where the timestamp determination system includes:
the data processor is used for sending a trigger pulse for starting the image acquisition module and recording the time of the trigger pulse as a trigger timestamp; the image acquisition module is also used for receiving the image frames output by the image acquisition module and recording the time for receiving the image frames as a receiving time stamp; the exposure module is also used for receiving an exposure time stamp output by the image acquisition module;
the data processor is further used for determining image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp; determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and a preset exposure duration; if the currently received image frame is normally exposed, determining an image transmission duration corresponding to the currently received image frame according to a receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image transmission duration and a preset transmission duration; if the exposure of the currently received image frame is abnormal, determining an image acquisition duration corresponding to the currently received image frame according to a receiving timestamp and the triggering timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image acquisition duration and a preset acquisition duration;
the image acquisition module is used for receiving the trigger pulse sent by the data processor, carrying out exposure acquisition on an image frame according to the trigger pulse and sending the acquired image frame to the data processor; and recording the time of completing the exposure as an exposure time stamp, and sending the exposure time stamp to the data processor.
According to the technical scheme of the embodiment of the application, whether the current image frame is normally exposed is determined according to the trigger timestamp and the exposure timestamp of the current image frame; and selecting different determination modes to determine the synchronous time stamp for the current image frame based on different exposure conditions of the current image frame. The method for determining the synchronous time stamp has the advantages that a fault-tolerant mechanism is provided for determining the synchronous time stamp, even under the condition that time stamp information or image frame information is lost, the accurate synchronous time stamp can be determined for the current image frame by using the method for determining the time stamp of the image frame, the accuracy of the synchronous time stamp is effectively improved, and the situation that data fusion cannot be performed or the data fusion result is wrong due to the fact that the synchronous time stamp is accurate can be effectively avoided.
It should be understood that the statements in this section are not intended to identify key or critical features of the embodiments of the present application, nor are they intended to limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flow chart of a method for timestamp determination of an image frame provided according to an embodiment;
fig. 2 is a flowchart of a time stamp determination method of an image frame provided according to the second embodiment;
fig. 3 is a flowchart of a time stamp determination method of an image frame provided according to the third embodiment;
fig. 4A is a schematic structural diagram of a time stamp determining system for an image frame according to an embodiment of the present application;
fig. 4B is a schematic structural diagram of another system for determining a timestamp of an image frame according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of an apparatus for determining a timestamp of an image frame according to a fourth embodiment of the present application;
fig. 6 is a schematic structural diagram of a data processor according to a fifth embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," "target," and "candidate" and the like in the description and claims of this application and the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be implemented in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example one
Fig. 1 is a flowchart of a method for determining timestamps of image frames according to an embodiment, which is applicable to a multi-sensor fusion scenario and is implemented by a device for determining timestamps of image frames, which may be implemented in hardware and/or software and may be integrated in an electronic device, such as a data processor, running the system.
In order to facilitate understanding of the method for determining a timestamp of an image frame provided in the embodiments of the present application, a timestamp determination system for an image frame used in the method is first introduced. Fig. 4A is a schematic structural diagram of a system for determining a timestamp of an image frame according to an embodiment of the present application. Referring to fig. 4A, a time stamp determination system 400 for an image frame includes: a data processor 410 and an image acquisition module 420. Wherein the data processor 410 is communicatively coupled to the image collector 420.
The data processor 410 is configured to send a trigger pulse for starting the image acquisition module 420, and record a time of the trigger pulse as a trigger timestamp; the image acquisition module 420 is further configured to receive the image frame output by the image acquisition module, and record a time of receiving the image frame as a receiving timestamp; and is further configured to receive the exposure timestamp output by the image capturing module 420;
the image acquisition module 420 is configured to receive a trigger pulse sent by the data processor 410, perform exposure according to the trigger pulse to acquire an image frame, and send the acquired image frame to the data processor 410; the time when the exposure is completed is recorded as an exposure time stamp, and the exposure time stamp is transmitted to the data processor 410.
The data processor 410 is further configured to determine a synchronization timestamp for a currently received image frame based on the trigger timestamp, the exposure timestamp and the receive timestamp.
Fig. 4B is a schematic structural diagram of another system for determining a timestamp of an image frame according to an embodiment of the present application. As shown in fig. 4B, the time stamp determining system for the image frame includes: a first data processor 411, a second data processor 412 and an image acquisition module 420. The first data processor 411 and the second data processor 412 jointly undertake the operation content of the data processor 410 shown in fig. 4A performed in determining the synchronization time stamp for the currently received image frame, and the determination efficiency of the synchronization time stamp can be improved.
Wherein the first data processor 411 is communicatively connected to the image acquisition module 420 via a trigger interface (not shown), and the first data processor 411 is connected to the second data processor 412 via a communication interface (not shown). The second data processor 412 is communicatively coupled to an image acquisition module 420. Illustratively, the image acquisition module 420 may be a camera module. The first data processor 411 and the second data processor 412 are respectively disposed with a first data processing chip and a second data processing chip. The chip types of the first data processing chip and the second data processing chip may be the same or different, and are not limited herein, which is determined according to the actual situation, and for example, the first data processing chip and the second data processing chip may be an MCU (micro controller Unit) having a customization function. The first data processing chip is defined with a first data processing logic, and the second data processing chip is defined with a second data processing logic. The first data processing logic is different from the second data processing logic, and defines a data processing flow of the first data processor 411. The second data processing logic defines a data processing flow of the second data processor 412.
The first data processor 411 is configured to send a trigger pulse for starting the image acquisition module, record a time of the trigger pulse as a trigger timestamp, and send a trigger pulse signal to the image acquisition module 420 through the trigger interface; the trigger pulse signal is used for instructing the image acquisition module 420 to perform exposure and feed back an exposure pulse signal;
the image acquisition module 420 is configured to receive a trigger pulse sent by the first data processor 411, perform exposure according to the trigger pulse to acquire an image frame, and send the acquired image frame to the second data processor 412; the image capturing module 420 is further configured to record the time when the exposure is completed as an exposure time stamp, and send the exposure time stamp to the first data processor 411.
The first data processor 411 sends the trigger timestamp and the exposure timestamp to the second data processor 412 through the communication interface; the second data processor 412 determines a synchronization timestamp for the currently received image frame based on the trigger timestamp, the exposure timestamp and the receive timestamp.
Of course, it is understood that the image acquisition module 420 may not send the exposure time stamp to the second data processor 412 directly, via the first data processor 411. The specific transmission path used to transmit the exposure timestamp to the second data processor 412 may be determined according to actual service requirements, and is not limited herein.
In a specific embodiment, the first data processor 411 triggers a pulse signal to the image capturing module 420 at a fixed time according to a preset image frame rate, a pulse output interface of the first data processor 411 is connected to an external trigger interface of the image capturing module 420, the image capturing module 420 receives the trigger pulse signal to start image frame exposure and simultaneously outputs an exposure pulse signal to the external trigger interface, the external trigger interface is connected to a pulse input interface of the first data processor 411 for the first data processor 411 to recognize, the image capturing module 420 records an exposure completion time as an exposure timestamp, and sends the exposure timestamp to the first data processor 411. The image acquisition module 420 supports pulse trigger input and exposure pulse signal output. The data output interface of the image acquisition module 420 is connected to the data input interface of the second data processor 412. The second data processor 412 reads the image frame data through the data input interface and records the time of receiving the image frame as a reception time stamp.
The second data processor 412 and the first data processor 411 are directly connected through a communication interface, and receive a trigger timestamp, an exposure timestamp, and the like sent by the first data processor 411. The second data processor 412 determines a synchronization timestamp for the currently received image frame based on the trigger timestamp, the exposure timestamp and the receive timestamp.
Alternatively, the operations performed by the data processor 410 and the second data processor 412 in the time stamp determination system for image frames shown in fig. 4A and 4B to "determine a synchronization time stamp for a currently received image frame according to the trigger time stamp, the exposure time stamp, and the reception time stamp" include: determining an image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp; determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and the preset exposure duration; if the exposure of the currently received image frame is normal, determining the image transmission duration corresponding to the currently received image frame according to the receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relation between the image transmission time and the preset transmission time; if the exposure of the currently received image frame is abnormal, determining the image acquisition duration corresponding to the currently received image frame according to the receiving timestamp and the triggering timestamp; and determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image acquisition time length and the preset acquisition time length.
Next, a method for determining a timestamp of an image frame provided in an embodiment of the present application is described. Fig. 1 is a flowchart of a method for determining a timestamp of an image frame, according to an embodiment, as shown in fig. 1, the method including:
s110, determining image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and exposure timestamp;
s120, determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and the preset exposure duration;
s130, if the exposure of the currently received image frame is normal, determining the image transmission duration corresponding to the currently received image frame according to the receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image transmission duration and the preset transmission duration;
s140, if the exposure of the currently received image frame is abnormal, determining the image acquisition duration corresponding to the currently received image frame according to the receiving timestamp and the triggering timestamp; and determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image acquisition time length and the preset acquisition time length.
The currently received image frame refers to an image frame received by the data processor in the current time window. For convenience of description, the currently received image frame is hereinafter referred to as a current image frame.
The current image frame is acquired by the image acquisition module and is sent to the data processor by the image acquisition module. The current image frame has not been updated to the image frame sequence in the data processor, i.e. the synchronization time stamp of the current image frame has not been determined. The synchronous timestamp is the actual acquisition time of the current image frame, and is an important basis for performing data fusion on the current image frame and the sensor data frame subsequently.
The trigger timestamp refers to the time of sending a trigger pulse, wherein the trigger pulse is used for starting the image acquisition module to expose for image acquisition. The exposure time stamp is generated by the image acquisition module, and the exposure time stamp is the time for completing the exposure of the image acquisition module. In addition to the exposure time stamp and the trigger time stamp, a receive time stamp for the current image frame is included in the data processor. Wherein, the receiving time stamp refers to the time when the data processor receives the current image frame. The reception time stamp is generated by the data processor, which generates a reception time stamp for an image frame in case the image frame is received by the data processor, i.e. the reception time stamp in the data processor actually corresponds to the current image frame.
The image frame and the exposure timestamp may be lost due to objective factors such as electromagnetic environment, network state and system load. This results in that the currently received trigger timestamp and exposure timestamp in the data processor may not actually correspond to the current image frame.
In order to ensure the accuracy of the synchronization timestamp of the current image frame, it is necessary to determine whether the currently received trigger timestamp and exposure timestamp actually correspond to the current image frame in the data processor, that is, determine the transmission condition of the image frame and the exposure timestamp, such as determining whether the image frame and the exposure timestamp are lost. A synchronization timestamp is determined for the current image frame for different transmission instances of the image frame and the exposure timestamp.
The transmission of the image frames, the trigger time stamps and the exposure time stamps may be determined by the data processor from the trigger time stamps, the exposure time stamps and the reception time stamps. Firstly, the data processor determines the image exposure duration corresponding to the currently received image frame according to the currently received trigger time stamp and the exposure time stamp so as to determine whether the current image frame is normally exposed.
The moment when the image acquisition module is started to carry out exposure can be determined according to the trigger timestamp, and the moment when the image acquisition module finishes exposure can be determined according to the exposure timestamp; from the start of exposure to the completion of exposure, the image capture module needs to be prepared for exposure, e.g., the image capture module needs to wake up and start up the associated components. The image exposure duration is used for quantifying the time length from the start of exposure to the completion of exposure of the image acquisition module.
The image exposure duration is determined according to the awakening time of the image module, the time of inputting a pulse and the time of scanning a line of pixels by the image acquisition module, under the condition of determination of the image acquisition module, the image exposure duration of the image acquisition module is also determined, and the exposure duration required by normal exposure of an image frame is determined as the reference exposure duration. Whether the current image frame is normally exposed or not can be determined based on the reference exposure time of the image acquisition module. The exposure time stamp and abnormal transmission of the image frame, such as packet loss, may cause the image exposure time period to exceed the reference exposure time period.
Determining an image exposure duration corresponding to the current image frame according to the trigger timestamp and the exposure timestamp of the current image frame, optionally, calculating a relative time interval between the exposure timestamp and the trigger timestamp, and determining the relative time interval as the image exposure duration of the current image frame.
The preset exposure duration is related to an exposure wake-up parameter of the image acquisition module, and the preset exposure duration can be determined according to a reference exposure duration of the image acquisition module, which is not limited herein. The preset exposure time is typically on the order of milliseconds. The preset exposure duration is used for measuring whether the current image frame is normally exposed.
If the image frame exposure duration is less than or equal to the preset exposure duration, indicating that the current image frame is normally exposed, determining the image transmission duration corresponding to the currently received image frame according to the receiving timestamp and the exposure timestamp; and further determining whether the exposure timestamp actually corresponds to the current image frame according to the relative size relation between the image transmission time and the preset transmission time, and determining a synchronous timestamp for the currently received image frame based on the exposure timestamp.
The image transmission duration refers to the time length from the time when the image acquisition module finishes exposure to the time when the data processing module receives the current image frame. It can be known that, in the case where the network state is stable, the image transmission duration of the current image frame is also determined, and the transmission duration required for normal transmission of the image frame is determined as the reference transmission duration. The preset transmission duration may be determined according to a reference transmission duration, which is not limited herein. The preset transmission duration is typically on the order of milliseconds. The preset transmission duration is used to determine whether the exposure timestamp actually corresponds to the current image frame. It may then be determined whether the exposure time stamp can be directly utilized to determine a synchronization time stamp for the current image frame.
Correspondingly, if the image exposure duration is greater than the preset exposure duration, the exposure of the current image frame is abnormal, the exposure timestamp of the current image frame cannot be used for determining the synchronization timestamp, the image acquisition duration corresponding to the current image frame needs to be determined according to the receiving timestamp and the trigger timestamp of the current image frame, and the synchronization timestamp is determined for the currently received image frame according to the relative size relationship between the image acquisition duration and the preset acquisition duration.
The image acquisition duration is used for measuring the time length from the start of exposure of the image acquisition module to the time when the data processor receives the current image frame. Optionally, the relative time interval between the receiving timestamp and the triggering timestamp is determined as the image acquisition duration. Wherein, image acquisition duration includes at least: an image exposure time period and an image transmission time period. The preset acquisition duration can be determined according to the reference exposure duration and the reference transmission duration.
Based on the relative size relationship between the image acquisition duration and the preset acquisition duration, it may be further determined whether the trigger timestamp actually corresponds to the current image frame, based on which a synchronization timestamp is determined for the current image frame.
It should be noted that step 130 and step 140 are in parallel relationship and have no sequential logic relationship. In the practical application process, the selection is executed according to the exposure condition of the current image frame.
According to the technical scheme of the embodiment of the application, whether the current image frame is normally exposed is determined according to the trigger timestamp and the exposure timestamp of the current image frame; and selecting different determination modes to determine the synchronous time stamp for the current image frame based on different exposure conditions of the current image frame. The method for determining the synchronous time stamp has the advantages that a fault-tolerant mechanism is provided for determining the synchronous time stamp, even under the condition that time stamp information or image frame information is lost, the accurate synchronous time stamp can be determined for the current image frame by using the method for determining the time stamp of the image frame, the accuracy of the synchronous time stamp is effectively improved, and the situation that data fusion cannot be performed or the data fusion result is wrong due to the fact that the synchronous time stamp is accurate can be effectively avoided.
In an alternative embodiment, the time stamp determining method after determining the synchronization time stamp for the currently received image frame further comprises: determining an offset address of the currently received image frame according to the starting address of the currently received image frame and the data size of the currently received image frame; based on the offset address, a synchronization timestamp is written to the currently received image frame.
The start address refers to a start address of the current image frame in the image frame sequence. The data size of the current image frame refers to the size of the storage space required for the current image frame. And determining the offset address of the current image frame according to the starting address of the current image frame and the data size of the current image frame, and optionally adding the data size of the current image frame on the basis of the starting address of the current image frame to obtain the offset address of the current image frame. For example, the offset address of the current image frame may be determined from x + y = offset. Where x represents the start address of the current image frame, y represents the data size of the current image frame, and offset represents the offset address of the current image frame.
Based on the offset address, a synchronization timestamp is written to the current image frame. Optionally, a synchronization timestamp is written at the offset address of the current image frame.
According to the technical scheme, the offset address of the current image frame is determined according to the starting address and the data size of the current image frame, and the synchronous timestamp is written into the current image frame based on the offset address. In the case that the current data frame is determined, the data size of the current data frame is determined, and based on the start address and the data size of the current image frame, the writing position of the synchronization timestamp in the current image frame can be determined, so that the synchronization timestamp can be subsequently and quickly read from the current image frame.
Example two
Fig. 2 is a flowchart of a time stamp determination method of an image frame provided according to the second embodiment. The present embodiment is further optimized based on the above embodiments, specifically, in a case that the exposure of the current image frame is normal, the operation "determining the synchronization timestamp for the currently received image frame according to the relative size relationship between the image transmission duration and the preset transmission duration" is refined.
As shown in fig. 2, the method includes:
s210, determining the image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp.
S220, determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and the preset exposure duration.
And S230, if the currently received image frame is normally exposed, determining the image transmission duration corresponding to the currently received image frame according to the receiving timestamp and the exposure timestamp.
And S240, if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is less than or equal to the preset transmission duration, determining the currently received exposure timestamp as the synchronous timestamp of the currently received image frame.
It is understood that the current image frame received by the data processor is acquired by the image acquisition module. Theoretically, the exposure timestamp should be earlier than the receive timestamp. The exposure time stamp is earlier than the receiving time stamp, the actual correspondence between the exposure time stamp and the current image frame cannot be directly determined, and the relative size relationship between the image transmission time length and the preset transmission time length needs to be further determined.
In the case that the exposure timestamp is earlier than the receive timestamp and the image transmission duration is less than or equal to the preset transmission duration, it may be determined that the exposure timestamp actually corresponds to the current image frame, and the exposure timestamp may be used to determine a synchronization timestamp for the current image frame. In particular, the exposure time stamp may be determined as a synchronization time stamp for the current image frame.
And S250, if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is longer than the preset transmission duration, determining the synchronous timestamp of the currently received image frame according to the receiving timestamp.
And under the condition that the exposure time stamp is earlier than the receiving time stamp, if the image transmission time length is longer than the preset transmission time length, the exposure time stamp is not actually corresponding to the current image frame, and the exposure time stamp cannot be used for determining the synchronous time stamp for the current image frame. In the case of a normal exposure determined from the trigger timestamp and the exposure timestamp, the trigger timestamp can also not be used to determine a synchronization timestamp for the current image frame. At this time, it is necessary to determine a synchronization timestamp for the current image frame from the reception timestamp. This is because the receiving time stamp is determined by the data processor according to the time of receiving the current image frame, the receiving time stamp does not need to perform data transmission, there is no packet loss problem, and the receiving time stamp actually corresponds to the current image frame. In case the exposure timestamp is not available, the synchronization timestamp may also be determined based on the reception timestamp.
In an alternative embodiment, determining a synchronization timestamp for a currently received image frame based on the receive timestamp comprises: defining a synchronization timestamp of a currently received image frame as t, defining a reception timestamp as t3T is calculated by the following formula:
t=t3-tλ
in the formula, tλThe time from the end of exposure to the completion of image output of the image acquisition module.
It can be known that, when the data transmission is normal, i.e. there is no packet loss, the exposure timestamp will be preferentially used as the synchronization timestamp of the current image frame. The relative time interval determined by the exposure time stamp and the reception time stamp is the image transmission time length. Based on this, the exposure timestamp of the current image frame may be deduced reversely by using the receiving timestamp, and optionally, the receiving timestamp of the current image frame is subtracted by the preset transmission time length to determine the synchronization timestamp. Optionally, t is determined according to a preset transmission time lengthλ
And S260, if the exposure time stamp is later than the receiving time stamp, determining the currently received image frame as the last received image frame, and re-determining the synchronous time stamp of the last received image frame according to the trigger time stamp, the exposure time stamp and the receiving time stamp which are received last time.
As above, the exposure time stamp actually corresponding to the current image frame theoretically needs to be earlier than the reception time stamp. And if the exposure time stamp is later than the receiving time stamp, determining the currently received image frame as the last received image frame. And re-determining the synchronous time stamp of the image frame received last time according to the trigger time stamp, the exposure time stamp and the receiving time stamp received last time.
It is noted that steps 240 through 260 all correspond to the case where the current image frame is normally exposed. Step 240 corresponds to the case where neither the timestamp information nor the image frame is lost; step 250 corresponds to the case where both the exposure timestamp and the trigger timestamp actually corresponding to the current image frame are lost; step 260 corresponds to the case where the image frame is lost. That is, steps 240 to 260 are in parallel relationship and have no sequential logic relationship. For convenience, step 240 and step 260 are shown in a sequential order, and in an actual application process, one execution is selected according to a data loss condition.
According to the technical scheme, on the premise that the currently received image frame is determined to belong to normal exposure according to the exposure timestamp and the trigger timestamp, the synchronous timestamp is determined for the currently received image frame according to the front-back relation between the exposure timestamp and the receiving timestamp and the image transmission duration. The accuracy of the synchronous timestamp is guaranteed, the fault tolerance of the synchronous timestamp determining method is improved, and the synchronous timestamp can still be determined for the currently received image frame under the condition that the exposure timestamp does not actually correspond to the currently received image frame.
EXAMPLE III
Fig. 3 is a flowchart of a time stamp determination method for an image frame according to a third embodiment. The present embodiment is further optimized on the basis of the above embodiment, specifically, in the case that it is determined that the currently received image frame is abnormal in exposure according to the exposure timestamp and the trigger timestamp, the operation "determining a synchronization timestamp for the currently received image frame according to the relative size relationship between the image acquisition duration and the preset acquisition duration" is refined.
As shown in fig. 3, the method includes:
s310, determining the image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp.
S320, determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and the preset exposure duration.
S330, if the exposure of the currently received image frame is abnormal, determining the image acquisition duration corresponding to the currently received image frame according to the receiving time stamp and the triggering time stamp.
The trigger timestamp and exposure time are key elements of the timestamp influence in determining whether an image frame is normally exposed. In case of an image frame exposure anomaly, it is further necessary to make explicit which of the exposure time stamp and the trigger time stamp is missing.
S340, if the triggering time stamp is earlier than the receiving time stamp and the image acquisition time length is less than the preset acquisition time length, determining the synchronous time stamp of the currently received image frame according to the currently received triggering time stamp, the receiving time stamp and the exposure time stamp.
It can be understood that, in the case that the trigger timestamp actually corresponds to the receive timestamp, the trigger timestamp theoretically needs to be earlier than the receive timestamp, and it is further necessary to ensure that the image acquisition duration determined by the trigger timestamp and the receive timestamp is less than the preset acquisition duration. That is, only when the trigger timestamp is earlier than the receiving timestamp and the image acquisition duration is shorter than the preset acquisition duration, the trigger timestamp may actually correspond to the current image frame, and as for whether the trigger timestamp actually corresponds to the current image frame, the relative size relationship among the currently received trigger timestamp, the exposure timestamp, and the receiving timestamp needs to be further determined in combination.
In an alternative embodiment, determining a synchronization timestamp for a currently received image frame from a currently received trigger timestamp, a receive timestamp, and an exposure timestamp comprises:
if the exposure time stamp is earlier than the trigger time stamp, defining the synchronous time stamp of the currently received image frame as t, and defining the currently received trigger time stamp as t1T is calculated by the following formula:
t=t1+(tζ*tε+tδ)
in the formula, tζThe awakening time of the image acquisition module is set; t is tεFor image acquisition module inputThe time of one pulse; t is tδTime to scan a line of pixels for the image acquisition module;
otherwise, the currently received image frame is determined as the last received image frame, and the synchronous time stamp of the last received image frame is determined again according to the trigger time stamp, the exposure time stamp and the receiving time stamp of the last received image frame.
It will be appreciated that where the exposure timestamp and the trigger timestamp both actually correspond to the currently received image frame, the exposure timestamp should be later than the trigger timestamp. If the exposure timestamp is earlier than the trigger timestamp, it may be determined that the exposure anomaly is caused by a lost exposure timestamp, the currently received trigger timestamp actually corresponds to the currently received image frame, and the currently received trigger timestamp may be used to determine a synchronization timestamp of the currently received image frame. Optionally, the preset exposure duration is superimposed on the trigger timestamp to determine the synchronization timestamp, optionally, tζ*tε+tδIndicating a preset exposure time period. Wherein, tζAnd the exposure awakening parameters are related to the exposure awakening parameters of the image acquisition module and are positive integers. The exposure wake-up parameter is related to the image capturing module, and the exposure wake-up parameters of different image capturing modules may be different, and are not limited herein. Illustratively, the exposure wake-up parameter may be 61396. Wherein, tεThe time of one pulse is input to the image acquisition module. t is tδTime, t, required for scanning a line of pixels for the image acquisition moduleδCan be determined by the following formula, tδ= row × tPCLK, where row is the number of pixel points included in one line of the current image frame, and tPCLK is the time required for the image acquisition module to scan one pixel point. For ease of expression, the above case is determined to be case 341.
Under the branch of step S340, a case 342 opposite to the case 341 is also included. Case 342 is:
otherwise, the image exposure is abnormal, the image acquisition time is shorter than the preset acquisition time, and the exposure time stamp is later than the trigger time stamp, which may be caused by the simultaneous loss of the trigger time stamp and the image frame. In this case, the currently received image frame is determined as the last received image frame, and the synchronization timestamp of the last received image frame is re-determined according to the last received trigger timestamp, exposure timestamp, and reception timestamp.
And S350, if the triggering time stamp is earlier than the receiving time stamp and the image acquisition time length is more than or equal to the preset acquisition time length, determining the synchronous time stamp of the currently received image frame according to the receiving time stamp and the exposure time stamp.
Under the condition that the triggering timestamp is earlier than the receiving timestamp, if the image acquisition duration is greater than or equal to the preset acquisition duration, the triggering timestamp is not actually corresponding to the receiving timestamp, and the triggering timestamp cannot be used for determining a synchronous timestamp for the current image frame. In this case, a synchronization time stamp may be determined for the currently received image frame from the reception time stamp and the exposure time stamp. Specifically, it may be determined whether the exposure time stamp actually corresponds to the currently received image frame according to a relative relationship between the reception time stamp and the exposure time stamp.
In an alternative embodiment, determining a synchronization timestamp for a currently received image frame from the receive timestamp and the exposure timestamp comprises: if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is less than or equal to the preset transmission duration, determining the currently received exposure timestamp as a synchronous timestamp of the currently received image frame; otherwise, defining the synchronous time stamp of the currently received image frame as t, and defining the receiving time stamp as t3T is calculated by the following formula:
t=t3-tλ
in the formula, tλThe time from the end of exposure to the completion of image output of the image acquisition module.
It can be known that the exposure timestamp may actually correspond to the current image frame only if the exposure timestamp is earlier than the reception timestamp, and the determination needs to be made in combination with the image transmission time as to whether the exposure timestamp actually corresponds to the reception timestamp. Specifically, under the condition that the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is less than or equal to the preset transmission duration, it may be determined that the currently received exposure timestamp actually corresponds to the current image frame, which indicates that the currently received exposure timestamp may be used to determine the synchronization timestamp for the current image frame. Specifically, the currently received exposure time stamp is determined as the synchronization time stamp of the currently received image frame. For ease of expression, the above case is determined to be case 351.
In the branching of step S350, a case 352 is also included as opposed to the case 351. Case 352 is: otherwise, it indicates that the currently received exposure timestamp does not actually correspond to the current image frame, i.e., neither the currently received exposure timestamp nor the trigger timestamp can be used to determine the synchronization timestamp for the current image frame. It will be appreciated that the receive timestamp actually corresponds to the current image frame, in which case a synchronization timestamp may be determined for the current image frame based on the receive timestamp. Based on this, the exposure timestamp of the current image frame may be deduced back by using the receiving timestamp, and optionally, the receiving timestamp of the current image frame minus the preset transmission duration is used to determine the synchronization timestamp. Optionally, t is determined according to a preset transmission time lengthλ. Wherein, tλThe time from the end of exposure to the completion of image output of the image acquisition module.
And S360, if the triggering time stamp is later than the receiving time stamp, determining the currently received image frame as the last received image frame, and re-determining the synchronous time stamp of the last received image frame according to the last received triggering time stamp, the last received exposure time stamp and the last received receiving time stamp.
The trigger timestamp is later than the receive timestamp, indicating that the trigger timestamp does not actually correspond to the current image frame. And determining the currently received image frame as the last received image frame, and re-determining the synchronous time stamp of the last received image frame according to the trigger time stamp and the receiving time stamp of the last received image frame.
It should be noted that steps 340 to 360 are parallel, and have no sequential logic relationship, and in the actual application process, one of the steps is selected to be executed according to the data loss condition.
According to the technical scheme, the currently received image frame is determined to belong to abnormal exposure according to the exposure timestamp and the trigger timestamp, the time length of image acquisition is combined according to the front-back relation of the trigger timestamp and the receiving timestamp, the synchronous timestamp is determined for the currently received image frame, the synchronous timestamp determining method applicable to the abnormal exposure condition of the current image frame is provided, a fault-tolerant mechanism is provided for determining the synchronous timestamp, the accuracy of the synchronous timestamp is guaranteed, and the robustness of the synchronous timestamp determining method is improved.
It can be known that, due to objective factors such as electromagnetic environment, network state and system load, packet loss may occur in the image frame, the trigger timestamp and the exposure timestamp. Thus, the exposure time stamp and the trigger time stamp received by the data processor within the current time window may not actually correspond to the image frame received by the current time window. The synchronization timestamp is determined for the currently received image frame by utilizing the timestamp information actually corresponding to the currently received image frame, so that the accuracy of the synchronization timestamp can be ensured. While embodiment two and embodiment three describe methods of determining synchronization timestamps for image frames under different data loss conditions. For the sake of understanding, the method for determining the synchronization timestamp of the image frame described in the second embodiment and the third embodiment will be further described below by taking the case where the image frame, the trigger timestamp, and the exposure timestamp are different and lost as an example.
Assume that the image capture module has sequentially generated 2 image frames (set to image 1 and image 2), so that the data processor has 6 timestamps, each of which is the trigger timestamp for image 1 (set to 1 t)1) Exposure time stamp for image 1 (set to 1 t)2) Reception time stamp (set to 1 t) corresponding to image 13) Trigger time stamp for image 2 (set to 2 t)1) Exposure time stamp for image 2 (set to 2 t)2) The reception time stamp corresponding to the image 2 is set to 2t3). The preset exposure time, the preset transmission time and the preset acquisition time are all set according to the normal working condition of the corresponding system, wherein the preset exposure time, the preset transmission time and the preset acquisition time are all larger than 0. In time stamp information and mapWhen the image frame is not lost, the data received by the data processor twice are respectively (1 t)1,1t2,1t3Images 1) and (2 t)1,2t2,2t3Image 2). In addition, the 6 timestamps have the following constraint relation in the chronological order:
1t1<1t2<1t3<2t1<2t2<2t3
based on the timestamp and/or the loss of the image frame, the data processor may have the following reception events:
case 1: when no packet loss occurs, the latest data received by the data processor is as follows: 2t of1,2t2,2t3Image 2;
case 2: when the image 2 loses the packet, the latest received data by the data processor is as follows: 2t of1,2t2,1t3Image 1;
case 3:2t of1And (3) packet loss, wherein the latest received data by the data processor are as follows: 1t1,2t2,2t3Image 2;
case 4:2t of2And (3) packet loss, wherein the latest received data by the data processor is as follows: 2t of1,1t2,2t3Image 2;
case 5:2t of1And image 2 loses packet, the latest received data of the data processor are: 1t of1,2t2,1t3Image 1;
case 6:2t of2And image 2 loses packet, the data that the data processor receives latest are: 2t of1,1t2,1t3Image 1;
case 7:2t of1And 2t2And (3) packet loss, wherein the latest received data by the data processor is as follows: 1t1,1t2,2t3Image 2.
The following specific analyses were carried out for various situations in conjunction with examples 2 and 3:
case 1: 2t since no data packet is lost1,2t2And 2t3Are all harmonized with the drawingLike 2 corresponds, i.e. 0<2t2-2t1Less than or equal to the preset exposure time, 0<2t3-2t2The preset transmission time is less than or equal to the preset transmission time. Wherein, 2t2-2t1Representing the image exposure duration, 2t2-2t1The exposure time is less than or equal to the preset exposure time, the image exposure time is less than or equal to the preset exposure time, and the exposure of the currently received image frame is normal; 2t of3-2t2Representing the image transmission duration, 2t3-2t2Less than or equal to the preset transmission time length, the image transmission time length is less than or equal to the preset transmission time length, 0<2t2-2t1And 0<2t3-2t2Respectively, that the currently received trigger timestamp is earlier than the exposure timestamp, which is earlier than the receive timestamp. In this case, the exposure time stamp (2 t)2) Determined as the synchronization timestamp of image 2. Corresponding to the branch of step S240 in the second embodiment.
Case 2: due to the loss of the image 2, the latest data received by the data processor is (2 t)1,2t2,1t3In image 1), the timestamp corresponding to image 1 is only 1t3Cannot pass through 2t1,2t2The synchronization timestamp of image 1 is determined. Thus, the trigger timestamp (1 t) may be based on the last receipt1) Exposure time stamp (1 t)2) And a reception time stamp (1 t)3) The synchronization timestamp of image 1 is re-determined. In the second and third embodiments, the expression 0 < 2t2-2t1Preset exposure time of 2t or less2-2t1Representing the image exposure duration, 2t2-2t1The exposure time is less than or equal to the preset exposure time, the image exposure time is less than or equal to the preset exposure time, and the currently received image frame is normally exposed; 1t3-2t2< 0, wherein 1t3-2t2< 0 indicates a currently received reception timestamp 1t3Earlier than the currently received exposure timestamp 2t2. Corresponding to the branch of step S260 in the second embodiment.
Case 3: due to 2t1The packet loss is (1 t) as the latest data received by the data processor1,2t2,2t3In image 2), in pair with image 2The corresponding time stamp has 2t2And 2t3. Thus, the exposure time can be stamped (2 t)2) Determined as the synchronization timestamp of image 2. In the second and third embodiments, the second embodiment is 2t2-1t1Preset exposure time, 2t3-1t1The preset acquisition time is more than or equal to 0 t and less than 2t3-2t2Less than or equal to the preset transmission time, wherein, 2t2-1t1Representing the image exposure duration, 2t2-1t1If the exposure time is longer than the preset exposure time, the image acquisition time is longer than the preset exposure time, and the exposure of the currently received image frame is abnormal; 2t of3-1t1Representing the image acquisition duration, 2t3-1t1The image acquisition time is greater than or equal to the preset acquisition time; 0 < 2t3-2t2Indicating that the exposure timestamp is earlier than the receive timestamp; 2t of3-2t2Representing the image transmission duration, 2t3-2t2The preset transmission time is less than or equal to the preset transmission time, and the image transmission time is less than or equal to the preset transmission time. Corresponding to the branch of step S350 in the third embodiment. Specifically, the case 351 in the branching of step S350 is corresponded. Specifically, case 351 is: if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is less than or equal to the preset transmission duration, the exposure timestamp can be determined to actually correspond to the currently received image frame, the abnormal exposure is caused by the loss of the triggering timestamp, and the exposure timestamp is used as a synchronous timestamp of the currently received image frame;
case 4: due to 2t2The packet loss is (2 t) as the latest data received by the data processor1,1t2,2t3Image 2), the time stamp corresponding to image 2 is 2t1And 2t3. Thus, the time stamp (2 t) may be based on the trigger1) The synchronization timestamp of image 2 is determined. In the second and third embodiments, however, |1t is reflected2-2t1| > Preset Exposure duration, 1t2-2t1<0,0<2t3-2t1< preset acquisition time, 1t2<2t3It is worth noting that |1t2-2t1I represents the image exposure duration, |1t2-2t1If the image exposure time is longer than the preset exposure time, and the current image frame is abnormally exposed. 1t2-2t1< 0 denotes an exposure time stamp 1t2Earlier than trigger timestamp 2t1。2t3-2t1Representing the image acquisition duration, 2t3-2t1The image acquisition time is less than the preset acquisition time; 0 < 2t3-2t1A trigger timestamp 2t indicating the current reception1Earlier than the reception timestamp 2t3This corresponds to the branch of step S340 in the third embodiment. Specifically, the case 341 is a case where the step S340 branches. Specifically, case 341 is: if the exposure timestamp is earlier than the trigger timestamp, it may be determined that the exposure anomaly is caused by the loss of the exposure timestamp, in which case the currently received trigger timestamp actually corresponds to the currently received image frame. A synchronization timestamp for a currently received image frame may be determined from a currently received trigger timestamp.
Case 5: due to 2t1And the image 2 loses the packet, the data that the data processor receives newly is (1 t)1,2t2,1t3Image 1), the timestamp corresponding to image 1 has 1t1And 1t3Cannot pass 2t2And determining a synchronization timestamp for image 1. Thus, the trigger timestamp (1 t) may be based on the last receipt1) Exposure time stamp (1 t)2) And a reception time stamp (1 t)3) The synchronization timestamp of image 1 is re-determined. And 2t in the second and third embodiments2-1t1The preset exposure time is more than 0 and less than 1t3-1t1< preset acquisition time, 1t1<2t2Wherein, 2t2-1t1Representing the image exposure duration, 2t2-1t1If the exposure time is longer than the preset exposure time, the image exposure time is longer than the preset exposure time, and the exposure of the currently received image frame is abnormal; 1t3-1t1Representing the image acquisition duration, 1t3-1t1The preset acquisition time indicates that the image acquisition time is less than the preset acquisition time, 1t1<2t2I.e. the currently received trigger timestamp 1t1Earlier than the currently received exposure timestamp 2t2Corresponding to the branch of the step S340 in the third embodiment, the method may specifically correspond to the branch 342 of the step S340.
Case 6: due to 2t2And image 2 lost, the data processor receives the latest data as (2 t)1,1t2,1t3Image 1), image 1 has a corresponding time stamp of 1t2And 1t3. Thus, the trigger timestamp (1 t) may be based on the last receipt1) Exposure time stamp (1 t)2) And a reception time stamp (1 t)3) The synchronization timestamp of image 1 is re-determined. And in the second and third embodiments, 1t is represented2-2t1< 0, and 1t3-2t1< 0, wherein 1t2-2t1< 0 indicates that the currently received exposure timestamp is earlier than the currently received trigger timestamp, 1t3-2t1< 0 indicates that the receiving timestamp is earlier than the currently received trigger timestamp, corresponding to the branch of step S360 in the third embodiment.
Case 7: due to 2t1And 2t2The packet loss is (1 t) as the latest data received by the data processor1,1t2,2t3Image 2), image 2 corresponds to a timestamp of only 2t3Failing to pass 1t1,1t2The synchronization timestamp of image 2 is determined. Therefore, it is possible to obtain the reception time stamp (2 t)3) The synchronization timestamp of image 2 is determined. And in the second and third embodiments, the value is 0<1t2-1t1Less than or equal to preset exposure time, 2t3-1t2Is greater than a predetermined transmission time, wherein 1t2-1t1Representing the image exposure duration, 1t2-1t1The preset exposure time is less than or equal to the preset exposure time, and the image exposure time is less than or equal to the preset exposure time; 0<1t2-1t1Indicating that the exposure timestamp is later than the trigger timestamp; 2t of3-1t2Representing the image transmission duration, 2t3-1t2And if the transmission time is longer than the preset transmission time, indicating that the exposure time stamp is longer than the preset transmission time, and indicating that the exposure time stamp is earlier than the receiving time stamp. Corresponding to the location of step S250 in the second embodimentAnd (4) branching.
It should be noted that the above-mentioned scenario merely belongs to an assumed scenario, and is used to illustrate how to determine the synchronization timestamp of the image frame in the case of various timestamps and/or packet loss of the image frame. The above scenario is not intended to limit the scope of the present application, and the decision-making method in the present application is not limited to the above scenario. As long as different judgment conditions in the application are met, the synchronous time stamp corresponding to the image frame can be determined through the application.
Example four
Fig. 5 is a schematic structural diagram of a device for determining a timestamp of an image frame according to a fourth embodiment of the present application, which is applicable to a situation in which a synchronization timestamp is determined for a current image frame in a multi-sensor fusion scenario. The device can be realized by software and/or hardware, and can be integrated in a data processor such as an intelligent terminal.
As shown in fig. 5, the apparatus may include: an image exposure duration determination module 510, an exposure condition determination module 520, a first synchronization timestamp determination module 530, and a second synchronization timestamp determination module 540.
An image exposure duration determining module 510, configured to determine an image exposure duration corresponding to a currently received image frame according to a currently received trigger timestamp and an exposure timestamp;
an exposure condition determining module 520, configured to determine whether the currently received image frame is normally exposed according to a relative size relationship between the image exposure duration and a preset exposure duration;
a first synchronization timestamp determining module 530, configured to determine, if the currently received image frame is normally exposed, an image transmission duration corresponding to the currently received image frame according to the receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relation between the image transmission time and the preset transmission time;
a second synchronous timestamp determining module 540, configured to determine, if exposure of a currently received image frame is abnormal, an image acquisition duration corresponding to the currently received image frame according to the receiving timestamp and the trigger timestamp; and determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image acquisition time length and the preset acquisition time length.
According to the technical scheme of the embodiment of the application, whether the current image frame is normally exposed is determined according to the trigger timestamp and the exposure timestamp of the current image frame; and selecting different determination modes to determine the synchronous time stamp for the current image frame based on different exposure conditions of the current image frame. The method for determining the synchronous time stamp has the advantages that a fault-tolerant mechanism is provided for determining the synchronous time stamp, even under the condition that time stamp information or image frame information is lost, the accurate synchronous time stamp can be determined for the current image frame by using the method for determining the time stamp of the image frame, the accuracy of the synchronous time stamp is effectively improved, and the situation that data fusion cannot be performed or the data fusion result is wrong due to the fact that the synchronous time stamp is accurate can be effectively avoided.
Optionally, the first synchronization timestamp determining module 530 includes: the first synchronous timestamp determining submodule is used for determining the currently received exposure timestamp as the synchronous timestamp of the currently received image frame if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is less than or equal to the preset transmission duration; the second synchronous timestamp determining submodule is used for determining the synchronous timestamp of the currently received image frame according to the receiving timestamp if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is longer than the preset transmission duration; and the third synchronous timestamp determining submodule is used for determining the currently received image frame as the last received image frame if the exposure timestamp is later than the receiving timestamp, and re-determining the synchronous timestamp of the last received image frame according to the trigger timestamp, the exposure timestamp and the receiving timestamp which are received last time.
Optionally, the second synchronization timestamp determining sub-module is specifically configured to define a synchronization timestamp of a currently received image frame as t, and define a receiving timestamp as t3T is calculated by the following formula:
t=t3-tλ
in the formula, tλFrom exposure end to image output for image acquisition moduleAnd the time of completion is output.
Optionally, the second synchronization timestamp determining module 540 includes: a fourth synchronization timestamp determining sub-module, configured to determine a synchronization timestamp of the currently received image frame according to the currently received trigger timestamp, the receiving timestamp, and the exposure timestamp if the trigger timestamp is earlier than the receiving timestamp and the image acquisition duration is shorter than a preset acquisition duration; a fifth synchronous timestamp determining submodule, configured to determine a synchronous timestamp of the currently received image frame according to the receiving timestamp and the exposure timestamp if the triggering timestamp is earlier than the receiving timestamp and the image acquisition duration is greater than or equal to a preset acquisition duration; and the sixth synchronous time stamp determining submodule is used for determining the currently received image frame as the last received image frame if the triggering time stamp is later than the receiving time stamp, and re-determining the synchronous time stamp of the last received image frame according to the last received triggering time stamp, the exposure time stamp and the receiving time stamp.
Optionally, the fourth synchronization timestamp determining sub-module includes: a first synchronization timestamp determining unit, configured to define a synchronization timestamp of the currently received image frame as t and a currently received trigger timestamp as t if the exposure timestamp is earlier than the trigger timestamp1T is calculated by the following formula:
t=t1+(tζ*tε+tδ)
in the formula, tζThe awakening time of the image acquisition module is set; t is tεInputting a pulse time for the image acquisition module; t is tδThe time for scanning a line for the image acquisition module. And the second synchronous timestamp determining unit is used for determining the currently received image frame as the last received image frame if the current image frame is not the last received image frame, and re-determining the synchronous timestamp of the last received image frame according to the last received trigger timestamp, the last received exposure timestamp and the last received receiving timestamp.
Optionally, the fifth synchronization timestamp determining sub-module includes: a first synchronization timestamp determining unit for determining whether the exposure timestamp is earlier than the reception timestamp and whether the image transmission duration is longer than or equal to a preset transmission durationIf the time length is longer, determining the currently received exposure time stamp as the synchronous time stamp of the currently received image frame; a second synchronous time stamp determining unit, otherwise, defining the synchronous time stamp of the currently received image frame as t and defining the receiving time stamp as t3T is calculated by the following formula:
t=t3-tλ
in the formula, tλThe time from the end of exposure to the completion of image output of the image acquisition module.
Optionally, the apparatus further comprises: the offset address determining module is used for determining the offset address of the currently received image frame according to the starting address of the currently received image frame and the data size of the currently received image frame after determining the synchronous timestamp for the currently received image frame; and the synchronous time stamp writing module is used for writing the synchronous time stamp into the currently received image frame based on the offset address.
The image frame time stamp determining device provided by the embodiment of the invention can execute the image frame time stamp determining method provided by any embodiment of the application, and has the corresponding performance module and the beneficial effect of executing the image frame time stamp determining method.
In the technical scheme of the disclosure, the collection, storage, use, processing, transmission, provision, disclosure and the like of the related user information all conform to the regulations of related laws and regulations, and do not violate the customs of the public order.
EXAMPLE five
FIG. 6 illustrates a block diagram of a data processor 610, which can be used to implement embodiments. The data processor 610 includes at least one processor 611, and a memory communicatively connected to the at least one processor 611, such as a Read Only Memory (ROM) 612, a Random Access Memory (RAM) 613, and the like, wherein the memory stores computer programs executable by the at least one processor, and the processor 611 may perform various suitable actions and processes according to the computer programs stored in the Read Only Memory (ROM) 612 or loaded from the storage unit 618 into the Random Access Memory (RAM) 613. In the RAM 613, various programs and data necessary for the operation of the data processor 610 can also be stored. The processor 611, the ROM 612, and the RAM 613 are connected to each other by a bus 614. An input/output (I/O) interface 615 is also connected to bus 614.
A number of components in the data processor 610 are connected to the I/O interface 615, including: an input unit 616 such as a keyboard, a mouse, or the like; an output unit 617 such as various types of displays, speakers, and the like; a storage unit 618, such as a magnetic disk, optical disk, or the like; and a communication unit 619 such as a network card, modem, wireless communication transceiver, or the like. The communication unit 619 allows the data processor 610 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 611 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processors 611 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and the like. The processor 611 performs the various methods and processes described above, such as a time stamp determination method for the image frame.
In some embodiments, the time stamp determination method for an image frame may be implemented as a computer program tangibly embodied in a computer-readable storage medium, such as storage unit 618. In some embodiments, part or all of the computer program may be loaded and/or installed onto the data processor 610 via the ROM 612 and/or the communication unit 619. When the computer program is loaded into the RAM 613 and executed by the processor 611, one or more steps of the method for time stamp determination of an image frame described above may be performed. Alternatively, in other embodiments, the processor 611 may be configured by any other suitable means (e.g., by means of firmware) to perform the timestamp determination method of the image frame.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), complex Programmable Logic Devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for implementing the methods of the present application may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable image frame timestamp determination apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be performed. A computer program can execute entirely on a machine, partly on a machine, as a stand-alone software package partly on a machine and partly on a remote machine or entirely on a remote machine or server.
In the context of this application, a computer readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here may be implemented on a data processor having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user may provide input to the data processor. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here may be implemented in a computing system that includes a back-end component (e.g., as a timestamp determination server for an image frame), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user may interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the Internet.
The computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solution of the present application can be achieved, and the present invention is not limited thereto.
The above-described embodiments are not intended to limit the scope of the present disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (11)

1. A method for time stamp determination of an image frame, the method comprising:
determining an image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp;
determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and a preset exposure duration;
if the currently received image frame is normally exposed, determining an image transmission duration corresponding to the currently received image frame according to a receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relation between the image transmission time length and a preset transmission time length;
if the exposure of the currently received image frame is abnormal, determining an image acquisition duration corresponding to the currently received image frame according to a receiving timestamp and the triggering timestamp; and determining a synchronous time stamp for the currently received image frame according to the relative size relation between the image acquisition time length and a preset acquisition time length.
2. The timestamp determination method according to claim 1, wherein determining a synchronization timestamp for the currently received image frame based on a relative magnitude relationship between the image transmission duration and a preset transmission duration comprises:
if the exposure timestamp is earlier than the receiving timestamp and the image transmission time length is less than or equal to the preset transmission time length, determining the currently received exposure timestamp as a synchronous timestamp of the currently received image frame;
if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is longer than the preset transmission duration, determining a synchronous timestamp of the currently received image frame according to the receiving timestamp;
and if the exposure timestamp is later than the receiving timestamp, determining the currently received image frame as the last received image frame, and re-determining the synchronous timestamp of the last received image frame according to the last received trigger timestamp, the exposure timestamp and the receiving timestamp.
3. The timestamp determination method of claim 2, wherein determining a synchronization timestamp for the currently received image frame from the receive timestamp comprises:
defining a synchronization timestamp of a currently received image frame as t, defining a reception timestamp as t3The t is calculated by the following formula:
t=t3-tλ
in the formula, the tλThe time from the end of exposure to the completion of image output of the image acquisition module.
4. The timestamp determination method according to any one of claims 1 to 3, wherein determining a synchronization timestamp for the currently received image frame according to a relative magnitude relationship between the image acquisition duration and a preset acquisition duration comprises:
if the triggering time stamp is earlier than the receiving time stamp and the image acquisition time length is less than the preset acquisition time length, determining a synchronous time stamp of the currently received image frame according to the currently received triggering time stamp, the receiving time stamp and the exposure time stamp;
if the triggering timestamp is earlier than the receiving timestamp and the image acquisition duration is greater than or equal to a preset acquisition duration, determining a synchronous timestamp of the currently received image frame according to the receiving timestamp and the exposure timestamp;
and if the triggering time stamp is later than the receiving time stamp, determining the currently received image frame as the last received image frame, and re-determining the synchronous time stamp of the last received image frame according to the last received triggering time stamp, the last received exposure time stamp and the last received receiving time stamp.
5. The timestamp determination method of claim 4, wherein determining a synchronization timestamp for the currently received image frame from the currently received trigger timestamp, receive timestamp, and exposure timestamp comprises:
if the exposure time stamp is earlier than the trigger time stamp, defining the synchronous time stamp of the currently received image frame as t, and defining the currently received trigger time stamp as t1The t is calculated by the following formula:
t=t1+(tζ*tε+tδ)
in the formula, the tζThe awakening time of the image acquisition module is set; t is saidεInputting a pulse time for the image acquisition module; said t isδTime to scan a line of pixels for the image acquisition module;
otherwise, the currently received image frame is determined as the last received image frame, and the synchronous time stamp of the last received image frame is determined again according to the trigger time stamp, the exposure time stamp and the receiving time stamp of the last received image frame.
6. The timestamp determination method of claim 4, wherein determining the synchronization timestamp for the currently received image frame from the receive timestamp and the exposure timestamp comprises:
if the exposure timestamp is earlier than the receiving timestamp and the image transmission duration is less than or equal to the preset transmission duration, determining the currently received exposure timestamp as a synchronous timestamp of the currently received image frame;
otherwise, defining the synchronous time stamp of the currently received image frame as t, and defining the receiving time stamp as t3The t is calculated by the following formula:
t=t3-tλ
in the formula, the tλThe time from the end of exposure to the completion of image output of the image acquisition module.
7. The timestamp determination method of claim 1, after determining a synchronization timestamp for the currently received image frame, further comprising:
determining an offset address of the currently received image frame according to a start address of the currently received image frame and a data size of the currently received image frame;
writing the synchronization timestamp to the currently received image frame based on the offset address.
8. An apparatus for time stamp determination of an image frame, the apparatus comprising:
the image exposure duration determining module is used for determining the image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp;
the exposure condition determining module is used for determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and the preset exposure duration;
the first synchronous timestamp determining module is used for determining the image transmission duration corresponding to the currently received image frame according to the receiving timestamp and the exposure timestamp if the currently received image frame is normally exposed; determining a synchronous timestamp for the currently received image frame according to the relative size relation between the image transmission time length and a preset transmission time length;
the second synchronous timestamp determining module is used for determining an image acquisition time length corresponding to the currently received image frame according to a receiving timestamp and the trigger timestamp if the currently received image frame is abnormally exposed; and determining a synchronous time stamp for the currently received image frame according to the relative size relation between the image acquisition time length and a preset acquisition time length.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out a method of time stamp determination of an image frame according to any one of claims 1 to 7.
10. A data processor comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements a method for time stamp determination of an image frame according to any of claims 1-7.
11. A time stamp determination system for an image frame, said time stamp determination system comprising:
the data processor is used for sending a trigger pulse for starting the image acquisition module and recording the time of the trigger pulse as a trigger timestamp; the image acquisition module is also used for receiving the image frames output by the image acquisition module and recording the time for receiving the image frames as a receiving time stamp; the exposure module is also used for receiving an exposure time stamp output by the image acquisition module;
the data processor is further used for determining image exposure duration corresponding to the currently received image frame according to the currently received trigger timestamp and the exposure timestamp; determining whether the currently received image frame is normally exposed or not according to the relative size relation between the image exposure duration and a preset exposure duration; if the currently received image frame is normally exposed, determining an image transmission duration corresponding to the currently received image frame according to a receiving timestamp and the exposure timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image transmission duration and a preset transmission duration; if the exposure of the currently received image frame is abnormal, determining an image acquisition duration corresponding to the currently received image frame according to a receiving timestamp and the triggering timestamp; determining a synchronous timestamp for the currently received image frame according to the relative size relationship between the image acquisition duration and a preset acquisition duration;
the image acquisition module is used for receiving the trigger pulse sent by the data processor, carrying out exposure acquisition on an image frame according to the trigger pulse and sending the acquired image frame to the data processor; and recording the time of completing the exposure as an exposure time stamp, and sending the exposure time stamp to the data processor.
CN202210896646.7A 2022-07-28 2022-07-28 Image frame time stamp determination method, device, medium, data processor and system Pending CN115276871A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210896646.7A CN115276871A (en) 2022-07-28 2022-07-28 Image frame time stamp determination method, device, medium, data processor and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210896646.7A CN115276871A (en) 2022-07-28 2022-07-28 Image frame time stamp determination method, device, medium, data processor and system

Publications (1)

Publication Number Publication Date
CN115276871A true CN115276871A (en) 2022-11-01

Family

ID=83772328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210896646.7A Pending CN115276871A (en) 2022-07-28 2022-07-28 Image frame time stamp determination method, device, medium, data processor and system

Country Status (1)

Country Link
CN (1) CN115276871A (en)

Similar Documents

Publication Publication Date Title
US9218199B2 (en) Identifying thread progress information by monitoring transitions between interesting states
CN107451193B (en) A kind of acquisition methods and device of customer terminal webpage load time, electronic equipment
CN112751983B (en) Image time synchronization method and device, electronic equipment and storage medium
CN115240850A (en) Information processing method and device, wearable device and electronic device
CN113346973B (en) Event prompting method and device, electronic equipment and computer readable storage medium
CN112579820A (en) Time hopping video data processing method, device, medium and electronic equipment
CN107392983B (en) Method and system for recording animation
CN115276871A (en) Image frame time stamp determination method, device, medium, data processor and system
CN115866154B (en) Time delay measurement method, device and system of vehicle-mounted multi-camera system and automobile
US8667320B2 (en) Deriving accurate media position information
CN109933418B (en) Timestamp synchronization method, electronic equipment and heterogeneous equipment
CN115908360A (en) Monitoring video inspection method and device, electronic equipment and storage medium
CN106603730B (en) A kind of method for real-time monitoring and its device for cloud platform
CN114827242B (en) Method, device, equipment and medium for correcting flow control frame
CN113965289B (en) Time synchronization method and device based on multi-sensor data
CN117956261A (en) PLC interaction method, system, upper computer, equipment and medium
EP4040403A2 (en) Method and apparatus for updating map data using vehicle camera
CN111881060B (en) Interface performance testing method and device and electronic equipment
CN107730705B (en) Method, device, equipment and storage medium for starting and stopping data and continuous data acquisition
CN117784569A (en) Time synchronization precision testing method
CN116450411A (en) Method, device, equipment and storage medium for determining replication delay state
CN118158548A (en) Image data transmission method, device, analog video recording equipment and medium
CN115967462A (en) Image frame synchronization method and device, electronic equipment and storage medium
CN114979203A (en) Vehicle file processing method, device, equipment and storage medium
CN117472774A (en) Interface testing method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination