CN116668602B - Camera image synchronization method, device, vehicle and storage medium - Google Patents

Camera image synchronization method, device, vehicle and storage medium Download PDF

Info

Publication number
CN116668602B
CN116668602B CN202310769990.4A CN202310769990A CN116668602B CN 116668602 B CN116668602 B CN 116668602B CN 202310769990 A CN202310769990 A CN 202310769990A CN 116668602 B CN116668602 B CN 116668602B
Authority
CN
China
Prior art keywords
image
camera
container
target
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310769990.4A
Other languages
Chinese (zh)
Other versions
CN116668602A (en
Inventor
孙乾坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Automobile Technology Co Ltd
Original Assignee
Xiaomi Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Automobile Technology Co Ltd filed Critical Xiaomi Automobile Technology Co Ltd
Priority to CN202310769990.4A priority Critical patent/CN116668602B/en
Publication of CN116668602A publication Critical patent/CN116668602A/en
Application granted granted Critical
Publication of CN116668602B publication Critical patent/CN116668602B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • H04N5/08Separation of synchronising signals from picture signals

Abstract

The disclosure relates to the field of automatic driving, in particular to a camera image synchronization method, a camera image synchronization device, a vehicle and a storage medium, which solve the technical problem that a plurality of camera images cannot be synchronized. The method comprises the following steps: determining a first target image with the smallest time stamp from a first container; for each camera related to the automatic driving system except for the target camera, determining images corresponding to the cameras in a first container, a second container and a third container according to the time stamp of the first target image to obtain a plurality of second target images, wherein the target camera is a camera for acquiring the first target image from the cameras related to the automatic driving system; and grouping the first target image and the plurality of second target images to obtain a synchronous image package.

Description

Camera image synchronization method, device, vehicle and storage medium
Technical Field
The present disclosure relates to the field of autopilot, and in particular, to a camera image synchronization method, apparatus, vehicle, and storage medium.
Background
In the field of autopilot, cameras of the same hardware group can ensure that the time stamps of each camera image are consistent by triggering exposures simultaneously, i.e. less than 1 millisecond between the time stamps of the individual images.
In the related art, a camera is divided into two camera software groups, namely a panoramic camera group and a panoramic camera group, and images acquired by each camera software group are aligned in groups, namely, a group of images submitted by each camera software group is ensured to be aligned in time. For example, the group of looking-around cameras includes four looking-around cameras, which are aligned according to the time stamps of the camera images since they belong to the same hardware group; the panoramic camera group comprises a front view camera, a rear view camera and a panoramic camera, and because the front view camera and the rear view camera belong to the same hardware group, and the panoramic camera does not belong to the hardware group, the alignment algorithm is needed to ensure that the camera images submitted by the panoramic camera group each time are aligned in time.
However, under the condition that the load of the vehicle in which the automatic driving system is located changes or the vehicle runs in a special environment, the condition that the frame loss or the frame error occurs to the camera can be caused, so that the related technology can not ensure that the images of the camera are aligned in time.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a camera image synchronization method, apparatus, vehicle, and storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided a camera image synchronization method, including:
determining a first target image with a minimum timestamp from a first container for storing images acquired at a first time by each camera associated with an autopilot system of a vehicle;
determining images corresponding to the cameras in the first container, the second container and a third container according to the time stamp of the first target image for each camera related to the automatic driving system except for the target camera, so as to obtain a plurality of second target images, wherein the target camera is a camera for acquiring the first target image in the cameras related to the automatic driving system, the second container is used for storing images acquired by each camera related to the automatic driving system at a second moment, and the third container is used for storing images acquired by each camera related to the automatic driving system at a third moment, the second moment is a moment before the first moment, and the third moment is a moment after the first moment;
and packaging the first target image and the plurality of second target images to obtain a synchronous image package.
Optionally, the determining, according to the timestamp of the first target image, the image corresponding to the camera from the first container, the second container and the third container includes:
determining images of all moments corresponding to the camera from the first container, the second container and the third container;
calculating an image time difference according to the time stamp of the first target image and the time stamp of the image for each image at each moment;
and taking the image with the smallest image time difference as a second target image of the camera in the images of all the moments corresponding to the camera.
Optionally, the determining the first target image with the smallest timestamp from the first container includes:
and sequencing the images in the first container from big to small according to the time stamps of the images stored in the first container, and taking the last image as the first target image.
Optionally, the camera image synchronization method further includes:
after receiving the image sent by the target camera, judging whether the image sent by the target camera exists in the first container;
storing an image sent by the target camera in the third container in the case that the image exists in the first container;
In the case where there is no image transmitted by the target camera in the first container, the image is stored in the first container.
Optionally, the determining the first target image with the smallest timestamp from the first container includes:
and under the condition that the first container is in a full-storage state, determining a first target image with the smallest timestamp from the first container, wherein the full-storage state represents that the first container stores one image corresponding to each camera.
Optionally, the camera image synchronization method further includes:
after the synchronous image package is obtained, the first container is defined as a new second container at the next moment, the third container is defined as a new first container at the next moment, the second container is emptied, and the second container is defined as a new third container at the next moment.
According to a second aspect of embodiments of the present disclosure, there is provided a camera image synchronization apparatus including:
a determination module configured to determine a first target image having a minimum timestamp from a first container for storing images acquired at a first time by each camera associated with an autopilot system of a vehicle;
An execution module configured to determine, for each camera associated with the autopilot system except for a target camera, an image corresponding to the camera in the first container, the second container and a third container according to a timestamp of the first target image, so as to obtain a plurality of second target images, wherein the target camera is a camera for acquiring the first target image in the cameras associated with the autopilot system, the second container is used for storing images acquired by each camera associated with the autopilot system at a second moment, and the third container is used for storing images acquired by each camera associated with the autopilot system at a third moment, the second moment is a moment before the first moment, and the third moment is a moment after the first moment;
and the synchronization module is configured to group the first target image and the plurality of second target images to obtain a synchronization image packet.
According to a third aspect of embodiments of the present disclosure, there is provided a camera image synchronization apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
Determining a first target image with a minimum timestamp from a first container for storing images acquired at a first time by each camera associated with an autopilot system of a vehicle;
determining images corresponding to the cameras in the first container, the second container and a third container according to the time stamp of the first target image for each camera related to the automatic driving system except for the target camera, so as to obtain a plurality of second target images, wherein the target camera is a camera for acquiring the first target image in the cameras related to the automatic driving system, the second container is used for storing images acquired by each camera related to the automatic driving system at a second moment, and the third container is used for storing images acquired by each camera related to the automatic driving system at a third moment, the second moment is a moment before the first moment, and the third moment is a moment after the first moment;
and packaging the first target image and the plurality of second target images to obtain a synchronous image package.
According to a fourth aspect of embodiments of the present disclosure, the present disclosure provides a vehicle, including the camera image synchronization apparatus of the third aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the camera image synchronization method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: according to the method, images of each camera associated with the automatic driving system of the vehicle at three adjacent moments are stored through three containers, a first target image with the smallest time stamp is determined from the first container, second target images which are acquired by other cameras, except for the camera which acquires the first target image, associated with the automatic driving system are determined from the first container, the second container and the third container, the first target image and the plurality of second target images are packaged to obtain a synchronous image package, so that the images acquired by all cameras associated with the automatic driving system are aligned in time, and the phenomenon that the automatic driving system of the vehicle cannot work normally due to frame loss and frame error of the cameras is avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a camera image synchronization method according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating step S12 according to an exemplary embodiment.
Fig. 3 is another flowchart illustrating a camera image synchronization method according to an exemplary embodiment.
Fig. 4 is a further flowchart illustrating a camera image synchronization method according to an exemplary embodiment.
Fig. 5 is a diagram illustrating the result of a camera image synchronization method according to an exemplary embodiment.
Fig. 6 is a block diagram illustrating a camera image synchronization apparatus according to an exemplary embodiment.
Fig. 7 is another block diagram of a camera image synchronization apparatus according to an exemplary embodiment.
Fig. 8 is a functional block diagram of a vehicle, according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
It should be noted that, all the actions of acquiring the images in the present application are performed under the condition of conforming to the corresponding policy of data protection regulations in the country of the location and obtaining the authorization given by the owner of the corresponding device.
Fig. 1 is a flowchart illustrating a camera image synchronization method for use in an automatic driving system of a vehicle, as shown in fig. 1, according to an exemplary embodiment, comprising the steps of:
in step S11, a first target image having a minimum time stamp is determined from a first container for storing images acquired at a first time by each camera associated with an autopilot system of the vehicle.
Wherein the first container is for storing one image acquired at a first time by each camera associated with an autopilot system of the vehicle, the number of cameras associated with the autopilot system of the vehicle is not limited. Each image in the first container is self-contained with a timestamp that is automatically generated when the camera acquires the image, the first time including the current time.
Each camera associated with the automatic driving system can shoot 30 images within 1 second, and under the condition that frame loss or frame error occurs in any camera, the acquisition time of the images stored in the first container is not synchronous, the first target image can be the image acquired first by any camera, and the first target image needs to be time-synchronous with images acquired by other cameras, except for the target camera for acquiring the first target image, associated with the automatic driving system.
For example, in the case where cameras associated with the automated driving system of the vehicle are camera 1, camera 2, camera 3, and camera 4, respectively, four images are stored in the first container, namely, image a1 acquired by camera 1, image b1 acquired by camera 2, image c1 acquired by camera 3, and image d1 acquired by camera 4, respectively, the time stamp of image a1 is t, the time stamp of image b1 is t+31, the time stamp of image c1 is t+29, the time stamp of image d1 is t+30, and then the image a1 with the smallest time stamp in the first container is the first target image.
In step S12, for each camera associated with the autopilot system except for the target camera, determining the image corresponding to the camera in the first container, the second container and the third container according to the timestamp of the first target image, so as to obtain a plurality of second target images, where the target camera is a camera for acquiring the first target image in the cameras associated with the autopilot system, the second container is used for storing the image acquired by each camera associated with the autopilot system at the second moment, and the third container is used for storing the image acquired by each camera associated with the autopilot system at the third moment, the second moment is the moment before the first moment, and the third moment is the moment after the first moment.
The method comprises the steps of presetting three containers for storing images acquired by each camera at three different moments, wherein a first container is used for storing images acquired by each camera at a first moment, a second container is used for storing images acquired by each camera at a moment before the first moment, and a third container is used for storing images acquired by each camera at a moment after the first moment.
For example, in the case that the first time is the current time, the second time may be the previous time, the third time may be the next time, the first container is used for storing one image acquired by each camera associated with the autopilot system at the current time, the second container is used for storing one image acquired by each camera associated with the autopilot system at the previous time, and the third container is used for storing one image acquired by each camera associated with the autopilot system at the next time.
For example, for each camera associated with the autopilot system except for the target camera, an image with the smallest image time difference between the image acquired by the camera and the first target image is determined from a first container, a second container and a third container according to the time stamp of the first target image as a second target image, so that a plurality of second target images are obtained.
In step S13, the first target image and the plurality of second target images are grouped to obtain a synchronous image packet.
Wherein the synchronous image package is used for controlling the vehicle to automatically drive.
Illustratively, the first target image is combined with a plurality of second target images and packaged into a synchronous image package, thereby achieving time synchronization of the camera images.
According to the method, for each camera associated with the automatic driving system of the vehicle, a first target image with the smallest timestamp is determined from a first container, and then second target images which are acquired by other cameras, except for the camera which acquires the first target image, associated with the automatic driving system are determined from the first container, a second container and a third container, the first target image and a plurality of second target images are packaged to obtain a synchronous image package, and under the condition that frame loss and frame error occur to the cameras, images acquired by all cameras associated with the automatic driving system can be aligned in time, so that camera image synchronization is realized, and normal operation of the automatic driving system is ensured.
In order to facilitate a better understanding of the camera image synchronization method provided by the present disclosure by those skilled in the art, the following describes in detail the relevant steps of the camera image synchronization method.
In a possible embodiment, in step S11, determining the first target image with the smallest timestamp from the first container may include:
the images in the first container are ordered from big to small according to the time stamps of the images stored in the first container, and the last image is taken as a first target image.
For example, when a frame loss or a frame error occurs in a camera, the image in the first container is not the image acquired by each camera at the same time, and the image with the smallest time stamp is the image acquired first by any camera associated with the automatic driving system, so that the image with the smallest time stamp is determined as the first target image.
For example, in the case where cameras associated with the automatic driving system of the vehicle are camera 1, camera 2, camera 3, and camera 4, respectively, four images are stored in the first container, namely, image a1 acquired by camera 1, image b1 acquired by camera 2, image c1 acquired by camera 3, and image d1 acquired by camera 4, respectively, the time stamp of image a1 is t, the time stamp of image b1 is t+31, the time stamp of image c1 is t+29, the time stamp of image d1 is t+30, and the four images are ordered from large to small according to the time stamps of the respective images: the image b1, the image d1, the image c1, and the image a1 have the last image, that is, the image a1, as the first target image.
The method and the device take the first transmitted image stored in the first container as a first target image, so that the image with the smallest image time difference with the first target image, which is acquired by other cameras except the target camera, is conveniently determined according to the first target image, and time synchronization is carried out.
In a possible embodiment, in step S11, determining the first target image with the smallest timestamp from the first container may further include:
the method includes determining a first target image with a minimum timestamp from a first container when the first container is in a full state, the full state representing that the first container has stored one image corresponding to each camera.
Wherein, in the case that the first container has stored one image corresponding to each camera, the first target image with the smallest time stamp is determined from the first container.
After the first container is in a full-storage state, the first target image is determined from the first container, so that a plurality of second target images are determined according to the time stamps of the first target images, and time synchronization is performed.
In a possible embodiment, referring to fig. 2, in step S12, determining the image corresponding to the camera from the first container, the second container, and the third container according to the time stamp of the first target image may include the following steps:
In step S21, images at all times corresponding to the camera are determined from the first container, the second container, and the third container.
Wherein, in the case that the first container is used for storing one image acquired by each camera associated with the automatic driving system of the vehicle at the present time, the second container is used for storing one image acquired by each camera associated with the automatic driving system of the vehicle at the previous time, and the third container is used for storing one image acquired by each camera associated with the automatic driving system of the vehicle at the next time.
For each camera, for example, all images corresponding to the camera are determined from the first container, the second container, and the third container.
In step S22, for each image at each time, an image time difference is calculated from the time stamp of the first target image and the time stamp of the image.
For all the determined images corresponding to the camera, the time stamp of the image is subtracted from the time stamp of the first target image to obtain a difference value, the absolute value of the difference value is taken as an image time difference, and the image time difference is a non-negative number.
For example, in the case where the time stamp of the first target image is t and the time stamp of the image is t+1, the image time difference=t- (t+1) = | -1|=1.
In step S23, among the images at all times corresponding to the camera, the image having the smallest image time difference is taken as the second target image of the camera.
And determining a second target image according to the size of the image time difference for the image corresponding to each camera, so as to obtain the second target image corresponding to each camera.
According to the method, for each camera, according to the image time difference between a first target image and all images corresponding to the camera, a second target image with the smallest image time difference between the first target image and the images acquired by the camera is determined, so that the images acquired first by all cameras related to an automatic driving system are determined from a first container, a second container and a third container, and time synchronization is conveniently carried out on the images acquired by the cameras.
In a possible embodiment, referring to fig. 3, the camera image synchronization method may further include the steps of:
in step S31, after receiving the image transmitted by the target camera, it is determined whether or not there is an image transmitted by the target camera in the first container.
For example, in the case that a frame loss or a frame error occurs in a camera associated with an autopilot system, an image acquired by the camera at a third moment may be received first, and then the image acquired by the camera at a first moment is received, so that after the image acquired by the camera at the third moment is stored in the first container, the image acquired by the camera at the first moment is received, and only one image is stored in the first container corresponding to the same camera, so that after the image transmitted by the target camera is received, whether the image corresponding to the camera exists in the first container needs to be determined.
In step S32, in the case where there is an image transmitted by the target camera in the first container, the image is stored in the third container.
For example, in the case where the image a2 acquired by the camera 1 at the third time is present in the first container, the image a1 of the camera 1 at the first time is stored in the third container.
In step S33, in the case where there is no image transmitted by the target camera in the first container, the image is stored in the first container.
For example, in the case where there is no image transmitted by the camera 1 in the first container, the image a1 transmitted by the camera 1 is stored in the first container.
After receiving an image sent by a target camera, the method needs to judge whether the image corresponding to the target camera is stored in a first container, and when the image corresponding to the target camera exists in the first container, the image is stored in a third container, and when the image corresponding to the target camera exists in the first container, the image is stored in the first container, so that miss storage is avoided.
In a possible embodiment, the camera image synchronization method may further include:
after the synchronous image package is obtained, the first container is defined as a new second container at the next moment, the third container is defined as a new first container at the next moment, the second container is emptied, and the second container is defined as a new third container at the next moment.
For example, the current time of the current camera image synchronization is the last time of the next camera image synchronization, and the next time of the current camera image synchronization is the current time of the next camera image synchronization, so that a first container for storing the current time image in the current camera image synchronization is defined as a new second container for storing the last time in the next camera image synchronization, a third container for storing the next time in the current camera image synchronization is defined as a new first container for storing the current time in the next camera image synchronization, a second container for storing the last time in the current camera image synchronization is emptied, and the emptied second container is defined as a third container for storing the next time in the next camera image synchronization.
For example, referring to fig. 4, the camera image synchronization method may include the steps of:
in step S41, an image transmitted by the target camera is received.
In step S42, it is determined whether or not there is an image transmitted by the target camera in the first container; in the case where the first container has an image transmitted by the target camera, step S44 is performed; in the case where there is no image transmitted by the target camera in the first container, step S43 is performed.
In step S43, the image is stored in the first container.
In step S44, the image is stored in a third container.
In step S45, determining whether the first container is in a full-memory state, where the full-memory state indicates that the first container has stored an image corresponding to each camera; in the case where the first container is in the full state, step S46 is performed; in the case where the first container is not in the full state, step S41 is performed.
In step S46, the respective images in the first container are sorted from large to small according to the time stamps of the respective images stored in the first container, and the last image is taken as the first target image.
In step S47, for each camera associated with the autopilot system other than the target camera, images at all times corresponding to the cameras are determined from the first container, the second container, and the third container.
In step S48, for each image at each time, an image time difference is calculated from the time stamp of the first target image and the time stamp of the image.
In step S49, among the images at all times corresponding to the camera, the image having the smallest image time difference is taken as the second target image of the camera.
In step S410, the first target image and the plurality of second target images are grouped to obtain a synchronous image packet, and the synchronous image packet is used to control the vehicle to automatically drive.
In step S411, the first container is defined as a new second container at the next time, the third container is defined as a new first container at the next time, and the second container is emptied, and the second container is defined as a new third container at the next time; the process returns to step S41.
For example, referring to fig. 5, in the case where the camera associated with the automated driving system includes camera 1, camera 2, camera 3, and camera 4, an image a3 transmitted by camera 1 is received;
steps S42 to S44 are performed, and the storage states of the three containers are shown in fig. 5, that is, the first container stores an image a2 corresponding to the camera 1, an image b2 corresponding to the camera 2, an image c2 corresponding to the camera 3, an image d2 corresponding to the camera 4, the second container stores an image a1 corresponding to the camera 1, an image b1 corresponding to the camera 2, an image c1 corresponding to the camera 3, an image d1 corresponding to the camera 4, and the third container stores an image a3 corresponding to the camera 1, an image b3 corresponding to the camera 2, an image c3 corresponding to the camera 3, and an image d3 corresponding to the camera 4;
Steps S45 to S46 are performed to obtain an image a2 as a first target image;
steps S47-S49 are performed to obtain the second target image corresponding to the camera 2 as the image b1, the second target image corresponding to the camera 3 as the image c1, and the second target image corresponding to the camera 4 as the image d1.
Step S410 is performed to obtain a synchronization image packet, where the synchronization image packet includes an image a2, an image b1, an image c1, and an image d1.
Based on the same inventive concept, the present disclosure also provides a camera image synchronization apparatus, referring to fig. 6, the camera image synchronization apparatus 600 includes a determination module 601, an execution module 602, and a synchronization module 603.
Wherein the determining module 601 is configured to determine a first target image having a smallest timestamp from a first container for storing images acquired at a first time by each camera associated with an autopilot system of the vehicle.
The execution module 602 is configured to determine, for each camera associated with the autopilot system except for a target camera, an image corresponding to the camera in a first container, a second container and a third container according to a timestamp of the first target image, and obtain a plurality of second target images, where the target camera is a camera that acquires the first target image from the cameras associated with the autopilot system, the second container is used to store an image acquired by each camera associated with the autopilot system at a second time, and the third container is used to store an image acquired by each camera associated with the autopilot system at a third time, the second time is a time before the first time, and the third time is a time after the first time.
The synchronizing module 603 is configured to group the first target image with the plurality of second target images to obtain a synchronized image group.
According to the method, for each camera associated with the automatic driving system of the vehicle, a first target image with the smallest timestamp is determined from a first container, and then second target images which are acquired by other cameras, except for the camera which acquires the first target image, associated with the automatic driving system are determined from the first container, a second container and a third container, the first target image and a plurality of second target images are packaged to obtain a synchronous image package, and under the condition that frame loss and frame error occur to the cameras, images acquired by all cameras associated with the automatic driving system can be aligned in time, so that camera image synchronization is realized, and normal operation of the automatic driving system is ensured.
Further, the execution module 602 is configured to determine images of all moments corresponding to the camera from the first container, the second container, and the third container;
calculating an image time difference according to the time stamp of the first target image and the time stamp of the image for each image at each moment;
among the images at all times corresponding to the camera, the image with the smallest image time difference is taken as a second target image of the camera.
Further, the determining module 601 is configured to sort the images in the first container from large to small according to the time stamps of the images stored in the first container, and take the last image as the first target image.
Further, the camera image synchronization apparatus 600 further includes a judging module configured to judge whether the image sent by the target camera exists in the first container after receiving the image sent by the target camera;
storing the image in a third container in the presence of the image sent by the target camera in the first container;
in the case where there is no image transmitted by the target camera in the first container, the image is stored in the first container.
Further, the determining module 601 is further configured to determine the first target image with the smallest timestamp from the first container if the first container is in a full state, where the full state characterizes that the first container has stored one image corresponding to each camera.
Further, the synchronization module 603 is further configured to define the first container as a new second container at a next time instant, define the third container as a new first container at a next time instant, and empty the second container, and define the second container as a new third container at a next time instant after obtaining the synchronization image package.
With respect to the camera image synchronization apparatus in the above-described embodiments, a specific manner in which each module performs an operation has been described in detail in the embodiments regarding the method, and will not be described in detail herein.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the camera image synchronization method provided by the present disclosure.
Based on the same inventive concept, the present disclosure also provides a camera image synchronization apparatus, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
determining a first target image with a minimum timestamp from a first container, wherein the first container is used for storing images acquired at a first moment by each camera associated with an automatic driving system of a vehicle;
for each camera related to the automatic driving system except for a target camera, determining images corresponding to the cameras in a first container, a second container and a third container according to the time stamp of the first target image to obtain a plurality of second target images, wherein the target camera is a camera for acquiring the first target image from the cameras related to the automatic driving system, the second container is used for storing images acquired by each camera related to the automatic driving system at a second moment, and the third container is used for storing images acquired by each camera related to the automatic driving system at a third moment, the second moment is a moment before the first moment, and the third moment is a moment after the first moment;
And grouping the first target image and the plurality of second target images to obtain a synchronous image package.
According to the method, for each camera associated with the automatic driving system of the vehicle, a first target image with the smallest timestamp is determined from a first container, and then second target images which are acquired by other cameras, except for the camera which acquires the first target image, associated with the automatic driving system are determined from the first container, a second container and a third container, the first target image and a plurality of second target images are packaged to obtain a synchronous image package, and under the condition that frame loss and frame error occur to the cameras, images acquired by all cameras associated with the automatic driving system can be aligned in time, so that camera image synchronization is realized, and normal operation of the automatic driving system is ensured.
Fig. 7 is a block diagram illustrating a camera image synchronization apparatus 700 according to an exemplary embodiment. For example, the camera image synchronization apparatus 700 may be provided as a server. Referring to fig. 7, the camera image synchronization apparatus 700 includes a first processor 722, and a memory resource represented by a first memory 732 for storing instructions, such as an application program, executable by the first processor 722. The application program stored in the first memory 732 may include one or more modules each corresponding to a set of instructions. Further, the first processor 722 is configured to execute instructions to perform the camera image synchronization method described above.
The camera image synchronization device 700 may also include a power supply component 726 configured to perform power management of the camera image synchronization device 700, a wired or wireless network interface 750 configured to connect the camera image synchronization device 700 to a network, and an input/output interface 758. The camera image synchronization apparatus 700 may operate based on an operating system stored in the first memory 732.
In another exemplary embodiment, a computer program product is also provided, comprising a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-described camera image synchronization method when executed by the programmable apparatus.
Based on the same inventive concept, the present disclosure also provides a vehicle including the camera image synchronization apparatus 700 described above.
According to the method, for each camera associated with the automatic driving system of the vehicle, a first target image with the smallest timestamp is determined from a first container, and then second target images which are acquired by other cameras, except for the camera which acquires the first target image, associated with the automatic driving system are determined from the first container, a second container and a third container, the first target image and a plurality of second target images are packaged to obtain a synchronous image package, and under the condition that frame loss and frame error occur to the cameras, images acquired by all cameras associated with the automatic driving system can be aligned in time, so that camera image synchronization is realized, and normal operation of the automatic driving system is ensured.
Fig. 8 is a block diagram of a vehicle 800, according to an exemplary embodiment. For example, vehicle 800 may be a hybrid vehicle, but may also be a non-hybrid vehicle, an electric vehicle, a fuel cell vehicle, or other type of vehicle. Vehicle 800 may be an autonomous vehicle, a semi-autonomous vehicle, or a non-autonomous vehicle.
Referring to fig. 8, a vehicle 800 may include various subsystems, such as an infotainment system 810, a perception system 820, a decision control system 830, a drive system 840, and a computing platform 850. Vehicle 800 may also include more or fewer subsystems, and each subsystem may include multiple components. In addition, interconnections between each subsystem and between each component of the vehicle 800 may be achieved by wired or wireless means.
In some embodiments, infotainment system 810 may include a communication system, an entertainment system, a navigation system, and so forth.
The sensing system 820 may include several sensors for sensing information of the environment surrounding the vehicle 800. For example, the sensing system 820 may include a global positioning system (which may be a GPS system, or may be a beidou system or other positioning system), an inertial measurement unit (inertial measurement unit, IMU), a lidar, millimeter wave radar, an ultrasonic radar, and a camera device.
Decision control system 830 may include a computing system, a vehicle controller, a steering system, a throttle, and a braking system.
The drive system 840 may include components that provide powered motion to the vehicle 800. In one embodiment, the drive system 840 may include an engine, an energy source, a transmission, and wheels. The engine may be one or a combination of an internal combustion engine, an electric motor, an air compression engine. The engine is capable of converting energy provided by the energy source into mechanical energy.
Some or all of the functions of vehicle 800 are controlled by computing platform 850. Computing platform 850 may include at least one second processor 851 and a second memory 852, where second processor 851 may execute instructions 853 stored in second memory 852.
The second processor 851 may be any conventional processor, such as a commercially available CPU. The processor may also include, for example, an image processor (Graphic Process Unit, GPU), a field programmable gate array (Field Programmable Gate Array, FPGA), a System On Chip (SOC), an application specific integrated Chip (Application Specific Integrated Circuit, ASIC), or a combination thereof.
The second memory 852 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
In addition to instructions 853, the second memory 852 may also store data such as road maps, route information, vehicle location, direction, speed, etc. The data stored by the second memory 852 may be used by the computing platform 850.
In an embodiment of the present disclosure, the second processor 851 may execute instructions 853 to complete all or part of the steps of the camera image synchronization method described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A camera image synchronization method, comprising:
determining a first target image with a minimum time stamp from a first container for storing one image acquired at a first time by each camera associated with an autopilot system of a vehicle;
for each associated camera of the automatic driving system except for a target camera, determining images corresponding to the associated cameras in a first container, a second container and a third container according to the time stamp of the first target image to obtain a plurality of second target images, wherein the target camera is a camera which acquires the first target image in the cameras of the automatic driving system, the second container is used for storing one image which is acquired by each camera of the automatic driving system at a second time, the third container is used for storing one image which is acquired by each camera of the automatic driving system at a third time, the second time is a time before the first time, the third time is a time after the first time, and the second target image comprises an image with the smallest image time difference with the first target image;
And packaging the first target image and the plurality of second target images to obtain a synchronous image package.
2. The camera image synchronization method according to claim 1, wherein determining the image corresponding to the associated camera from the first container, the second container, and the third container according to the timestamp of the first target image includes:
determining images of all moments corresponding to the associated camera from the first container, the second container and the third container;
calculating an image time difference according to the time stamp of the first target image and the time stamp of the image for each image at each moment;
and taking the image with the smallest image time difference as a second target image of the associated camera in the images of all the moments corresponding to the associated camera.
3. The camera image synchronization method of claim 1, wherein determining the first target image from the first container having the smallest timestamp comprises:
and sequencing the images in the first container from big to small according to the time stamps of the images stored in the first container, and taking the last image as the first target image.
4. A camera image synchronization method according to any one of claims 1-3, characterized in that the camera image synchronization method further comprises:
after receiving the image sent by the target camera, judging whether the image sent by the target camera exists in the first container;
storing an image sent by the target camera in the third container in the case that the image exists in the first container;
in the case where there is no image transmitted by the target camera in the first container, the image is stored in the first container.
5. A camera image synchronization method according to any of claims 1-3, wherein said determining a first target image with a smallest timestamp from a first container comprises:
and under the condition that the first container is in a full-storage state, determining a first target image with the smallest timestamp from the first container, wherein the full-storage state represents that the first container stores one image corresponding to each camera.
6. A camera image synchronization method according to any one of claims 1-3, characterized in that the camera image synchronization method further comprises:
after the synchronous image package is obtained, the first container is defined as a new second container at the next moment, the third container is defined as a new first container at the next moment, the second container is emptied, and the second container is defined as a new third container at the next moment.
7. A camera image synchronization apparatus, comprising:
a determination module configured to determine a first target image having a minimum timestamp from a first container for storing one image acquired at a first time by each camera associated with an autopilot system of a vehicle;
an execution module configured to determine, for each associated camera associated with the autopilot system, an image corresponding to the associated camera in the first container, the second container, and a third container according to a timestamp of the first target image, to obtain a plurality of second target images, where the target camera is a camera that acquires the first target image from the cameras associated with the autopilot system, the second container is used to store one image acquired at a second time by each camera associated with the autopilot system, and the third container is used to store one image acquired at a third time by each camera associated with the autopilot system, the second time is a time before the first time, the third time is a time after the first time, and the second target image includes an image with a minimum image time difference from the first target image;
And the synchronization module is configured to group the first target image and the plurality of second target images to obtain a synchronization image packet.
8. A camera image synchronization apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
determining a first target image with a minimum time stamp from a first container for storing one image acquired at a first time by each camera associated with an autopilot system of a vehicle;
for each associated camera of the automatic driving system except for a target camera, determining images corresponding to the associated cameras in a first container, a second container and a third container according to the time stamp of the first target image to obtain a plurality of second target images, wherein the target camera is a camera which acquires the first target image in the cameras of the automatic driving system, the second container is used for storing one image which is acquired by each camera of the automatic driving system at a second time, the third container is used for storing one image which is acquired by each camera of the automatic driving system at a third time, the second time is a time before the first time, the third time is a time after the first time, and the second target image comprises an image with the smallest image time difference with the first target image;
And packaging the first target image and the plurality of second target images to obtain a synchronous image package.
9. A vehicle characterized in that it comprises the camera image synchronizing device of claim 8.
10. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the method of any of claims 1-6.
CN202310769990.4A 2023-06-27 2023-06-27 Camera image synchronization method, device, vehicle and storage medium Active CN116668602B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310769990.4A CN116668602B (en) 2023-06-27 2023-06-27 Camera image synchronization method, device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310769990.4A CN116668602B (en) 2023-06-27 2023-06-27 Camera image synchronization method, device, vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN116668602A CN116668602A (en) 2023-08-29
CN116668602B true CN116668602B (en) 2024-03-19

Family

ID=87709792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310769990.4A Active CN116668602B (en) 2023-06-27 2023-06-27 Camera image synchronization method, device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN116668602B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013219514A (en) * 2012-04-06 2013-10-24 Sumitomo Electric Ind Ltd Image packet transmitter, image packet receiver, image transmission system, image transmission method, and image transmission program
CN109194436A (en) * 2018-11-01 2019-01-11 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
KR20190083901A (en) * 2018-01-05 2019-07-15 한국전자통신연구원 Method and apparatus for time synchronization of videos from an array of cameras
CN110505466A (en) * 2019-09-12 2019-11-26 北京四维图新科技股份有限公司 Image processing method, device, electronic equipment, storage medium and system
CN114660620A (en) * 2022-04-11 2022-06-24 深兰人工智能(深圳)有限公司 Camera and laser radar synchronous control system, method and storage medium
CN115878494A (en) * 2023-01-04 2023-03-31 小米汽车科技有限公司 Test method and device for automatic driving software system, vehicle and storage medium
CN116156081A (en) * 2021-11-22 2023-05-23 哲库科技(上海)有限公司 Image processing method and device and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013219514A (en) * 2012-04-06 2013-10-24 Sumitomo Electric Ind Ltd Image packet transmitter, image packet receiver, image transmission system, image transmission method, and image transmission program
KR20190083901A (en) * 2018-01-05 2019-07-15 한국전자통신연구원 Method and apparatus for time synchronization of videos from an array of cameras
CN109194436A (en) * 2018-11-01 2019-01-11 百度在线网络技术(北京)有限公司 Sensor time stabs synchronous detecting method, device, equipment, medium and vehicle
CN110505466A (en) * 2019-09-12 2019-11-26 北京四维图新科技股份有限公司 Image processing method, device, electronic equipment, storage medium and system
CN116156081A (en) * 2021-11-22 2023-05-23 哲库科技(上海)有限公司 Image processing method and device and storage medium
CN114660620A (en) * 2022-04-11 2022-06-24 深兰人工智能(深圳)有限公司 Camera and laser radar synchronous control system, method and storage medium
CN115878494A (en) * 2023-01-04 2023-03-31 小米汽车科技有限公司 Test method and device for automatic driving software system, vehicle and storage medium

Also Published As

Publication number Publication date
CN116668602A (en) 2023-08-29

Similar Documents

Publication Publication Date Title
US11550623B2 (en) Distributed system task management using a simulated clock
US20240054895A1 (en) Parking method and apparatus, storage medium, chip and vehicle
CN115878494B (en) Test method and device for automatic driving software system, vehicle and storage medium
EP3809348A1 (en) Workflow management system
CN115056649A (en) Augmented reality head-up display system, implementation method, equipment and storage medium
WO2020139967A1 (en) Distributed system execution using a serial timeline
CN116366780B (en) Frame sequence number determining method and device for frame synchronization and vehicle
CN114781635A (en) Model deployment method, device, equipment and medium
KR20240042663A (en) Training vision-based systems using simulated content
JP2020140726A (en) Flight management server of unmanned flying object and flight management system
CN116668602B (en) Camera image synchronization method, device, vehicle and storage medium
CN113240813B (en) Three-dimensional point cloud information determining method and device
CN117429269B (en) Control method for closed-loop braking energy recovery function of new energy vehicle type and vehicle
CN112565468B (en) Driving scene recognition method and system
CN112765302B (en) Method and device for processing position information and computer readable medium
CN112712608B (en) System and method for collecting performance data by a vehicle
US11809790B2 (en) Architecture for distributed system simulation timing alignment
CN115657647B (en) Fault determination method, device, vehicle and storage medium
CN117308972A (en) Vehicle positioning method, device, storage medium and electronic equipment
CN116343174A (en) Target detection method, device, vehicle and storage medium
JP2020140262A (en) Operation support device and vehicle
CN115661014A (en) Point cloud data processing method and device, electronic equipment and storage medium
US11669657B2 (en) Architecture for distributed system simulation with realistic timing
CN116647303B (en) Time synchronization method, device and storage medium
CN116985830A (en) Vehicle mode operation method and device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant