CN115345771A - Automatic driving simulation test image processing method and device - Google Patents

Automatic driving simulation test image processing method and device Download PDF

Info

Publication number
CN115345771A
CN115345771A CN202210896539.4A CN202210896539A CN115345771A CN 115345771 A CN115345771 A CN 115345771A CN 202210896539 A CN202210896539 A CN 202210896539A CN 115345771 A CN115345771 A CN 115345771A
Authority
CN
China
Prior art keywords
image
queue
images
head
camera model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210896539.4A
Other languages
Chinese (zh)
Inventor
严宋扬
潘余曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Xinxin Information Technology Co ltd
Original Assignee
Xi'an Xinxin Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Xinxin Information Technology Co ltd filed Critical Xi'an Xinxin Information Technology Co ltd
Priority to CN202210896539.4A priority Critical patent/CN115345771A/en
Publication of CN115345771A publication Critical patent/CN115345771A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/0223User address space allocation, e.g. contiguous or non contiguous base addressing
    • G06F12/023Free address space management

Abstract

The application provides an automatic driving simulation test image processing method and device, relates to the technical field of automatic driving simulation tests, and solves the technical problem that images of different machine position display ends cannot be synchronously displayed in the automatic driving test process. The method comprises the following steps: acquiring a plurality of images of a target vehicle model controlled by an automatic driving system, acquired by each camera model in a plurality of camera models in the same time period, in a simulation environment; determining an image cache queue associated with each camera model according to a plurality of images acquired by each camera model; synchronizing the timestamps of the images of the plurality of image buffer queues at the head of the queue at preset time intervals; and sending at least one image stored in the image buffer queue associated with each camera model to a machine position display end corresponding to each camera model, wherein the at least one image comprises an image positioned at the head position of the image buffer queue after the time stamps are synchronized.

Description

Automatic driving simulation test image processing method and device
Technical Field
The application belongs to the technical field of simulation testing, and particularly relates to an automatic driving simulation test image processing method and device.
Background
In the simulation test of the automatic driving system, generally, a simulation test platform controls a simulator and the automatic driving system, a virtual simulation scene is rendered in the simulator, the automatic driving system is connected with the simulator through a communication interface and controls a test vehicle model in the simulator, and then the driving performance of the automatic driving system on a model vehicle is tested in a simulation mode.
In order to facilitate the monitoring of the test process by the tester and the review of the simulation test process, an image sensor model for image acquisition of the test process is generally arranged in the simulator. In the testing process, the image sensor model collects images of a simulation scene in the simulator, the image sensor model continuously transmits the collected images to an execution engine of the simulation testing platform, the execution engine sends the received images to an image processing module, the image processing module performs imaging processing on the images and then transmits the images to a display terminal, and the display terminal performs video display on the continuous images.
During the automatic driving simulation test, image sensor models are generally arranged at a plurality of different positions of a simulator respectively, namely different machine positions are adopted to collect test process images at different sites of the simulation environment of the simulator; in addition, the types of image sensor models of different stands can be different due to different image acquisition requirements. The difference of the positions and the types of the image sensor models in the simulator can cause the difference of image transmission frequencies, that is, the actual acquisition time of the images transmitted to the display terminal by different image sensor models at the same time can be different, so that the images in the test process of different positions cannot be synchronously displayed. And the images in the test process can not be synchronously displayed, which can cause that the tester can not accurately judge the current test progress and test condition.
Disclosure of Invention
The embodiment of the application provides an automatic driving simulation test image processing method and device, and can solve the technical problem that in an automatic simulation test, test process images of different machine positions cannot be synchronously displayed.
In a first aspect, an embodiment of the present application provides an automatic driving simulation test image processing method, where the method includes:
acquiring a plurality of images of a target vehicle model controlled by an automatic driving system, acquired by each camera model in a plurality of camera models in the same time period, in a simulation environment, wherein different camera models are used for acquiring images from different point positions;
determining an image cache queue associated with each camera model according to a plurality of images acquired by each camera model, wherein one image cache queue comprises a plurality of images which are sequentially stored in the image cache queue according to a time sequence and acquired by the corresponding camera model, each image has a time stamp, and the image cache queue is a first-in first-out queue;
synchronizing the timestamps of the images of which the plurality of image buffer queues are positioned at the head of the queue at preset time intervals;
and sending at least one image stored in the image cache queue associated with each camera model to a machine position display end corresponding to each camera model, wherein the at least one image comprises the image with the synchronized timestamp positioned at the head position of the image cache queue.
In the above embodiment, the multiple images of the target vehicle model controlled by the automatic driving system, acquired by the multiple camera models in the same time period, in the simulation environment are stored in the corresponding image cache queues, and the timestamps of the images at the head of the queues in the multiple image cache queues are synchronized, so that the time for the multiple image cache queues to transmit the images to the corresponding machine position display ends is synchronized, thereby avoiding the time difference between the images acquired by different camera models, and then sending at least one image stored in the associated image cache queue to the corresponding machine position display end can ensure that the images acquired by different camera models are synchronously displayed at the corresponding machine position display ends, thereby realizing that the video images of different machine positions are synchronously played, that is, the video images of different machine positions are displayed at the same time as simulation test contents at the same time.
In one possible implementation form of the first aspect, the time stamp is determined according to a time when the corresponding image is received by the automated driving simulation testing platform.
In a possible implementation manner of the first aspect, the synchronizing timestamps of the images of which the plurality of image buffer queues are located at head-of-line positions includes:
determining a maximum timestamp from timestamps corresponding to the images of which the image cache queues are located at the head of the image cache queues;
and updating the images at the head of the image queue in each other image cache queue according to the maximum timestamp so that the time deviation degree between the timestamp of the images at the head of the image queue in each other image cache queue and the maximum timestamp is less than or equal to a preset threshold, wherein the other image cache queues are queues except the image cache queue corresponding to the maximum timestamp in the plurality of image cache queues.
In a possible implementation manner of the first aspect, the updating, according to the maximum timestamp, the image at the head of the image buffer queue in each other image buffer queue, so that a time deviation degree between the timestamp of the image at the head of the image buffer queue in each other image buffer queue and the maximum timestamp is less than or equal to a preset threshold includes:
determining the time deviation degree between the timestamp corresponding to the image of each other image buffer queue at the head of the queue position and the maximum timestamp;
and for the first image buffer queue with the time deviation degree larger than a preset threshold value in the other image buffer queues, adjusting the time stamp of the image of the first image buffer queue at the head of the image queue, so that the time deviation degree between the time stamp of the image of the first image buffer queue at the head of the image queue after adjustment and the maximum time stamp is smaller than or equal to the preset threshold value.
In a possible implementation manner of the first aspect, the sending, to a machine position display end corresponding to each camera model, at least one image stored in the image buffer queue associated with each camera model includes:
determining the display frame rate of the machine position display end corresponding to each image cache queue;
and sending at least one image stored in each image cache queue to the corresponding machine position display end according to the display frame rate of the machine position display end corresponding to each image cache queue.
In the above embodiment, before sending the images in the image cache queue to the corresponding machine location display ends, the display frame rate of each machine location display end is determined first, and the image cache queue adjusts the speed at which the machine location display end sends the images according to the display frame rate, so that the speed at which the image cache queue sends the images is matched with the display frame rate, thereby realizing stable display of the machine location display ends.
In a possible implementation manner of the first aspect, the determining a display frame rate of the set-top display end corresponding to each image cache queue includes:
acquiring the number of the images stored in each image cache queue;
and determining the display frame rate corresponding to each machine position display end according to the number of the images stored in each image buffer queue.
In a possible implementation manner of the first aspect, the greater the number of the images stored in the image buffer queue, the greater the display frame rate of the corresponding machine position display end.
In a second aspect, an embodiment of the present application provides an automatic driving simulation test image processing apparatus, including:
the system comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a plurality of images of a target vehicle model controlled by an automatic driving system, acquired by each camera model in a plurality of camera models in the same time period, in a simulation environment, and different camera models are used for acquiring images from different point positions;
the queue determining unit is used for determining an image cache queue associated with each camera model according to a plurality of images acquired by each camera model, wherein one image cache queue comprises a plurality of images which are sequentially stored in the image cache queue according to a time sequence and acquired by the corresponding camera model, each image has a time stamp, and the image cache queue is a first-in first-out queue;
the synchronization unit is used for synchronizing the timestamps of the images of which the plurality of image cache queues are positioned at the head of the queue at preset time intervals;
and the sending unit is used for sending at least one image stored in the image cache queue associated with each camera model to a machine position display end corresponding to each camera model, and the at least one image comprises the image with the synchronized timestamp positioned at the head position of the image cache queue.
In a third aspect, an embodiment of the present application provides a simulation test platform, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the steps of the method described in any one of the foregoing embodiments of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the method as described in any one of the above embodiments of the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a server, causes the server to perform the steps of the method described in any one of the above embodiments of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic structural diagram of an automated driving simulation test system according to an embodiment of the present application;
FIG. 2 is a flowchart of an automated driving simulation test image processing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a three-dimensional rectangular coordinate system established with the geometric center of an EGO vehicle as the origin according to an embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating time synchronization of a plurality of other image buffer queues according to an embodiment of the present application;
FIG. 5 is a block diagram of an automatic driving simulation test image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic internal structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated.
The application provides an automatic driving simulation test image processing method, which comprises the steps of storing a plurality of images of a target vehicle model controlled by an automatic driving system, acquired by a plurality of camera models in the same time period, in a simulation environment into corresponding image cache queues, and synchronizing timestamps of images positioned at the head of the image cache queues; and sending at least one image stored in each image buffer queue to the corresponding machine position display end. Therefore, the time for transmitting the images of the corresponding machine position display ends by the plurality of image cache queues is synchronous, the time difference between the images acquired by different camera models is avoided, and then at least one image stored in the associated image cache queue is sent to the corresponding machine position display end, so that the images acquired by different camera models can be ensured to be synchronously displayed on the corresponding machine position display ends, and the video images of different machine positions are synchronously played, namely the video images of different machine positions display simulation test contents at the same time.
The following describes an exemplary method for processing an image of an automatic driving simulation test provided by the present application with reference to a specific embodiment.
Referring to fig. 1, the autopilot simulation test system provided for the present application includes a simulation test platform 110, an autopilot system 120, a simulator 130 and a display terminal 140, where the simulation test platform 110 includes an execution engine 111 and an image processing module 112, where the execution engine 111 is used for overall control of a simulation test process. In this embodiment, the execution engine 111 transmits an image acquired by the camera model in the simulator 130 to the image processing module 112, the image processing module 112 processes the image and then transmits the processed image to the display terminal 140, the display terminal 140 performs video display on the received image, and a tester monitors the progress of the simulation test through the video output by the display terminal 140.
In the embodiment of the present application, the display terminal 140 is capable of simultaneously displaying images captured by camera models at a plurality of sites in the simulator 130. The display terminal 140 may be a collection of multiple terminals, or may be multiple display interfaces on the same terminal.
Based on the automatic driving simulation test system shown in fig. 1, as shown in fig. 2, a flowchart of an embodiment of an automatic driving simulation test image processing method provided by the present application is provided. The main execution subject of the method is the image processing module 112 in the simulation test platform 110 in fig. 1. Referring to fig. 2, the method may include:
step S201, acquiring a plurality of images of the target vehicle model controlled by the automatic driving system, acquired by each camera model in the plurality of camera models in the same time period, in a simulation environment, wherein different camera models are used for acquiring images from different point positions.
In an embodiment, the camera model, which may also be referred to as an image sensor model, is a self-contained functional module of the simulator 130, and is used for acquiring a simulation test image of a specific site of the simulator 130. Different camera models are used for acquiring images at different positions.
Optionally, the simulation test platform 110 may send an image acquisition instruction to the multiple camera models to trigger each camera model to acquire an image and report the acquired image, or the multiple camera models may also actively report the acquired image to the simulation test platform 110.
Generally, the camera model is set according to the requirements of the automatic driving simulation test, and the setting of the camera model is different according to different simulation test requirements. Specifically, in the simulation test process, a tester can increase or decrease the number of camera models according to needs; different types of camera models can be selected according to different shooting requirements, for example, the types of the camera models can be a color camera, a gray level camera, a depth camera and the like; the specific position of the camera model in the simulation scene can also be set according to the monitoring requirement.
In an embodiment, the specific position of the camera model in the simulation scene may be referred to as a position of the camera model. The machine positions of the camera model may include a moving machine position and a stationary machine position. The camera model of the moving stand can move following the movement of the test vehicle model (EGO vehicle), and is used for continuously acquiring images of the surrounding simulation environment of the EGO vehicle in the simulation test process. The camera model of the fixed machine position is used for acquiring images of a fixed position of a simulation environment where the camera model of the fixed machine position is located, the camera model of the fixed machine position cannot move along with the EGO vehicle, and the camera model of the fixed machine position mainly focuses on the simulation test condition of the EGO vehicle at a specific position needing to be observed emphatically in the simulation environment, such as the machine position which is arranged right above a crossroad and faces downwards, or the machine position which is arranged at a bend, right above a roundabout and the like.
In order to make the concept of the exercise machine position easier to understand in the embodiment of the present application, the following describes an example in which the camera model is provided in different exercise machine positions, with reference to the accompanying drawings.
In an embodiment, the camera model of the moving stand moves along with the motion of the EGO vehicle, so the coordinates of the moving stand are determined by the relative coordinates of the camera model and the EGO vehicle, and as shown in fig. 3, a three-dimensional rectangular coordinate system is established by taking the geometric center of the EGO vehicle as an origin. The distances from the geometric center point to the foremost end of the EGO vehicle and the rearmost end of the EGO vehicle are equal, the distances from the geometrical center point to the leftmost end of the EGO vehicle and the rightmost end of the EGO vehicle are equal, and the distances from the geometrical center point to the uppermost end of the EGO vehicle and the lowermost end of the EGO vehicle are equal. The length, the width and the height of the EGO vehicle are respectively set to be L, W and H, the EGO vehicle is an original model in a simulator, and therefore the vehicle information is obtained through the simulator.
As shown in fig. 3, xv represents the direction of the x-axis (the traveling direction of the EGO vehicle), yv represents the direction of the y-axis (the right direction of the EGO vehicle), and Zv represents the direction of the z-axis (the upper direction of the EGO vehicle). Where Roll indicates the angle of rotation about the x-axis in the direction indicated by the arrow (in degrees, if Roll equals 360, it indicates one rotation about the x-axis, which is equivalent to no change from the perspective of the view), pitch and Yaw have similar definitions as Roll, indicating rotation about the y-axis and z-axis, respectively.
For a camera of a moving machine position, the position and the orientation of the camera in three-dimensional space can be represented by six values of x, y, z, roll, pitch and Yaw. In an embodiment, the sports machine position may include a foreground view machine position, a car view machine position, a rear view machine position, a left view machine position, a right view machine position, and the like. The above-mentioned machine positions will be described below with reference to x, y, z, roll, pitch and Yaw, respectively.
(1) Foreground view machine position
The camera model corresponding to the machine position is located at a position above the front cover of the vehicle, wherein x = L/2.1, y =0, z = H/2.3, roll =0, pitch =20 and yaw =0, wherein Pitch =20 is equivalent to slightly tilting the camera down by 20 degrees in order to better focus on the area in front of the EGO vehicle, which is the area generally needed to focus on during testing.
(2) Vehicle view machine position
The camera model corresponding to the machine position is located at the rear position of the vehicle, wherein x = -L/1.9, y =0, z = H/1.9, roll =0, pitch =20, and yaw =0. The lens faces the front of the vehicle and can irradiate the whole vehicle from back to front.
(3) Rearview machine position
The camera model corresponding to the machine position is positioned on the front cover of the vehicle, wherein x = -L/2.1, y =0, z = H/2, roll =0, pitch =20 and yaw =180, wherein yaw =180 corresponds to the lens of the camera facing the opposite direction of the driving direction of the vehicle so as to irradiate the rear of the vehicle. The view behind the vehicle can be seen at the stand, and a camera model is arranged at the stand when the test condition behind the vehicle needs to be concerned.
(4) Left view machine position
The camera model corresponding to the machine position is positioned on the left side of the vehicle, wherein x =0, y =1.5 w, z = h/2, roll =0, pitch =0, and yaw = -90. The stand is capable of photographing the vehicle from the left side of the vehicle.
(5) Right view machine position
The camera model corresponding to the station is located on the right side of the vehicle, wherein x =0, y = -1.5 × w, z = h/2, roll =0, pitch =0, and yaw =90. The stand is capable of photographing the vehicle from the right side of the vehicle.
In the embodiment, each camera model continuously collects the simulation images of the corresponding machine position, the collected images are sent to the execution engine, and the execution engine sends the images to the image processing module.
Step S202, determining an image buffer queue associated with each camera model according to a plurality of images acquired by each camera model, wherein one image buffer queue comprises a plurality of images which are sequentially stored in the image buffer queue according to the time sequence and acquired by the corresponding camera model, each image has a time stamp, and the image buffer queue is a first-in first-out queue.
In the embodiment, each camera model corresponds to one image cache queue, and the image processing module stores the images into the corresponding image cache queues according to the time sequence of receiving the images. The image buffer queue is a first-in first-out queue, and the image in the first-in queue is taken out from the queue firstly. For example, the image 1, the image 2, and the image 3 are all images acquired by the same camera model a, and the time corresponding to the timestamp of the image 1 is earlier than the time corresponding to the timestamp of the image 2, and the time corresponding to the timestamp of the image 2 is earlier than the time corresponding to the timestamp of the image 3, then in the image buffer queue corresponding to the camera model a, the image 1 is at the first position (i.e., the queue head position), assuming that in the left-to-right direction, the first leftmost position in the image buffer queue a may be regarded as the queue head, the rightmost position may be regarded as the end position of the image buffer queue, and the positions of the corresponding image 2 and image 3 are located behind the first position in the image buffer queue. Where image 1 would be taken first and then images 2 and 3 would be taken in sequence.
In the embodiment, each camera model is actually an element in the simulator, each element in the simulator has a unique label, and when an image processing module of the simulation test platform receives an image, the image processing module can judge which camera model the image belongs to according to the running condition of the camera model in the simulator, so that the image is stored in a corresponding image cache queue. Or each camera model sends the collected image and simultaneously sends the identification of the camera model collecting the image, so that the image processing module determines which camera model the received image comes from.
Each image has a time stamp, which may be a time stamp of an image acquisition time, a time stamp of a time when the image is received by the execution engine, or a time stamp of a time when the image is received by the image processing module.
If the time stamp is the capture time of the image in the simulator, the simulator is required to include the image capture time in the image data provided by the simulator to the simulation test platform. However, for different simulation test processes, the simulation test platform may interface different simulators, and the image data provided by different simulators may be different, so that it cannot be guaranteed that all simulators will necessarily provide a timestamp of the image acquisition instant.
In one embodiment, in order to enable the simulation test platform to effectively control the time stamp of the image, the time stamp of the corresponding image is determined according to the time when the simulation test platform receives the image. Specifically, the image processing module may determine the time stamp of the corresponding image according to the time when the corresponding image is received by itself, or the execution engine may determine the time stamp of the corresponding image according to the time when the corresponding image is received by itself and transmit the image and the corresponding time stamp to the image processing module.
In an embodiment, the simulation test platform automatically records the time when the image processing module receives the image (including the time when the execution engine receives the image and the time when the image processing module receives the image), so that the controllability of the time stamp can be improved by determining the time stamp by using the time when the image is received. In addition, the time from the image acquisition of the camera model to the image receiving of the image processing module is very short, and the time is the same for the same simulator, so that the synchronism is not influenced. Therefore, the time when the simulation test platform receives the image is selected to determine the time stamp of the corresponding image in the embodiment of the application.
Optionally, the image in the image cache queue is actually an image-timestamp tuple formed by the image and the corresponding timestamp, wherein the camera model continuously sends the acquired image to an execution engine of the simulation test platform, the execution engine sends the image to the image processing module, and the image processing module combines the image with the current system time of the simulation test platform to construct the image-timestamp tuple and stores the image-timestamp tuple into the corresponding image cache queue after receiving the image.
It can be understood that a plurality of camera models may simultaneously send the acquired images to the execution engine, and at this time, the image processing module simultaneously receives a plurality of images, respectively forms a plurality of image-timestamp tuples with the timestamps of the current system time, and stores each image-timestamp tuple in the corresponding image cache queue.
And step S203, synchronizing the time stamps of the images of the plurality of image buffer queues at the head of the queue at preset time intervals.
In an embodiment, the image at the head of the image buffer queue is the image that enters the image buffer queue first in the current state, and may also be referred to as a head of the queue image. When one head image is taken out from the image buffer queue, the next image in the image buffer queue becomes the new head image of the image buffer queue. For example, image 1 and image 2 are in the same image buffer queue, image 1 is at the head of the queue, and the position of image 2 in the queue is sequentially after image 1, then when image 1 is sent, image 2 will be at the head of the queue.
The image time stamps of the images of the image buffer queues at the queue head positions are synchronized, so that the time stamps of the queue head images of the image buffer queues after synchronization are synchronous, and the queue head images are the images which are firstly sent to the corresponding machine position display ends in the image buffer queues, so that the images which are subsequently sent to the machine position display ends by the image buffer queues are synchronized in time.
In an embodiment, the time stamp for synchronizing the images of the image buffer queues at the head of the image buffer queues is periodically performed, so that each image in the image buffer queues is not synchronized. On the basis of ensuring the synchronous display of images at the display ends of different machine positions, the method avoids the large data processing amount brought by synchronous operation aiming at each image, and further avoids excessive increase of the operation pressure of a simulation test platform.
In one embodiment, the specific process of synchronizing the time stamps of the images of the plurality of image buffer queues at the head of the queue includes: firstly, acquiring time stamps corresponding to head images of each image cache queue, and determining the maximum time stamp from the time stamps; and then updating the head image of the other image buffer queues according to the maximum timestamp, so that the timestamp of the head image of each other image buffer queue is aligned with the maximum timestamp, wherein all the image buffer queues except the image buffer queue corresponding to the maximum timestamp are other image buffer queues. The time stamp and the maximum time stamp can be aligned by the time stamp and the maximum time stamp being identical, or by the time deviation degree between the time stamp and the maximum time stamp being less than or equal to a preset threshold. The image at the head of the queue is an image corresponding to the image buffer queue at the head of the queue.
It can be understood that a larger timestamp for an image indicates a later time for the image to enter the image queue, and thus a later time for acquiring the image. Therefore, the head image corresponding to the maximum timestamp is the latest acquired image in the head images of the image cache queues; when the images of the other image cache queues at the head of the image queue are updated, the head images of the other image cache queues are updated, so that the similarity between the acquisition time of the head images of the other image cache queues and the acquisition time of the head images corresponding to the maximum timestamp is higher, and the time synchronism of the images sent to different machine position display ends is ensured.
Optionally, the method for making the time deviation degree between the maximum timestamp and the acquisition time of the head-of-line image be less than or equal to a preset threshold specifically includes: firstly, determining the time deviation degree between the time stamp corresponding to the image of each other image buffer queue at the head of the queue and the maximum time stamp; and for the first image cache queue with the time deviation degree larger than the preset threshold value in other image cache queues, adjusting the time stamp of the image of the first image cache queue at the head position of the image cache queue, so that the time deviation degree between the time stamp of the image of the adjusted first image cache queue at the head position of the image cache queue and the maximum time stamp is smaller than or equal to the preset threshold value. Wherein the first image buffer queue may be any one of the other image buffer queues.
Illustratively, the timestamp corresponding to the head image of each of the other image buffer queues is compared with the maximum timestamp to obtain the time deviation degree corresponding to each of the other image buffer queues.
Illustratively, the timestamp of the image at the head of line position of the first image buffer queue is adjusted by replacing the image at the head of line position of the first image buffer queue. Specifically, the method comprises the following steps: and deleting the head image of the first image cache queue at the current moment, and repeatedly deleting the head image and acquiring the time deviation degree of a new head image according to the time deviation degree between the time stamp and the maximum time stamp of the next head image of the first image cache queue until the time deviation degree between the time stamp and the maximum time stamp of the head image of the first image cache queue is less than or equal to a preset threshold value.
In an embodiment, the time deviation degree is used to characterize the difference between two timestamps, and may be, for example, a ratio or a difference. For example, the time deviation degree may be represented by a difference between the maximum timestamp and the timestamp of the corresponding head-of-line image, or may be represented by a ratio of the maximum timestamp to the timestamp of the corresponding head-of-line image.
Illustratively, when the time deviation degree is a difference value, the preset threshold is a synchronous fault-tolerant time difference, and the synchronous fault-tolerant time difference is a synchronous time difference acceptable for the display terminals of different positions. Specifically, when the time stamp difference of the images received by the different set-top display terminals is less than or equal to the synchronous fault-tolerant time difference, the different set-top display terminals can be considered to be synchronously displayed.
It can be understood that, when the time deviation degree is a ratio, where the preset threshold is a value close to 1, the closer the preset threshold is to 1, the smaller the acceptable synchronization time difference between the display terminals of different stands is.
Next, a time stamp process for synchronizing images at the head of the image buffer queues will be described with the time difference being set as the difference.
Fig. 4 is a schematic flow chart illustrating a process of synchronizing timestamps of images at a head-of-line position in a plurality of image buffer queues according to an embodiment of the present application. In this embodiment, δ represents the synchronous error tolerance time difference of different image queues, i.e. the timestamps of the head images of different image queues are allowed to have a difference of δ, which is a positive number. Where s represents the size of the time period over which the periodic inspection of all image queues is performed, and s is a positive number. As shown in fig. 4, the time synchronization process specifically includes:
s401, waiting for S duration.
S402, obtaining the maximum value t-max of the time stamps of the head images of all the current image buffer queues.
S403, putting all other image cache queues with time stamps smaller than t-max into a sequence L, and traversing the sequence L; wherein q is an image buffer queue in the sequence L, and the timestamp of the corresponding head image is t.
S404, judging whether the difference value between t-max and t is larger than delta.
S405, if the judgment result is yes, discarding the head image of the image cache queue q, making t equal to the time stamp of the next head image of the image cache queue q, and jumping to S404.
S406, if not, judging whether the image cache queue q is the last image cache queue of the sequence L.
And S407, if not, enabling the next image cache queue of the sequence L to be q, and jumping to S404.
And S408, if the judgment result is yes, ending the flow.
In one embodiment, δ is in milliseconds, and is defaulted to 1000 milliseconds during application; the unit of s is millisecond, and the default value in the application process is 3000 millisecond.
In the embodiment, the value of δ is 1000 milliseconds, that is, when time synchronization is performed, when the difference between the maximum timestamp and timestamps of other image cache queues is greater than 1000 milliseconds, synchronization operation is performed, that is, a corresponding queue head image is deleted; if the images of different machine position display ends have difference within 1000 milliseconds in synchronism, human eyes cannot easily recognize the difference, and therefore when the difference between the time stamp and the maximum time stamp is less than 1000 milliseconds, the image corresponding to the time stamp and the image corresponding to the maximum time stamp can be considered to be synchronous.
In the time synchronization process, the frame rate can be prevented from being influenced by too many discarded images in the image cache queue by setting the fault-tolerant time difference delta, so that the automatic driving simulation test image processing method in the embodiment of the application ensures the frame rate stability while ensuring the image transmission synchronism.
And S204, sending at least one image stored in the image buffer queue associated with each camera model to a machine position display end corresponding to each camera model, wherein the at least one image comprises an image positioned at the head of the image buffer queue after the time stamps are synchronized.
In the embodiment, the camera models correspond to the machine position display ends one to one, that is, one machine position display end is only used for displaying images acquired by the corresponding camera model, so that a plurality of machine position display ends exist in the embodiment of the application. The plurality of machine position display ends can be a plurality of display interfaces on one display terminal, and the plurality of machine position display ends can also be displayed by dividing into a plurality of blocks on the same display interface. Optionally, each machine position display end may also correspond to one display terminal.
The images in the image buffer queue are sequentially sent to the corresponding machine position display ends, and because the image buffer queue is a first-in first-out queue, the image at the head position of the image buffer queue is firstly sent to the corresponding machine position display ends.
It can be understood that, because the time synchronization is performed periodically, the time stamp synchronization is not performed on each head image of the image buffer queue, and therefore, some images are subjected to the time stamp synchronization processing, but some images are directly transmitted to the corresponding machine side display side without being subjected to the time stamp synchronization processing, among the images transmitted to the machine side display side by the image processing module.
In one embodiment, before sending the images in the image buffer queue to the corresponding machine position display end, the image processing module further performs graphics processing on the images, for example: the necessary graphic adjustment of size, color, etc. is performed, and the graphic adjustment is mainly determined according to the specific simulation test requirements, which is not described herein again.
In one embodiment, sending at least one image stored in a corresponding image buffer queue to a machine position display end specifically includes: determining the display frame rate of the machine position display end corresponding to each image cache queue; and sending at least one image stored in each image buffer queue to the corresponding machine position display end according to the display frame rate of the machine position display end corresponding to each image buffer queue. In an embodiment, before the image processing module sends an image to a corresponding machine position display end, it needs to determine a display frame rate of the machine position display end corresponding to each image buffer queue. And the image processing module determines the speed of sending the image to the corresponding machine position display end according to the display frame rate.
Alternatively, the display frame rate of the machine side display end may be determined based on historical experience, for example, the image transmission rate of each camera model during the historical test may be determined instead.
Optionally, an embodiment of the present application further provides a method for determining a display frame rate of a machine-side display end, which specifically includes the following steps: firstly, acquiring the number of images stored in each image buffer queue; and then determining the display frame rate of each machine position display end according to the number of the images stored in each image tuple.
In the embodiment, the number of the images in each image cache queue is associated with the display frame rate of the corresponding machine position display end, so that the influence of the image acquisition and sending frequency of a camera model on the video display of the machine position display end can be reduced, and the machine position display end can perform video display on the images according to the corresponding frame rate, so that the technical problem of unstable image frame rate of the machine position display end is solved, and the user experience is improved.
Optionally, the relationship between the display frame rate and the number of images in the image buffer queue specifically includes: the larger the number of the images corresponding to the image buffer queue is, the larger the display frame rate of the machine position display end corresponding to the image buffer queue is.
In the embodiment, the display frame rate may be understood as the number of pictures displayed per second of the video, and the unit is FPS (frames per second), for example, the frame rate is 5FPS, which indicates that 5 pictures are displayed per second. That is, the larger the display frame rate, the larger the number of images displayed by the corresponding set display terminal in a unit time.
The larger the number of the images in the cache queue is, the faster the corresponding camera model acquires the images and sends the images to the outside, so that the image processing module can accelerate the image sending speed, and the corresponding machine position display end can provide a smoother picture to the outside. When the number of the images in the buffer queue is relatively small, the speed of acquiring the images and sending the images to the outside by the camera model is relatively low, so that the image sending speed is reduced by the image processing module, and although the image display speed of the corresponding machine position display end is reduced, the stability and the fluency of image display are improved.
In one embodiment, the display frame rate is converted to a transmission wait period between adjacent images to the display end of the mobile station, wherein the transmission wait period is the reciprocal of the frame rate. For example, if it is determined that the display frame rate of the machine position display end corresponding to the image cache queue is 15FPS, the sending waiting time of the corresponding machine position display end is 1/15 second, that is, the machine position display end sends one image in the image cache queue to the corresponding machine position display end at an interval of 1/15 second.
Optionally, the display frame rate may be proportional to the number of images in the image buffer queue, or may be set according to a specific value.
In one embodiment, the number of the images stored in the image cache queue is m, when m is greater than 0 and less than or equal to 10, the display frame rate is determined to be 5FPS, and the sending waiting time is 0.2 second; when m is more than 10 and less than or equal to 100, determining that the display frame rate is 10FPS and the sending waiting time is 0.1 second; when m is more than 100, the display frame rate is determined to be 15FPS, and the sending waiting time is 1/15 second.
Optionally, the reciprocal of the display frame rate of each machine position display end is smaller than a preset threshold, that is, each sending wait time is smaller than the synchronous fault-tolerant time difference δ. For example, δ is 1000 milliseconds, and the maximum transmission latency is 0.2 seconds, so the transmission latency is much smaller than the synchronization fault tolerance time difference δ. That is, the time difference between the images at the same time of the display ends of different machine positions is large, so that the adjustment of the sending waiting time (corresponding to the frame rates of the display ends of different machine positions) for the buffer queues of different images is relatively small, and the synchronism of image display between the display ends of different machine positions is not affected.
In the embodiment of the application, the timestamps of the images at the head of the image queue of the image cache queues are synchronized by setting the image cache queues corresponding to the image sensor models, so that the images transmitted to the corresponding machine position display end by the image cache queues are synchronized in time, and the images acquired by the camera model are ensured to be synchronously displayed at the corresponding machine position display end.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 5 shows a block diagram of an automatic driving simulation test image processing apparatus provided in an embodiment of the present application, corresponding to the automatic driving simulation test image processing method of the above embodiment, and only the relevant parts of the embodiment of the present application are shown for convenience of description. Referring to fig. 5, the automated driving simulation test image processing apparatus 500 includes: acquisition unit 501, queue determination unit 502, synchronization unit 503, and transmission unit 504:
the acquiring unit 501 is configured to acquire multiple images of a target vehicle model controlled by an automatic driving system, acquired by each camera model in multiple camera models in the same time period, in a simulation environment, where different camera models are used to acquire images from different point locations;
a queue determining unit 502, configured to determine an image buffer queue associated with each camera model according to a plurality of images acquired by each camera model, where one image buffer queue includes a plurality of images sequentially stored in the image buffer queue according to a time sequence and acquired by the corresponding camera model, each image has a time stamp, and the image buffer queue is a first-in first-out queue;
a synchronizing unit 503, configured to synchronize timestamps of images of the plurality of image buffer queues at the head of the queue at preset time intervals;
a sending unit 504, configured to send at least one image stored in the image buffer queue associated with each camera model to the machine-side display end corresponding to each camera model, where the at least one image includes an image located at the head of the image buffer queue after the timestamp synchronization.
Optionally, the queue determining unit 502 is further configured to determine a timestamp of the image according to a time when the automatic driving simulation test platform receives the corresponding image.
Optionally, the synchronizing unit 503 is configured to, according to the time stamp for synchronizing the images at the head of the queue in the plurality of image buffer queues, include:
determining a maximum time stamp from the time stamps corresponding to the images of the image cache queues at the head of the image cache queues;
and updating the images at the head of the image queue in each other image cache queue according to the maximum timestamp so that the time deviation degree between the timestamp of the image at the head of the image queue in each other image cache queue and the maximum timestamp is less than or equal to a preset threshold, wherein the other image cache queues are the image cache queues except the image cache queue corresponding to the maximum timestamp in the plurality of image cache queues.
Optionally, the synchronizing unit 503 is configured to update, according to the maximum timestamp, that a time deviation degree between the timestamp of the image located at the head of the image in each of the other image buffer queues and the maximum timestamp is less than or equal to a preset threshold, and includes:
determining the time deviation degree between the timestamp corresponding to the image of each other image buffer queue at the head of the queue position and the maximum timestamp;
and for the first image buffer queue with the time deviation degree larger than a preset threshold value in the other image buffer queues, adjusting the time stamp of the image of the first image buffer queue at the head of the image queue, so that the time deviation degree between the time stamp of the image of the first image buffer queue at the head of the image queue after adjustment and the maximum time stamp is smaller than or equal to the preset threshold value.
Optionally, the sending unit 504 is configured to send at least one image stored in the image buffer queue associated with each camera model to the machine position display end corresponding to each camera model, and includes:
determining the display frame rate of a machine position display end corresponding to each image cache queue;
and sending at least one image stored in each image buffer queue to the corresponding machine position display end according to the display frame rate of the machine position display end corresponding to each image buffer queue.
Optionally, the sending unit 504 is configured to determine a display frame rate of the machine-side display end corresponding to each image buffer queue, and includes:
acquiring the number of images stored in each image cache queue;
and determining the display frame rate corresponding to each machine position display end according to the number of the images stored in each image buffer queue.
Optionally, the larger the number of images stored in the image buffer queue, the larger the display frame rate of the corresponding machine side display end is.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
Based on the same inventive concept, an embodiment of the application further provides a simulation test platform, which can be a terminal device. As shown in fig. 6, the terminal device 600 of this embodiment includes: a processor 601, a memory 602, and a computer program 604 stored in the memory 602 and executable on the processor 601. The computer program 604 may be executed by the processor 601 to generate instructions 603, and the processor 601 may implement the steps in the embodiments of the receiving address confirmation method according to the instructions 603. Alternatively, the processor 601, when executing the computer program 604, implements the functions of each module/unit in each apparatus embodiment described above, for example, the functions of the acquisition unit 501 to the transmission unit 504 shown in fig. 5.
Illustratively, the computer program 604 may be divided into one or more modules/units, which are stored in the memory 602 and executed by the processor 601 to accomplish the present application. One or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 604 in the terminal device 600.
Those skilled in the art will appreciate that fig. 6 is only an example of the terminal device 600 and does not constitute a limitation to the terminal device 600, that the terminal device 600 may include more or less components than those shown, or some components may be combined, or different components may be included, for example, the terminal device 600 may further include an input-output device, a network access device, a bus, etc.
The Processor 601 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 602 may be an internal storage unit of the terminal device 600, such as a hard disk or a memory of the terminal device 600. The memory 602 may also be an external storage device of the terminal device 600, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the terminal device 600. Further, the memory 602 may also include both internal and external memory units of the terminal device 600. The memory 602 is used for storing computer programs and other programs and data required by the terminal device 600. The memory 602 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments are implemented.
The embodiments of the present application provide a computer program product, which when running on a server, enables the server to implement the steps in the above method embodiments when executed.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated module/unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a device/server, recording medium, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An automated driving simulation test image processing method, the method comprising:
acquiring a plurality of images of a target vehicle model controlled by an automatic driving system, acquired by each camera model in the plurality of camera models in the same time period, in a simulation environment, wherein different camera models are used for acquiring images from different point positions;
determining an image cache queue associated with each camera model according to a plurality of images acquired by each camera model, wherein one image cache queue comprises a plurality of images which are sequentially stored in the image cache queue according to a time sequence and acquired by the corresponding camera model, each image has a time stamp, and the image cache queue is a first-in first-out queue;
synchronizing the timestamps of the images of which the plurality of image buffer queues are positioned at the head of the queue at preset time intervals;
and sending at least one image stored in the image buffer queue associated with each camera model to a machine position display end corresponding to each camera model, wherein the at least one image comprises the image positioned at the head position of the image buffer queue after the time stamp synchronization.
2. The method of claim 1, wherein the time stamp is determined based on a time at which the corresponding image was received by the automated driving simulation test platform.
3. The method of claim 1, wherein said synchronizing the time stamps of said images at head-of-line positions of a plurality of said image buffer queues comprises:
determining a maximum timestamp from timestamps corresponding to the images of the image cache queues at the head of the queue;
and updating the images at the head of the image queue in each other image cache queue according to the maximum timestamp so that the time deviation degree between the timestamp of the image at the head of the image queue in each other image cache queue and the maximum timestamp is smaller than or equal to a preset threshold, wherein the other image cache queues are queues except the image cache queue corresponding to the maximum timestamp in the plurality of image cache queues.
4. The method as claimed in claim 3, wherein said updating said image at the head of line position in each other image buffer queue according to said maximum timestamp, so that the time deviation degree between the timestamp of said image at the head of line position in each said other image buffer queue and said maximum timestamp is less than or equal to a preset threshold value comprises:
determining the time deviation degree between the timestamp corresponding to the image of each other image buffer queue at the head of the queue position and the maximum timestamp;
and for the first image buffer queue with the time deviation degree larger than a preset threshold value in the other image buffer queues, adjusting the time stamp of the image of the first image buffer queue at the head of the image queue, so that the time deviation degree between the time stamp of the image of the first image buffer queue at the head of the image queue after adjustment and the maximum time stamp is smaller than or equal to the preset threshold value.
5. The method according to any one of claims 1 to 4, wherein the sending at least one image stored in the image buffer queue associated with each camera model to the corresponding machine position display end of each camera model comprises:
determining the display frame rate of the machine position display end corresponding to each image cache queue;
and sending at least one image stored in each image cache queue to the corresponding machine position display end according to the display frame rate of the machine position display end corresponding to each image cache queue.
6. The method as claimed in claim 5, wherein said determining a display frame rate of the machine side display end corresponding to each of the image buffer queues comprises:
acquiring the number of the images stored in each image cache queue;
and determining the display frame rate corresponding to each machine position display end according to the number of the images stored in each image buffer queue.
7. The method as claimed in claim 6, wherein the greater the number of the images stored in the image buffer queue, the greater the display frame rate of the machine display end.
8. An automatic driving simulation test image processing apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring a plurality of images of a target vehicle model controlled by an automatic driving system, acquired by each camera model in a plurality of camera models in the same time period, in a simulation environment, and different camera models are used for acquiring images from different point positions;
the queue determining unit is used for determining an image cache queue associated with each camera model according to a plurality of images acquired by each camera model, wherein one image cache queue comprises a plurality of images which are sequentially stored in the image cache queue according to a time sequence and acquired by the corresponding camera model, each image has a time stamp, and the image cache queue is a first-in first-out queue;
the synchronization unit is used for synchronizing the timestamps of the images of which the plurality of image cache queues are positioned at the head of the queue at intervals of preset time;
and the sending unit is used for sending at least one image stored in the image cache queue associated with each camera model to a machine position display end corresponding to each camera model, wherein the at least one image comprises the image positioned at the head position of the image cache queue after the time stamp synchronization.
9. A simulation test platform comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202210896539.4A 2022-07-27 2022-07-27 Automatic driving simulation test image processing method and device Pending CN115345771A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210896539.4A CN115345771A (en) 2022-07-27 2022-07-27 Automatic driving simulation test image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210896539.4A CN115345771A (en) 2022-07-27 2022-07-27 Automatic driving simulation test image processing method and device

Publications (1)

Publication Number Publication Date
CN115345771A true CN115345771A (en) 2022-11-15

Family

ID=83951060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210896539.4A Pending CN115345771A (en) 2022-07-27 2022-07-27 Automatic driving simulation test image processing method and device

Country Status (1)

Country Link
CN (1) CN115345771A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409397A (en) * 2023-12-15 2024-01-16 河北远东通信系统工程有限公司 Real-time portrait comparison method, device and system based on position probability

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117409397A (en) * 2023-12-15 2024-01-16 河北远东通信系统工程有限公司 Real-time portrait comparison method, device and system based on position probability
CN117409397B (en) * 2023-12-15 2024-04-09 河北远东通信系统工程有限公司 Real-time portrait comparison method, device and system based on position probability

Similar Documents

Publication Publication Date Title
US9185269B2 (en) Imaging device, information processing device, information processing method, and method for synchronizing frame data output
CN104539929B (en) Stereo-image coding method and code device with motion prediction
CN106066701B (en) A kind of AR and VR data processing equipment and method
JP2013511176A (en) Camera synchronization in multi-view session capture
CN111736169A (en) Data synchronization method, device and system
CN115345771A (en) Automatic driving simulation test image processing method and device
CN114125301B (en) Shooting delay processing method and device for virtual reality technology
CN111565298A (en) Video processing method, device, equipment and computer readable storage medium
CN113923354B (en) Video processing method and device based on multi-frame images and virtual background shooting system
CN109302567A (en) Camera image low latency synchronization system and image low latency synchronous method
CN114866829A (en) Synchronous playing control method and device
KR20080006925A (en) The method and system to get the frame data of moving shot with camera on a vehicle and the location data from location base service or gps and the direction data of the vehicle to send to server through wireless internet by real time and to be used that by another vehicle
CN114697645A (en) VR equipment testing method, device, equipment, medium and program product
CN111988535A (en) System and method for optically positioning fusion picture
KR101649754B1 (en) Control signal transmitting method in distributed system for multiview cameras and distributed system for multiview cameras
CN105991952B (en) Filming apparatus, image-reproducing method and recording medium
CN112166594A (en) Video processing method and device
CN112446961A (en) Scene reconstruction system and method
CN111093041A (en) Novel automobile and automobile image processing system thereof
JP2006025163A (en) Photographing system
CN111949114A (en) Image processing method and device and terminal
CN113992885B (en) Data synchronization method and device
CN117278733B (en) Display method and system of panoramic camera in VR head display
JP2023177556A (en) Imaging apparatus and control method for the same
CN114339157B (en) Multi-camera real-time splicing system and method with adjustable observation area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination