CN110162089A - A kind of unpiloted emulation mode and device - Google Patents
A kind of unpiloted emulation mode and device Download PDFInfo
- Publication number
- CN110162089A CN110162089A CN201910462332.4A CN201910462332A CN110162089A CN 110162089 A CN110162089 A CN 110162089A CN 201910462332 A CN201910462332 A CN 201910462332A CN 110162089 A CN110162089 A CN 110162089A
- Authority
- CN
- China
- Prior art keywords
- emulation
- data
- investigative range
- data set
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Abstract
This application discloses a kind of unpiloted emulation mode and devices, the corresponding each sensor data set of pre-stored same simulated environment is read in the method, and the investigative range of the corresponding actual vehicle sensor of each sensor data set according to the investigative range and record of emulation vehicle sensors, it is concentrated in each sensing data and chooses emulation data set, then, for each unit investigative range of the emulation vehicle sensors, in the sensing data that the emulation data set includes, determine the corresponding sensing data of unit investigative range, according to the corresponding sensing data of each unit investigative range, obtain emulation data, and emulation vehicle is emulated according to the emulation data.This method is without carrying out three-dimensional modeling, and emulation vehicle is consistent with the state of actual vehicle when without requiring emulation, improves the flexibility of emulation.The emulation data set selected simultaneously is the collected truthful data of actual vehicle institute, and the authenticity of emulation is effectively guaranteed.
Description
Technical field
This application involves unmanned technical field more particularly to a kind of unpiloted emulation mode and devices.
Background technique
Current unmanned emulation mode mainly includes two kinds, and one is three-dimensional modeling emulation, another kind is to adopt number in fact
It is emulated according to playback.
Three-dimensional modeling emulation needs to carry out 1:1 equal proportion three-dimensional modeling to by imitative region in advance, obtains the virtual imitative of three-dimensional
True environment.In simulation process, according to the current spatial position of the sensors such as camera, radar and posture, it is based on virtual environment
In three-dimensional modeling data, the collected sensing data of output transducer, and the sensing data is input to automatic Pilot
In system, simulation result is obtained.
Adopting data readback emulation in fact needs the preparatory vehicle that drives being travelled by imitative region according to projected route, then uses vehicle
The sensors such as upper camera, radar are acquired the image on driving trace, the point information such as cloud, and by collected sensor
Data are saved in database.In simulation process, sensing data is read out from database, then according to time elder generation
The sensing data of reading is successively input in automated driving system by sequence afterwards, obtains simulation result.
But three-dimensional modeling is emulated when establishing three-dimensional virtual simulation environment, if can be led using software automatic modeling
Cause the threedimensional model precision established lower, the d engine rendering effect of current main-stream is poor, according to manually modeling, then
Larger workload.In addition, the sensing data that the method is exported in emulation is excessively ideal, and actual sensing data deviation
It is larger, cause simulation result accuracy lower.
And adopted for data readback emulation mode in fact, since its sensing data used is actual vehicle opposite
(e.g., the speed of vehicle, driving trace, direction of sensor etc.) is collected in the state of fixation, therefore it is required that imitative when emulation
The state consistency of true vehicle and actual vehicle, this reduces the flexibility of emulation and validities.
Summary of the invention
The embodiment of the present application provides a kind of unpiloted emulation mode and device, solves to deposit in the prior art to part
The above problem.
The application adopts the following technical solutions:
This application provides a kind of unpiloted emulation modes, comprising:
Read the corresponding each sensor data set of pre-stored same simulated environment, wherein different sensing datas
The sensing data of concentration is that the actual vehicle under different conditions is collected in the simulated environment;
According to the corresponding actual vehicle of each sensor data set of the investigative range and record of emulation vehicle sensors
The investigative range of sensor is concentrated in each sensing data and chooses emulation data set;
For each unit investigative range of the emulation vehicle sensors, in the sensor that the emulation data set includes
In data, the corresponding sensing data of unit investigative range is determined;
According to the corresponding sensing data of each unit investigative range, emulation data are obtained, and according to the emulation
Data emulate emulation vehicle.
Optionally, for each sensor data set, it includes point cloud data and image data which, which concentrates,.
Optionally, the corresponding each sensor data set of same simulated environment is stored in advance, specifically includes:
For the different conditions of actual vehicle, obtains when actual vehicle is travelled in this state in the simulated environment and adopt
The point cloud data and image data collected;
According to the space coordinate of the point cloud data, the investigative range for the radar for collecting the point cloud data and acquisition
To the investigative range of the camera of described image data, the space coordinate of described image data is determined;
Store the point cloud data, described image data, the space coordinate of the point cloud data and described image data
Space coordinate.
Optionally, according to the space coordinate of the point cloud data, the investigative range for the radar for collecting the point cloud data
And the investigative range of the camera of described image data is collected, it determines the space coordinate of described image data, specifically includes:
For each pixel in described image data, according to the investigative range of the camera, determine that the pixel is corresponding
Space ray;
Judge that the investigative range of the corresponding space ray of the pixel and the radar whether there is intersection point;
If it exists, then the space coordinate of the corresponding point cloud data of the intersection point is determined as the space coordinate of the pixel;
Otherwise, according to actual vehicle state parameter in this state and described image data, the pixel is determined
Space coordinate.
Optionally, according to the corresponding reality of each sensor data set of the investigative range and record of emulation vehicle sensors
The investigative range of border vehicle sensors is concentrated in each sensing data and chooses emulation data set, specifically includes:
Determine the investigative range and the corresponding actual vehicle sensor of each sensor data set of emulation vehicle sensors
Investigative range registration;
According to the investigative range and the corresponding actual vehicle sensor of each sensor data set of emulation vehicle sensors
Investigative range registration, concentrated in each sensing data and choose emulation data set.
Optionally, for each unit investigative range of the emulation vehicle sensors, include in the emulation data set
Sensing data in, determine the corresponding sensing data of unit investigative range, specifically include:
Determine the investigative range for the corresponding radar of point cloud data that the emulation data set includes;
For each unit investigative range of the camera of the emulation vehicle, the corresponding sky of unit investigative range is determined
Between ray and determine radar investigative range intersection point;
Determine the space coordinate of the corresponding point cloud data of the intersection point;
In the image data that the emulation data set includes, the corresponding pixel of the space coordinate is determined, as described
The sensing data that the camera of emulation vehicle is detected in the unit investigative range.
Optionally, for each unit investigative range of the emulation vehicle sensors, include in the emulation data set
Sensing data in, determine the corresponding sensing data of unit investigative range, specifically include:
Determine the investigative range for the corresponding radar of point cloud data that the emulation data set includes;
For each unit investigative range of the radar of the emulation vehicle, the corresponding space of unit investigative range is determined
The intersection point of the investigative range of ray radar corresponding with the point cloud data that emulation data set includes;
It is detected the corresponding point cloud data of the intersection point as the radar of the emulation vehicle in the unit investigative range
Sensing data.
This application provides a kind of unpiloted simulators, comprising:
Read module, for reading the corresponding each sensor data set of pre-stored same simulated environment, wherein different
The sensing data concentrated of sensing data be that actual vehicle under different conditions is collected in the simulated environment;
Module is chosen, for according to the investigative range of emulation vehicle sensors and each sensor data set pair of record
The investigative range for the actual vehicle sensor answered is concentrated in each sensing data and chooses emulation data set;
Determining module, for each unit investigative range for the emulation vehicle sensors, in the emulation data
In the sensing data that collection includes, the corresponding sensing data of unit investigative range is determined;
Emulation module, for obtaining emulation data according to the corresponding sensing data of each unit investigative range, and
Emulation vehicle is emulated according to the emulation data.
This application provides a kind of computer readable storage medium, the storage medium is stored with computer program, described
Above-mentioned unpiloted emulation mode is realized when computer program is executed by processor.
This application provides a kind of electronic equipment, including memory, processor and storage on a memory and can handled
The computer program run on device, the processor realize above-mentioned unpiloted emulation mode when executing described program.
The application use at least one above-mentioned technical solution can reach it is following the utility model has the advantages that
As can be seen that reading the corresponding each sensor data set of pre-stored same simulated environment in from the above,
And the spy of the corresponding actual vehicle sensor of each sensor data set according to the investigative range and record of emulation vehicle sensors
Range is surveyed, is concentrated in each sensing data and chooses emulation data set, then, is visited for each unit of the emulation vehicle sensors
It surveys range and determines the corresponding sensing data of unit investigative range, root in the sensing data that the emulation data set includes
According to the corresponding sensing data of each unit investigative range, obtain emulation data, and according to the emulation data to emulation vehicle into
Row emulation.
For compared with the existing technology, unpiloted emulation mode provided by the present application, without carrying out three-dimensional modeling, and
And when emulating to emulation vehicle, emulation vehicle is consistent with the state of actual vehicle when without requiring emulation, to have
The flexibility for improving emulation of effect.At the same time, under the investigative range of given emulation vehicle sensors, it can determine
Which the corresponding collected sensor data set of actual vehicle institute of the investigative range has.That is, being passed by emulation vehicle
The selected emulation data set taken out of the investigative range of sensor is actually the collected truthful data of actual vehicle institute, so,
When being emulated to emulation vehicle, data are adopted in fact without carrying out by emulation vehicle, therefore, are greatly improving simulation efficiency
While, the authenticity of emulation is also effectively guaranteed.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present application, constitutes part of this application, this Shen
Illustrative embodiments and their description please are not constituted an undue limitation on the present application for explaining the application.In the accompanying drawings:
Fig. 1 is a kind of flow diagram of unpiloted emulation mode in the embodiment of the present application;
Fig. 2 is the signal for the space coordinate that the space ray provided by the embodiments of the present application by pixel determines the pixel
Figure;
Fig. 3 is the investigative range and each sensor data set provided by the embodiments of the present application by emulating vehicle sensors
The registration of the investigative range of corresponding actual vehicle sensor chooses the schematic diagram of emulation data set;
Fig. 4 is the investigative range provided by the embodiments of the present application by each actual vehicle sensor relative to the emulation vehicle
The priority of the investigative range of sensor chooses the schematic diagram of emulation data set;
Fig. 5 is a kind of unpiloted simulator schematic diagram provided by the present application;
Fig. 6 is the electronic equipment schematic diagram provided by the present application corresponding to Fig. 1.
Specific embodiment
To keep the purposes, technical schemes and advantages of the application clearer, below in conjunction with the application specific embodiment and
Technical scheme is clearly and completely described in corresponding attached drawing.Obviously, described embodiment is only the application one
Section Example, instead of all the embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not doing
Every other embodiment obtained under the premise of creative work out, shall fall in the protection scope of this application.
Below in conjunction with attached drawing, the technical scheme provided by various embodiments of the present application will be described in detail.
Fig. 1 is a kind of flow diagram of unpiloted emulation mode in the embodiment of the present application, specifically includes following step
It is rapid:
S101: the corresponding each sensor data set of pre-stored same simulated environment is read, wherein different sensors
Sensing data in data set is that the actual vehicle under different conditions is collected in the simulated environment.
In the embodiment of the present application, multiple sensor data sets are previously stored in database, these sensor data sets
It is collected in simulated environment by actual vehicle.Wherein, actual vehicle can pass through in the driving process of simulated environment
Radar in the actual vehicle is set and camera carries out data acquisition, is point cloud number by the collected data of radar
According to, by the collected data of camera be image data.So the corresponding sensing data concentration of actual vehicle includes this
The collected point cloud data of actual vehicle and image data.
In practical applications, different actual vehicles carry out when driving in same simulated environment, and state in which may be each
It is not identical.Here so-called state refers in simulated environment, actual vehicle itself and is mounted on each in the actual vehicle
Kind sensor state in which, the state can specifically be characterized by some parameters.For example, be arranged in actual vehicle
The travel track etc. of the placement angle of camera, the investigative range of radar, actual vehicle in simulated environment.
It can be seen, these parameters will collect which sensing data is made to actual vehicle in simulated environment
At influence, that is, travel track of the actual vehicle in simulated environment be different, the placement angle of camera that is arranged in actual vehicle
The investigative range for the radar being arranged in difference, actual vehicle is different, and collected point cloud data and image data are also different.
So for same simulated environment, can by different actual vehicles in the simulated environment collected point
Cloud data and image data respectively as in the simulated environment the collected sensor data set of institute saved.In other words
Say, a sensor data set, then correspond in fact actual vehicle with a kind of state the collected biography of institute under the simulated environment
Sensor data.
In the embodiment of the present application, simulated environment can refer to actual rings locating when actual vehicle carries out data acquisition
Border.Since actual vehicle is in the collected point cloud data of different simulated environment institutes and different (the i.e. sensor number of image data
It is different according to collecting), so, different simulated environment and different sensor data sets can be carried out to corresponding guarantor in the database
It deposits.That is, each simulated environment is corresponding with multiple sensor data sets.Based on this, emulated to emulation vehicle
When, it can first determine the emulation vehicle locating simulated environment in emulation, and read out from database pre-stored
Each sensor data set corresponding to the simulated environment, and then in the follow-up process, the spy of sensor is uploaded according to the emulation vehicle
Range is surveyed, is concentrated in these sensing datas and chooses sensor data set as emulation data set.
It should be noted that executing subject used by being emulated to emulation vehicle can be computer, dedicated emulated dress
Equal terminal devices are set, can be server.It will only be below to execute master with terminal devices such as computers for the ease of subsequent descriptions
Body is illustrated unpiloted emulation mode provided by the present application.
In the embodiment of the present application, sensing data is concentrated in addition to record has actual vehicle to collect in the simulated environment
Point cloud data and image data outside, the space coordinate of point cloud data and the space coordinate of image data can also be recorded.Tool
Body, for the different conditions of actual vehicle, the available actual vehicle of terminal device is travelled in this state in the emulation ring
Collected point cloud data and image data in border, then, can according to the space coordinate of point cloud data, acquire the point cloud data
The investigative range of the investigative range of Shi Leida and camera when acquiring the image data, determines the space of the image data
Coordinate.
Wherein, the space coordinate of point cloud data can be directly converted to by point cloud data, and specific conversion regime is existing
Some usual manners, are just not explained in detail herein.And for the space coordinate of image data, it is based on point cloud data in fact
Space coordinate, in conjunction with radar and camera investigative range converted after obtain.
Specifically, terminal device can be according to the camera of the actual vehicle for each pixel in the image data
Investigative range, determine the corresponding space ray of the pixel.Then, terminal device may determine that the corresponding space of the pixel is penetrated
The investigative range of line and the radar of the actual vehicle whether there is intersection point, and if it exists, then can be by the corresponding point cloud data of the intersection point
Space coordinate, be determined as the space coordinate of the pixel, if it does not exist, then can be according to the shape of the actual vehicle in this state
State parameter and the image data, determine the space coordinate of the pixel.
For each pixel in image data, space ray corresponding to the pixel is can be regarded as in fact to take the photograph
As head sets out for source point, the line between the pixel.After obtaining the space ray, terminal device may determine that the space is penetrated
Whether it is located at the detection model of the radar between plane locating for the investigative range of line and the radar with the presence or absence of intersection point and the intersection point
It (is in fact exactly to determine that the space ray can be beaten in the investigative range of the radar, and can beat in the radar in enclosing
On which point in investigative range), it, can be by the corresponding point cloud data of the intersection point if intersection point is located at the investigative range of the radar
Space coordinate, be determined as the space coordinate of the pixel, as shown in Figure 2.
Fig. 2 is the signal for the space coordinate that the space ray provided by the embodiments of the present application by pixel determines the pixel
Figure.
It is assumed that when terminal device determines the space coordinate of pixel A in image data, it can be using camera as source point, by this
Space ray of the line as pixel A between source point and pixel A.Then, terminal device the space ray can be beaten to
Plane where the investigative range of the radar.Figure it is seen that the space ray intersects at the B in the investigative range of radar
Point, then the space coordinate of the point cloud data corresponding to intersection points B can be determined as the space coordinate of pixel A by terminal device.
It should be noted that being caused in all the points in the investigative range of the radar not due to the characteristic of point cloud data
One establishes a capital that there are corresponding point cloud datas, so, if terminal device determines the point of intersection, there is no corresponding points
Cloud data, then terminal device can determine the space coordinate of the pixel in other way.
For example, being chosen in all the points with point cloud data that terminal device can include out of the radar investigative range
With the intersection point apart from nearest point, and the space by the space coordinate of the corresponding point cloud data of point of selection, as the pixel
Coordinate.For another example being chosen in all the points with point cloud data that terminal device can include out of the radar investigative range
With the intersection point at a distance of closer three points, and determine the center of gravity of these three points.If it is determined that there are corresponding points for the center of gravity
Cloud data, then can space coordinate by the space coordinate of the corresponding point cloud data of the center of gravity, as the pixel.Other modes exist
This does not just have been illustrated in detail.
If it is determined that intersection point is not present in the investigative range of the corresponding space ray of the pixel and the radar, then terminal device can
To pass through image analysis technology, to determine the space coordinate of the pixel.It wherein, include the pixel in acquisition due to actual vehicle
Image data when locating geographical coordinate can determine that then terminal device can be sat based on the geography of the actual vehicle
Mark, by the state parameter of actual vehicle status in the simulated environment (e.g., in actual vehicle camera putting angle
Degree, investigative range of radar etc.), image analysis is carried out to the image data, so that it is determined that going out the space coordinate of the pixel.Specifically
, terminal device can determine the depth of field of the image data, then opposite according to the cone matrix of the camera, the camera
It is based on when the excursion matrix of actual vehicle, the depth of field, the direction of the pixel and actual vehicle acquire the image data
Geographical coordinate determines the space coordinate of the pixel by image analysis.Specific process is existing routine techniques, herein
It is not explained in detail.
S102: according to the corresponding reality of each sensor data set of the investigative range and record of emulation vehicle sensors
The investigative range of vehicle sensors is concentrated in each sensing data and chooses emulation data set.
In the embodiment of the present application, each sensor data set recorded in database refers to actual vehicle in different shapes in fact
Lower collected sensing data of state, and the state of actual vehicle reflects the investigative range of actual vehicle sensor in fact.
Wherein, since the travel track of the placement angle of camera in actual vehicle, the investigative range of radar, actual vehicle determines reality
Corresponding to which pickup area in simulated environment when the progress data acquisition of border vehicle, so, actual vehicle mentioned herein passes
The investigative range of sensor refers to corresponding in simulated environment when actual vehicle carries out data acquisition according to the state adopt in fact
Collect region.
In the embodiment of the present application, multiple sensor data sets of the simulated environment, each sensing are preserved in database
The investigative range of the corresponding actual vehicle sensor of device data set is possibly different from.So being emulated to emulation vehicle
When, need to emulate the investigative range of vehicle sensors in the simulated environment according to the emulation vehicle, from these sensing datas
Concentrate selection which or which sensor data set as emulation data set.
That is, terminal device needs the investigative range according to emulation vehicle sensors, to determine choose reality
Which collected sensor data set of vehicle institute, come for being emulated to the emulation vehicle.Wherein, emulation mentioned herein
The investigative range of vehicle sensors and the investigative range of actual vehicle sensor mentioned above are essentially identical, that is, emulation vehicle
Sensor is according in the state of certain, the corresponding pickup area in simulated environment.
Specifically, terminal device can determine that the investigative range of emulation vehicle sensors is corresponding with each sensor data set
The registration of the investigative range of actual vehicle sensor, and then according to the investigative range for the emulation vehicle sensors determined and respectively
The registration of the investigative range of the corresponding actual vehicle sensor of sensor data set, each sensor number in the simulated environment
Emulation data set is chosen according to concentrating, as shown in Figure 3.
Fig. 3 is the investigative range and each sensor data set provided by the embodiments of the present application by emulating vehicle sensors
The registration of the investigative range of corresponding actual vehicle sensor chooses the schematic diagram of emulation data set.
It is assumed that there are three actual vehicle, these three actual vehicles are collected in the simulated environment according to different tracks
Three sensor data sets, three sensor data sets correspond respectively to the detection model of these three actual vehicle sensors of A, B, C
It encloses.The emulation vehicle can be characterized by the emulation travel track of vehicle, the placement angle of camera, investigative range of radar etc.
The parameter of status can determine that the investigative range of the emulation vehicle sensors is D.From figure 3, it can be seen that the emulation
Therefore the investigative range D of vehicle sensors, the registration highest with the investigative range A of actual vehicle sensor can choose reality
Sensor data set corresponding to the investigative range A of border vehicle sensors is as emulation data set.
Since the investigative range of the emulation vehicle sensors may be with the investigative ranges of multiple actual vehicle sensors
Be overlapped, so, in the embodiment of the present application, terminal device can determine respectively the investigative ranges of the emulation vehicle sensors with
Registration between the investigative range of each actual vehicle sensor, and according to each registration determined, determine each practical vehicle
Priority of the investigative range of sensor relative to the investigative range of the emulation vehicle sensors, and then according to each priority,
It is concentrated from each sensing data and chooses emulation data set, as shown in Figure 4.
Fig. 4 is the investigative range provided by the embodiments of the present application by each actual vehicle sensor relative to the emulation vehicle
The priority of the investigative range of sensor chooses the schematic diagram of emulation data set.
The investigative range of multiple actual vehicle sensors is illustrated in Fig. 4, emulates the investigative range of vehicle sensors and more
The investigative range of a actual vehicle sensor, which exists, to be overlapped, and the part surrounded for dotted line frame emulates vehicle sensors
The investigative range A of investigative range, the investigative range and actual vehicle sensor and the investigative range B of actual vehicle sensor exist
It is overlapped.Figure 4, it is seen that between the investigative range A of actual vehicle sensor and the investigative range for emulating vehicle sensors
It is overlapped more, is overlapped between the investigative range B of actual vehicle sensor and the investigative range for emulating vehicle sensors smaller.Institute
With the priority of the investigative range B of the priority ratio actual vehicle sensor of the investigative range A of actual vehicle sensor is high.
Terminal device can be by the investigative range B institute of the investigative range A of actual vehicle sensor and actual vehicle sensor
Corresponding sensor data set choose as emulation data set.That portion surrounded for dotted line frame demonstrated in Figure 4
Divide for investigative range, this part detection range can be regarded as practical in investigative range A and Fig. 4 by actual vehicle sensor
The part shadow region composition of the investigative range B of vehicle sensors.So being imitated using the emulation data set of selection
When true data, terminal device can first the sensor data set according to corresponding to the investigative range A of actual vehicle sensor be determined
A part emulation data out, then, corresponding to the part shadow region further according to the investigative range B of actual vehicle sensor
Sensor data set is determined to emulate data accordingly.
That is, terminal device is corresponding according to the investigative range of emulation vehicle sensors and each sensor data set
Registration and each priority between the investigative range of actual vehicle sensor, subsequent need based on which practical vehicle determined
Sensor data set corresponding to the investigative range of sensor, required emulation number when being emulated to be combined into emulation vehicle
According to collection.
S103: for each unit investigative range of the emulation vehicle sensors, include in the emulation data set
In sensing data, the corresponding sensing data of unit investigative range is determined.
It, can be for the emulation vehicle sensors installed on the emulation vehicle after terminal device selects emulation data set
Each unit investigative range determines the corresponding biography of unit investigative range in the sensing data that emulation data set includes
Sensor data.Here it is mainly based upon the emulation data set selected, if to determine with the spy of the emulation vehicle sensors
It surveys range and carries out data acquisition, it should which specific data can be collected.
It is provided with radar and camera on emulation vehicle, radar and camera have respectively corresponded respective investigative range, institute
With terminal device needs to determine that the unit of sensing data corresponding to the unit investigative range of radar and camera is visited respectively
Survey sensing data corresponding to range.
Specifically, in the embodiment of the present application, record has each sensor data set and each actual vehicle to sense in database
Corresponding relationship between the investigative range of device, so, terminal device can determine the point cloud data institute that emulation data set includes
The investigative range of corresponding radar.Then, terminal device can be directed to each unit investigative range of the radar of the emulation vehicle,
Determine the detection of the corresponding space ray of unit investigative range radar corresponding with the point cloud data that emulation data set includes
The intersection point of range, and then can be visited using the corresponding point cloud data of the intersection point as the radar of the emulation vehicle in the unit investigative range
The sensing data measured.
The corresponding space ray of unit investigative range mentioned herein, it is basic with the space ray mentioned in step S101
It is upper identical, that is, can be using the radar as source point, the source point to the line between the unit investigative range is unit detection
The corresponding space ray of range.It is corresponding that the space ray can be beaten the point cloud data for including to emulation data set by terminal device
The investigative range of radar, to determine that the space ray plays the detection for the corresponding radar of point cloud data for including in emulation data set
Which point upper (that is, intersection point) of range, so can the corresponding point cloud data of the intersection point be determined as the radar of the emulation vehicle and exist
The sensing data that the unit investigative range detects.
For the camera on emulation vehicle, the point cloud data institute that terminal device determines that emulation data set includes is right
After the investigative range for the radar answered, the unit can be determined for each unit investigative range of the camera of emulation vehicle
The intersection point of the corresponding space ray of investigative range and the investigative range for the radar determined.Terminal device can be determined further
The space coordinate of the corresponding point cloud data of the intersection point out, and in the image data that the emulation data set includes, determine the sky
Between pixel corresponding to coordinate, and then the camera for using the pixel as the emulation vehicle is detected in the unit investigative range
Sensing data.
Wherein, space ray corresponding to the unit investigative range of the camera of emulation vehicle mentioned herein, and it is above-mentioned
The space ray mentioned is substantially the same, that is, can be regarded as detecting using the camera as source point from the source point directive unit
The ray of range is the space ray.Terminal device can further determine that the space ray is directed toward the detection of the radar
Which point upper (that is, intersection point) in range.Since the space coordinate of point cloud data and point cloud data can be by way of conventional
Mutually converted, so, terminal device can directly determine out the space coordinate of point cloud data corresponding to the intersection point.
After terminal device determines the space coordinate of point cloud data corresponding to intersection point, can include in emulation data set
In image data, the corresponding pixel of the space coordinate is determined, and then use the pixel as the camera of the emulation vehicle at this
The sensing data that unit investigative range detects.In other words, terminal device is according to corresponding cloud number of the intersection point in fact
According to space coordinate, determine which pixel that the intersection point corresponds in the image data, and use the pixel as emulation vehicle
Camera institute's collected pixel when carrying out Image Acquisition to actual environment according to the unit investigative range.
It should be noted that the unit investigative range of radar and the unit investigative range of camera can pass through preset stroke
Point mode is divided, and is obtained for example, being divided equally by angle, or is divided to obtain by specified unit area, herein
Just do not have been illustrated in detail.
S104: according to the corresponding sensing data of each unit investigative range, emulation data are obtained, and according to described
Emulation data emulate emulation vehicle.
It, can be by these sensing datas after terminal device determines sensing data corresponding to constituent parts investigative range
Emulation vehicle is emulated as emulation data, and based on these emulation data.
As can be seen that terminal device can be adopted under different conditions by pre-stored actual vehicle in from the above
The sensor data set collected, if to determine that the emulation vehicle is in actual environment according to the investigative range of emulation vehicle sensors
It is middle by which collected point cloud data and image data, and then using these data determined as emulation data to the emulation
Vehicle is emulated.For compared with the existing technology, unpiloted emulation mode provided by the present application is built without carrying out three-dimensional
Mould, also, when being emulated to emulation vehicle, emulation vehicle is consistent with the state of actual vehicle when without requiring emulation,
To effectively raise the flexibility of emulation.
At the same time, under the investigative range of given emulation vehicle sensors, it can determine that the investigative range is corresponding
The collected sensor data set of actual vehicle institute which has.That is, by provided by the present application unpiloted imitative
When true method emulates emulation vehicle, the emulation data set of taking-up selected by the investigative range based on emulation vehicle sensors is
For the collected truthful data of actual vehicle, so, without by emulation vehicle carry out it is real adopt data, therefore, be greatly improved
Simulation efficiency, while being also effectively guaranteed the authenticity of emulation.
The above are the unpiloted emulation modes that one or more embodiments of the application provide, and are thought based on same
Road, present invention also provides corresponding unpiloted simulators, as shown in Figure 5.
Fig. 5 is a kind of unpiloted simulator schematic diagram provided by the present application, is specifically included:
Read module 501, for reading the corresponding each sensor data set of pre-stored same simulated environment, wherein
The sensing data that different sensing datas is concentrated is that the actual vehicle under different conditions collects in the simulated environment
's;
Module 502 is chosen, for according to the investigative range of emulation vehicle sensors and each sensing data of record
The investigative range for collecting corresponding actual vehicle sensor is concentrated in each sensing data and chooses emulation data set;
Determining module 503, for each unit investigative range for the emulation vehicle sensors, in the emulation number
In the sensing data for including according to collection, the corresponding sensing data of unit investigative range is determined;
Emulation module 504, for obtaining emulation number according to the corresponding sensing data of each unit investigative range
According to, and emulation vehicle is emulated according to the emulation data.
Optionally, for each sensor data set, it includes point cloud data and image data which, which concentrates,.
Optionally, described device further include:
Memory module 505 obtains actual vehicle and is travelled in this state in institute for being directed to the different conditions of actual vehicle
Collected point cloud data and image data when stating in simulated environment;According to the space coordinate of the point cloud data, collect institute
It states the investigative range of the radar of point cloud data and collects the investigative range of the camera of described image data, determine the figure
As the space coordinate of data;Store the point cloud data, described image data, the space coordinate of the point cloud data and the figure
As the space coordinate of data.
Optionally, the memory module 505 is specifically used for, for each pixel in described image data, according to described
The investigative range of camera determines the corresponding space ray of the pixel;Judge the corresponding space ray of the pixel and the radar
Investigative range whether there is intersection point;If it exists, then by the space coordinate of the corresponding point cloud data of the intersection point, it is determined as the picture
The space coordinate of element;Otherwise, according to actual vehicle state parameter in this state and described image data, determining should
The space coordinate of pixel.
Optionally, it chooses module 502 to be specifically used for, determines the investigative range for emulating vehicle sensors and each sensor
The registration of the investigative range of the corresponding actual vehicle sensor of data set;According to the investigative range of emulation vehicle sensors and institute
The registration for stating the investigative range of the corresponding actual vehicle sensor of each sensor data set is concentrated in each sensing data
Choose emulation data set.
Optionally it is determined that module 503 is specifically used for, the corresponding radar of point cloud data that the emulation data set includes is determined
Investigative range;For each unit investigative range of the camera of the emulation vehicle, determine that the unit investigative range is corresponding
Space ray and determine radar investigative range intersection point;Determine the space coordinate of the corresponding point cloud data of the intersection point;
In the image data that the emulation data set includes, the corresponding pixel of the space coordinate is determined, as the emulation vehicle
The sensing data that is detected in the unit investigative range of camera.
Optionally it is determined that module 503 is specifically used for, the corresponding radar of point cloud data that the emulation data set includes is determined
Investigative range;For each unit investigative range of the radar of the emulation vehicle, determine that the unit investigative range is corresponding
The intersection point of the investigative range of space ray radar corresponding with the point cloud data that emulation data set includes;The intersection point is corresponding
The sensing data that point cloud data is detected as the radar of the emulation vehicle in the unit investigative range.
The embodiment of the present application also provides a kind of computer readable storage medium, which is stored with computer journey
Sequence, computer program can be used for executing the unpiloted emulation mode that above-mentioned Fig. 1 is provided.
The embodiment of the present application also provides the schematic configuration diagrams of electronic equipment shown in fig. 6.As described in Figure 6, in hardware layer
Face, the electronic equipment include processor, internal bus, network interface, memory and nonvolatile memory, are also possible to wrap certainly
Include hardware required for other business.Processor is right into memory from corresponding computer program is read in nonvolatile memory
After run, to realize unpiloted emulation mode described in above-mentioned Fig. 1.Certainly, other than software realization mode, this explanation
Other implementations, such as logical device or the mode of software and hardware combining etc. is not precluded in book, that is to say, that following processing
The executing subject of process is not limited to each logic unit, is also possible to hardware or logical device.
In the 1990s, the improvement of a technology can be distinguished clearly be on hardware improvement (for example,
Improvement to circuit structures such as diode, transistor, switches) or software on improvement (improvement for method flow).So
And with the development of technology, the improvement of current many method flows can be considered as directly improving for hardware circuit.
Designer nearly all obtains corresponding hardware circuit by the way that improved method flow to be programmed into hardware circuit.Cause
This, it cannot be said that the improvement of a method flow cannot be realized with hardware entities module.For example, programmable logic device
(Programmable Logic Device, PLD) (such as field programmable gate array (Field Programmable Gate
Array, FPGA)) it is exactly such a integrated circuit, logic function determines device programming by user.By designer
Voluntarily programming comes a digital display circuit " integrated " on a piece of PLD, designs and makes without asking chip maker
Dedicated IC chip.Moreover, nowadays, substitution manually makes IC chip, this programming is also used instead mostly " is patrolled
Volume compiler (logic compiler) " software realizes that software compiler used is similar when it writes with program development,
And the source code before compiling also write by handy specific programming language, this is referred to as hardware description language
(Hardware Description Language, HDL), and HDL is also not only a kind of, but there are many kind, such as ABEL
(Advanced Boolean Expression Language)、AHDL(Altera Hardware Description
Language)、Confluence、CUPL(Cornell University Programming Language)、HDCal、JHDL
(Java Hardware Description Language)、Lava、Lola、MyHDL、PALASM、RHDL(Ruby
Hardware Description Language) etc., VHDL (Very-High-Speed is most generally used at present
Integrated Circuit Hardware Description Language) and Verilog.Those skilled in the art also answer
This understands, it is only necessary to method flow slightly programming in logic and is programmed into integrated circuit with above-mentioned several hardware description languages,
The hardware circuit for realizing the logical method process can be readily available.
Controller can be implemented in any suitable manner, for example, controller can take such as microprocessor or processing
The computer for the computer readable program code (such as software or firmware) that device and storage can be executed by (micro-) processor can
Read medium, logic gate, switch, specific integrated circuit (Application Specific Integrated Circuit,
ASIC), the form of programmable logic controller (PLC) and insertion microcontroller, the example of controller includes but is not limited to following microcontroller
Device: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20 and Silicone Labs C8051F320 are deposited
Memory controller is also implemented as a part of the control logic of memory.It is also known in the art that in addition to
Pure computer readable program code mode is realized other than controller, can be made completely by the way that method and step is carried out programming in logic
Controller is obtained to come in fact in the form of logic gate, switch, specific integrated circuit, programmable logic controller (PLC) and insertion microcontroller etc.
Existing identical function.Therefore this controller is considered a kind of hardware component, and to including for realizing various in it
The device of function can also be considered as the structure in hardware component.Or even, it can will be regarded for realizing the device of various functions
For either the software module of implementation method can be the structure in hardware component again.
System, device, module or the unit that above-described embodiment illustrates can specifically realize by computer chip or entity,
Or it is realized by the product with certain function.It is a kind of typically to realize that equipment is computer.Specifically, computer for example may be used
Think personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media play
It is any in device, navigation equipment, electronic mail equipment, game console, tablet computer, wearable device or these equipment
The combination of equipment.
For convenience of description, it is divided into various units when description apparatus above with function to describe respectively.Certainly, implementing this
The function of each unit can be realized in the same or multiple software and or hardware when application.
It should be understood by those skilled in the art that, the embodiment of the present invention can provide as method, system or computer program
Product.Therefore, complete hardware embodiment, complete software embodiment or reality combining software and hardware aspects can be used in the present invention
Apply the form of example.Moreover, it wherein includes the computer of computer usable program code that the present invention, which can be used in one or more,
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) produces
The form of product.
The present invention be referring to according to the method for the embodiment of the present invention, the process of equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that every one stream in flowchart and/or the block diagram can be realized by computer program instructions
The combination of process and/or box in journey and/or box and flowchart and/or the block diagram.It can provide these computer programs
Instruct the processor of general purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to produce
A raw machine, so that being generated by the instruction that computer or the processor of other programmable data processing devices execute for real
The device for the function of being specified in present one or more flows of the flowchart and/or one or more blocks of the block diagram.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works, so that it includes referring to that instruction stored in the computer readable memory, which generates,
Enable the manufacture of device, the command device realize in one box of one or more flows of the flowchart and/or block diagram or
The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device, so that counting
Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, thus in computer or
The instruction executed on other programmable devices is provided for realizing in one or more flows of the flowchart and/or block diagram one
The step of function of being specified in a box or multiple boxes.
In a typical configuration, calculating equipment includes one or more processors (CPU), input/output interface, net
Network interface and memory.
Memory may include the non-volatile memory in computer-readable medium, random access memory (RAM) and/or
The forms such as Nonvolatile memory, such as read-only memory (ROM) or flash memory (flash RAM).Memory is computer-readable medium
Example.
Computer-readable medium includes permanent and non-permanent, removable and non-removable media can be by any method
Or technology come realize information store.Information can be computer readable instructions, data structure, the module of program or other data.
The example of the storage medium of computer includes, but are not limited to phase change memory (PRAM), static random access memory (SRAM), moves
State random access memory (DRAM), other kinds of random access memory (RAM), read-only memory (ROM), electric erasable
Programmable read only memory (EEPROM), flash memory or other memory techniques, read-only disc read only memory (CD-ROM) (CD-ROM),
Digital versatile disc (DVD) or other optical storage, magnetic cassettes, tape magnetic disk storage or other magnetic storage devices
Or any other non-transmission medium, can be used for storage can be accessed by a computing device information.As defined in this article, it calculates
Machine readable medium does not include temporary computer readable media (transitory media), such as the data-signal and carrier wave of modulation.
It should also be noted that, the terms "include", "comprise" or its any other variant are intended to nonexcludability
It include so that the process, method, commodity or the equipment that include a series of elements not only include those elements, but also to wrap
Include other elements that are not explicitly listed, or further include for this process, method, commodity or equipment intrinsic want
Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including described want
There is also other identical elements in the process, method of element, commodity or equipment.
It will be understood by those skilled in the art that embodiments herein can provide as method, system or computer program product.
Therefore, complete hardware embodiment, complete software embodiment or embodiment combining software and hardware aspects can be used in the application
Form.It is deposited moreover, the application can be used to can be used in the computer that one or more wherein includes computer usable program code
The shape for the computer program product implemented on storage media (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.)
Formula.
The application can describe in the general context of computer-executable instructions executed by a computer, such as program
Module.Generally, program module includes routines performing specific tasks or implementing specific abstract data types, programs, objects, group
Part, data structure etc..The application can also be practiced in a distributed computing environment, in these distributed computing environments, by
Task is executed by the connected remote processing devices of communication network.In a distributed computing environment, program module can be with
In the local and remote computer storage media including storage equipment.
All the embodiments in this specification are described in a progressive manner, same and similar portion between each embodiment
Dividing may refer to each other, and each embodiment focuses on the differences from other embodiments.Especially for system reality
For applying example, since it is substantially similar to the method embodiment, so being described relatively simple, related place is referring to embodiment of the method
Part explanation.
The above description is only an example of the present application, is not intended to limit this application.For those skilled in the art
For, various changes and changes are possible in this application.All any modifications made within the spirit and principles of the present application are equal
Replacement, improvement etc., should be included within the scope of the claims of this application.
Claims (10)
1. a kind of unpiloted emulation mode, which is characterized in that the described method includes:
Read the corresponding each sensor data set of pre-stored same simulated environment, wherein different sensing datas is concentrated
Sensing data be that actual vehicle under different conditions is collected in the simulated environment;
According to the corresponding actual vehicle sensing of each sensor data set of the investigative range of emulation vehicle sensors and record
The investigative range of device is concentrated in each sensing data and chooses emulation data set;
For each unit investigative range of the emulation vehicle sensors, in the sensing data that the emulation data set includes
In, determine the corresponding sensing data of unit investigative range;
According to the corresponding sensing data of each unit investigative range, emulation data are obtained, and according to the emulation data
Emulation vehicle is emulated.
2. the method as described in claim 1, which is characterized in that be directed to each sensor data set, which concentrates
Include point cloud data and image data.
3. method according to claim 2, which is characterized in that the corresponding each sensing data of same simulated environment is stored in advance
Collection, specifically includes:
For the different conditions of actual vehicle, obtains when actual vehicle is travelled in this state in the simulated environment and collect
Point cloud data and image data;
According to the space coordinate of the point cloud data, the investigative range for the radar for collecting the point cloud data and collect institute
The investigative range for stating the camera of image data determines the space coordinate of described image data;
Store the space of the point cloud data, described image data, the space coordinate of the point cloud data and described image data
Coordinate.
4. method as claimed in claim 3, which is characterized in that according to the space coordinate of the point cloud data, collect it is described
The investigative range of the radar of point cloud data and collect described image data camera investigative range, determine described image
The space coordinate of data, specifically includes:
For each pixel in described image data, according to the investigative range of the camera, the corresponding sky of the pixel is determined
Between ray;
Judge that the investigative range of the corresponding space ray of the pixel and the radar whether there is intersection point;
If it exists, then the space coordinate of the corresponding point cloud data of the intersection point is determined as the space coordinate of the pixel;
Otherwise, according to actual vehicle state parameter in this state and described image data, the sky of the pixel is determined
Between coordinate.
5. the method as described in claim 1, which is characterized in that according to the institute of the investigative range of emulation vehicle sensors and record
The investigative range for stating the corresponding actual vehicle sensor of each sensor data set is concentrated in each sensing data and chooses emulation
Data set specifically includes:
Determine the investigative range of emulation vehicle sensors and the spy of the corresponding actual vehicle sensor of each sensor data set
Survey the registration of range;
According to the spy of the investigative range of emulation vehicle sensors and the corresponding actual vehicle sensor of each sensor data set
The registration for surveying range is concentrated in each sensing data and chooses emulation data set.
6. method as claimed in claim 3, which is characterized in that detect model for each unit of the emulation vehicle sensors
It encloses, in the sensing data that the emulation data set includes, determines the corresponding sensing data of unit investigative range, specifically
Include:
Determine the investigative range for the corresponding radar of point cloud data that the emulation data set includes;
For each unit investigative range of the camera of the emulation vehicle, determine that the corresponding space of unit investigative range is penetrated
The intersection point of the investigative range of line and the radar determined;
Determine the space coordinate of the corresponding point cloud data of the intersection point;
In the image data that the emulation data set includes, the corresponding pixel of the space coordinate is determined, as the emulation
The sensing data that the camera of vehicle is detected in the unit investigative range.
7. method as claimed in claim 3, which is characterized in that detect model for each unit of the emulation vehicle sensors
It encloses, in the sensing data that the emulation data set includes, determines the corresponding sensing data of unit investigative range, specifically
Include:
Determine the investigative range for the corresponding radar of point cloud data that the emulation data set includes;
For each unit investigative range of the radar of the emulation vehicle, the corresponding space ray of unit investigative range is determined
The intersection point of the investigative range of radar corresponding with the point cloud data that emulation data set includes;
The biography that the corresponding point cloud data of the intersection point is detected as the radar of the emulation vehicle in the unit investigative range
Sensor data.
8. a kind of unpiloted simulator, which is characterized in that described device includes:
Read module, for reading the corresponding each sensor data set of pre-stored same simulated environment, wherein different biographies
Sensing data in sensor data set is that the actual vehicle under different conditions is collected in the simulated environment;
Module is chosen, for corresponding according to the investigative range of emulation vehicle sensors and each sensor data set of record
The investigative range of actual vehicle sensor is concentrated in each sensing data and chooses emulation data set;
Determining module, for each unit investigative range for the emulation vehicle sensors, in the emulation data set packet
In the sensing data contained, the corresponding sensing data of unit investigative range is determined;
Emulation module, for according to the corresponding sensing data of each unit investigative range, obtaining emulation data, and according to
The emulation data emulate emulation vehicle.
9. a kind of computer readable storage medium, which is characterized in that the storage medium is stored with computer program, the calculating
The claims 1-7 any method is realized when machine program is executed by processor.
10. a kind of electronic equipment including memory, processor and stores the calculating that can be run on a memory and on a processor
Machine program, which is characterized in that the processor realizes the claims 1-7 any method when executing described program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910462332.4A CN110162089B (en) | 2019-05-30 | 2019-05-30 | Unmanned driving simulation method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910462332.4A CN110162089B (en) | 2019-05-30 | 2019-05-30 | Unmanned driving simulation method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110162089A true CN110162089A (en) | 2019-08-23 |
CN110162089B CN110162089B (en) | 2020-11-03 |
Family
ID=67630094
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910462332.4A Active CN110162089B (en) | 2019-05-30 | 2019-05-30 | Unmanned driving simulation method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110162089B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110908387A (en) * | 2019-12-13 | 2020-03-24 | 齐鲁工业大学 | Method, medium and electronic device for planning paths of unmanned surface vehicle in dynamic environment |
CN111324945A (en) * | 2020-01-20 | 2020-06-23 | 北京百度网讯科技有限公司 | Sensor scheme determination method, device, equipment and storage medium |
CN111339649A (en) * | 2020-02-20 | 2020-06-26 | 西南交通大学 | Simulation method, system and equipment for collecting vehicle track data |
CN112061131A (en) * | 2020-11-13 | 2020-12-11 | 奥特酷智能科技(南京)有限公司 | Road data-based method for driving simulated vehicle to avoid obstacles |
CN113650616A (en) * | 2021-07-20 | 2021-11-16 | 武汉光庭信息技术股份有限公司 | Vehicle behavior prediction method and system based on collected data |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101825442A (en) * | 2010-04-30 | 2010-09-08 | 北京理工大学 | Mobile platform-based color laser point cloud imaging system |
US20120253580A1 (en) * | 2011-03-28 | 2012-10-04 | Al-Mahnna Khaled Abdullah M | GPS Navigation System |
CN103049912A (en) * | 2012-12-21 | 2013-04-17 | 浙江大学 | Random trihedron-based radar-camera system external parameter calibration method |
CN103226833A (en) * | 2013-05-08 | 2013-07-31 | 清华大学 | Point cloud data partitioning method based on three-dimensional laser radar |
CN108509918A (en) * | 2018-04-03 | 2018-09-07 | 中国人民解放军国防科技大学 | Target detection and tracking method fusing laser point cloud and image |
CN109211575A (en) * | 2017-07-05 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | Pilotless automobile and its field test method, apparatus and readable medium |
-
2019
- 2019-05-30 CN CN201910462332.4A patent/CN110162089B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101825442A (en) * | 2010-04-30 | 2010-09-08 | 北京理工大学 | Mobile platform-based color laser point cloud imaging system |
US20120253580A1 (en) * | 2011-03-28 | 2012-10-04 | Al-Mahnna Khaled Abdullah M | GPS Navigation System |
CN103049912A (en) * | 2012-12-21 | 2013-04-17 | 浙江大学 | Random trihedron-based radar-camera system external parameter calibration method |
CN103226833A (en) * | 2013-05-08 | 2013-07-31 | 清华大学 | Point cloud data partitioning method based on three-dimensional laser radar |
CN109211575A (en) * | 2017-07-05 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | Pilotless automobile and its field test method, apparatus and readable medium |
CN108509918A (en) * | 2018-04-03 | 2018-09-07 | 中国人民解放军国防科技大学 | Target detection and tracking method fusing laser point cloud and image |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110908387A (en) * | 2019-12-13 | 2020-03-24 | 齐鲁工业大学 | Method, medium and electronic device for planning paths of unmanned surface vehicle in dynamic environment |
CN111324945A (en) * | 2020-01-20 | 2020-06-23 | 北京百度网讯科技有限公司 | Sensor scheme determination method, device, equipment and storage medium |
CN111324945B (en) * | 2020-01-20 | 2023-09-26 | 阿波罗智能技术(北京)有限公司 | Sensor scheme determining method, device, equipment and storage medium |
US11953605B2 (en) | 2020-01-20 | 2024-04-09 | Beijing Baidu Netcom Science Technology Co., Ltd. | Method, device, equipment, and storage medium for determining sensor solution |
CN111339649A (en) * | 2020-02-20 | 2020-06-26 | 西南交通大学 | Simulation method, system and equipment for collecting vehicle track data |
CN112061131A (en) * | 2020-11-13 | 2020-12-11 | 奥特酷智能科技(南京)有限公司 | Road data-based method for driving simulated vehicle to avoid obstacles |
CN113650616A (en) * | 2021-07-20 | 2021-11-16 | 武汉光庭信息技术股份有限公司 | Vehicle behavior prediction method and system based on collected data |
Also Published As
Publication number | Publication date |
---|---|
CN110162089B (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110162089A (en) | A kind of unpiloted emulation mode and device | |
CN110057352A (en) | A kind of camera attitude angle determines method and device | |
CN107976193A (en) | A kind of pedestrian's flight path estimating method, device, flight path infer equipment and storage medium | |
CN104781849A (en) | Fast initialization for monocular visual simultaneous localization and mapping (SLAM) | |
CN110245552A (en) | Interaction processing method, device, equipment and the client of vehicle damage image taking | |
CN110793548B (en) | Navigation simulation test system based on virtual-real combination of GNSS receiver hardware in loop | |
US20200082624A1 (en) | Providing augmented reality in a web browser | |
CN109074497A (en) | Use the activity in depth information identification sequence of video images | |
CN109186596B (en) | IMU measurement data generation method, system, computer device and readable storage medium | |
CN108286976A (en) | The fusion method and device and hybrid navigation system of a kind of point cloud data | |
CN108235809A (en) | End cloud combination positioning method and device, electronic equipment and computer program product | |
CN107656961A (en) | A kind of method for information display and device | |
CN107832331A (en) | Generation method, device and the equipment of visualized objects | |
CN107084740A (en) | A kind of air navigation aid and device | |
CN104913775A (en) | Method for measuring height of transmission line of unmanned aerial vehicle and method and device for positioning unmanned aerial vehicle | |
CN103076023A (en) | Method and device for calculating step | |
CN112987593A (en) | Visual positioning hardware-in-the-loop simulation platform and simulation method | |
CN110530398B (en) | Method and device for detecting precision of electronic map | |
CN109190674A (en) | The generation method and device of training data | |
CN110058684A (en) | A kind of geography information exchange method, system and storage medium based on VR technology | |
CN110008387A (en) | Flow-field visualized implementation method, device and electronic equipment | |
CN111127661B (en) | Data processing method and device and electronic equipment | |
CN106226740B (en) | Far field sonic location system and method | |
CN104913776B (en) | Unmanned plane localization method and device | |
WO2020106508A1 (en) | Experience driven development of mixed reality devices with immersive feedback |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |