CN109635870A - Data processing method and device - Google Patents
Data processing method and device Download PDFInfo
- Publication number
- CN109635870A CN109635870A CN201811513244.4A CN201811513244A CN109635870A CN 109635870 A CN109635870 A CN 109635870A CN 201811513244 A CN201811513244 A CN 201811513244A CN 109635870 A CN109635870 A CN 109635870A
- Authority
- CN
- China
- Prior art keywords
- data
- frame
- type
- sensor
- fused
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
Abstract
The embodiment of the present application discloses data processing method and device.One specific embodiment of this method includes: to obtain sensing data to be fused, sensing data to be fused includes: the respective collected sensing data of the sensor of each of sensor of multiple types on vehicle type, wherein, the collected sensing data of the sensor of a type includes: multiframe sensing data;Relevance based on acquisition time, generate multiframe fused data, and generate each corresponding data file of frame fused data in multiframe fused data, wherein a frame fused data includes: the respective collected at least frame sensor data of sensor of each type.Realize that associated multiframe sensing data is integrated into a frame fused data on acquisition time by different types of, each corresponding data file of frame fused data is generated, the data file of generation can satisfy the demand for needing the research and development task of each type using sensing data.
Description
Technical field
This application involves computer fields, and in particular to vehicular field more particularly to data processing method and device.
Background technique
In the development project of vehicle, such as mark dyspoiesis object identification model of running environment emulation, barrier
Many research and development tasks of training sample are both needed to the collected sensing data of sensor using vehicle.
Current some data being made of the collected sensing data of the sensor of vehicle for being supplied to third party and using
It concentrates, there is also sensing datas, and tissue is not carried out in the way of being suitable for task, some types that lack required by task
The problems such as data, and then it is unable to satisfy the demand of research and development task.
Summary of the invention
The embodiment of the present application provides data processing method and device.
In a first aspect, the embodiment of the present application provides data processing method, this method comprises: obtaining sensor to be fused
Data, sensing data to be fused includes: the sensor of each of sensor of multiple types on vehicle type is each
From collected sensing data, wherein the collected sensing data of the sensor of a type includes: multiframe sensor number
According to;Relevance based on acquisition time generates multiframe fused data, and each frame generated in multiframe fused data merges number
According to corresponding data file, wherein a frame fused data include: each type sensor respectively it is collected at least
One frame sensor data.
Second aspect, the embodiment of the present application provide data processing equipment, which includes: acquiring unit, are configured as
Sensing data to be fused is obtained, sensing data to be fused includes: every in the sensor of multiple types on vehicle
The respective collected sensing data of the sensor of one type, wherein the collected sensor number of the sensor of a type
According to including: multiframe sensing data;Processing unit is configured as the relevance based on acquisition time, generates multiframe fused data,
And each corresponding data file of frame fused data in generation multiframe fused data, wherein a frame fused data packet
It includes: the respective collected at least frame sensor data of the sensor of each type.
Data processing method and device provided by the embodiments of the present application, by obtaining sensing data to be fused, wait melt
The sensing data of conjunction includes: the sensor of each of sensor of multiple types on vehicle type is respectively collected
Sensing data, wherein the collected sensing data of the sensor of a type includes: multiframe sensing data;Based on adopting
Collect the relevance of time, each frame fused data generated in multiframe fused data, and generation multiframe fused data is respectively right
The data file answered, wherein a frame fused data includes: the respective collected at least frame sensing of sensor of each type
Device data.It realizes different types of the associated multiframe sensing data on acquisition time and is integrated into frame fusion number
According to generating each corresponding data file of frame fused data, the data file of generation, which can satisfy, to be needed to utilize sensor
The demand of the research and development task of each type of data.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 shows the exemplary system architecture for being suitable for being used to realize the embodiment of the present application;
Fig. 2 shows the flow charts according to one embodiment of the data processing method of the application;
Fig. 3 shows the structural schematic diagram of one embodiment of the data processing equipment according to the application;
Fig. 4 is adapted for the structural schematic diagram for the computer system for realizing the server of the embodiment of the present application.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to
Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 shows the exemplary system architecture for being suitable for being used to realize the embodiment of the present application.
As shown in Figure 1, system architecture may include vehicle 101, network 102, server 103.Network 102 is wireless network
Network.
Vehicle 101 can be the automatic driving vehicle with automatic Pilot ability.Vehicle 101 has sensing system, vehicle
101 sensing system includes but is not limited to: camera, global positioning system (GPS), Inertial Measurement Unit (IMU), millimeter
Wave radar, laser radar (Light Detection And Ranging, abbreviation LIDAR).Laser radar often rotates a circle, can
With scanning to a frame laser point cloud, laser radar rotates a circle the laser point one frame laser point cloud of composition scan.Acquisition swashs
The laser point cloud data of luminous point cloud includes: coordinate of the laser point under laser radar coordinate system.
The collected sensing data of sensor on vehicle 101 in the process of moving can be sent to clothes by vehicle 101
Business device 103.
Referring to FIG. 2, it illustrates the processes according to one embodiment of the data processing method of the application.The application is real
Applying information-pushing method provided by example can be executed by server (server 103 of example as shown in figure 1).This method includes following
Step:
Step 201, sensing data to be fused is obtained.
In the present embodiment, the collected data of the sensor of any one type on vehicle can be referred to as to sense
Device data.Sensing data to be fused includes: the sensor of each of sensor of multiple types on vehicle type
Respective collected sensing data.
In the present embodiment, the type of a frame sensor data according to acquire the frame sensor data sensor type
It determines.
For example, the collected frame image of a camera can be referred to as the collected frame sensor number of camera
According to.The collected frame data of one millimetre-wave radar can be referred to as the collected frame sensor data of millimetre-wave radar.
For laser radar, laser radar often rotates a circle, and can scan to a frame laser point cloud, and the laser radar scanning that rotates a circle is arrived
Laser point constitute a frame laser point cloud, the laser point cloud data of the collected frame laser point cloud of a laser radar can claim
Be the collected frame sensor data of laser radar.
In the present embodiment, any one on vehicle when the quantity of the sensor of a type on vehicle is multiple
Any one frame sensor data of a sensor acquisition for belonging to the type can be referred to as to be acquired by the sensor of the type
The frame sensor data arrived.
In the present embodiment, the collected sensing of sensor of a type in the sensor of multiple types on vehicle
Device data include: the collected multiframe sensing data of sensor by the type.
In the present embodiment, sensing data to be fused can collect for the sensor of vehicle in a period of time
All the sensors data.For example, sensing data to be fused include vehicle in one day driving process by the biography of vehicle
The respective collected sensing data of the sensor of the collected multiple types by vehicle of sensor.
Step 202, based on the relevance of acquisition time, multiframe fused data is generated, and is generated in multiframe fused data
Each corresponding data file of frame fused data.
In the present embodiment, it is based on acquisition time relevance, generates multiframe fused data.One frame fused data includes: every
The respective collected at least frame sensor data of the sensor of one type.
In the present embodiment, when generating a frame fused data, it can determine first in the frame fused data and be used as base
A quasi- at least frame sensor data.Frame sensor data as benchmark are by can be minimum by frequency acquisition in all types
Type a sensor acquisition.When the quantity of the sensor of the smallest type of frequency acquisition in all types is multiple
When, the sensing data as benchmark is multiframe.The timestamp of one frame laser point cloud data can be used as the frame laser point cloud number
According to acquisition time.The timestamp of one frame millimetre-wave radar data can be used as the frame millimetre-wave radar data acquisition time,
The timestamp of one frame image can be used as the acquisition time of the frame image.In the relevance based on acquisition time, generates the frame and melt
When closing data, the sensing data for each frame as benchmark can be found out on acquisition time and on the basis of the frame
The neighbouring multiframe sensing data of sensing data.It is adjacent with the sensing data of a frame as benchmark on acquisition time to search
For close multiframe sensing data, it is assumed that a frame is a frame laser point cloud data as the sensing data of benchmark, can be incited somebody to action
Collect collected multiple image in the period of the frame laser point cloud data, the initial time of the period and at the end of
It carves nearby acquired image and the frame laser point cloud data and is combined into a frame fused data.In other words, a frame fused data packet
Contain: a frame laser point cloud data collects collected multiple image in the period of the frame laser point cloud data, in the time
Acquired image near the initial time of section and finish time.Aforesaid way can be used, for each frame as benchmark
Sensing data finds out neighbouring multiframe sensing data on acquisition time respectively.Finally, using each frame as benchmark
Sensing data and all the sensors data found out form the frame fused data.
For example, the sensor on vehicle includes: laser radar, camera.In other words, the type packet of the sensor on vehicle
It includes: laser radar type, camera types.Laser radar acquires laser point cloud data.Camera collection image.Each frame laser
The laser point cloud data of point cloud is respectively provided with a timestamp.Each image is respectively provided with a timestamp.When generating a frame
When fused data, it can determine that a frame laser point cloud data is sensed as the frame as benchmark in the frame fused data first
Device data.It can be by multiple image collected in the period for collecting the frame laser point cloud data, rising in the period
Moment beginning and finish time, nearby acquired image and the frame laser point cloud data were combined into a frame fused data.In other words,
One frame fused data includes: a frame laser point cloud data, collect it is collected more in the period of the frame laser point cloud data
Frame image, the initial time in the period and acquired image near finish time.
In some optional implementations of the present embodiment, multiple types of the sensor of vehicle include: laser radar
Type, millimetre-wave radar type, camera types.In other words, vehicle has at least one laser radar, at least one millimeter wave
Radar, at least one camera.Include at least laser point cloud data of a frame laser point cloud, at least a frame in one frame fused data
Millimetre-wave radar data, at least a frame image.One frame laser point cloud data, a frame millimetre-wave radar data, a frame image
To be referred to as a frame sensor data.
For example, a frame fused data includes the laser point cloud data of a frame laser point cloud, a frame millimetre-wave radar data, one
Frame image.The quantity of the sensor of one type of vehicle can be one or more.When the biography for belonging to a type on vehicle
When the quantity of sensor is multiple, for belonging to multiple sensors of the type, a frame fused data may include each and belong to
The respective collected at least frame sensor data of the sensor of the type.For example, vehicle has multiple cameras, frame fusion
It include each camera respectively collected frame image in data.
In some optional implementations in the present embodiment, a frame fused data further include: the biography of each type
The respective respective associated position data, every of the timestamp of collected multiframe sensing data, each frame sensor data of sensor
One frame sensor data respectively associated attitude data, supplemental characteristic.The timestamp of one frame laser point cloud data can be used as this
The acquisition time of frame laser point cloud data.The timestamp of one frame millimetre-wave radar data can be used as the frame millimetre-wave radar data
Acquisition time, a frame image timestamp can be used as the acquisition time of the frame image.
For example, multiple types of the sensor of vehicle include laser radar type, millimetre-wave radar type, camera class
Type.The quantity of laser radar is one, and the quantity of millimetre-wave radar and the quantity of camera are multiple.The frame fused data packet
It includes: the respective associated position of a frame laser point cloud data, multiframe millimetre-wave radar data, multiple image, each frame sensor data
Set data, each frame sensor data respectively associated attitude data, supplemental characteristic.It is each in multiframe millimetre-wave radar data
Frame millimetre-wave radar data are acquired by a millimetre-wave radar respectively.Each frame image in multiple image is imaged by one respectively
Head acquisition.The data file of the frame fused data includes: that the corresponding file of camera, millimetre-wave radar type are corresponding
File, the corresponding file of laser radar type, the corresponding file of supplemental characteristic.
The timestamp of the collected multiframe sensing data of the sensor of a type in the frame fused data include: by
The timestamp of the respective collected frame sensor data of each of sensor of the type sensor.Position data packet
It includes: the respective associated position data of each frame sensor data.Attitude data includes: that each frame sensor data are respectively associated
Attitude data.For the frame image in the frame fused data, position data associated with the frame image includes: to collect this
The position of vehicle when frame image, when collecting the frame image camera position.Attitude data associated with the frame image
The angle etc. of camera when including: the posture of vehicle when collecting the frame image, collecting the frame image.The position of vehicle can
Using the coordinate representation under world coordinate system.For the frame millimetre-wave radar data in the frame fused data, with frame milli
The associated position data of metre wave radar data includes: the position of vehicle when collecting the frame millimetre-wave radar data, collects
The position of the central point of millimetre-wave radar when the frame millimetre-wave radar data.Posture associated with the frame millimetre-wave radar data
Data include: the posture of vehicle when collecting the frame millimetre-wave radar data.For the frame laser point in the frame fused data
Cloud data, position data associated with the frame laser point cloud data include: vehicle when collecting the frame laser point cloud data
Position, when collecting the frame laser point cloud data central point of laser radar position.It is associated with the frame laser point cloud data
Attitude data swash when including: the posture of vehicle when collecting the frame laser point cloud data, collecting the frame laser point cloud data
The posture of the central point of optical radar.It the position of vehicle can be using the coordinate representation under world coordinate system.The center of laser radar
It the position of point can be using the coordinate representation under world coordinate system.
Supplemental characteristic in the frame fused data includes: and associated data of camera, associated with millimetre-wave radar
Data, data associated with laser radar.Data associated with camera include: the respective internal reference of each camera
Several and outer parameter.Data associated with millimetre-wave radar include: each millimetre-wave radar millimetre-wave radar coordinate system with
The transformational relation of world coordinate system.Data associated with laser radar include: the laser radar coordinate system and generation of laser radar
The transformational relation of boundary's coordinate system.
In some optional implementations in the present embodiment, in the relevance based on acquisition time, generates a frame and melt
When closing data, the frame reference sensor data in a frame fused data to be generated, a frame reference sensor number can be determined
According to the sensor acquisition by the smallest type of frequency acquisition in the multiple type;For each of multiple types with adopt
Collect the smallest different types of type of frequency, finds out acquisition time from the collected sensing data of sensor of the type
At least frame sensor data of acquisition time recently apart from the frame reference sensor data;It is sensed based on the frame benchmark
Device data and the multiframe sensing data found out generate a frame fused data.
For example, multiple types of the sensor of vehicle include laser radar type, millimetre-wave radar type, camera class
Type.The quantity of laser radar is 1, and the quantity of millimetre-wave radar is 2, and the quantity of camera is 9.In one frame fused data
Include the collected frame laser point cloud data of laser radar, two frame millimetre-wave radar data, nine frame images.Laser radar is adopted
Collect the frequency acquisition that frequency is less than the frequency acquisition of millimetre-wave radar, camera.The frame laser point cloud data is as a frame benchmark
Sensing data.When generating a frame fused data, it is first determined the frame laser point cloud data in the frame fused data, it is different
In the type of laser radar include: millimetre-wave radar type, camera types.Then, for each millimetre-wave radar, respectively
From by the collected acquisition time that found out from all frame millimetre-wave radar data of the millimetre-wave radar apart from the frame laser point
The nearest frame millimetre-wave radar data of the acquisition time of cloud data adopt each camera from by the camera respectively
The nearest frame image of acquisition time of the acquisition time apart from the frame laser point cloud data is found out in all frame images collected,
To which the frame laser point cloud data, the two frame millimetre-wave radar data found out, the nine frame images found out are combined into a frame
Fused data.
In the present embodiment, after generating multiframe fused data, each frame that can be generated in multiframe fused data melts
Close the corresponding data file of data.Fused data is stored using the corresponding data file of fused data.In other words, a frame melts
Data are closed to be stored in the corresponding data file of frame fused data.Further data file can be compressed, be pressed
The data file of contracting stores the data file of compression.
For generating the data file of a frame fused data, multiple types of the sensor of vehicle include laser radar class
Type, millimetre-wave radar type, camera types.One of the quantity of laser radar, the quantity of millimetre-wave radar and camera
Quantity is multiple.The frame fused data includes: a frame laser point cloud data, multiframe millimetre-wave radar data, multiple image, position
Set data, attitude data, supplemental characteristic.Each frame millimetre-wave radar data in multiframe millimetre-wave radar data are respectively by one
Millimetre-wave radar acquisition.Each frame image in multiple image is acquired by a camera respectively.The data of the frame fused data
File includes: that the corresponding file of camera, the corresponding file of millimetre-wave radar type, laser radar type are corresponding
The corresponding file of file, supplemental characteristic.
The corresponding file of camera types includes the corresponding sub-folder of each camera.One camera pair
It is the timestamp of the frame image, associated with the frame image comprising by the collected frame image of the camera in the subfile answered
Position data, attitude data associated with the frame image.Position data associated with the frame image includes: to collect this
The position of vehicle when frame image, when collecting the frame image camera position.Attitude data associated with the frame image
The angle etc. of camera when including: the posture of vehicle when collecting the frame image, collecting the frame image.The position of vehicle can
Using the coordinate representation under world coordinate system.
It include the corresponding sub-folder of each millimetre-wave radar in the corresponding file of millimetre-wave radar type.One
The corresponding sub-folder of a millimetre-wave radar includes: by the collected frame millimetre-wave radar data of the millimetre-wave radar, the frame
The timestamp of millimetre-wave radar data, position data associated with the frame millimetre-wave radar data and the frame millimetre-wave radar
The associated attitude data of data.Position data associated with the frame millimetre-wave radar data includes: to collect the frame millimeter
The position of vehicle when wave radar data, when collecting the frame millimetre-wave radar data central point of millimetre-wave radar position.With
The associated attitude data of frame millimetre-wave radar data includes: the posture of vehicle when collecting the frame millimetre-wave radar data.
Time in the corresponding file of laser radar type comprising a frame laser point cloud data, the frame laser point cloud data
Stamp, position data associated with the frame laser point cloud data, attitude data associated with the frame laser point cloud data.With this
The associated position data of frame laser point cloud data includes: the position of vehicle when collecting the frame laser point cloud data, collects
The position of the central point of laser radar when the frame laser point cloud data.Attitude data packet associated with the frame laser point cloud data
Include: the posture of vehicle when collecting the frame laser point cloud data, when collecting the frame laser point cloud data laser radar center
The posture of point.It the position of vehicle can be using the coordinate representation under world coordinate system.The position of the central point of laser radar can be with
Using the coordinate representation under world coordinate system.
Each sensor is contained in the corresponding file of supplemental characteristic by the obtained data of transducer calibration.Ginseng
The corresponding file of number data includes: data associated with camera, data associated with millimetre-wave radar and laser thunder
Up to associated data.Data associated with camera include: the respective intrinsic parameter of each camera and outer parameter.With milli
The associated data of metre wave radar include: millimetre-wave radar coordinate system and the conversion of world coordinate system of each millimetre-wave radar
Relationship.Data associated with laser radar include: that the conversion of the laser radar coordinate system and world coordinate system of laser radar is closed
System.
In the present embodiment, all corresponding data files of frame fused data may be constructed document data set.In vehicle
Development project in, any one need using vehicle the collected sensing data of sensor research and development task can benefit
Corresponding research and development task is completed with document data set.
For example, the task of the training sample of the mark dyspoiesis object identification model to break the barriers.Due to by inhomogeneity
The associated multiframe sensing data on acquisition time of type is integrated into a frame fused data, and it is each to generate each frame fused data
Self-corresponding data file, each corresponding data file of frame fused data include that the multiframe of multiple and different types senses
Device data, timestamp, position data, attitude data etc. can use the mark that barrier is rapidly completed in data file, generate barrier
Hinder the training sample of object identification model.
Referring to FIG. 3, this application provides an a kind of implementations of device as the realization to method shown in above-mentioned each figure
Example, the Installation practice are corresponding with embodiment of the method shown in Fig. 2.Each unit in device is configured as completing corresponding
The specific implementation of operation can be with reference to the specific implementation of corresponding operation described in embodiment of the method.
As shown in figure 3, the data processing equipment of the present embodiment includes: acquiring unit 301, processing unit 302.Wherein, it obtains
Unit 301 is taken to be configured as obtaining sensing data to be fused, sensing data to be fused includes: multiple classes on vehicle
The respective collected sensing data of the sensor of each of sensor of type type, wherein the sensor of a type
Collected sensing data includes: multiframe sensing data;Processing unit 302 is configured as the association based on acquisition time
Property, multiframe fused data is generated, and generate each corresponding data file of frame fused data in multiframe fused data,
Wherein, a frame fused data includes: the respective collected at least frame sensor data of sensor of each type.
In some optional implementations of the present embodiment, the multiple type includes: laser radar type, millimeter wave
Radar type, camera types.
In some optional implementations of the present embodiment, for each of the multiple type type, belong to
The quantity of the sensor of the type is one or more.
In some optional implementations of the present embodiment, a frame fused data further include: in the multiple type
Respectively the collected at least timestamp of a frame sensor data, each frame sensor data are respectively for the sensor of each type
Associated position data, each frame sensor data respectively associated attitude data, supplemental characteristic.
In some optional implementations of the present embodiment, processing unit is configured to determine to be generated one
A frame reference sensor data in frame fused data, the frame reference sensor data are by the acquisition in the multiple type
The sensor of the smallest type of frequency acquires;It is different from the smallest type of frequency acquisition for each of the multiple type
Type, find out acquisition time from the collected sensing data of sensor of the type and passed apart from the frame benchmark
The nearest at least frame sensor data of the acquisition time of sensor data;Based on the frame reference sensor data and find out
Multiframe sensing data, generate a frame fused data.
Fig. 4 shows the structural schematic diagram for being suitable for the computer system for the server for being used to realize the embodiment of the present application.
It, can be according to being stored in read-only storage as shown in figure 4, computer system includes central processing unit (CPU) 401
Program in device (ROM) 402 is executed from the program that storage section 408 is loaded into random access storage device (RAM) 403
Various movements appropriate and processing.In RAM403, it is also stored with various programs and data needed for computer system operation.
CPU 401, ROM 402 and RAM403 are connected with each other by bus 404.Input/output (I/O) interface 405 is also connected to always
Line 404.
I/O interface 405: importation 406 is connected to lower component;Output par, c 407;Storage section including hard disk etc.
408;And the communications portion 409 of the network interface card including LAN card, modem etc..Communications portion 409 is via all
As the network of internet executes communication process.Driver 410 is also connected to I/O interface 405 as needed.Detachable media 411,
Such as disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on as needed on driver 410, in order to from thereon
The computer program of reading is mounted into storage section 408 as needed.
Particularly, process described in embodiments herein may be implemented as computer program.For example, the application
Embodiment includes a kind of computer program product comprising carries computer program on a computer-readable medium, the calculating
Machine program includes the instruction for method shown in execution flow chart.The computer program can be by communications portion 409 from net
It is downloaded and installed on network, and/or is mounted from detachable media 411.In the computer program by central processing unit (CPU)
When 401 execution, the above-mentioned function of limiting in the present processes is executed.
Present invention also provides a kind of server, which can have one or more processors;Memory is used for
One or more programs are stored, may include the finger to execute operation described in above-described embodiment in one or more programs
It enables.When one or more programs are executed by one or more processors, so that one or more processors execute above-mentioned implementation
The instruction of operation described in example.
Present invention also provides a kind of computer-readable medium, which can be included in server
's;It is also possible to individualism, without in supplying server.Above-mentioned computer-readable medium carries one or more journey
Sequence, when one or more program is executed by server, so that server executes operation described in above-described embodiment.
It should be noted that computer-readable medium described herein can be computer-readable signal media or meter
Calculation machine readable storage medium storing program for executing either the two any combination.Computer readable storage medium for example may include but unlimited
In the system of electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, device or device, or any above combination.Computer can
The more specific example for reading storage medium can include but is not limited to: electrical connection, portable meter with one or more conducting wires
Calculation machine disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable programmable read only memory
(EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device or
The above-mentioned any appropriate combination of person.In this application, computer readable storage medium can be it is any include or storage program
Tangible medium, which can be executed system by message, device or device use or in connection.And in this Shen
Please in, computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
In carry computer-readable program code.The data-signal of this propagation can use many-sided form, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
System, device or device use or program in connection are executed by message.Include on computer-readable medium
Program code can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. are above-mentioned
Any appropriate combination.
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the application, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part of one module, program segment or code of table, a part of the module, program segment or code include one or more use
The executable message of the logic function as defined in realizing.It should also be noted that in some implementations as replacements, being marked in box
The function of note can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are actually
It can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it to infuse
Meaning, the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart can be with holding
The dedicated hardware based system of functions or operations as defined in row is realized, or can use specialized hardware and computer message
Combination realize.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Those skilled in the art
Member is it should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic
Embodiment, while should also cover in the case where not departing from the inventive concept, by above-mentioned technical characteristic or its equivalent feature into
Row any combination and the other technical em- bodiments formed.Such as features described above and (but being not limited to) disclosed herein have class
Technical characteristic like function is replaced mutually and the technical em- bodiments that are formed.
Claims (12)
1. a kind of data processing method, comprising:
Sensing data to be fused is obtained, sensing data to be fused includes: in the sensor of multiple types on vehicle
Each type the respective collected sensing data of sensor, wherein the collected sensing of the sensor of a type
Device data include: multiframe sensing data;
Relevance based on acquisition time generates multiframe fused data, and generates each frame fusion in multiframe fused data
The corresponding data file of data, wherein a frame fused data include: each type sensor respectively it is collected extremely
Few frame sensor data.
2. according to the method described in claim 1, the multiple type includes: laser radar type, millimetre-wave radar type, takes the photograph
As head type.
3. according to the method described in claim 2, belonging to the biography of the type for each of the multiple type type
The quantity of sensor is one or more.
4. according to the method described in claim 3, a frame fused data further include: each of the multiple type type
The respective respective associated positional number of the collected at least timestamp of a frame sensor data, each frame sensor data of sensor
According to, each frame sensor data respectively associated attitude data, supplemental characteristic.
5. method described in one of -4 according to claim 1, the relevance based on acquisition time generate multiframe fused data packet
It includes:
Determine the frame reference sensor data in a frame fused data to be generated, the frame reference sensor data are by institute
State the sensor acquisition of the smallest type of frequency acquisition in multiple types;
For each of the multiple type and the smallest different types of type of frequency acquisition, from the sensing of the type
It is nearest that acquisition time of the acquisition time apart from the frame reference sensor data is found out in the collected sensing data of device
An at least frame sensor data;
Based on the frame reference sensor data and the multiframe sensing data found out, a frame fused data is generated.
6. a kind of data processing equipment, comprising:
Acquiring unit is configured as obtaining sensing data to be fused, and sensing data to be fused includes: more on vehicle
The respective collected sensing data of the sensor of each of the sensor of a type type, wherein the biography of a type
The collected sensing data of sensor includes: multiframe sensing data;
Processing unit is configured as the relevance based on acquisition time, generates multiframe fused data, and generates multiframe and merge number
Each corresponding data file of frame fused data in, wherein a frame fused data includes: the sensing of each type
The respective collected at least frame sensor data of device.
7. device according to claim 6, the multiple type includes: laser radar type, millimetre-wave radar type, takes the photograph
As head type.
8. device according to claim 7 belongs to the biography of the type for each of the multiple type type
The quantity of sensor is one or more.
9. device according to claim 8, a frame fused data further include: each of the multiple type type
The respective respective associated positional number of the collected at least timestamp of a frame sensor data, each frame sensor data of sensor
According to, each frame sensor data respectively associated attitude data, supplemental characteristic.
10. the device according to one of claim 6-9, processing unit is configured to determine that a frame to be generated melts
The frame reference sensor data in data are closed, the frame reference sensor data are by the frequency acquisition in the multiple type
The sensor of the smallest type acquires;For each of the multiple type and the smallest different types of class of frequency acquisition
Type finds out acquisition time apart from the frame reference sensor from the collected sensing data of sensor of the type
The nearest at least frame sensor data of the acquisition time of data;Based on the frame reference sensor data and find out more
Frame sensor data generate a frame fused data.
11. a kind of server, comprising:
One or more processors;
Memory, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processors
Realize such as method as claimed in any one of claims 1 to 5.
12. a kind of computer-readable medium, is stored thereon with computer program, such as right is realized when which is executed by processor
It is required that any method in 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811513244.4A CN109635870A (en) | 2018-12-11 | 2018-12-11 | Data processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811513244.4A CN109635870A (en) | 2018-12-11 | 2018-12-11 | Data processing method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109635870A true CN109635870A (en) | 2019-04-16 |
Family
ID=66072833
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811513244.4A Pending CN109635870A (en) | 2018-12-11 | 2018-12-11 | Data processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109635870A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111429521A (en) * | 2020-03-05 | 2020-07-17 | 深圳市镭神智能系统有限公司 | External parameter calibration method, device, medium and electronic equipment for camera and laser radar |
CN112541527A (en) * | 2020-11-26 | 2021-03-23 | 深兰科技(上海)有限公司 | Multi-sensor synchronization method and device, electronic equipment and storage medium |
CN113611009A (en) * | 2021-08-02 | 2021-11-05 | 湖南酷陆网络科技有限公司 | Environmental sanitation vehicle cloud box sensing data fusion processing system |
CN113837385A (en) * | 2021-09-06 | 2021-12-24 | 东软睿驰汽车技术(沈阳)有限公司 | Data processing method, device, equipment, medium and product |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101510356A (en) * | 2009-02-24 | 2009-08-19 | 上海高德威智能交通系统有限公司 | Video detection system and data processing device thereof, video detection method |
US8699790B2 (en) * | 2011-11-18 | 2014-04-15 | Mitsubishi Electric Research Laboratories, Inc. | Method for pan-sharpening panchromatic and multispectral images using wavelet dictionaries |
CN103971392A (en) * | 2013-01-31 | 2014-08-06 | 北京四维图新科技股份有限公司 | Navigation-oriented three-dimensional video data processing method and device, system and terminal |
CN105711597A (en) * | 2016-02-25 | 2016-06-29 | 江苏大学 | System and method for sensing local driving environment in front |
-
2018
- 2018-12-11 CN CN201811513244.4A patent/CN109635870A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101510356A (en) * | 2009-02-24 | 2009-08-19 | 上海高德威智能交通系统有限公司 | Video detection system and data processing device thereof, video detection method |
US8699790B2 (en) * | 2011-11-18 | 2014-04-15 | Mitsubishi Electric Research Laboratories, Inc. | Method for pan-sharpening panchromatic and multispectral images using wavelet dictionaries |
CN103971392A (en) * | 2013-01-31 | 2014-08-06 | 北京四维图新科技股份有限公司 | Navigation-oriented three-dimensional video data processing method and device, system and terminal |
CN105711597A (en) * | 2016-02-25 | 2016-06-29 | 江苏大学 | System and method for sensing local driving environment in front |
Non-Patent Citations (2)
Title |
---|
王国锋,许振辉: "《多源激光雷达数据集成技术及其应用》", 30 November 2012 * |
陈慧岩: "《智能车辆理论与应用》", 31 July 2018 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111429521A (en) * | 2020-03-05 | 2020-07-17 | 深圳市镭神智能系统有限公司 | External parameter calibration method, device, medium and electronic equipment for camera and laser radar |
CN112541527A (en) * | 2020-11-26 | 2021-03-23 | 深兰科技(上海)有限公司 | Multi-sensor synchronization method and device, electronic equipment and storage medium |
CN113611009A (en) * | 2021-08-02 | 2021-11-05 | 湖南酷陆网络科技有限公司 | Environmental sanitation vehicle cloud box sensing data fusion processing system |
CN113611009B (en) * | 2021-08-02 | 2023-04-18 | 湖南酷陆网络科技有限公司 | Ring Wei Cheliang cloud box sensing data fusion processing system |
CN113837385A (en) * | 2021-09-06 | 2021-12-24 | 东软睿驰汽车技术(沈阳)有限公司 | Data processing method, device, equipment, medium and product |
CN113837385B (en) * | 2021-09-06 | 2024-02-09 | 东软睿驰汽车技术(沈阳)有限公司 | Data processing method, device, equipment, medium and product |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109635870A (en) | Data processing method and device | |
US10360247B2 (en) | System and method for telecom inventory management | |
CN107871129B (en) | Method and apparatus for handling point cloud data | |
CN107945198A (en) | Method and apparatus for marking cloud data | |
CN106707293B (en) | Obstacle recognition method and device for vehicle | |
CN109508681A (en) | The method and apparatus for generating human body critical point detection model | |
CN109308681B (en) | Image processing method and device | |
CN109212530A (en) | Method and apparatus for determining barrier speed | |
CN103996036B (en) | A kind of map data collecting method and device | |
CN110400363A (en) | Map constructing method and device based on laser point cloud | |
CN109325429A (en) | A kind of method, apparatus, storage medium and the terminal of linked character data | |
CN110222641B (en) | Method and apparatus for recognizing image | |
Hinz | Detection and counting of cars in aerial images | |
US20210264198A1 (en) | Positioning method and apparatus | |
CN109409364A (en) | Image labeling method and device | |
EP3796262A1 (en) | Method and apparatus for calibrating camera | |
CN112651266A (en) | Pedestrian detection method and device | |
CN111598006A (en) | Method and device for labeling objects | |
CN109427031A (en) | A kind of data processing method and equipment | |
JP5878634B2 (en) | Feature extraction method, program, and system | |
CN112765302B (en) | Method and device for processing position information and computer readable medium | |
CN109903308B (en) | Method and device for acquiring information | |
CN109034214A (en) | Method and apparatus for generating label | |
JP5837404B2 (en) | Image processing apparatus and image processing method | |
CN112585957A (en) | Station monitoring system and station monitoring method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190416 |
|
RJ01 | Rejection of invention patent application after publication |