CN109871385A - Method and apparatus for handling data - Google Patents
Method and apparatus for handling data Download PDFInfo
- Publication number
- CN109871385A CN109871385A CN201910151379.9A CN201910151379A CN109871385A CN 109871385 A CN109871385 A CN 109871385A CN 201910151379 A CN201910151379 A CN 201910151379A CN 109871385 A CN109871385 A CN 109871385A
- Authority
- CN
- China
- Prior art keywords
- sensor
- data
- type
- threshold value
- response
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Abstract
Embodiment of the disclosure discloses the method and apparatus for handling data.One specific embodiment of this method includes: to receive the first data of first sensor output first;Later, determine whether the type of first sensor matches with sensor of interest type, wherein sensor of interest type is associated with the historical record of trigger data mixing operation;Then, in response to determining that the type of first sensor matches with the sensor of interest type, trigger data mixing operation;Finally, storing the type of first sensor in response to trigger data mixing operation.The embodiment, which is realized, is integrated into capable triggering to data independent of single-sensor, to improve the reliability of trigger data fusion.
Description
Technical field
Embodiment of the disclosure is related to field of computer technology, and in particular to the method and apparatus for handling data.
Background technique
With the rapid development of sensing technology, the Data fusion technique of multisensor also seems further important.Especially exist
In automatic Pilot and the DAS (Driver Assistant System) application of vehicle, the bottom module of Fusion Module perceptually, by different sensors sense
The obstacle information known is merged.Therefore, can the triggering frequency of data fusion obtain the data that each sensor acquires
To make full use of, whether to occupy multi-system resource with fusion treatment significant.
There are two types of relevant data fusion triggering mode is usual.One is master reference is set by some sensor, when
When receiving the data of master reference transmission, the data and trigger data fusion of other sensors are obtained.The second is by pre-
If frequency, such as every 100ms trigger a data fusion.
Summary of the invention
Embodiment of the disclosure proposes the method and apparatus for handling data.
In a first aspect, embodiment of the disclosure provides a kind of method for handling data, this method comprises: receiving the
First data of one sensor output;Determine whether the type of first sensor matches with sensor of interest type, wherein mesh
It is associated with the historical record of trigger data mixing operation to mark sensor type;In response to determining the type and mesh of first sensor
Mark sensor type matches, trigger data mixing operation;In response to trigger data mixing operation, the class of first sensor is stored
Type.
In some embodiments, this method further include: in response to determining the type and sensor of interest class of first sensor
Type mismatches, and determines whether difference at the time of receiving the first data between corresponding timestamp and object time stamp is greater than mesh
Mark activation threshold value;In response to determining that difference is greater than target activation threshold value, trigger data mixing operation.
In some embodiments, above-mentioned object time stamp includes the time corresponding to the triggering of last data fusion operation
Stamp.Above-mentioned target activation threshold value includes the corresponding default triggered time threshold value of first sensor.This method further include: in response to touching
Data fusion operation is sent out, timestamp when trigger data mixing operation is stored.
In some embodiments, above-mentioned trigger data mixing operation, comprising: the second data are obtained from second sensor,
In, second sensor is associated with first sensor;Send the information that characterization starts data fusion.
In some embodiments, above-mentioned first sensor and second sensor have respectively corresponded default triggered time threshold value.
In some embodiments, above-mentioned sensor of interest type includes corresponding to the triggering of last data fusion operation
The type of sensor.
Second aspect, embodiment of the disclosure provide it is a kind of for handling the device of data, the device include: receive it is single
Member is configured to receive the first data of first sensor output;Determination unit is configured to determine the type of first sensor
Whether match with sensor of interest type, wherein the historical record phase of sensor of interest type and trigger data mixing operation
Association;First trigger unit is configured in response to determine that the type of first sensor matches with sensor of interest type, touch
Send out data fusion operation;First storage unit is configured in response to trigger data mixing operation, stores the class of first sensor
Type.
In some embodiments, device further include: the second trigger unit is configured in response to determine first sensor
Type and sensor of interest type mismatch, determine that receiving corresponding timestamp and object time at the time of the first data stabs
Between difference whether be greater than target activation threshold value;In response to determining that difference is greater than target activation threshold value, trigger data fusion behaviour
Make.
In some embodiments, above-mentioned object time stamp includes the time corresponding to the triggering of last data fusion operation
Stamp.Above-mentioned target activation threshold value includes the corresponding default triggered time threshold value of first sensor.The device further include: the second storage
Unit is configured in response to trigger data mixing operation, stores timestamp when trigger data mixing operation.
In some embodiments, above-mentioned first trigger unit includes: acquisition module, is configured to obtain from second sensor
Second data, wherein second sensor is associated with first sensor;Sending module is configured to send characterization and starts data
The information of fusion.
In some embodiments, above-mentioned first sensor and second sensor have respectively corresponded default triggered time threshold value.
In some embodiments, above-mentioned sensor of interest type includes corresponding to the triggering of last data fusion operation
The type of sensor.
The third aspect, embodiment of the disclosure provide a kind of Multiple Source Sensor fusion triggering system, which includes: master
Sensor, secondary sensor, processor, wherein the data that the default triggered time threshold value of master reference is less than master reference export
Time interval, the data that the default triggered time threshold value of secondary sensor is greater than master reference export time interval, and processor is used for
Realize the method as described in implementation any in first aspect.
Fourth aspect, embodiment of the disclosure provide a kind of terminal, which includes: one or more processors;It deposits
Storage device is stored thereon with one or more programs;When one or more programs are executed by one or more processors, so that one
A or multiple processors realize the method as described in implementation any in first aspect.
5th aspect, embodiment of the disclosure provide a kind of computer-readable medium, are stored thereon with computer program,
The method as described in implementation any in first aspect is realized when the program is executed by processor.
The method and apparatus for handling data that embodiment of the disclosure provides, first reception first sensor export
First data;Later, determine whether the type of first sensor matches with sensor of interest type, wherein sensor of interest
Type is associated with the historical record of trigger data mixing operation;Then, in response to the type and target of determining first sensor
Sensor type matches, trigger data mixing operation;Finally, storing first sensor in response to trigger data mixing operation
Type.Capable triggering is integrated into data independent of single-sensor to realizing, improve trigger data fusion can
By property.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the disclosure is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 is that one embodiment of the disclosure can be applied to exemplary system architecture figure therein;
Fig. 2 is the flow chart according to one embodiment of the method for handling data of the disclosure;
Fig. 3 is according to an embodiment of the present disclosure for handling the schematic diagram of an application scenarios of the method for data;
Fig. 4 is the flow chart according to another embodiment of the method for handling data of the disclosure;
Fig. 5 is the structural schematic diagram according to one embodiment of the device for handling data of the disclosure;
Fig. 6 be an application scenarios of Multiple Source Sensor according to an embodiment of the present disclosure fusion triggering system 600 when
Sequence figure;
Fig. 7 is adapted for the structural schematic diagram for realizing the electronic equipment of embodiment of the disclosure.
Specific embodiment
The disclosure is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to
Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the feature in embodiment and embodiment in the disclosure can phase
Mutually combination.The disclosure is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 is shown can be using the disclosure for handling the method for data or the example of the device for handling data
Property framework 100.
As shown in Figure 1, system architecture 100 may include sensor 101,102,103, network 104 and processor 105.Net
Network 104 between sensor 101,102,103 and processor 105 to provide the medium of communication link.Network 104 may include
Various connection types, such as wired, wireless communication link or fiber optic cables etc..
Sensor 101,102,103 is interacted by network 104 with processor 105, to receive or send message etc..Sensor
101,102,103 may include various types of detection devices, the measured information experienced can be converted to electric signal
Or other are for the information format of transmission, processing, storage, display.The sensor can include but is not limited to following at least one
: LiDAR (Light Detection And Ranging, laser radar), imaging sensor (such as camera), ultrasonic wave
Radar, millimetre-wave radar.
Processor 105 can be the various processors for supporting data processing and information transmission, and for example, data fusion operates
The processor of trigger signal is provided.Processor 105 can receive the data of the output of sensor 101,102,103 and record output number
According to sensor type, and determine whether trigger data mixing operation according to the type of sensor.
It should be noted that processor can be hardware, it is also possible to software.When processor is hardware, may be implemented
At the processor group that multiple processors form, single processor also may be implemented into.When processor is software, may be implemented into
Single software or software module also may be implemented into multiple softwares or software module.It is not specifically limited herein.
It should be noted that for handling the method for data generally by processor 105 provided by embodiment of the disclosure
It executes, correspondingly, the device for handling data is generally positioned in processor 105.
It should be understood that the number of sensor, network and processor in Fig. 1 is only schematical.According to realize needs,
It can have any number of sensor, network and processor.
With continued reference to Fig. 2, the process of one embodiment of the method for handling data according to the disclosure is shown
200.This be used for handle data method the following steps are included:
Step 201, the first data of first sensor output are received.
In the present embodiment, can lead to for handling the executing subject (processor 105 as shown in Figure 1) of the method for data
It crosses wired connection mode or radio connection receives the first data of first sensor output.Wherein, above-mentioned first sensing
Device may include the sensor for capableing of trigger data mixing operation.In practice, since data fusion operation needs will be acquired
Several data of each sensor are analyzed under certain criterion, are integrated, and system resources in computation can be occupied.Thus usually set in advance
Fixed some or certain sensors can trigger data mixing operation, and the data of other sensors can not trigger being acquired
Data fusion operation.It can be incited somebody to action with multiple sensors of above-mentioned executing subject communication connection with respectively preset data transmission frequencies
Data collected are exported.Therefore, above-mentioned executing subject can receive defeated from different sensors at different times
Data out.As an example, above-mentioned first sensor can include but is not limited to following at least one in automatic Pilot technical field
: LiDAR, camera, millimetre-wave radar.
It should be noted that in practice, same type of sensor can have multiple.In different types of sensor
In data handling procedure, usually first same type of different sensors data collected are integrated, at this point, above-mentioned first
Data correspond to the result after the integration of same type of multiple sensors data collected.As an example, one is driven automatically
3 cameras can be equipped with by sailing on automobile.In this case, above-mentioned first data can be above-mentioned 3 cameras and be adopted
The image data of collection integrated after result.Correspondingly, above-mentioned first sensor can be above-mentioned 3 cameras.
Step 202, determine whether the type of first sensor matches with sensor of interest type.
In the present embodiment, above-mentioned executing subject can determine that the first data received by above-mentioned steps 201 are corresponding
The type of first sensor.Then, above-mentioned executing subject can determine whether the type of above-mentioned first sensor senses with target
Device type matches.Wherein, above-mentioned sensor of interest type can be associated with the historical record of trigger data mixing operation.Make
For example, above-mentioned sensor of interest type can be the sensor that trigger data mixing operation number is most in historical record
Type.In general, the number of trigger data mixing operation is more, illustrate that the data of the sensor are in contrast more important, therefore
It is equivalent to as the higher triggering priority of the sensor settings.
It should be noted that the type of the sensor usually can include but is not limited to it is at least one of following: LiDAR,
Camera, millimetre-wave radar.
In the present embodiment, above-mentioned executing subject can be determined first touches in the historical record of above-mentioned trigger data mixing operation
Send out the type of the most sensor of number.Then, above-mentioned executing subject can determine the type of first sensor with above-mentioned
The type of sensor be compared.In response to determining the type of first sensor and the type one of above-mentioned identified sensor
It causes, above-mentioned executing subject can determine that the type of first sensor matches with sensor of interest type.In response to determining first
The Type-Inconsistencies of the type of sensor and above-mentioned identified sensor, above-mentioned executing subject can determine first sensor
Type and sensor of interest type mismatch.
In some optional implementations of the present embodiment, above-mentioned sensor of interest type may include last data
The type of sensor corresponding to the triggering of mixing operation.Above-mentioned executing subject can first determine above-mentioned trigger data mixing operation
Historical record recorded in the last trigger data mixing operation sensor type.Then, above-mentioned executing subject
The type of first sensor can be compared with the type of above-mentioned identified sensor.In response to determining first sensor
Type it is consistent with the type of above-mentioned identified sensor, above-mentioned executing subject can determine the type and mesh of first sensor
Mark sensor type matches.It is different in response to the type of determining first sensor and the type of above-mentioned identified sensor
It causes, above-mentioned executing subject can determine that the type of first sensor and sensor of interest type mismatch.
Step 203, in response to determining that the type of first sensor matches with sensor of interest type, trigger data fusion
Operation.
In the present embodiment, according to the determination of above-mentioned steps 202 as a result, in response to determine first sensor type and mesh
Mark sensor type matches, and above-mentioned executing subject can trigger data mixing operation by various modes.As an example, above-mentioned
Executing subject can be analyzed and processed acquired data according to preset data fusion (data fusion) algorithm.Make
For another example, above-mentioned executing subject can send characterization log-on data fusion behaviour to the electronic equipment that data fusion operates is executed
The instruction of work.
In some optional implementations of the present embodiment, above-mentioned executing subject can trigger data in accordance with the following steps
Mixing operation:
The first step obtains the second data from second sensor.
In these implementations, above-mentioned executing subject can obtain the second number from the second sensor for communicating connection
According to.Wherein, second sensor can be associated with first sensor.As an example, above-mentioned first sensor and second sensor
It can be at work at a distance of the sensor for being less than pre-determined distance threshold value.As another example, above-mentioned first sensor and second
Sensor can be the sensor that data output frequencies difference is less than preset threshold.
Optionally, above-mentioned first sensor and above-mentioned second sensor can also respectively correspond default triggered time threshold
Value.Wherein, above-mentioned triggered time threshold value usually joins with the frequency dependence of sensor output data.The sensor output data
Frequency exports a data every certain time interval for characterizing sensor.As an example, first can be passed in practice
The default triggered time threshold value of sensor is set smaller than the numerical value of the time interval of first sensor output data;By the second sensing
The default triggered time threshold value of device is set greater than the numerical value of the time interval of first sensor output data.So, above-mentioned
One sensor can be equivalent to the master reference of trigger data mixing operation.Above-mentioned second sensor can be equivalent to trigger data
The secondary sensor of mixing operation.And default triggered time threshold value is bigger, illustrates its corresponding sensor-triggered data fusion operation
Priority it is lower.To realize to each sensor-triggered by the way that different default triggered time threshold values is arranged for each sensor
The setting of the priority of data fusion operation.
Second step sends the information that characterization starts data fusion.
In these implementations, above-mentioned executing subject can be after the above-mentioned first step obtains the second data, to being obtained
The first data, the second data taken are analyzed and processed according to preset data anastomosing algorithm.Above-mentioned executing subject can also be sent out
The information for sending characterization to start data fusion.Wherein, the information that above-mentioned characterization starts data fusion can be various forms, such as open
It opens the control signal of indicator light or modifies to the value of flag bit.Optionally, above-mentioned executing subject can also will be acquired
Second data are sent to the electronic equipment for executing data fusion operation;And characterization is sent to above-mentioned electronic equipment and starts to count
According to the instruction of mixing operation.
Step 204, in response to trigger data mixing operation, the type of first sensor is stored.
In the present embodiment, in response to trigger data mixing operation, above-mentioned executing subject can store first sensor
Type.Optionally, above-mentioned executing subject can also merge above-mentioned trigger data and grasp according to the type of the first sensor of storage
The historical record of work is updated.
It is according to an embodiment of the present disclosure for handling the one of the application scenarios of the method for data with continued reference to Fig. 3, Fig. 3
A schematic diagram.In the application scenarios of Fig. 3, the automated driving system 302 being mounted on vehicle 301 receives binocular camera
The image data of 303 outputs.Above-mentioned automated driving system 302 determines target according to the historical record of trigger data mixing operation
Sensor type is laser radar.Later, above-mentioned automated driving system 302 determines that the type of binocular camera 303 and target pass
Sensor type mismatches, and does not execute data fusion operation.Next, above-mentioned automated driving system 302 receives laser radar 304
The point cloud data of output.Then, above-mentioned automated driving system 302 determines the type and sensor of interest type of laser radar 304
Match, trigger data mixing operation.Then, above-mentioned automated driving system 302 is by the sensing of this trigger data mixing operation
The type laser radar of device is stored.Optionally, above-mentioned automated driving system 302 can also by above-mentioned laser radar type and
The timestamp for triggering the operation of this data fusion is stored in the historical record of above-mentioned trigger data mixing operation.
Currently, one of prior art usually sets master reference for some sensor, transmitted when receiving master reference
Data when, obtain other sensors data and trigger data fusion.But this mode can be strongly dependent upon the master reference.
When the master reference is unstable there are the frequency of failure or output data, the triggering of data fusion operation will be directly affected, into
And the information that can not be arrived in time using multi-sensor collection.And the method provided by the above embodiment of the disclosure, by determining institute
The data received from sensor type whether with sensor of interest type whether match determine whether trigger number
According to mixing operation, data mixing operation can be triggered by different sensors by realizing, and enrich triggering mode, more have
The advantage of fusion redundancy backup is embodied to effect, to improve the reliability of trigger data fusion.
With further reference to Fig. 4, it illustrates the processes 400 of another embodiment of the method for handling data.The use
In the process 400 of the method for processing data, comprising the following steps:
Step 401, the first data of first sensor output are received.
Step 402, determine whether the type of first sensor matches with sensor of interest type.
Step 403, in response to determining that the type of first sensor matches with sensor of interest type, trigger data fusion
Operation.
Step 404, in response to trigger data mixing operation, the type of first sensor is stored.
Above-mentioned steps 401, step 402, step 403 and step 404 respectively with step 201, the step in previous embodiment
202, step 203 is consistent with step 204, and the description above with respect to step 201, step 202, step 203 and step 204 is also suitable
In step 401, step 402, step 403 and step 404, details are not described herein again.
Step 405, it is mismatched in response to the type and sensor of interest type that determine first sensor, determines and receive the
Whether difference at the time of one data between corresponding timestamp and object time stamp is greater than target activation threshold value;In response to determination
Difference is greater than target activation threshold value, trigger data mixing operation.
In the present embodiment, according to determined by above-mentioned steps 402 as a result, in response to determine first sensor type with
Sensor of interest type mismatches, above-mentioned executing subject can determine at the time of receive the first data corresponding timestamp and mesh
Mark the difference between timestamp.Wherein, 00 divides 00 when above-mentioned timestamp can refer to Greenwich Mean Time 1970 01 month 01 day 00
Second (00 divides 00 second at Beijing time 1970 01 month 01 day 08) rises to particular moment (at the time of as received the first data)
Total number of seconds.Above-mentioned object time stamp can be the timestamp being dynamically determined according to preset rules.As an example, when above-mentioned target
Between stamp can be used for characterizing last trigger data mixing operation sensor the last output data time.Then,
Identified difference can be compared by above-mentioned executing subject with target activation threshold value.Wherein, above-mentioned target activation threshold value can
To be pre-set any number, it is also possible to according to practical application and the numerical value of determination.As an example, above-mentioned target triggering
Threshold value can be the mean time being separated by between data fusion operation twice indicated by the historical record of trigger data mixing operation
It is long.It is greater than above-mentioned target activation threshold value in response to the above-mentioned identified difference of determination, above-mentioned executing subject can be melted with trigger data
Closing operation.
In some optional implementations of the present embodiment, above-mentioned object time stamp may include last data fusion
The corresponding timestamp of the triggering moment of operation.Above-mentioned target activation threshold value may include the corresponding default touching of above-mentioned first sensor
Send out time threshold.In response to trigger data mixing operation, when above-mentioned executing subject can also store trigger data mixing operation
Timestamp.Optionally, above-mentioned executing subject can also merge above-mentioned trigger data according to the type of the first sensor of storage
The historical record of operation is updated.
Figure 4, it is seen that the method for handling data compared with the corresponding embodiment of Fig. 2, in the present embodiment
Process 400 embody under the type and the unmatched situation of sensor of interest type of first sensor, by receiving the
Whether difference at the time of one data between corresponding timestamp and object time stamp is greater than target activation threshold value to determine whether
The step of trigger data mixing operation.The scheme of the present embodiment description realizes difference by introducing target activation threshold value as a result,
The switching of sensor-triggered data fusion operation.Thus can be by other when frame losing occurs in the data output process of master reference
The operation of sensor-triggered data fusion, fully demonstrates the advantage of redundancy backup in data fusion.
With further reference to Fig. 5, as the realization to method shown in above-mentioned each figure, present disclose provides for handling data
One embodiment of device, the Installation practice is corresponding with embodiment of the method shown in Fig. 2, which specifically can be applied to
In various electronic equipments.
As shown in figure 5, the device 500 provided in this embodiment for handling data includes receiving unit 501, determination unit
502, the first trigger unit 503 and the first storage unit 504.Wherein, receiving unit 501 are configured to receive first sensor
First data of output;Determination unit 502, be configured to determine first sensor type whether with sensor of interest type phase
Matching, wherein sensor of interest type is associated with the historical record of trigger data mixing operation;First trigger unit 503, quilt
It is configured to determine that the type of first sensor matches with sensor of interest type, trigger data mixing operation;First
Storage unit 504 is configured in response to trigger data mixing operation, stores the type of first sensor.
In the present embodiment, in the device 500 for handling data: receiving unit 501, determination unit 502, first trigger
The specific processing of unit 503 and the first storage unit 504 and its brought technical effect can refer to Fig. 2 corresponding embodiment respectively
In step 201, step 202, the related description of step 203 and step 204, details are not described herein.
In some optional implementations of the present embodiment, the above-mentioned device 500 for handling data can also include:
Second trigger unit (not shown) may be configured to type and sensor of interest class in response to determining first sensor
Type mismatches, and determines whether difference at the time of receiving the first data between corresponding timestamp and object time stamp is greater than mesh
Mark activation threshold value;In response to determining that difference is greater than target activation threshold value, trigger data mixing operation.
In some optional implementations of the present embodiment, above-mentioned object time stamp may include last data fusion
Timestamp corresponding to the triggering of operation.Above-mentioned target activation threshold value may include the first sensor corresponding default triggered time
Threshold value.It is above-mentioned for handle the device 500 of data can also to include: the second storage unit (not shown), can be configured
At in response to trigger data mixing operation, timestamp when trigger data mixing operation is stored.
In some optional implementations of the present embodiment, above-mentioned first trigger unit 503 may include: acquisition module
(not shown), sending module (not shown).Wherein, above-mentioned acquisition module may be configured to obtain from second sensor
Take the second data, wherein second sensor is associated with first sensor;Above-mentioned sending module may be configured to send characterization
Start the information of data fusion.
In some optional implementations of the present embodiment, above-mentioned first sensor and second sensor can be right respectively
There should be default triggered time threshold value.
In some optional implementations of the present embodiment, above-mentioned sensor of interest type may include last data
The type of sensor corresponding to the triggering of mixing operation.
The device provided by the above embodiment of the disclosure receives first sensor output by receiving unit 501 first
First data;Then, it is determined that unit 502 determines whether the type of first sensor matches with sensor of interest type, wherein
Sensor of interest type is associated with the historical record of trigger data mixing operation;Later, in response to determining first sensor
Type matches with sensor of interest type, 503 trigger data mixing operation of the first trigger unit;Finally, in response to triggering number
According to mixing operation, the first storage unit 504 stores the type of first sensor.To realize independent of single-sensor pair
Data fusion is triggered, and the reliability of trigger data fusion is improved.
With further reference to Fig. 6, it illustrates one of the Multiple Source Sensor of embodiment of the disclosure fusion triggering system 600
The timing diagram of application scenarios.It may include master reference 601 that Multiple Source Sensor, which merges triggering system 600, secondary sensor 602 and place
Manage device (not shown).Wherein, the data that the default triggered time threshold value of above-mentioned master reference is usually less than master reference are defeated
Time interval out.The default triggered time threshold value of above-mentioned secondary sensor is typically larger than the default triggered time threshold of above-mentioned master reference
Value.Above-mentioned processor can be used to implement step 201 to the step 204 or step 401 to step 405 of previous embodiment such as and retouch
The method stated.
As an example, the data that binocular camera and laser radar are exported can trigger data mixing operation.It is above-mentioned
The data output time interval of binocular camera can be 0.08s, i.e. its data output frequencies are 12.5Hz.Above-mentioned laser radar
Data output time interval can be 0.1s, i.e., its data output frequencies be 10Hz.Based on binocular camera is arranged
Sensor can set 0.06s for the default triggered time threshold value of above-mentioned binocular camera, i.e., triggering frequency is 16.67Hz.
In order to set time sensor for laser radar, 0.15s can be set by the default triggered time threshold value of above-mentioned laser radar,
I.e. triggering frequency is 6.67Hz.
When the data output frequencies of binocular camera are normal, data fusion operation can be carried out with the frequency of binocular camera
Triggering, front half section as shown in FIG. 6.If there is the frame losing (dotted line in such as Fig. 6 in certain intermediate moment binocular camera output data
It is shown) when, above-mentioned processor can determine the timestamp and trigger data mixing operation of present frame laser radar output data
Difference between the timestamp of last trigger data fusion (binocular sensor) in historical record.In response to the above-mentioned difference of determination
Value 0.06s is less than the default triggered time threshold value 0.15s of laser radar, not trigger data mixing operation.When laser radar again
Output data, above-mentioned difference reach 0.16s, and the default triggered time of laser radar is greater than in response to the above-mentioned difference 0.16s of determination
Threshold value 0.15s can switch to and be triggered using the data of laser radar output to data mixing operation, as shown in FIG. 6
Middle section.The collected information of sensor is melted to be still ensured that in the case where frame losing occurs in master reference output data
Conjunction processing generates correctly perception output result.After the data output frequencies of binocular camera restore normal, since binocular is taken the photograph
As the data output time interval of head and default triggered time threshold value are all smaller, it is thus possible to switch to defeated using binocular camera
Data out trigger data mixing operation.To ensure that the frequency of data fusion operation is unlikely to too low, for rear
Continuous processing module can perceive in time the environment of surrounding according to Data Fusion result.
Below with reference to Fig. 7, below with reference to Fig. 7, it illustrates the electronic equipments for being suitable for being used to realize embodiment of the disclosure
The structural schematic diagram of (terminal device of example as shown in figure 1) 700.Terminal device in embodiment of the disclosure may include but unlimited
In the mobile terminal of car-mounted terminal (such as automated driving system, DAS (Driver Assistant System)) etc. and consolidating for desktop computer etc.
Determine terminal.Terminal device shown in Fig. 7 is only an example, should not function and use scope band to embodiment of the disclosure
Carry out any restrictions.
As shown in fig. 7, electronic equipment 700 may include processing unit (such as central processing unit, graphics processor etc.)
701, random access can be loaded into according to the program being stored in read-only memory (ROM) 702 or from storage device 708
Program in memory (RAM) 703 and execute various movements appropriate and processing.In RAM 703, it is also stored with electronic equipment
Various programs and data needed for 700 operations.Processing unit 701, ROM 702 and RAM703 are connected with each other by bus 704.
Input/output (I/O) interface 705 is also connected to bus 704.
In general, following device can connect to I/O interface 705: including such as touch screen, touch tablet, keyboard, mouse, taking the photograph
As the input unit 706 of head, microphone, accelerometer, gyroscope etc.;Including such as liquid crystal display (LCD, Liquid
Crystal Display), loudspeaker, vibrator etc. output device 707;Storage device including such as tape, hard disk etc.
708;And communication device 709.Communication device 709 can permit electronic equipment 700 and wirelessly or non-wirelessly be led to other equipment
Letter is to exchange data.Although Fig. 7 shows the electronic equipment 700 with various devices, it should be understood that being not required for reality
Apply or have all devices shown.It can alternatively implement or have more or fewer devices.Each side shown in Fig. 7
Frame can represent a device, also can according to need and represent multiple devices.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description
Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable medium
On computer program, which includes the program code for method shown in execution flow chart.In such reality
It applies in example, which can be downloaded and installed from network by communication device 709, or from storage device 708
It is mounted, or is mounted from ROM 702.When the computer program is executed by processing unit 701, the implementation of the disclosure is executed
The above-mentioned function of being limited in the method for example.
It is situated between it should be noted that computer-readable medium described in embodiment of the disclosure can be computer-readable signal
Matter or computer readable storage medium either the two any combination.Computer readable storage medium for example can be with
System, device or the device of --- but being not limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, or it is any more than
Combination.The more specific example of computer readable storage medium can include but is not limited to: have one or more conducting wires
Electrical connection, portable computer diskette, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type are programmable
Read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic are deposited
Memory device or above-mentioned any appropriate combination.In embodiment of the disclosure, computer readable storage medium, which can be, appoints
What include or the tangible medium of storage program that the program can be commanded execution system, device or device use or and its
It is used in combination.And in embodiment of the disclosure, computer-readable signal media may include in a base band or as carrier wave
The data-signal that a part is propagated, wherein carrying computer-readable program code.The data-signal of this propagation can be adopted
With diversified forms, including but not limited to electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal is situated between
Matter can also be any computer-readable medium other than computer readable storage medium, which can be with
It sends, propagate or transmits for by the use of instruction execution system, device or device or program in connection.Meter
The program code for including on calculation machine readable medium can transmit with any suitable medium, including but not limited to: electric wire, optical cable,
RF (Radio Frequency, radio frequency) etc. or above-mentioned any appropriate combination.
Above-mentioned computer-readable medium can be included in above-mentioned terminal device;It is also possible to individualism, and not
It is fitted into the terminal device.Above-mentioned computer-readable medium carries one or more program, when said one or more
When a program is executed by the terminal device, so that the terminal device: receiving the first data of first sensor output;Determine first
Whether the type of sensor matches with sensor of interest type, wherein sensor of interest type and trigger data mixing operation
Historical record it is associated;In response to determining that the type of first sensor matches with sensor of interest type, trigger data is melted
Closing operation;In response to trigger data mixing operation, the type of first sensor is stored.
The behaviour for executing embodiment of the disclosure can be write with one or more programming languages or combinations thereof
The computer program code of work, described program design language include object oriented program language-such as Java,
Smalltalk, C++ further include conventional procedural programming language-such as " C " language or similar program design language
Speech.Program code can be executed fully on the user computer, partly be executed on the user computer, as an independence
Software package execute, part on the user computer part execute on the remote computer or completely in remote computer or
It is executed on server.In situations involving remote computers, remote computer can pass through the network of any kind --- packet
It includes local area network (LAN) or wide area network (WAN)-is connected to subscriber computer, or, it may be connected to outer computer (such as benefit
It is connected with ISP by internet).
Flow chart and block diagram in attached drawing illustrate system, method and the computer of the various embodiments according to the disclosure
The architecture, function and operation in the cards of program product.In this regard, each box in flowchart or block diagram can be with
A part of a module, program segment or code is represented, a part of the module, program segment or code includes one or more
Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box
The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical
On can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it wants
It is noted that the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart, Ke Yiyong
The dedicated hardware based system of defined functions or operations is executed to realize, or can be referred to specialized hardware and computer
The combination of order is realized.
Being described in unit involved in embodiment of the disclosure can be realized by way of software, can also be passed through
The mode of hardware is realized.Described unit also can be set in the processor, for example, can be described as: a kind of processor,
Including receiving unit, determination unit, the first trigger unit and the first storage unit.Wherein, the title of these units is in certain feelings
The restriction to the unit itself is not constituted under condition, for example, receiving unit is also described as " receiving first sensor output
The first data unit ".
Above description is only the preferred embodiment of the disclosure and the explanation to institute's application technology principle.Those skilled in the art
Member it should be appreciated that embodiment of the disclosure involved in invention scope, however it is not limited to the specific combination of above-mentioned technical characteristic and
At technical solution, while should also cover do not depart from foregoing invention design in the case where, by above-mentioned technical characteristic or its be equal
Feature carries out any combination and other technical solutions for being formed.Such as disclosed in features described above and embodiment of the disclosure (but
It is not limited to) technical characteristic with similar functions is replaced mutually and the technical solution that is formed.
Claims (15)
1. a kind of method for handling data, comprising:
Receive the first data of first sensor output;
Determine whether the type of the first sensor matches with sensor of interest type, wherein the sensor of interest class
Type is associated with the historical record of trigger data mixing operation;
Match in response to the type and the sensor of interest type of the determination first sensor, triggers the data fusion
Operation;
In response to triggering the data fusion operation, the type of the first sensor is stored.
2. according to the method described in claim 1, wherein, the method also includes:
Type and the sensor of interest type in response to the determination first sensor mismatch, and determine and receive described the
Whether difference at the time of one data between corresponding timestamp and object time stamp is greater than target activation threshold value;In response to determination
The difference is greater than the target activation threshold value, triggers the data fusion operation.
3. according to the method described in claim 2, wherein, the object time stamp includes the triggering of last data fusion operation
Corresponding timestamp, the target activation threshold value include the corresponding default triggered time threshold value of the first sensor;And
The method also includes:
In response to triggering the data fusion operation, storage triggers timestamp when data fusion operation.
4. according to the method described in claim 1, wherein, the triggering data fusion operates, comprising:
The second data are obtained from second sensor, wherein the second sensor is associated with the first sensor;
Send the information that characterization starts data fusion.
5. according to the method described in claim 4, wherein, the first sensor and the second sensor have respectively corresponded pre-
If triggered time threshold value.
6. method described in one of -5 according to claim 1, wherein the sensor of interest type includes last data fusion
The type of sensor corresponding to the triggering of operation.
7. a kind of for handling the device of data, comprising:
Receiving unit is configured to receive the first data of first sensor output;
Whether determination unit, the type for being configured to determine the first sensor match with sensor of interest type, wherein
The sensor of interest type is associated with the historical record of trigger data mixing operation;
First trigger unit is configured in response to determine the type of the first sensor and the sensor of interest type phase
Matching triggers the data fusion operation;
First storage unit is configured in response to trigger the data fusion operation, stores the type of the first sensor.
8. device according to claim 7, wherein described device further include:
Second trigger unit is configured in response to determine the type of the first sensor and the sensor of interest type not
Matching, determines whether difference at the time of receiving first data between corresponding timestamp and object time stamp is greater than mesh
Mark activation threshold value;It is greater than the target activation threshold value in response to the determination difference, triggers the data fusion operation.
9. device according to claim 8, wherein the object time stamp includes the triggering of last data fusion operation
Corresponding timestamp, the target activation threshold value include the corresponding default triggered time threshold value of the first sensor;It is described
Device further include:
Second storage unit is configured in response to trigger the data fusion operation, and storage triggers the data fusion operation
When timestamp.
10. device according to claim 7, wherein first trigger unit includes:
Module is obtained, is configured to obtain the second data from second sensor, wherein the second sensor and described first passes
Sensor is associated;
Sending module is configured to send the information that characterization starts data fusion.
11. device according to claim 10, wherein the first sensor and the second sensor have respectively corresponded
Default triggered time threshold value.
12. the device according to one of claim 7-11, wherein the sensor of interest type includes that last data are melted
The type of sensor corresponding to the triggering of closing operation.
13. a kind of Multiple Source Sensor merges triggering system, comprising: master reference, secondary sensor, processor, wherein the main biography
The data that the default triggered time threshold value of sensor is less than the master reference export time interval, the default touching of the secondary sensor
The data output time interval that time threshold is greater than the master reference is sent out, the processor is for realizing such as claim 1-6
In any method.
14. a kind of terminal, comprising:
One or more processors;
Storage device is stored thereon with one or more programs;
When one or more of programs are executed by one or more of processors, so that one or more of processors are real
Now such as method as claimed in any one of claims 1 to 6.
15. a kind of computer-readable medium, is stored thereon with computer program, wherein the realization when program is executed by processor
Such as method as claimed in any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910151379.9A CN109871385B (en) | 2019-02-28 | 2019-02-28 | Method and apparatus for processing data |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910151379.9A CN109871385B (en) | 2019-02-28 | 2019-02-28 | Method and apparatus for processing data |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109871385A true CN109871385A (en) | 2019-06-11 |
CN109871385B CN109871385B (en) | 2021-07-27 |
Family
ID=66919523
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910151379.9A Active CN109871385B (en) | 2019-02-28 | 2019-02-28 | Method and apparatus for processing data |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109871385B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111310627A (en) * | 2020-02-07 | 2020-06-19 | 广州视源电子科技股份有限公司 | Detection method and device of sensing device and electronic equipment |
CN111753901A (en) * | 2020-06-23 | 2020-10-09 | 国汽(北京)智能网联汽车研究院有限公司 | Data fusion method, device and system and computer equipment |
CN112307094A (en) * | 2019-07-26 | 2021-02-02 | 北京百度网讯科技有限公司 | Automatic driving data reading method and device, computer equipment and storage medium |
CN113327344A (en) * | 2021-05-27 | 2021-08-31 | 北京百度网讯科技有限公司 | Fusion positioning method, device, equipment, storage medium and program product |
WO2022110801A1 (en) * | 2020-11-26 | 2022-06-02 | 浙江商汤科技开发有限公司 | Data processing method and apparatus, electronic device, and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102461071A (en) * | 2009-06-15 | 2012-05-16 | 高通股份有限公司 | Method for configuration of a sensor in a network |
JP5753286B1 (en) * | 2014-02-05 | 2015-07-22 | 株式会社日立パワーソリューションズ | Information processing apparatus, diagnostic method, and program |
CN106017475A (en) * | 2016-07-04 | 2016-10-12 | 四川九洲电器集团有限责任公司 | Flight path updating method and flight path updating device |
CN106840242A (en) * | 2017-01-23 | 2017-06-13 | 驭势科技(北京)有限公司 | The sensor self-checking system and multi-sensor fusion system of a kind of intelligent driving automobile |
WO2018073960A1 (en) * | 2016-10-21 | 2018-04-26 | 日本電気株式会社 | Display method, display device, and program |
CN108663988A (en) * | 2018-05-31 | 2018-10-16 | 深圳市鑫汇达机械设计有限公司 | Numerically-controlled machine tool intelligent monitor system based on Internet of Things |
CN108983219A (en) * | 2018-08-17 | 2018-12-11 | 北京航空航天大学 | A kind of image information of traffic scene and the fusion method and system of radar information |
-
2019
- 2019-02-28 CN CN201910151379.9A patent/CN109871385B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102461071A (en) * | 2009-06-15 | 2012-05-16 | 高通股份有限公司 | Method for configuration of a sensor in a network |
JP5753286B1 (en) * | 2014-02-05 | 2015-07-22 | 株式会社日立パワーソリューションズ | Information processing apparatus, diagnostic method, and program |
CN106017475A (en) * | 2016-07-04 | 2016-10-12 | 四川九洲电器集团有限责任公司 | Flight path updating method and flight path updating device |
WO2018073960A1 (en) * | 2016-10-21 | 2018-04-26 | 日本電気株式会社 | Display method, display device, and program |
CN106840242A (en) * | 2017-01-23 | 2017-06-13 | 驭势科技(北京)有限公司 | The sensor self-checking system and multi-sensor fusion system of a kind of intelligent driving automobile |
CN108663988A (en) * | 2018-05-31 | 2018-10-16 | 深圳市鑫汇达机械设计有限公司 | Numerically-controlled machine tool intelligent monitor system based on Internet of Things |
CN108983219A (en) * | 2018-08-17 | 2018-12-11 | 北京航空航天大学 | A kind of image information of traffic scene and the fusion method and system of radar information |
Non-Patent Citations (1)
Title |
---|
侯立梅: "基于多传感器数据融合智能导航车的算法研究", 《中国优秀硕士学位论文全文数据库工程科技II辑》 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112307094A (en) * | 2019-07-26 | 2021-02-02 | 北京百度网讯科技有限公司 | Automatic driving data reading method and device, computer equipment and storage medium |
CN112307094B (en) * | 2019-07-26 | 2024-04-02 | 北京百度网讯科技有限公司 | Automatic driving data reading method and device, computer equipment and storage medium |
CN111310627A (en) * | 2020-02-07 | 2020-06-19 | 广州视源电子科技股份有限公司 | Detection method and device of sensing device and electronic equipment |
CN111310627B (en) * | 2020-02-07 | 2024-01-30 | 广州视源电子科技股份有限公司 | Detection method and device of sensing device and electronic equipment |
CN111753901A (en) * | 2020-06-23 | 2020-10-09 | 国汽(北京)智能网联汽车研究院有限公司 | Data fusion method, device and system and computer equipment |
CN111753901B (en) * | 2020-06-23 | 2023-08-15 | 国汽(北京)智能网联汽车研究院有限公司 | Data fusion method, device, system and computer equipment |
WO2022110801A1 (en) * | 2020-11-26 | 2022-06-02 | 浙江商汤科技开发有限公司 | Data processing method and apparatus, electronic device, and storage medium |
CN113327344A (en) * | 2021-05-27 | 2021-08-31 | 北京百度网讯科技有限公司 | Fusion positioning method, device, equipment, storage medium and program product |
CN113327344B (en) * | 2021-05-27 | 2023-03-21 | 北京百度网讯科技有限公司 | Fusion positioning method, device, equipment, storage medium and program product |
Also Published As
Publication number | Publication date |
---|---|
CN109871385B (en) | 2021-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109871385A (en) | Method and apparatus for handling data | |
CN110148294B (en) | Road condition state determining method and device | |
CN111262887B (en) | Network risk detection method, device, equipment and medium based on object characteristics | |
US9684523B2 (en) | In-vehicle information system, information terminal, and application execution method | |
CN111552470A (en) | Data analysis task creation method and device in Internet of things and storage medium | |
CN111125442B (en) | Data labeling method and device | |
JP6049452B2 (en) | Information display device, information display system, control method thereof, and program | |
KR20150025456A (en) | An electronic device and operating metod thereof | |
CN111159604A (en) | Picture resource loading method and device | |
CN110119725A (en) | For detecting the method and device of signal lamp | |
CN111416756A (en) | Protocol testing method, device, computer equipment and storage medium | |
CN112116690A (en) | Video special effect generation method and device and terminal | |
KR20120038315A (en) | User equipment and method for generating ar tag information, and system | |
CN110293977A (en) | Method and apparatus for showing augmented reality information warning | |
CN114241415A (en) | Vehicle position monitoring method, edge calculation device, monitoring device and system | |
JP4163456B2 (en) | Seamless pointing system | |
CN110650210B (en) | Image data acquisition method, device and storage medium | |
CN113469360B (en) | Reasoning method and device | |
US9740449B2 (en) | Onboard display system | |
CN112590798B (en) | Method, apparatus, electronic device, and medium for detecting driver state | |
CN112965911B (en) | Interface abnormity detection method and device, computer equipment and storage medium | |
CN112260845B (en) | Method and device for accelerating data transmission | |
CN108401003A (en) | Synchronous method, device, equipment and the computer storage media of radar data | |
CN113703704A (en) | Interface display method, head-mounted display device and computer readable medium | |
CN108920122B (en) | Screen display method, device, terminal and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |