CN109752717A - Device and method for the sensing data in associated vehicle - Google Patents
Device and method for the sensing data in associated vehicle Download PDFInfo
- Publication number
- CN109752717A CN109752717A CN201810685694.5A CN201810685694A CN109752717A CN 109752717 A CN109752717 A CN 109752717A CN 201810685694 A CN201810685694 A CN 201810685694A CN 109752717 A CN109752717 A CN 109752717A
- Authority
- CN
- China
- Prior art keywords
- target
- vehicle
- sensor
- sensing data
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 19
- 239000012141 concentrate Substances 0.000 abstract description 11
- 230000004927 fusion Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 21
- 238000001514 detection method Methods 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000006399 behavior Effects 0.000 description 2
- 235000013399 edible fruits Nutrition 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- HUTDUHSNJYTCAR-UHFFFAOYSA-N ancymidol Chemical compound C1=CC(OC)=CC=C1C(O)(C=1C=NC=NC=1)C1CC1 HUTDUHSNJYTCAR-UHFFFAOYSA-N 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
- B60W40/072—Curvature of the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/584—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0002—Automatic control, details of type of controller or control system architecture
- B60W2050/0004—In digital systems, e.g. discrete-time systems involving sampling
- B60W2050/0005—Processor details or data handling, e.g. memory registers or chip architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo or light sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B60W2420/408—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4042—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4044—Direction of movement, e.g. backwards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2400/00—Special features of vehicle units
- B60Y2400/30—Sensors
Abstract
The disclosure provides the device and method for the sensing data in associated vehicle.The device includes the processor for being configured to multiple sensors of sensing external object and being configured to be electrically connected with multiple sensors.Processor is configured so that multiple sensors to obtain the sensor data set of the target positioned at vehicle periphery, the at least part for the sensing data that the curvature of the road travelled based on target information, vehicle or target and at least one of the type of sensor for being configured to sensing target include to select sensing data to concentrate, and generate by least part of associated sensor data the tracking data of target.
Description
Cross reference to related applications
This application claims on November 07th, 2017 the 10-2017-0147439 South Korea patent application submitted it is preferential
Power and equity, entire contents are incorporated herein by reference.
Technical field
This disclosure relates to a kind of device and method for being associated with the data obtained by multiple sensors.
Background technique
Statement in this section only provides background information relevant to the disclosure, and may not constitute the prior art.
With the development of the automobile industry, surrounding vehicles can be tracked and obtain relevant to surrounding vehicles more by having developed
The system of kind information, to provide convenience to driver.For example, such as front radar, angle radar, forward sight photograph can be used in vehicle
The various sensors of machine, side view camera, rear view cameras, vision and light detection and range unit (LIDAR) are closed
In the information for the surrounding vehicles being located in all directions.Vehicle can be by being associated with the biography obtained by each of each sensor
Sensor data generate sensor fused data.As a result, it is possible to increase the essence of the data obtained by each of each sensor
Exactness.Vehicle can be associated with the sensor fused data of present frame by the sensor fused data of previous frame.
Since running environment is (that is, the sensing range of each of the movement of vehicle, the movement of target, multiple sensors
Deng), it may be difficult to the data of previous frame are associated with the data of present frame.In this case, for the elder generation of same target
The attribute information (for example, information of the type (vehicle, pedestrian etc.) of instruction target) and mark of the sensor fused data of previous frame
According with (ID) (for example, the unique identification value for distributing to target) can be with the sensor fused data of the present frame for same target
Attribute information and ID it is different.When vehicle data is associated with failure, the behaviour of the various systems for the sensor supported by vehicle is used
Work may also fail.
Summary of the invention
The one side of the disclosure provides a kind of device and method for associated sensor data, to consider vehicle comprehensively
, the data obtained by sensor are associated with robust fashion in the case where surrounding vehicles, running environment etc..
In in one aspect of the present disclosure, a kind of device for the sensing data in associated vehicle may include: more
A sensor is configured to sensing external object;And processor, it is configured to be electrically connected with multiple sensors.Processor can be with
Multiple sensors are configured so as to obtain the sensor data set of the target positioned at vehicle periphery, at least based on target information,
The curvature for the road that vehicle or target are travelled is configured to sense the type of the sensor of target to select sensor data set
In include sensing data at least part, and target is generated by least part of associated sensor data
Tracking data.
In some embodiments of the disclosure, multiple sensors may include radar and camera.
In some embodiments of the disclosure, multiple sensors may include front radar, forward sight camera and side view
Camera.
In some embodiments of the disclosure, sensor data set may include being passed by each of multiple sensors
The sensing data that sensor obtains.
In some embodiments of the disclosure, processor is configurable to: the road travelled based on target information, vehicle
The setting of at least one of the curvature on road and the type of sensor for being configured to sensing target is for determining sensor data set
Validity region;And sensing data is selected to concentrate sensor number corresponding with the region in the sensing data for including
According to.
In some embodiments of the disclosure, target information may include the distance between vehicle-to-target, vehicle
At least one of direction of motion and the type of target of difference, target between speed and the speed of target.
In some embodiments of the disclosure, processor is configurable to: being set based on the distance between vehicle-to-target
Determine the region for determining the validity of sensor data set;And sensing data is selected to concentrate in the sensing data for including
Sensing data corresponding with the region.
In some embodiments of the disclosure, processor is configurable to: when the distance between vehicle-to-target becomes
Increase the region when bigger than preset distance;And reduce the area when the distance between vehicle-to-target becomes smaller than preset distance
Domain.
In some embodiments of the disclosure, processor is configurable to: the speed of speed and target based on vehicle
Between difference set the region for determining the validity of sensor data set;And it selects sensing data to concentrate and includes
Sensing data corresponding with the region in sensing data.
In some embodiments of the disclosure, processor is configurable to: when vehicle speed and target speed it
Between difference increase the region when becoming larger than predetermined value;And when the difference between the speed of vehicle and the speed of target becomes
Reduce the region when less than predetermined value.
In some embodiments of the disclosure, processor is configurable to: being set based on the curvature of road for determining
The region of the validity of sensor data set;And select sensing data concentrate include sensing data in the region pair
The sensing data answered.
In some embodiments of the disclosure, processor is configurable to: when the curvature of road becomes larger than predetermined model
Increase the region when enclosing;And reduce the region when the curvature of road becomes smaller than preset range.
In some embodiments of the disclosure, processor is configurable to: the direction of motion setting based on target is used for
Determine the region of the validity of sensor data set;And sensing data is selected to concentrate the area sensing data Zhong Yugai for including
The corresponding sensing data in domain.
In some embodiments of the disclosure, processor is configurable to increase the region in the direction of motion of target
Length.
In some embodiments of the disclosure, processor is configurable to: sensor-based type set is for true
Determine the region of the validity of sensor data set;And select sensing data concentrate include sensing data in the region
Corresponding sensing data.
In some embodiments of the disclosure, processor is configurable to: based on the type set of target for determining
The region of the validity of sensor data set;And select sensing data concentrate include sensing data in the region pair
The sensing data answered.
According to another aspect of the present disclosure, a kind of method for the sensing data in associated vehicle may include: to make
The sensor data set of the target positioned at vehicle periphery is obtained with multiple sensors of vehicle;Based on target information, vehicle or
The curvature for the road that target is travelled and at least one of type of sensor of sensing target is configured to select to sense
At least part for the sensing data for including in device data set;And at least part next life for passing through associated sensor data
At the tracking data of target.
In some embodiments of the disclosure, sensor data set may include being passed by each of multiple sensors
The sensing data that sensor obtains.
In some embodiments of the disclosure, selecting at least part of sensing data may include: based on target
The curvature of the road that information, vehicle are travelled and be configured to sensing target sensor type at least part set
Determine the region for determining the validity of sensor data set;And sensing data is selected to concentrate in the sensing data for including
Sensing data corresponding with the region.
In some embodiments of the disclosure, target information may include: the distance between vehicle-to-target, vehicle
At least one of direction of motion and the type of target of difference, target between speed and the speed of target.
According to description provided herein, other application field be will become obvious.It should be understood that description and specific
Example for illustration purposes only, is not intended to limit the scope of the present disclosure.
Detailed description of the invention
In order to better understand the disclosure, it is now described with reference to the drawings the disclosure provided in an illustrative manner
Various embodiments, in which:
Fig. 1 is the block diagram for showing the configuration of the device for associated sensor data;
Fig. 2 is the block diagram of the configuration for the program module for including in the device shown for associated sensor data;
Fig. 3 is the flow chart for showing the algorithm used in the device for associated sensor data;
Fig. 4 is the diagram for showing the exemplary operation of the device for associated sensor data;
Fig. 5 is the diagram for showing the exemplary operation of the device for associated sensor data;
Fig. 6 is the diagram for showing the exemplary operation of the device for associated sensor data;
Fig. 7 is the diagram for showing the exemplary operation of the device for associated sensor data;
Fig. 8 is the diagram for showing the exemplary operation of the device for associated sensor data;
Fig. 9 is the diagram for showing the exemplary operation of the device for associated sensor data;
Figure 10 is the diagram for showing the exemplary operation of the device for associated sensor data;
Figure 11 is the flow chart for showing the method for associated sensor data;And
Figure 12 is the block diagram for showing the configuration of computing system.
Attached drawing described herein is for illustration purposes only, it is no intended to limit the scope of the present disclosure in any way.
Specific embodiment
It is described below and is substantially only exemplary, it is no intended to limit the disclosure, application or purposes.It should be understood that
It is that throughout the drawings, corresponding appended drawing reference indicates identical or corresponding component and feature.
Hereinafter, it will be described in detail with reference to the accompanying drawings embodiment of the present disclosure.It is attached in the element addition for each attached drawing
Icon clocks, although identical element is shown on different attached drawings, it is noted that, identical element is having the same
Appended drawing reference.In addition, when describing implementation of the disclosure mode, if it is determined that the detailed description mould of related known configuration or function
The main idea for having pasted embodiment of the present disclosure, then omit it.
In the element for describing implementation of the disclosure mode, can used herein such as first, second, first,
Second, A, B, the words such as (a) and (b).These words are only used for distinguishing element and another element, no matter respective element
Inherent feature, order or sequence how, do not limit respective element.Unless otherwise defined, otherwise used herein includes technology
Or all terms of scientific term have meaning identical with the normally understood meaning of disclosure those skilled in the art.?
Those terms defined in usually used dictionary should be interpreted as having contain identical with the context implication in related fields
Justice, and should not be interpreted as having ideal or meaning too formal, unless explicitly defining in this application.
Fig. 1 is the frame for showing the configuration of the device for associated sensor data in some embodiments of the disclosure
Figure.
Referring to Fig.1, (for ease of description, hereinafter referred to as the device 100 of the sensing data in associated vehicle
" device 100 ") it may include multiple sensors 111 to 113 and processor 120.The device 100 of Fig. 1 can be loaded into vehicle
In.
Each of multiple sensors 111 to 113 are configurable to sensing external object.Multiple sensors 111 to
Each of 113 are configurable to obtain the information about external object.For example, every in multiple sensors 111 to 113
One can obtain the position about external object, the speed of external object, the moving direction of external object and/or external right
The information of the type (for example, vehicle, pedestrian, bicycle, motorcycle etc.) of elephant.Multiple sensors 111 to 113 may include for example
Radar and camera.For example, first sensor 111 can be front radar, second sensor 112 can be forward sight camera,
And 3rd sensor 113 can be side view camera.In Fig. 1, the device including three sensors 111 to 113 is described
100 illustrative embodiments.However, embodiment is without being limited thereto.For example, device 100 may include two or more biographies
Sensor may include different types of sensor, such as laser scanner and/or angle radar (corner radar).
Processor 120 can be electrically connected with multiple sensors 111 to 113.Processor 120 can control multiple sensors
111 to 113, and various data processings and various arithmetical operations can be executed.
In some embodiments of the disclosure, processor 120 can be used multiple sensors 111 to 113 and is located at
The sensor data set of the target of vehicle periphery.Processor 120 can by each of multiple sensors 111 to 113 come
Detect target.Sensor data set may include the sensor number obtained by each of multiple sensors 111 to 113
According to.For example, first sensor 111, which can be used, in processor 120 obtains the first data, the acquisition of second sensor 112 can be used
Second data, and 3rd sensor 113 can be used and obtain third data.Processor 120 can obtain including the first data,
The set of the sensing data of second data and third data.
In some embodiments of the disclosure, processor 120 can be based on about the information of target, vehicle or target institute
The curvature of the road of traveling and sense target sensor type at least part select sensor data set
At least partially.Information about target may include for example about the distance between vehicle-to-target, the speed of vehicle and mesh
At least part of information in the direction of motion of difference, target and the type of target between target speed.Processor 120
Can curvature based on the road that the information about target, vehicle are travelled and sense target sensor type in extremely
Few a part, to set the region for determining the validity of sensor data set.The region can be for gating
(gating) region of sensor data set.Processor 120 can be concentrated in the sensing data for including in sensing data and be selected
Select sensing data corresponding with setting regions.Selected sensing data can be used to execute sensor in processor 120
Fusion.The description operated in detail that setting is used to determine the region of validity will be provided referring to Fig. 5 to Figure 10.
In some embodiments of the disclosure, processor 120 can be by being associated at least the one of selected sensing data
Part generates the tracking data of target.Tracking data can be the target data obtained by sensor fusion.Processor 120
It can store the tracking data of previous frame.Processor 120 can predict the movement of target by using the tracking data of previous frame
To obtain prediction data.Processor 120 can be by selected sensing data and the data phase predicted from the tracking data of previous frame
Association.The coordinate of the coordinate and selected sensing data that include in the tracking data of such as previous frame can be used in processor 120
Carry out associated data.Processor 120 can generate the tracking data of the present frame of target by being associated with.
Fig. 2 is the program for including in the device for associated sensor data shown in some embodiments of the disclosure
The block diagram of the configuration of module.
Referring to Fig. 2, the device in some embodiments of the disclosure may include program module.Program module can be used for
Sensor fusion, can obtain target data from multiple sensors, and the sensor fusion tracking data of target can be generated.
Program module may include preprocessing module 210, relating module 220, tracking module 230 and tracing management module 240.
Preprocessing module 210 can receive signal from multiple sensors.Preprocessing module 210 can execute the letter received
Number time/spatial synchronization.The target classification that preprocessing module 210 can will test is mobile object or stopping object.In advance
Processing module 210 can execute each sensor in multiple sensors effective verifying (for example, fault detection be isolated
(FDI)-tracking continuity tracks mobile history, measurement result relatively etc.).
Relating module 220 can determine between the sensor fusion tracking data of previous frame and the sensing data of present frame
Association.As a definitive result, relating module 220 can change associated diagram.Relating module 220 can be used not associated
The sensing data of present frame generates new associated diagram (for example, tracking candidate).Relating module 220 can verify the biography of previous frame
Being associated between sensor fusion tracking data and the sensing data of present frame.
Tracking module 230 can the more location information of new sensor fusion tracking data, velocity information and evaluated error
Information.Tracking module 230 can update evaluated error covariance information.
Tracing management module 240 can execute the management of the life cycle of sensor fusion tracking data (for example, initial
Change, confirm or terminate).Tracing management module 240 can by sensor fusion tracking data classification be mobile object data or
Stop the data of object.Tracing management module 240 can execute the validity of sensor fusion tracking data verifying (FDI with
Track continuity, the mobile history of tracking, spatial simlanty, chronotaxis etc.).
Fig. 3 is to show the algorithm according to used in the device for associated sensor data of embodiment of the present disclosure
Flow chart.
Referring to Fig. 3, arest neighbors (NN) data association technique is can be used in the device in some embodiments of the disclosure.NN
Data association technique can be easily carried out, and can provide accurate knot in the movement of system and accurate measurement model
Fruit.Here, when executing NN data association technique, it may occur however that unsuitable data correlation and due to unsuitable data close
Connection, which may cause using unsuitable measured value, generates sensor fusion tracking data.This may result under technical performance
Drop.Device in some embodiments of the disclosure can provide the function of the performance for enhancing operation relevant to data correlation
It can be (for example, new breath (innovations) measures and calculates result verification, measurement result association and combined measurement knot
Fruit).
Fig. 4 is the exemplary operation for showing the device for associated sensor data in some embodiments of the disclosure
Diagram.
Referring to Fig. 4, the device in some embodiments of the disclosure can determine the fusion of the sensor at previous time with
Being associated between track data and the sensing data newly measured.For example, the device can be based on by camera, angle radar and radar
Sensing data (the z measured at first frame2) 411 to 413 generate the sensor fusion tracking number of first frame (previous frame)
According to.The device can predict the movement of target based on the sensor fusion tracking data of first frame, and can predict will be
The sensor fusion tracking data obtained at second frame (present frame)420.The device can be obtained by camera, angle radar
Sensing data (the z measured at the second frame with radar1) 431 to 433.The device can be in the number measured by multiple sensors
There is the sensing data 411 to 413 and 431 to 433 for the coordinate being located in domain (gate) 450 according to middle selection.For selecting domain
The example formula of data in 450 can be following formula 1.
[formula 1]
First formula above can be mahalanobis distance formula.Z can indicate the sensing data measured,It can be with table
Show the sensor fusion tracking data of prediction, d can indicate the coordinate of the sensing data measured and the sensor fusion of prediction
The distance between coordinate of tracking data, S can indicate covariance matrix, and γ can be with representative domain threshold value.The device can be with
Select the sensing data for meeting the first formula as effective sensor data.Second formula and third formula can be from
One formula export.The domain of ellipse can be formed by formula 1, and domain threshold value can be according to will retouch referring to Fig. 5 to Figure 10
The case where stating and change.
The device can measure the coordinate of the sensor fusion tracking data 420 of prediction with by camera in first frame
The distance between the coordinate of sensing data 411 and prediction sensor fusion tracking data 420 coordinate with by camera
Distance between the coordinate of the sensing data 431 measured at the second frame is compared.The device can choose have with it is pre-
The sensing data 431 of the coordinate of the coordinate relative close of the sensor fusion tracking data 420 of survey.The device can be with similar
Mode select sensing data 432 and sensing data 433.Selected sensing data 431 to 433 can be used in the device
Generate the sensor fusion tracking data of the second frame.
Fig. 5 is the exemplary operation for showing the device for associated sensor data in some embodiments of the disclosure
Diagram.
Vehicle 510 shown in fig. 5 may include the device 100 of Fig. 1.In the description of Fig. 5, it is described as being held by vehicle 510
Capable operation by the processor 120 of device 100 it is understood that be controlled.
In some embodiments of the disclosure, vehicle can be set based on the distance between vehicle-to-target for true
Determine the region of the validity of sensor data set, and can sensing data concentrate in the sensing data for including selection with
The corresponding sensing data of setting regions.For example, vehicle can increase the region when the distance between vehicle-to-target is elongated
Area.When the distance between vehicle-to-target shortens, vehicle can reduce the area in the region.
Referring to Fig. 5, its sensor is can be used to detect the vehicle around vehicle 510 in vehicle 510.For example, vehicle 510 can
To detect the target vehicle 520 for being located at 510 front of vehicle.When target vehicle 520 accelerates so that target vehicle 520 and vehicle
The distance between 510 it is elongated when, the uncertainty of the measured value of target vehicle 520 may be will increase.Therefore, when 510 He of vehicle
When the distance between target vehicle 520 is elongated, vehicle 510 can increase the size in domain.
Fig. 6 is the exemplary operation for showing the device for associated sensor data in some embodiments of the disclosure
Diagram.
Vehicle 610 shown in Fig. 6 may include the device 100 of Fig. 1.In the description of Fig. 6, it is described as by vehicle 610
The operation of execution by the processor 120 of device 100 it is understood that be controlled.
In some embodiments of the disclosure, vehicle can difference between the speed based on vehicle and the speed of target
It sets the region for determining the validity of sensor data set, and the sensor for including can be concentrated in sensing data
Sensing data corresponding with setting regions is selected in data.For example, when the difference between the speed of vehicle and the speed of target
When becoming larger, vehicle can increase the area in the region.When the difference between the speed of vehicle and the speed of target becomes smaller, vehicle
It can reduce the area in the region.
Referring to Fig. 6, the vehicle around its sensor detected vehicle 610 is can be used in vehicle 610.For example, vehicle 610 can be with
Detect the second target vehicle positioned at the first object vehicle 620 of 610 left front of vehicle and positioned at 610 right front of vehicle
630.Vehicle 610 and first object vehicle 620 can be at low speed, and the second target vehicle 630 can run at high speed.At this
In the case of kind, vehicle 610 may be faster than vehicle 610 relative to first object relative to the relative velocity of the second target vehicle 630
The relative velocity of vehicle 620.When relative velocity faster when, the uncertainty of measured value may be higher.Therefore, vehicle 610 can
The size for being directed to the domain of the second target vehicle 630 to be set greater than to the size in the domain for first object vehicle 620.
Fig. 7 is the exemplary operation for showing the device for associated sensor data in some embodiments of the disclosure
Diagram.
Vehicle 710 shown in Fig. 7 may include the device 100 of Fig. 1.In the description of Fig. 7, it is described as being held by vehicle 710
Capable operation by the processor 120 of device 100 it is understood that be controlled.
In some embodiments of the disclosure, vehicle can be set based on the curvature of road for determining sensor number
According to the region of the validity of collection, and selection and setting regions pair in the sensing data for including can be concentrated in sensing data
The sensing data answered.For example, vehicle can increase the area in the region when the curvature of road becomes larger.When the curvature of road
When becoming smaller, vehicle can reduce the area in the region.
Referring to Fig. 7, the vehicle around its sensor detected vehicle 710 is can be used in vehicle 710.For example, vehicle 710 can be with
Detect the first object vehicle 720 for being located at 710 front of vehicle and the second target carriage positioned at 720 front of first object vehicle
730.Vehicle 710, which can obtain cartographic information and can use cartographic information, obtains 720 position of first object vehicle
730 position of road curvature information and the second target vehicle road curvature information.Second target vehicle, 730 institute is in place
The curvature set can be greater than the curvature of 720 position of first object vehicle.When the curvature of target position becomes larger, measurement
The uncertainty of value may be higher.Therefore, the size in the domain for the second target vehicle 730 can be set as by vehicle 710
Greater than the size in the domain for first object vehicle 720.
Fig. 8 is the exemplary operation for showing the device for associated sensor data in some embodiments of the disclosure
Diagram.
Vehicle 810 shown in Fig. 8 may include the device 100 of Fig. 1.In the description of Fig. 8, it is described as by vehicle 810
The operation of execution by the processor 120 of device 100 it is understood that be controlled.
In some embodiments of the disclosure, vehicle can be set based on the direction of motion of target for determining sensing
The region of the validity of device data set, and selection and setting area in the sensing data for including can be concentrated in sensing data
The corresponding sensing data in domain.For example, vehicle can increase the length in the region.
Referring to Fig. 8, the vehicle around its sensor detected vehicle 810 is can be used in vehicle 810.Vehicle 810 can detecte
To the first object vehicle 820 of the right direction traveling in vehicle 810 and the second target vehicle of the traveling ahead to vehicle 810
830.Vehicle 810 can obtain the direction of motion about each of first object vehicle 820 and the second target vehicle 830
Information.When the object moves, the uncertainty of the measured value of the direction of motion may be got higher.For example, working as target in the direction of the x axis
When mobile, the uncertainty of the measured value of x-axis coordinate may be got higher.Therefore, vehicle 810 can will be directed to first object vehicle
820 domain is set as longer in the horizontal direction, and can will be set as the domain of the second target vehicle 830 in vertical side
It is longer upwards.
Fig. 9 is the exemplary operation for showing the device for associated sensor data in some embodiments of the disclosure
Diagram.
Vehicle 910 shown in Fig. 9 may include the device 100 of Fig. 1.In the description of Fig. 9, it is described as being held by vehicle 910
Capable operation by the processor 120 of device 100 it is understood that be controlled.
In some embodiments of the disclosure, vehicle can be set with sensor-based type for determining sensor
The region of the validity of data set, and selection and setting regions in the sensing data for including can be concentrated in sensing data
Corresponding sensing data.
Referring to Fig. 9, the vehicle around its sensor detected vehicle 910 is can be used in vehicle 910.For example, vehicle 910 can be with
Detect first object vehicle 920, the second target vehicle 930 and third target vehicle 940.Before vehicle 910 can be used only
Portion's detections of radar first object vehicle 920.The accuracy of the lengthwise position of front radar may be higher, but the transverse direction of front radar
The accuracy of position may be lower.Accordingly, it is considered to the spy of the front radar to the sensor as sensing first object vehicle 920
Property, vehicle 910 can set longer domain in the lateral direction relative to first object vehicle 920.Before vehicle 910 can be used
Portion's radar and forward sight camera detect the second target vehicle 930.When using two sensor detection targets, survey can be improved
The accuracy of amount.Therefore, the size in the domain for the second target vehicle 930 can be set as smaller by vehicle 910.Vehicle 910
Side detections of radar third target vehicle 940 can be used.The accuracy of side radar in the longitudinal and lateral directions may be used
With similar.Accordingly, it is considered to arrive the characteristic of side radar, vehicle 910 can set circular domain for third target vehicle 940.
Figure 10 is the exemplary behaviour for showing the device for associated sensor data in some embodiments of the disclosure
The diagram of work.
Vehicle 1010 shown in Fig. 10 may include the device 100 of Fig. 1.In the description of Figure 10, it is described as by vehicle
1010 operations executed by the processor 120 of device 100 it is understood that be controlled.
In some embodiments of the disclosure, vehicle can be set based on the type of target for determining sensor number
According to the region of the validity of collection, and selection and setting regions pair in the sensing data for including can be concentrated in sensing data
The sensing data answered.
Referring to Fig.1 0, the various objects around its sensor detected vehicle 1010 can be used in vehicle 1010.For example, vehicle
1010 can detecte first object vehicle 1020, the second target vehicle 1030 and pedestrian 1040.When for first object vehicle
1020 and pedestrian 1040 when setting identical domain, it can will be about the data of first object vehicle 1020 and about pedestrian's 1040
Data correlation, and can will be about the data of first object vehicle 1020 and about the data fusion of pedestrian 1040.Vehicle
1010, which can be used its sensor, obtains the type about 1010 surroundings of vehicle (for example, first object vehicle 1020, second
Target vehicle 1030, pedestrian 1040, bicycle (not shown), motorcycle (not shown) etc.) information.Vehicle 1010 can be by domain
Form be set as being suitable for the type of 1010 surroundings of vehicle.Vehicle 1010 can be for first object vehicle 1020 and the
Each of two target vehicles 1030 set oval domain, and can set for pedestrian 1040 than being directed to first object vehicle
1020 and second each in target vehicle 1030 the small circular domain in domain.
Figure 11 is the flow chart for showing the method for associated sensor data in some embodiments of the disclosure.
Hereinafter, it can be assumed that the device 100 of Fig. 1 executes the processing of Figure 11.Further, in the description of Figure 11,
It is described as the operation executed by device it is understood that be controlled by the processor 120 of device 100.
Referring to Fig.1 1, in operation 1110, which can be used multiple sensors to obtain the mesh positioned at vehicle periphery
Target sensor data set.For example, the first sensor data that first sensor obtains target can be used in device, can be used
Second sensor obtains the second sensor data of target, and the 3rd sensor that 3rd sensor obtains target can be used
Data.
In operation 1120, device curvature based on information, road about target and can sense the sensing of target
At least part for the sensing data that at least part in the type of device includes to select sensing data to concentrate.For example,
Device can curvature based on the road where the distance between device and target, the size of the relative velocity of target, target, mesh
The target direction of motion, the type of sensor for sensing target and/or the type of target set domain, and can choose domain Zhong Bao
The sensing data included.
In operation 1130, device can be generated by being associated at least part of selected sensing data target with
Track data.For example, device can calculate the sensing data of the tracking data and present frame predicted from the tracking data of previous frame
The distance between, and the sensing data that is closer can be used to generate the tracking data of present frame.
Figure 12 is the block diagram for showing the configuration of the computing system in some embodiments of the disclosure.
Referring to Fig.1 2, above-mentioned user's input processing method in some embodiments of the disclosure can pass through computing system
To implement.Computing system 1000 may include at least one processor 1100, memory being connected to each other via bus 1200
1300, user interface input unit 1400, user interface output device 1500, storage device 1600 and network interface 1700.
Processor 1100 can be the place for executing the instruction being stored in memory 1300 and/or storage device 1600
The central processing unit (CPU) or semiconductor devices of reason.Each of memory 1300 and storage device 1600 can wrap
Include various types of volatibility or non-volatile memory medium.For example, memory 1300 may include read-only memory (ROM) and
Random access memory (RAM).
Therefore, the operation of the method or algorithm that describe in conjunction with some embodiments of the disclosure disclosed in this specification can
Directly to be implemented by the hardware module that is executed by processor 1100, software module or combinations thereof.Software module can be resident
In such as RAM, flash memory, ROM, erasable programmable ROM (EPROM), electric EPROM (EEPROM), register, hard disk, move
On the storage medium (that is, memory 1300 and/or storage device 1600) of disk or CD ROM (CD-ROM).Exemplary memory is situated between
Matter may be coupled to processor 1100.Processor 1100 can read information from storage medium and can write information into storage
In medium.Alternatively, storage medium can be integrated with processor 1100.Pocessor and storage media may reside within dedicated integrated
In circuit (ASIC).ASIC may reside in user terminal.Alternatively, pocessor and storage media can be used as user terminal
Independent assembly it is resident.
The device and method for associated sensor data in some embodiments of the disclosure can be by being based on
Value relevant to data correlation is dynamically adjusted about the information of target, road curvature, sensor type etc. to come with robust fashion
Keep the association of the data obtained by multiple sensors.
It is furthermore possible to also provide the various effects directly or indirectly determined by the disclosure.
Although describing the disclosure referring to illustrative embodiments, to those skilled in the art show and
It is clear to, can make various changes and modifications without departing from the spirit and scope of the disclosure.
Therefore, the illustrative embodiments of the disclosure are not limiting, and are exemplary, and the spirit of the disclosure
It is without being limited thereto with range.Spirit and scope of the present disclosure and the disclosure should be explained by appended claims, it should be appreciated that
All technical ideas to be equal with the disclosure are included in spirit and scope of the present disclosure.
The description of the disclosure is substantially only exemplary, therefore the variation for not departing from the purport of the disclosure will be in this public affairs
In the range of opening.This variation is not regarded as a departure from spirit and scope of the present disclosure.
Claims (20)
1. a kind of device for the sensing data in associated vehicle, described device include:
Multiple sensors are configured to sensing external object;With
Processor, is electrically connected to the multiple sensor, and the processor is configured that
The sensor data set of the target positioned at the vehicle periphery is obtained using the multiple sensor, wherein the biography
Sensor data set includes sensing data;
At least part of the sensing data is selected based on the first information, wherein the first information include target information,
It the curvature for the road that the vehicle or the target are travelled and is configured in the type for sensing the sensor of the target
At least one;And
The tracking data of the target is generated by being associated with described at least part of the sensing data.
2. device as described in claim 1, wherein the multiple sensor includes radar and camera.
3. device as described in claim 1, wherein the multiple sensor includes front radar, forward sight camera and side view
Camera.
4. device as described in claim 1, wherein the sensor data set includes by each in the multiple sensor
The sensing data that a sensor obtains.
5. device as described in claim 1, wherein the processor is configured that
Based on the first information, the region for determining the validity of the sensor data set is set;And
Select sensing data corresponding with the region in the sensing data.
6. device as described in claim 1, wherein the target information include: between the vehicle and the target away from
From difference, the direction of motion of the target and the target between, the speed of the vehicle and the speed of the target
At least one of type.
7. device as described in claim 1, wherein the processor is additionally configured to:
Based on the distance between the vehicle and the target, the area for determining the validity of the sensor data set is set
Domain;And
Select sensing data corresponding with the region in the sensing data.
8. device as claimed in claim 7, wherein the processor is configured that
When the distance between the vehicle and the target become bigger than preset distance, increase the region;And
When the distance between the vehicle and the target become smaller than the preset distance, reduce the region.
9. device as described in claim 1, wherein the processor is configured that
Speed based on the vehicle and the difference between the speed of the target are set for determining the sensor data set
Validity region;And
Select sensing data corresponding with the region in the sensing data.
10. device as claimed in claim 9, wherein the processor is configured that
When the difference between the speed of the vehicle and the speed of the target becomes larger than predetermined value, increase the region;
And
When the difference between the speed of the vehicle and the speed of the target becomes smaller than the predetermined value, reduce the area
Domain.
11. device as described in claim 1, wherein the processor is configured that
Based on the curvature of the road, the region for determining the validity of the sensor data set is set;And
Select sensing data corresponding with the region in the sensing data.
12. device as claimed in claim 11, wherein the processor is configured that
When the curvature of the road becomes larger than preset range, increase the region;And
When the curvature of the road becomes smaller than the preset range, reduce the region.
13. device as described in claim 1, wherein the processor is configured that
Based on the direction of motion of the target, the region for determining the validity of the sensor data set is set;And
Select sensing data corresponding with the region in the sensing data.
14. device as claimed in claim 13, wherein the processor is configured that
Increase length of the region in the direction of motion of the target.
15. device as described in claim 1, wherein the processor is configured that
Based on the type of the sensor, the region for determining the validity of the sensor data set is set;And
Select sensing data corresponding with the region in the sensing data.
16. device as described in claim 1, wherein the processor is configured that
Based on the type of the target, the region for determining the validity of the sensor data set is set;And
Select sensing data corresponding with the region in the sensing data.
17. a kind of method for the sensing data in associated vehicle, the described method comprises the following steps:
The sensor data set of the target positioned at the vehicle periphery is obtained using multiple sensors of the vehicle, wherein
The sensor data set includes sensing data;
At least part based on first information selection sensing data, wherein the first information includes target information, described
It the curvature for the road that vehicle or the target are travelled and is configured in the type for sensing the sensor of the target at least
One;And
The tracking data of the target is generated by being associated at least part of the sensing data.
18. method as claimed in claim 17, wherein the sensor data set includes by every in the multiple sensor
The sensing data that one sensor obtains.
19. method as claimed in claim 17, wherein selected described in the sensing data extremely based on the first information
A part includes: less
Based on the first information, the region for determining the validity of the sensor data set is set;And
Select sensing data corresponding with the region in the sensing data.
20. method as claimed in claim 17, wherein the target information include between the vehicle and the target away from
From difference, the direction of motion of the target and the target between, the speed of the vehicle and the speed of the target
At least one of type.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170147439A KR102371616B1 (en) | 2017-11-07 | 2017-11-07 | Apparatus and method for associating sensor data of vehicle |
KR10-2017-0147439 | 2017-11-07 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109752717A true CN109752717A (en) | 2019-05-14 |
CN109752717B CN109752717B (en) | 2023-10-17 |
Family
ID=66328665
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810685694.5A Active CN109752717B (en) | 2017-11-07 | 2018-06-28 | Apparatus and method for correlating sensor data in a vehicle |
Country Status (3)
Country | Link |
---|---|
US (1) | US11328516B2 (en) |
KR (1) | KR102371616B1 (en) |
CN (1) | CN109752717B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113454970B (en) * | 2019-03-21 | 2023-10-10 | 杭州飞步科技有限公司 | Extensible data fusion architecture and related products |
US11214261B2 (en) * | 2019-06-11 | 2022-01-04 | GM Global Technology Operations LLC | Learn association for multi-object tracking with multi sensory data and missing modalities |
JP7332403B2 (en) * | 2019-09-11 | 2023-08-23 | 株式会社東芝 | Position estimation device, mobile control system, position estimation method and program |
US11954179B2 (en) * | 2020-09-11 | 2024-04-09 | Inceptio Hongkong Limited | Resolving out of sequence data in sensor tracking |
KR20220131646A (en) * | 2021-03-22 | 2022-09-29 | 현대자동차주식회사 | Method and apparatus for tracking an object, and recording medium for recording program performing the method |
KR20220142593A (en) | 2021-04-14 | 2022-10-24 | 현대자동차주식회사 | Method and apparatus for fusing sensor information, and recording medium for recording program performing the method |
US20220341753A1 (en) * | 2021-04-23 | 2022-10-27 | Honeywell International Inc. | Methods and systems for detecting wind shear conditions in an urban airspace |
US20220371533A1 (en) * | 2021-05-18 | 2022-11-24 | Motional Ad Llc | Distributed vehicle body sensors for event detection |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07242133A (en) * | 1994-03-04 | 1995-09-19 | Toyota Motor Corp | Scanning radar device |
US20060111819A1 (en) * | 2003-08-18 | 2006-05-25 | Llorenc Servera Serapio | System and method for monitoring the external environment of a motor vehicle |
CN101701826A (en) * | 2009-11-20 | 2010-05-05 | 西安电子科技大学 | Target tracking method of passive multi-sensor based on layered particle filtering |
US20100292886A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Turn by turn graphical navigation on full windshield head-up display |
DE102010012811A1 (en) * | 2010-03-23 | 2011-09-29 | Jenoptik Robot Gmbh | Method for measurement of speed and assigning measured speed to car, involves fading marking in holder on car when car passes monitoring region, where marking represents measurement data of speed measured from car |
KR20150067682A (en) * | 2013-12-10 | 2015-06-18 | 현대모비스 주식회사 | Apparatus and method for tracking objects with optimizing region of interest |
CN105197011A (en) * | 2014-06-13 | 2015-12-30 | 现代摩比斯株式会社 | System and method for managing dangerous driving index for vehicle |
JP2016071830A (en) * | 2014-09-26 | 2016-05-09 | 日本電気株式会社 | Object tracking device, object tracking system, object tracking method, display control device, object detection device, program, and recording medium |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
CN106585623A (en) * | 2016-12-21 | 2017-04-26 | 驭势科技(北京)有限公司 | Detection system for detecting targets around vehicle and application of detection system |
CN106990403A (en) * | 2017-04-28 | 2017-07-28 | 西安电子科技大学 | Low-altitude target tracking method based on multiband two-stage information fusion |
CN107009968A (en) * | 2017-03-28 | 2017-08-04 | 驭势科技(北京)有限公司 | Mobile lidar control method, device and mobile unit |
CN107300916A (en) * | 2016-04-14 | 2017-10-27 | 沃尔沃汽车公司 | Method and apparatus for the performance of the emerging system that monitors and adjust autonomous vehicle |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7630806B2 (en) * | 1994-05-23 | 2009-12-08 | Automotive Technologies International, Inc. | System and method for detecting and protecting pedestrians |
AU2001259640A1 (en) * | 2000-05-08 | 2001-11-20 | Automotive Technologies International, Inc. | Vehicular blind spot identification and monitoring system |
US6687637B2 (en) * | 2001-06-18 | 2004-02-03 | Globvision Inc. | Data sensor validation system and method |
GB2416943A (en) * | 2004-08-06 | 2006-02-08 | Qinetiq Ltd | Target detection |
US20100253595A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Virtual controls and displays by laser projection |
EP2720211B1 (en) * | 2011-06-09 | 2016-08-10 | Toyota Jidosha Kabushiki Kaisha | Other-vehicle detection device and other-vehicle detection method |
EP2639781A1 (en) | 2012-03-14 | 2013-09-18 | Honda Motor Co., Ltd. | Vehicle with improved traffic-object position detection |
US9031729B2 (en) * | 2012-11-29 | 2015-05-12 | Volkswagen Ag | Method and system for controlling a vehicle |
US9751534B2 (en) * | 2013-03-15 | 2017-09-05 | Honda Motor Co., Ltd. | System and method for responding to driver state |
WO2017159509A1 (en) * | 2016-03-15 | 2017-09-21 | 本田技研工業株式会社 | Vehicle control system, vehicle control method, and vehicle control program |
JP6654544B2 (en) * | 2016-10-21 | 2020-02-26 | 株式会社Soken | Sensor control device |
US11254329B2 (en) * | 2017-04-24 | 2022-02-22 | Mobileye Vision Technologies Ltd. | Systems and methods for compression of lane data |
DE112017008079T5 (en) * | 2017-11-10 | 2020-06-25 | Honda Motor Co., Ltd. | DISPLAY SYSTEM, DISPLAY METHOD AND PROGRAM |
US10509410B2 (en) * | 2017-12-06 | 2019-12-17 | Zoox, Inc. | External control of an autonomous vehicle |
-
2017
- 2017-11-07 KR KR1020170147439A patent/KR102371616B1/en active IP Right Grant
-
2018
- 2018-06-11 US US16/005,127 patent/US11328516B2/en active Active
- 2018-06-28 CN CN201810685694.5A patent/CN109752717B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07242133A (en) * | 1994-03-04 | 1995-09-19 | Toyota Motor Corp | Scanning radar device |
US20060111819A1 (en) * | 2003-08-18 | 2006-05-25 | Llorenc Servera Serapio | System and method for monitoring the external environment of a motor vehicle |
US20100292886A1 (en) * | 2009-05-18 | 2010-11-18 | Gm Global Technology Operations, Inc. | Turn by turn graphical navigation on full windshield head-up display |
CN101701826A (en) * | 2009-11-20 | 2010-05-05 | 西安电子科技大学 | Target tracking method of passive multi-sensor based on layered particle filtering |
DE102010012811A1 (en) * | 2010-03-23 | 2011-09-29 | Jenoptik Robot Gmbh | Method for measurement of speed and assigning measured speed to car, involves fading marking in holder on car when car passes monitoring region, where marking represents measurement data of speed measured from car |
KR20150067682A (en) * | 2013-12-10 | 2015-06-18 | 현대모비스 주식회사 | Apparatus and method for tracking objects with optimizing region of interest |
CN105197011A (en) * | 2014-06-13 | 2015-12-30 | 现代摩比斯株式会社 | System and method for managing dangerous driving index for vehicle |
JP2016071830A (en) * | 2014-09-26 | 2016-05-09 | 日本電気株式会社 | Object tracking device, object tracking system, object tracking method, display control device, object detection device, program, and recording medium |
CN107300916A (en) * | 2016-04-14 | 2017-10-27 | 沃尔沃汽车公司 | Method and apparatus for the performance of the emerging system that monitors and adjust autonomous vehicle |
CN105953796A (en) * | 2016-05-23 | 2016-09-21 | 北京暴风魔镜科技有限公司 | Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone |
CN106585623A (en) * | 2016-12-21 | 2017-04-26 | 驭势科技(北京)有限公司 | Detection system for detecting targets around vehicle and application of detection system |
CN107009968A (en) * | 2017-03-28 | 2017-08-04 | 驭势科技(北京)有限公司 | Mobile lidar control method, device and mobile unit |
CN106990403A (en) * | 2017-04-28 | 2017-07-28 | 西安电子科技大学 | Low-altitude target tracking method based on multiband two-stage information fusion |
Non-Patent Citations (2)
Title |
---|
MICHAEL SCHUSTER等: "Tracking of vehicles on nearside lanes using multiple radar sensors", 2014 INTERNATIONAL RADAR CONFERENCE, pages 1 - 4 * |
王文学等: "卡尔曼滤波在机器人足球比赛系统中的应用", 机器人, vol. 28, no. 4, pages 410 - 414 * |
Also Published As
Publication number | Publication date |
---|---|
KR20190051601A (en) | 2019-05-15 |
CN109752717B (en) | 2023-10-17 |
US11328516B2 (en) | 2022-05-10 |
US20190138825A1 (en) | 2019-05-09 |
KR102371616B1 (en) | 2022-03-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109752717A (en) | Device and method for the sensing data in associated vehicle | |
KR102452550B1 (en) | Apparatus for aggregating object based on Lidar data, system having the same and method thereof | |
KR102168753B1 (en) | Electronic device for obtaining three-dimension object based on camera and radar sensor fusion, and operaing method thereof | |
KR101973343B1 (en) | Object detection method and object detection apparatus | |
US20190346561A1 (en) | Enhanced object detection and motion estimation for a vehicle environment detection system | |
JP2015041382A (en) | Object tracking method and object tracking device | |
CN112801225B (en) | Automatic driving multi-sensor fusion sensing method and system under limit working condition | |
US20150134234A1 (en) | Apparatus for determining motion characteristics of target and device for controlling driving route of vehicle including the same | |
KR20190081334A (en) | Method for tracking moving trajectory based on complex positioning and apparatus thereof | |
WO2019208271A1 (en) | Electronic control device, and computation method | |
KR102298652B1 (en) | Method and apparatus for determining disparty | |
CN109785362A (en) | Target object tracking, device and storage medium based on target object detection | |
Flores et al. | Efficient probability-oriented feature matching using wide field-of-view imaging | |
US20230025981A1 (en) | Method and apparatus with grid map generation | |
KR102498435B1 (en) | Apparatus and method for calibration of sensor system of autonomous vehicle | |
CN113203424B (en) | Multi-sensor data fusion method and device and related equipment | |
CN111989541A (en) | Stereo camera device | |
CN113033267B (en) | Vehicle positioning method, device, computer equipment and storage medium | |
Venugopala | Comparative study of 3D object detection frameworks based on LiDAR data and sensor fusion techniques | |
JP2020165945A (en) | Self-position estimating method and self-position estimating device | |
JP7254115B2 (en) | Other vehicle behavior prediction device and other vehicle behavior prediction method | |
KR102456151B1 (en) | Sensor fusion system based on radar and camera and method of calculating the location of nearby vehicles | |
US11807232B2 (en) | Method and apparatus for tracking an object and a recording medium storing a program to execute the method | |
JP6398966B2 (en) | Mobile body position and orientation estimation apparatus and mobile body autonomous traveling system | |
El Bouazzaoui et al. | An Extended HOOFR SLAM Algorithm Using IR-D Sensor Data for Outdoor Autonomous Vehicle Localization |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |