WO2017216856A1 - 車間距離推定方法及び車間距離推定装置 - Google Patents
車間距離推定方法及び車間距離推定装置 Download PDFInfo
- Publication number
- WO2017216856A1 WO2017216856A1 PCT/JP2016/067609 JP2016067609W WO2017216856A1 WO 2017216856 A1 WO2017216856 A1 WO 2017216856A1 JP 2016067609 W JP2016067609 W JP 2016067609W WO 2017216856 A1 WO2017216856 A1 WO 2017216856A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- inter
- vehicle distance
- shielding
- tracking
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 20
- 238000001514 detection method Methods 0.000 description 19
- 230000001133 acceleration Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000013500 data storage Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/14—Adaptive cruise control
- B60W30/16—Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
- B60R21/0134—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
- G08G1/0133—Traffic data processing for classifying traffic situation
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/052—Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/54—Audio sensitive means, e.g. ultrasound
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4041—Position
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/801—Lateral distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/804—Relative longitudinal speed
Definitions
- the present invention relates to an inter-vehicle distance estimation method and an inter-vehicle distance estimation device that estimate an inter-vehicle distance.
- Patent Document 1 when a tracking target is shielded by a plurality of shielding objects, a shielding object having a long estimated shielding time calculated based on a difference between a motion vector of the shielding object and a motion vector of the tracking target is tracked.
- the target tracking device set as is disclosed.
- Patent Document 1 uses the speed at the time when the tracking target is shielded as the speed of the tracking target, when used for estimating the inter-vehicle distance of the vehicle train including the tracking target, the speed of the tracking target is If it changes, the accuracy of the inter-vehicle distance estimation may deteriorate.
- an object of the present invention is to provide an inter-vehicle distance estimation method and an inter-vehicle distance estimation apparatus that can improve the accuracy of inter-vehicle distance estimation.
- An inter-vehicle distance estimation method estimates a shielding area shielded from a sensor by an obstacle and two non-shielding areas sandwiching the shielding area, and each of the two non-shielding areas on the same lane. Based on the speeds of the two tracking vehicles that travel, the inter-vehicle distance to the tracking vehicle traveling on the same lane in the shielding area is estimated.
- an inter-vehicle distance estimation device and an inter-vehicle distance estimation method that can improve the accuracy of inter-vehicle distance estimation.
- FIG. 1 is a schematic block diagram illustrating a basic configuration of an inter-vehicle distance estimation apparatus according to an embodiment of the present invention.
- FIG. 2 is a diagram illustrating a scene in which a vehicle equipped with the inter-vehicle distance estimation device according to the embodiment of the present invention joins a back lane.
- FIG. 3A is a flowchart for explaining an example of an inter-vehicle distance estimation method by the inter-vehicle distance estimation apparatus according to the embodiment of the present invention.
- FIG. 3B is a flowchart for explaining an example of an inter-vehicle distance estimation method by the inter-vehicle distance estimation device according to the embodiment of the present invention.
- FIG. 4 is an example illustrating attribute information of a plurality of tracked vehicles.
- FIG. 5 is a graph for explaining a method of calculating the estimated value shown in FIG.
- FIG. 6A is a diagram illustrating a scene where the tracking vehicle in the shielding area accelerates.
- FIG. 6B is a diagram illustrating a scene in which the tracking vehicle in the shielding area accelerates.
- FIG. 7 is a graph showing the speed and position of a plurality of tracked vehicles.
- FIG. 8A is a diagram illustrating a scene where the tracking vehicle in the shielding area decelerates.
- FIG. 8B is a diagram illustrating a scene where the tracking vehicle in the shielding area decelerates.
- FIG. 9 is a graph showing the speed and position of a plurality of tracked vehicles.
- FIG. 1 is a block diagram showing a configuration of an inter-vehicle distance estimation device 20 according to the present embodiment.
- the inter-vehicle distance estimation device 20 includes a sensor 21, a map data storage unit 22, a self-position estimation unit 23, an exercise information acquisition unit 24, an output unit 25, and a processing circuit 30.
- the inter-vehicle distance estimation device 20 is installed in a vehicle 11 (own vehicle) and estimates the inter-vehicle distance of a vehicle train including other vehicles shielded by an obstacle.
- the sensor 21 is mounted on the vehicle 11, detects position information of objects around the vehicle 11, and outputs it to the processing circuit 30.
- a distance measuring sensor such as a laser range finder (LRF), a millimeter wave radar, an ultrasonic sensor, a stereo camera, or an image sensor can be adopted.
- the sensor 21 may be composed of a plurality of types of sensors, and may detect the speed, acceleration, shape, color, and the like of surrounding objects.
- the sensor 21 scans a predetermined range around the vehicle 11 to acquire the three-dimensional distance data of the surrounding environment.
- the three-dimensional distance data is point cloud data indicating a relative three-dimensional position from the sensor 21.
- the map data storage unit 22 is a storage device that stores high-definition map data.
- the map data storage unit 22 may be mounted on the vehicle 11 or installed in a server or the like via a communication line.
- the map data includes information on road structures such as the position of each lane and traffic zone classification, and the position and shape of features around the road. Information can be recorded.
- the self position estimation unit 23 estimates the self position of the vehicle 11 in the map data stored in the map data storage unit 22.
- the self position includes the posture of the vehicle 11.
- the self-position estimating unit 23 is based on information acquired from a positioning device such as a global positioning system (GPS) receiver, an acceleration sensor mounted on the vehicle 11, an angular velocity sensor, a rudder angle sensor, a speed sensor, and the like. Is estimated.
- the self-position estimating unit 23 estimates the detailed self-position in the map data by calculating the relative position of the vehicle 11 with respect to the feature recorded in the map data from the information acquired from the sensor 21. Good.
- GPS global positioning system
- the motion information acquisition unit 24 acquires motion information indicating a motion state such as the speed, acceleration, angular velocity, and steering angle of the vehicle 11.
- the motion information is acquired from a speed sensor, an acceleration sensor, an angular velocity sensor, a rudder angle sensor, or the like mounted on the vehicle 11.
- the output unit 25 is an output interface (I / F) that outputs a calculation result by the processing circuit 30.
- the output unit 25 outputs the calculation result by the processing circuit 30 to, for example, a control circuit that automatically controls driving of the vehicle 11.
- the output destination from which the output unit 25 outputs the calculation result may be a display device or a speaker that presents information to the passenger of the vehicle 11.
- the processing circuit 30 includes an object detection unit 31, an area estimation unit 32, an object collation unit 33, a recognition result storage unit 34, an inter-vehicle distance estimation unit 35, and an object motion prediction unit 40.
- the processing circuit 30 includes a programmed processing device such as a processing device including an electrical circuit.
- the processing circuitry may include other devices such as application specific integrated circuits (ASICs) and circuit components arranged to perform the described functions.
- the processing circuit 30 can be configured by one or a plurality of processing circuits.
- the processing circuit 30 may also be used as an electronic control unit (ECU) used for other control related to the vehicle 11.
- ECU electronice control unit
- the object detection unit 31 detects the observable object 13 around the vehicle 11 based on the information acquired by the sensor 21.
- the observable object 13 is an object that can be observed using the sensor 21 without being shielded from the sensor 21 by an obstacle.
- the object detection unit 31 uses the information acquired by the sensor 21, the map data stored in the map data storage unit 22, the self-position estimated by the self-position estimation unit 23, and the motion information acquired by the motion information acquisition unit 24. Based on this, the attribute information of the observable object 13 is acquired.
- the attribute information may include the position, speed, acceleration, posture, shape, color, and type of the observable object 13. Note that the speed and acceleration of the observable object 13 may include information on the rotation direction.
- the object detection unit 31 sets an identifier (ID) for the detected observable object 13 and determines the attribute information and ID of the observable object 13 as the object information of the observable object 13.
- ID identifier
- the region estimation unit 32 estimates the shielding region 14 that is shielded from the sensor 21 by the obstacle and the non-shielding region 15 that is not shielded from the sensor 21 around the vehicle 11.
- the obstacle is the observable object 13.
- the area estimation unit 32 extracts point cloud data within a predetermined height range from the ground surface from the point cloud data acquired by the sensor 21, and connects the extracted point cloud data to the non-shielding area 14.
- the boundary with the shielding area 15 is determined.
- the area estimation unit 32 estimates the area behind the determined boundary as the shielding area 14 and the near side as the non-shielding area 15.
- the region estimation unit 32 estimates regions that sandwich the shielding region 14 in the horizontal direction as two non-shielding regions 15.
- the object collation unit 33 collates the observable object 13 detected by the object detection unit 31 with the predicted object predicted by the object motion prediction unit 40, and determines whether or not the observable object 13 and the predicted object correspond to each other. Determine whether.
- the object matching unit 33 determines whether or not the observable object 13 and the predicted object correspond to each other based on the similarity between the attribute information of the observable object 13 and the attribute information of the predicted object.
- the recognition result storage unit 34 stores the object information acquired from the object detection unit 31 based on the self-position estimated by the self-position estimation unit 23 and the motion information acquired by the motion information acquisition unit 24, as map data. It is stored as a recognition result in association with the map data stored in the unit 22.
- the recognition result storage unit 34 maps the object information determined by the object detection unit 31 onto the map data.
- the recognition result storage unit 34 updates the object information determined by the object detection unit 31 according to the determination result of the object collation unit 33.
- the recognition result storage unit 34 tracks the tracking vehicle 12 traveling in the shielding area 14 or the non-shielding area 15 by holding the ID of the stored object information according to the determination result of the object matching unit 33.
- the recognition result storage unit 34 is based on the self-position estimated by the self-position estimation unit 23 and the motion information acquired by the motion information acquisition unit 24, and the shielding region 14 and the non-shielding estimated by the region estimation unit 32.
- the area 15 is set on the same lane recorded in the map data.
- the recognition result storage unit 34 stores the shielding area 14 and the non-shielding area 15 set on the same lane in association with each other. Based on the map data, the recognition result storage unit 34 estimates the area on the lane where the mutually related shielding area 14 and non-shielding area 15 are estimated as the traveling area 101 where the tracking vehicle 12 that is the tracking target travels. .
- the inter-vehicle distance estimation unit 35 estimates the inter-vehicle distances of the plurality of tracking vehicles 12 that travel in the same travel region 101 based on the object information stored in the recognition result storage unit 34.
- the inter-vehicle distance estimation unit 35 estimates inter-vehicle distances of the plurality of tracking vehicles 12 based on the estimated speed of the tracking vehicle 12 that travels in the shielding area 14.
- the object motion prediction unit 40 includes a tracking object group detection unit 41, a shielding determination unit 42, a speed estimation unit 43, a position estimation unit 44, and a posture estimation unit 45.
- the object motion prediction unit 40 predicts attribute information of the observable object 13 or the non-observable object 16 based on the object information of the observable object 13.
- the object motion prediction unit 40 outputs the predicted attribute information and ID of the observable object 13 or the non-observable object 16 as the object information of the predicted object.
- the tracking object group detection unit 41 detects the object group that exists over the shielding area 14 and the non-shielding area 15 and has the same moving direction as the plurality of tracking vehicles 12 based on the recognition result of the recognition result storage unit 34.
- the tracking object group detection unit 41 may detect the object group existing in the travel area 101 estimated by the recognition result storage unit 34 as a plurality of tracking vehicles 12.
- the tracking object group detection unit 41 may further detect the observable object 13 that moves in the traveling direction of the plurality of tracking vehicles 12 already detected in the non-shielding region 15 as the tracking vehicle 12.
- the shielding determination unit 42 determines whether each object of the object group detected by the tracking object group detection unit 41 is shielded from the sensor 21 by another obstacle. That is, the shielding determination unit 42 determines whether each object exists in either the shielding area 14 or the non-shielding area 15. The object determined not to be shielded by the shielding determination unit 42 is the observable object 13, and the object determined to be shielded is the non-observable object 16.
- the speed estimation unit 43 estimates the speeds of the plurality of tracking vehicles 12 detected by the tracking object group detection unit 41.
- the speed estimation unit 43 performs non-observation existing in the shielding area 14 based on the current speeds of the two observable objects 13 respectively moving in the two non-shielding areas 15 sandwiching the shielding area 14 where the non-observing object 16 exists. Estimate the current velocity of the object 16.
- the position estimation unit 44 estimates the current position of the tracking vehicle 12 based on the speed estimated by the speed estimation unit 43 and the attribute information of the observable object 13.
- the posture estimation unit 45 estimates the current posture of the tracking vehicle 12 based on the speed estimated by the speed estimation unit 43 and the attribute information of the observable object 13.
- the posture estimation unit 45 may estimate the posture of the tracking vehicle 12 based on the shape of the road recorded in the map data.
- the sensor 21 acquires information on the surrounding environment including the tracking object (the plurality of tracking vehicles 12).
- the sensor 21 acquires at least position information of an object in front of the vehicle 11.
- step S11 the object detection unit 31 detects the observable object 13 and the object information of the observable object 13 based on the information acquired in step S10.
- the object detection unit 31 may detect the observable object 13 including the surrounding features and the object information of the observable object 13 based on the map data, the self position, and the motion information of the vehicle 11.
- the region estimation unit 32 includes a plurality of shielding regions 14 that are shielded from the sensor 21 by the obstacle and a plurality of non-shielding regions 15 that are not shielded from the sensor 21 based on the information acquired in step S ⁇ b> 10. presume.
- step S13 the object collation unit 33 collates the object information of the observable object 13 detected in step S11 with the object information of the predicted object predicted by the object motion prediction unit 40.
- step S13 is based on the premise that the object information of the predicted object acquired in steps S23 to S27, which will be described later, is input to the object matching unit 33.
- step S14 the object matching unit 33 determines whether or not the observable object 13 and the predicted object correspond to each other based on the similarity between the attribute information of the observable object 13 and the attribute information of the predicted object. .
- the object collation unit 33 proceeds to step S15 when determining that it corresponds, and proceeds to step S16 when determining that it does not correspond.
- step S15 the recognition result storage unit 34 updates the current object information using the attribute information of the observable object 13. That is, the recognition result storage unit 34 replaces the already stored attribute information of the object information of the observable object 13 with the attribute information acquired in step S11 at the current time, and obtains the new object information of the observable object 13 as object information.
- the recognition result storage unit 34 replaces the already stored attribute information of the object information of the observable object 13 with the attribute information acquired in step S11 at the current time, and obtains the new object information of the observable object 13 as object information.
- step S16 the object matching unit 33 determines whether or not the predicted object is shielded. That is, the object matching unit 33 determines whether or not the predicted object exists in the shielding region 14 based on the attribute information of the predicted object and the shielding region 14 estimated by the region estimation unit 32. When determining that the predicted object is shielded, the object matching unit 33 proceeds to step S17, and when determining that the predicted object is not shielded, proceeds to step S18.
- step S17 the recognition result storage unit 34 updates the current object information using the attribute information of the predicted object. That is, the recognition result storage unit 34 replaces the already stored attribute information of the object information of the observable object 13 with the attribute information of the predicted object input to the object matching unit 33 at the current time, and Store as object information.
- step S18 the recognition result storage unit 34 deletes the object information of the predicted object input to the object matching unit 33 at the current time. That is, the recognition result storage unit 34 maintains the already stored object information of the observable object 13 without changing it.
- the recognition result storage unit 34 detects the observable object detected in step S11. 13 object information is stored.
- step S19 the recognition result storage unit 34 maps the object information of the observable object 13 or the non-observable object 16 stored in any of steps S15 to S18 on the map data.
- the recognition result storage unit 34 maps the object information of the observable object 13 or the non-observable object 16 on the map data based on the map data, the self position, and the motion information of the vehicle 11.
- step S ⁇ b> 20 the recognition result storage unit 34, among the plurality of shielding areas 14 and the plurality of non-shielding areas 15 estimated in step S ⁇ b> 12, for example, based on the map data, the self position, and the motion information of the vehicle 11.
- the related shielding area 14 and non-shielding area 15 are estimated.
- the recognition result storage unit 34 estimates the traveling area 101 in a predetermined range on the lane where the plurality of shielding areas 14 and the plurality of non-shielding areas 15 are estimated.
- the recognition result storage unit 34 estimates the plurality of shielding areas 14 and the plurality of non-shielding areas 15 estimated in the same traveling area 101 as the plurality of shielding areas 14 and the plurality of non-shielding areas 15 that are related to each other. Note that the recognition result storage unit 34 may estimate the travel region 101 based on a region where a plurality of objects having the same movement direction are detected without using map data.
- the inter-vehicle distance estimation unit 35 estimates the inter-vehicle distance between the plurality of tracking vehicles 12 that travel in the same travel area 101 estimated in step S20.
- the plurality of tracking vehicles 12 traveling in the traveling area 101 are composed of a plurality of observable objects 13 and a plurality of non-observable objects 16. That is, the inter-vehicle distance estimation unit 35 is based on the object information of the plurality of observable objects 13 and the plurality of non-observable objects 16 existing in the travel region 101 estimated by the recognition result storage unit 34. Estimate the distance between vehicles.
- step S22 the processing circuit 30 outputs the inter-vehicle distances of the plurality of tracked vehicles 12 estimated in step S21 to the output unit 25.
- the processing circuit 30 also outputs the object information of the plurality of observable objects 13 and the plurality of non-observable objects 16 and the information of the plurality of shielding areas 14 and the plurality of non-shielding areas 15 stored in the recognition result storage unit 34. To 25.
- step S23 the tracking object group detection unit 41 performs the same movement among the observable object 13 and the non-observable object 16 that exist in the related plurality of shielding areas 14 and the plurality of non-shielding areas 15 estimated in step S20.
- An object group having a direction is detected as a plurality of tracking vehicles 12.
- the tracking object group detection unit 41 may simply detect an object group existing in the plurality of related shielding areas 14 and the plurality of non-shielding areas 15 as the plurality of tracking vehicles 12.
- step S24 the shielding determination unit 42 determines whether each object in the object group is shielded from the sensor 21 by the obstacle based on each object information of the object group detected in step S23. For example, the shielding determination unit 42 determines whether each object is shielded by referring to information indicating the presence or absence of shielding included in the attribute information. In this case, in step S16, the object matching unit 33 may determine whether or not the predicted object is shielded, and add the determination result to the attribute information.
- the object that is not shielded is the observable object 13, and the object that is not shielded is the non-observable object 16.
- step S25 the speed estimation unit 43 estimates the speed of the non-observable object 16 determined to be shielded in step S24 based on the attribute information of the observable object 13 determined to be shielded. Specifically, the speed estimation unit 43 travels the speed of the non-observed object 16 that is one or a plurality of tracking vehicles 12 that travels in one shielded area 14 through two unshielded areas 15 that sandwich the shielded area 14. It is estimated based on the speed of the observable object 13 that is the two tracking vehicles 12 that perform.
- the speed of the non-observed object 16 is set.
- a method for estimating the frequency will be specifically described.
- the numerical value in a parenthesis means that it is an estimated value.
- FIG. 5 is a graph for explaining a method of estimating the speed of each non-observable object 16 from the attribute information of each observable object 13 shown in FIG.
- the speed estimation unit 43 internally divides the speed of the two observable objects 13 sandwiching the non-observable object 16 by the ratio of the distances between the objects using the position of the non-observable object 16 estimated at the previous time. Is estimated as the velocity of the non-observed object 16.
- step S26 the position estimation unit 44 estimates the current position of the non-observable object 16 based on the speed estimated in step S25 and the attribute information of the observable object 13.
- step S27 the posture estimation unit 45 estimates the current posture of the non-observable object 16 based on the velocity estimated in step S25 and the attribute information of the observable object 13.
- the speed is v13 0 .
- the position of the tracking vehicle 12 of ID 12 estimated in the previous time and d12 0.
- the speed is v13 T
- the inter-vehicle distance estimation unit 35 can accurately estimate the positions of the plurality of tracked vehicles 12 in the next processing cycle based on the estimated vehicle speed, thereby estimating the inter-vehicle distance with high accuracy.
- the inter-vehicle distance estimating unit 35 the speed v11 and speed v13 0, the velocity v12 0 obtained by internally dividing the ratio of the distance from the distance and d12 0 from 0 to d12 0 to d13 0
- the inter-vehicle distance estimation unit 35 can accurately estimate the positions of the plurality of tracked vehicles 12 in the next processing cycle based on the estimated vehicle speed, thereby estimating the inter-vehicle distance with high accuracy.
- the vehicle travels on the same lane in the shielding area 14 based on the speed of the vehicle traveling on the same lane in the two non-shielding areas 15 sandwiching the shielding area 14.
- the inter-vehicle distance estimation device 20 estimates the current speed of the non-observable object 16 based on the current speed of the two observable objects 13 that sandwich the non-observable object 16. Thereby, the inter-vehicle distance estimation device 20 can estimate the speed of the non-observed object 16 with high accuracy, and as a result, the accuracy of the inter-vehicle distance estimation can be improved.
- the inter-vehicle distance estimation device 20 the travel area 101 where the plurality of tracking vehicles 12 travel is estimated using the map data, and the plurality of non-shielding areas 15 are estimated in the travel area 101. Therefore, since the inter-vehicle distance estimation apparatus 20 can estimate the non-shielding region 15 with high accuracy, the observable object 13 can be efficiently detected.
- the tracking vehicle 12 is tracked based on the similarity between the attribute information of the observable object 13 and the attribute information of the predicted object. It is determined whether or not the observable object 13 and the predicted object correspond to each other. In particular, by adopting the shape, color, and the like as the attribute information of the tracking vehicle 12, the inter-vehicle distance estimation device 20 enters the non-observed object 16 again even if the tracking vehicle 12 temporarily enters the shielding area 14. In this case, the same tracking vehicle 12 can be tracked with high accuracy.
- the object matching unit 33 may determine whether or not the observable object 13 and the predicted object correspond to each other using the attitude as the attribute information.
- the inter-vehicle distance estimation device 20 can track the same tracking vehicle 12 with higher accuracy.
- the posture of the tracking vehicle 12 can be estimated based on, for example, a portion bent in an L shape in the point cloud data obtained by the sensor 21, a tangential direction of a movement history, a road shape, and the like.
- the present invention includes various embodiments that are not described here, such as a configuration in which the above-described configurations are mutually applied. Therefore, the technical scope of the present invention is defined only by the invention specifying matters according to the scope of claims reasonable from the above description.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Chemical & Material Sciences (AREA)
- Data Mining & Analysis (AREA)
- Analytical Chemistry (AREA)
- Mathematical Physics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
図1は、本実施の形態に係る車間距離推定装置20の構成を示すブロック図である。車間距離推定装置20は、センサ21と、地図データ記憶部22と、自己位置推定部23と、運動情報取得部24と、出力部25と、処理回路30とを備える。車間距離推定装置20は、例えば、図2に示すように、車両11(自車両)に搭載され、障害物により遮蔽された他車両を含む車列の車間距離を推定する。
以下、図3A乃至図3Bのフローチャートを用いて、車間距離推定装置20による車間距離推定方法の一例を説明する。以下に示す一連の処理は、所定の時刻ごとに繰り返し実行される。図2に示すように、車間距離推定装置20が搭載された車両11が、前方に存在する道路10の奥の車線に合流するために、奥の車線を走行する複数の追跡車両12の車間距離を推定する場面を例として説明する。
v2=40+10×(2/6)=43.4 …(1)
v3=40+10×(5/6)=48.3 …(2)
v5=35+10×(2/3)=45 …(3)
図6A及び図6Bは、時刻t=0で、ID=11,12,13が同じ速度で走行している状態から、2つの可観測物体13(ID=11,13)に挟まれた1つの非観測物体16(ID=12)が、時刻t=0から時刻t=Tにかけて加速する場面を説明する図である。ID=11の追跡車両12に対して、ID=13の追跡車両12の相対速度が増加したと観測された場合、非観測物体16であるID=12の追跡車両12が、時刻t=0から時刻t=Tにかけて加速したと推定することができる。結果として、ID=11~13の3台の追跡車両12の車間距離は、それぞれ狭くなると推定される。
上記のように、本発明を上記の実施の形態によって記載したが、この開示の一部をなす論述及び図面は本発明を限定するものであると理解すべきではない。この開示から当業者には様々な代替実施の形態、実施例及び運用技術が明らかとなろう。
12 追跡車両
14 遮蔽領域
15 非遮蔽領域
20 車間距離推定装置
21 センサ
30 処理回路
101 走行領域
Claims (5)
- 車両の周囲の物体の位置情報を検出するセンサと、前記位置情報に基づいて検出される複数の追跡車両の車間距離を推定する処理回路とを用いた車間距離推定方法であって、
前記処理回路は、障害物により前記センサから遮蔽される遮蔽領域と、前記遮蔽領域を挟む2つの非遮蔽領域とを推定し、
前記2つの非遮蔽領域の同一車線上をそれぞれ走行する2台の前記追跡車両の速度に基づいて、前記遮蔽領域の前記同一車線上を走行する前記追跡車両に対する車間距離を推定することを特徴とする車間距離推定方法。 - 前記遮蔽領域の同一車線上を走行する前記追跡車両の速度を推定し、
推定した前記追跡車両の速度に基づいて、前記車間距離を推定することを特徴とする請求項1に記載の車間距離推定方法。 - 前記処理回路は、前記複数の追跡車両が走行する走行領域を地図データにおいて推定し、前記走行領域において前記2つの非遮蔽領域を推定することを特徴とする請求項1又は2に記載の車間距離推定方法。
- 前記センサは、前記追跡車両の形状及び色の少なくとも何れかを検出し、
前記処理回路は、前記追跡車両の形状及び色の少なくとも何れかに基づいて、前記追跡車両を追跡することを特徴とする請求項1乃至3の何れか1項に記載の車間距離推定方法。 - 車両の周囲の物体の位置情報を検出するセンサと、前記位置情報に基づいて検出される複数の追跡車両の車間距離を推定する処理回路とを備える車間距離推定装置であって、
前記処理回路は、障害物により前記センサから遮蔽される遮蔽領域と、前記遮蔽領域を挟む2つの非遮蔽領域とを推定し、
前記2つの非遮蔽領域をそれぞれ走行する2台の前記追跡車両の速度に基づいて、前記遮蔽領域を走行する前記追跡車両の速度を推定し、
推定した前記追跡車両の速度に基づいて、前記複数の追跡車両の車間距離を推定することを特徴とする車間距離推定装置。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680086753.6A CN109314763B (zh) | 2016-06-14 | 2016-06-14 | 车间距离推定方法及车间距离推定装置 |
US16/307,278 US10501077B2 (en) | 2016-06-14 | 2016-06-14 | Inter-vehicle distance estimation method and inter-vehicle distance estimation device |
MX2018014953A MX368800B (es) | 2016-06-14 | 2016-06-14 | Método de estimación de distancia entre vehículos y dispositivo de estimación de distancia entre vehículos. |
PCT/JP2016/067609 WO2017216856A1 (ja) | 2016-06-14 | 2016-06-14 | 車間距離推定方法及び車間距離推定装置 |
EP16905411.1A EP3471408B1 (en) | 2016-06-14 | 2016-06-14 | Inter-vehicle distance estimation method and inter-vehicle distance estimation device |
BR112018075863-9A BR112018075863A2 (pt) | 2016-06-14 | 2016-06-14 | método de estimativa de distância entre veículos e dispositivo de estimativa de distância entre veículos |
JP2018523063A JP6699728B2 (ja) | 2016-06-14 | 2016-06-14 | 車間距離推定方法及び車間距離推定装置 |
RU2018146911A RU2693015C1 (ru) | 2016-06-14 | 2016-06-14 | Способ оценки расстояния между транспортными средствами и устройство оценки расстояния между транспортными средствами |
CA3027328A CA3027328A1 (en) | 2016-06-14 | 2016-06-14 | Inter-vehicle distance estimation method and inter-vehicle distance estimation device |
KR1020187035096A KR101980509B1 (ko) | 2016-06-14 | 2016-06-14 | 차간 거리 추정 방법 및 차간 거리 추정 장치 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/067609 WO2017216856A1 (ja) | 2016-06-14 | 2016-06-14 | 車間距離推定方法及び車間距離推定装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017216856A1 true WO2017216856A1 (ja) | 2017-12-21 |
Family
ID=60664511
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/067609 WO2017216856A1 (ja) | 2016-06-14 | 2016-06-14 | 車間距離推定方法及び車間距離推定装置 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10501077B2 (ja) |
EP (1) | EP3471408B1 (ja) |
JP (1) | JP6699728B2 (ja) |
KR (1) | KR101980509B1 (ja) |
CN (1) | CN109314763B (ja) |
BR (1) | BR112018075863A2 (ja) |
CA (1) | CA3027328A1 (ja) |
MX (1) | MX368800B (ja) |
RU (1) | RU2693015C1 (ja) |
WO (1) | WO2017216856A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113613166A (zh) * | 2021-07-30 | 2021-11-05 | 安标国家矿用产品安全标志中心有限公司 | 井下带状定位目标的定位方法、装置及服务器 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112650243B (zh) * | 2020-12-22 | 2023-10-10 | 北京百度网讯科技有限公司 | 车辆控制方法、装置、电子设备和自动驾驶车辆 |
CN117689907B (zh) * | 2024-02-04 | 2024-04-30 | 福瑞泰克智能系统有限公司 | 车辆跟踪方法、装置、计算机设备和存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007334631A (ja) * | 2006-06-15 | 2007-12-27 | Sony Corp | 画像監視システムおよび物体領域追跡方法 |
JP2010072836A (ja) * | 2008-09-17 | 2010-04-02 | Toyota Motor Corp | 周辺監視装置 |
JP2012191354A (ja) * | 2011-03-09 | 2012-10-04 | Canon Inc | 情報処理装置、情報処理方法及びプログラム |
JP2014203349A (ja) * | 2013-04-08 | 2014-10-27 | スズキ株式会社 | 車両運転支援装置 |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4200572B2 (ja) * | 1999-01-18 | 2008-12-24 | 株式会社エクォス・リサーチ | 渋滞検出装置 |
JP2000222669A (ja) * | 1999-01-28 | 2000-08-11 | Mitsubishi Electric Corp | 交通流推定装置および交通流推定方法 |
JP4453217B2 (ja) * | 2001-04-11 | 2010-04-21 | 日産自動車株式会社 | 車間距離制御装置 |
JP2003346282A (ja) * | 2002-05-29 | 2003-12-05 | Toshiba Corp | 道路監視システム及び道路監視方法 |
JP3979382B2 (ja) * | 2003-12-03 | 2007-09-19 | 日産自動車株式会社 | 車線逸脱防止装置 |
AT502315B1 (de) * | 2004-01-19 | 2007-10-15 | Siemens Ag Oesterreich | Mobiles, kameragestütztes abstands- und geschwindigkeitsmessgerät |
JP2005284678A (ja) * | 2004-03-29 | 2005-10-13 | Mitsubishi Heavy Ind Ltd | 交通流計測装置 |
EP2084689B1 (en) * | 2006-11-15 | 2010-09-29 | Aitek S.P.A. | Method and apparatus for determining the distance between two vehicles running along a road or motorway section, particularly in a tunnel |
JP5233432B2 (ja) * | 2008-06-16 | 2013-07-10 | アイシン・エィ・ダブリュ株式会社 | 運転支援システム、運転支援方法及び運転支援プログラム |
WO2010073292A1 (ja) * | 2008-12-22 | 2010-07-01 | トヨタ自動車株式会社 | レーダ装置、及び当該レーダ装置において用いられる測定方法 |
SI2306428T1 (sl) * | 2009-10-01 | 2012-03-30 | Kapsch Trafficcom Ag | Naprava in postopek za določanje smeri hitrostiin ali razmaka vozil |
US20110082620A1 (en) * | 2009-10-05 | 2011-04-07 | Tesla Motors, Inc. | Adaptive Vehicle User Interface |
JP5312367B2 (ja) | 2010-02-12 | 2013-10-09 | 村田機械株式会社 | 走行台車システム |
JP5218861B2 (ja) | 2010-09-30 | 2013-06-26 | 株式会社Jvcケンウッド | 目標追跡装置、目標追跡方法 |
JP2012192878A (ja) * | 2011-03-17 | 2012-10-11 | Toyota Motor Corp | 危険度判定装置 |
KR101838710B1 (ko) * | 2011-06-20 | 2018-04-26 | 현대모비스 주식회사 | 선진 안전 차량에서 스쿨 존 안전 장치 및 그 방법 |
JP5922947B2 (ja) * | 2012-02-22 | 2016-05-24 | 富士重工業株式会社 | 車外環境認識装置および車外環境認識方法 |
RU2572952C1 (ru) * | 2012-07-27 | 2016-01-20 | Ниссан Мотор Ко., Лтд. | Устройство обнаружения трехмерных объектов и способ обнаружения трехмерных объектов |
US9988047B2 (en) * | 2013-12-12 | 2018-06-05 | Magna Electronics Inc. | Vehicle control system with traffic driving control |
CN103985252A (zh) * | 2014-05-23 | 2014-08-13 | 江苏友上科技实业有限公司 | 一种基于跟踪目标时域信息的多车辆投影定位方法 |
WO2016024314A1 (ja) * | 2014-08-11 | 2016-02-18 | 日産自動車株式会社 | 車両の走行制御装置及び方法 |
US9558659B1 (en) * | 2014-08-29 | 2017-01-31 | Google Inc. | Determining the stationary state of detected vehicles |
JP6447481B2 (ja) * | 2015-04-03 | 2019-01-09 | 株式会社デンソー | 起動提案装置及び起動提案方法 |
JP6558733B2 (ja) * | 2015-04-21 | 2019-08-14 | パナソニックIpマネジメント株式会社 | 運転支援方法およびそれを利用した運転支援装置、運転制御装置、車両、運転支援プログラム |
US10431094B2 (en) * | 2016-05-30 | 2019-10-01 | Nissan Motor Co., Ltd. | Object detection method and object detection apparatus |
CN109416883B (zh) * | 2016-06-27 | 2020-05-26 | 日产自动车株式会社 | 车辆控制方法及车辆控制装置 |
US10821976B2 (en) * | 2017-01-30 | 2020-11-03 | Telenav, Inc. | Navigation system with dynamic speed setting mechanism and method of operation thereof |
-
2016
- 2016-06-14 BR BR112018075863-9A patent/BR112018075863A2/pt unknown
- 2016-06-14 WO PCT/JP2016/067609 patent/WO2017216856A1/ja unknown
- 2016-06-14 CN CN201680086753.6A patent/CN109314763B/zh active Active
- 2016-06-14 KR KR1020187035096A patent/KR101980509B1/ko active IP Right Grant
- 2016-06-14 MX MX2018014953A patent/MX368800B/es active IP Right Grant
- 2016-06-14 JP JP2018523063A patent/JP6699728B2/ja active Active
- 2016-06-14 US US16/307,278 patent/US10501077B2/en active Active
- 2016-06-14 EP EP16905411.1A patent/EP3471408B1/en active Active
- 2016-06-14 CA CA3027328A patent/CA3027328A1/en not_active Abandoned
- 2016-06-14 RU RU2018146911A patent/RU2693015C1/ru active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007334631A (ja) * | 2006-06-15 | 2007-12-27 | Sony Corp | 画像監視システムおよび物体領域追跡方法 |
JP2010072836A (ja) * | 2008-09-17 | 2010-04-02 | Toyota Motor Corp | 周辺監視装置 |
JP2012191354A (ja) * | 2011-03-09 | 2012-10-04 | Canon Inc | 情報処理装置、情報処理方法及びプログラム |
JP2014203349A (ja) * | 2013-04-08 | 2014-10-27 | スズキ株式会社 | 車両運転支援装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3471408A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113613166A (zh) * | 2021-07-30 | 2021-11-05 | 安标国家矿用产品安全标志中心有限公司 | 井下带状定位目标的定位方法、装置及服务器 |
CN113613166B (zh) * | 2021-07-30 | 2023-04-18 | 安标国家矿用产品安全标志中心有限公司 | 井下带状定位目标的定位方法、装置及服务器 |
Also Published As
Publication number | Publication date |
---|---|
EP3471408B1 (en) | 2021-04-21 |
KR20190005189A (ko) | 2019-01-15 |
CA3027328A1 (en) | 2017-12-21 |
KR101980509B1 (ko) | 2019-05-20 |
RU2693015C1 (ru) | 2019-07-01 |
US10501077B2 (en) | 2019-12-10 |
CN109314763A (zh) | 2019-02-05 |
MX2018014953A (es) | 2019-04-25 |
BR112018075863A2 (pt) | 2019-03-19 |
EP3471408A4 (en) | 2019-08-21 |
US20190276021A1 (en) | 2019-09-12 |
JP6699728B2 (ja) | 2020-05-27 |
JPWO2017216856A1 (ja) | 2019-06-13 |
MX368800B (es) | 2019-10-17 |
CN109314763B (zh) | 2021-01-05 |
EP3471408A1 (en) | 2019-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10019017B2 (en) | Autonomous driving system | |
JP6566132B2 (ja) | 物体検出方法及び物体検出装置 | |
US10493984B2 (en) | Vehicle control method and vehicle control device | |
CN108688660B (zh) | 运行范围确定装置 | |
US20190072674A1 (en) | Host vehicle position estimation device | |
US11847838B2 (en) | Recognition device | |
CN103608217B (zh) | 用于求取车辆与位于车辆侧面的对象之间的相对位置的方法 | |
US20240199006A1 (en) | Systems and Methods for Selectively Decelerating a Vehicle | |
WO2017216856A1 (ja) | 車間距離推定方法及び車間距離推定装置 | |
CN115050203B (zh) | 地图生成装置以及车辆位置识别装置 | |
CN114987529A (zh) | 地图生成装置 | |
CN115107798A (zh) | 车辆位置识别装置 | |
US20220120568A1 (en) | Electronic device for vehicle, and method of operating electronic device for vehicle | |
JP2020051943A (ja) | 車両用ステレオカメラ装置 | |
US12025752B2 (en) | Systems and methods for detecting erroneous LIDAR data | |
WO2021145032A1 (ja) | 周辺車両位置推定システム、周辺車両位置推定プログラム | |
JP2022129177A (ja) | 運転支援方法及び運転支援装置 | |
JP2022121835A (ja) | 距離算出装置および車両位置推定装置 | |
JP2022123988A (ja) | 区画線認識装置 | |
CN114987530A (zh) | 地图生成装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018523063 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20187035096 Country of ref document: KR Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16905411 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 3027328 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018075863 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2016905411 Country of ref document: EP Effective date: 20190114 |
|
ENP | Entry into the national phase |
Ref document number: 112018075863 Country of ref document: BR Kind code of ref document: A2 Effective date: 20181213 |