CN108352116B - Vehicle surrounding information management device - Google Patents

Vehicle surrounding information management device Download PDF

Info

Publication number
CN108352116B
CN108352116B CN201680030600.XA CN201680030600A CN108352116B CN 108352116 B CN108352116 B CN 108352116B CN 201680030600 A CN201680030600 A CN 201680030600A CN 108352116 B CN108352116 B CN 108352116B
Authority
CN
China
Prior art keywords
information
vehicle
data
region
management device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201680030600.XA
Other languages
Chinese (zh)
Other versions
CN108352116A (en
Inventor
田中裕也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Astemo Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Astemo Ltd filed Critical Hitachi Astemo Ltd
Publication of CN108352116A publication Critical patent/CN108352116A/en
Application granted granted Critical
Publication of CN108352116B publication Critical patent/CN108352116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Abstract

The invention provides a vehicle periphery information management device capable of suppressing a load on a network and a processing load on an acquisition side. The present invention provides a vehicle periphery information management device comprising: a data acquisition unit that acquires a plurality of types of external information data around a vehicle; and a data selection unit that selects a part of the plurality of types of external information data acquired by the data acquisition unit according to a predetermined position from the host vehicle and outputs the selected part to the outside.

Description

Vehicle surrounding information management device
Technical Field
The present invention relates to a vehicle periphery information management apparatus for acquiring information for managing the periphery of a vehicle and providing the information to the outside.
Background
In recent years, in autonomous driving control which is regarded as important in the industry, a plurality of information acquisition devices need to be mounted on a vehicle in order to secure safety of the own vehicle in all directions. Information obtained from the information acquisition device mounted thereon relates to many aspects, such as detection information of an external recognition sensor, map information, inter-vehicle communication information, and the like, so that the amount of information handled in vehicle control is increasing. Therefore, the load on the in-vehicle network and the processing load on the control are expected to increase, and a device capable of efficiently managing and providing information is demanded.
As an example of a system for reducing the load on the in-vehicle network, for example, patent document 1 discloses the following: an integrated ECU is provided for each of a plurality of networks in a vehicle, and selects and transmits information required by other networks among received information, thereby optimizing the amount of data. In patent document 1, a network is divided into a vehicle motion system for managing braking, steering, ACC control, etc., a power train system for managing an engine, a transmission, etc., and a power supply system for managing a battery, an alternator, etc., and the exchange of information of a host vehicle between them is made efficient.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2004-136816
Disclosure of Invention
Problems to be solved by the invention
However, in a traveling situation, for example, when the surrounding environment of the host vehicle is congested, the amount of object identification information to be acquired increases. Therefore, since a plurality of information acquisition devices are mounted in the automatic driving control as described above, the amount of data supplied from these information acquisition devices increases in proportion to the recognized object information, road sign information, and the like. Patent document 1 does not assume that the amount of information increases due to a traveling situation, and the amount of data to be transmitted increases as the number of recognized objects, road signs, and the like increases. Therefore, an increase in the network load and the processing load on the information acquisition side becomes a problem.
The invention aims to provide a vehicle periphery information management device capable of restraining the load of a network and the processing load of an acquisition side.
Means for solving the problems
In order to solve the above problem, the present invention provides a vehicle periphery information management device including: a data integration unit that integrates the acquired ambient environment information of the vehicle; and a filter unit that filters the data integrated by the data integration unit, wherein the filter unit performs a process of removing, from the integrated data, ambient environment information data existing outside an area exceeding a predetermined distance from the host vehicle.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to provide a host vehicle peripheral information management device capable of suppressing a load on a network and a processing load on an acquisition side.
Drawings
Fig. 1 is a diagram showing an example of a configuration of an in-vehicle system using a host vehicle surrounding information management device according to the present invention.
Fig. 2 is a diagram showing an example of the external recognition range of the present invention.
Fig. 3 is a diagram showing an example of a processing block of the data integration unit according to the present invention.
Fig. 4 is a diagram showing an example of a processing block of the filter parameter determination unit according to the present invention.
Fig. 5 is a diagram showing an example of a processing block of the travelable region determination unit according to the present invention.
Fig. 6 is a diagram showing a travelable region calculation flowchart of the present invention.
Fig. 7 is a diagram illustrating an example of the search target travel area according to the present invention.
Fig. 8 is a diagram illustrating an example of the possibility of adjacent lane movement determination according to the present invention.
Fig. 9 is a diagram illustrating a travelable region scene 1 of the present invention.
Fig. 10 is a diagram illustrating a travelable area neighboring information scene 1 according to the present invention.
Fig. 11 is a diagram illustrating a travelable region scene 2 of the present invention.
Fig. 12 is a diagram illustrating a travelable area adjacent information scene 2 according to the present invention.
Fig. 13 is a diagram illustrating a travelable area adjacent information scene 3 according to the present invention.
Fig. 14 is a diagram showing an example of each peripheral information filtering cycle list according to the present invention.
Fig. 15 is a diagram showing an example of each peripheral information filtering target data list according to the present invention.
Fig. 16 is a diagram showing a filtering parameter calculation flowchart according to the present invention.
Fig. 17 is a diagram illustrating comparison between the time of congestion traveling and the time of normal traveling according to the present invention.
Detailed Description
Embodiments of the present invention will be described below with reference to the drawings.
Example 1
Fig. 1 shows the configuration of the in-vehicle own vehicle surrounding information management apparatus 01 and its internal functions, information acquisition apparatus, and information providing destination. The own vehicle periphery information management device 01 receives and manages information indicating the behavior of the own vehicle acquired from the own vehicle behavior recognition sensor 02 as the information acquisition device, information such as an external object, an obstacle, and a road sign acquired from the external recognition sensor 03, information such as a road shape and a road sign acquired from the map 04, and own vehicle position information acquired from the GPS05, collectively from the input communication network 12. The information managed by the own vehicle periphery information management device 01 is reduced into minimum information necessary for performing target processing, such as the autopilot control ECU06, the drive ECU07, and the display ECU08, which are information providing destinations, and is transmitted via the output communication network 13.
The vehicle behavior recognition sensor 02 includes, for example, a gyro sensor, a wheel speed sensor, a steering angle sensor, an acceleration sensor, and the like mounted on the vehicle, and can acquire a yaw rate, a wheel speed, a steering angle, an acceleration, and the like representing the behavior of the vehicle.
The external recognition sensor 03 includes a camera, a radar, and other sensors, and can acquire the relative positions and states of objects, obstacles, road signs, and the like around the vehicle.
As shown in fig. 2, the host vehicle F01 is equipped with a rear radar F02, an omnidirectional camera F03, a front camera F04, and a front radar F05, and is configured as a detection system capable of detecting omnidirectional information.
The more the external recognition sensor 03 is mounted, the more the amount of information that can be acquired, and the better the effect of reducing the data transmitted to the network, but the configuration may be such that the external recognition sensor is not mounted in all directions. For example, in the configuration having only the front camera F04, the amount of data that can be acquired increases as more objects are recognized, and therefore, the amount of data that can be transmitted to the network after determining only the travel-enabled area in front can be reduced. In addition to the external recognition sensor 03, vehicle-to-vehicle communication and road-to-vehicle communication may be included in the configuration. In the inter-vehicle communication, a yaw rate, a rudder angle, a speed, an acceleration, a brake light state, a direction indicator state, and the like representing the behavior of another vehicle can be acquired wirelessly. In road-to-vehicle communication, the position and state of road signs, traffic lights, other vehicles passing nearby, and the like can be wirelessly acquired by road-side devices provided on roads.
The map 04 distributes road information in a range of several kilometers around the own vehicle. Examples of the road information to be distributed include speed limit information, road information such as road curvature and gradient, road width, and the number of lanes, branching position, merging position, and toll booth position.
The GPS05 denotes a GPS receiver that receives signals from satellites located at high altitudes and is capable of acquiring the position of the own vehicle. In addition, the heading, speed, height, and the like of the own vehicle can be acquired as the accessory information.
The input communication Network 12 and the output communication Network 13 exchange information through CAN (Controller Area Network), serial communication between CPUs, Ethernet (registered trademark) expected to be widely used in vehicles in the future, wireless communication, and the like, which are generally used in vehicle-mounted systems.
The automatic drive control ECU06 has a function of automatically operating the steering wheel, brake, and accelerator of the vehicle to reach a target position. The automatic driving control may be fully automatic in which the driver does not operate at all, or may be semi-automatic in which a part of the operation is automatically controlled.
The drive ECU07 performs control for maintaining the running stability of the vehicle, such as traction control and an anti-lock brake system.
The display ECU08 is an ECU for displaying information of the periphery of the own vehicle in a navigation system, an instrument panel, and the like. In the present invention, 3 ECUs, that is, the automatic drive control ECU06, the drive ECU07, and the display ECU08, are given as examples of the user of the information of the host vehicle surrounding information management device 01, but may be changed depending on the purpose of use and the configuration of the vehicle.
The host vehicle peripheral information management device 01 includes a data integration unit (or referred to as a data acquisition unit, the same applies hereinafter) 09 and a data selection unit 14. The data selecting unit 14 is constituted by the filter parameter determining unit 10 and the output data filtering unit 11. The filter parameter determination unit 10, the output data unit 11, and the data integration unit 09 acquire the own vehicle behavior information and the external recognition information, the map information, and the own vehicle position information, which are acquired from the own vehicle behavior recognition sensor 02, the external recognition sensor 03, the map 04, the GPS05, and the like, respectively, via the input communication network 12. In the data integrating unit 09, information from various information acquiring apparatuses is acquired, and therefore, characteristics of each information acquiring apparatus are discriminated and integrated into the own vehicle information and the own vehicle periphery information. The data integration unit 09 is a configuration element that is premised on the preprocessing performed by the filter parameter determination unit 10 and the output data filtering unit 11, and performs processing for acquiring consistency of data from a plurality of information acquisition devices that operate asynchronously and reducing the amount of data in the output data filtering unit. The filter parameter determination unit 10 calculates a parameter how to filter the integrated information on the periphery of the host vehicle and how to reduce the transmitted data. The output data filtering unit 11 filters the vehicle periphery information based on the filter parameter calculated by the filter parameter determining unit 10, and transmits the filtered vehicle periphery information obtained by reducing the transmitted data to the output communication network 13.
Fig. 3 shows an example of the processing block of the data integration unit 09.
The data integration unit 09 is configured by a coordinate conversion processing unit 20, a synchronization processing unit 21, an object grouping processing unit 22, an object tracking processing unit 23, and an object lane position determination unit 24. As the input to the data integration unit 09, information from the host vehicle behavior recognition sensor 02 and the external world recognition sensor 03, the map 04, and the GPS05 are used. The coordinate conversion processing unit 20 and the synchronization processing unit 21 are necessary for creating the own vehicle information and the own vehicle peripheral information in which the information inputted from the plurality of information acquisition devices are matched with each other. The object grouping processing unit 22 and the object tracking processing unit 23 are necessary to acquire information of each object obtained from a plurality of information acquisition devices as information that is consistent with the real world. The object lane position determination unit 24 is necessary to calculate the object position with which the matching is achieved, using the object and the lane information from the map.
Regarding the input targeted by the data integration unit 09, a case where the coordinate systems used by the information acquisition devices are different from one another is considered. For example, one of the host vehicle behavior recognition sensors 02 is a wheel speed sensor, and the coordinate system to be used is an orthogonal coordinate system having the position of each four wheels as the center, and one of the external world recognition sensors 03 is a millimeter wave radar for avoiding a forward collision, and the coordinate system to be used is an orthogonal coordinate system oriented in the vehicle traveling direction with the position of the vehicle head as the center. In addition, in the map 04, a geodetic coordinate system is used as a coordinate system to be used, and it is expected that the position on the earth is expressed by latitude and longitude. In the coordinate conversion processing 20, the information after the conversion is used in the post-processing so as to be converted into 1 coordinate system as a representative in order to absorb the difference between these coordinate systems. For example, an orthogonal coordinate system with the vehicle center as a reference may be used as the representative coordinate system. By converting into 1 coordinate system as a representative, in the post-processing, when the information of the external world recognition sensor 03 and the map 04 is to be associated, the coordinate conversion processing is not performed each time according to the feature of the information acquisition device.
Next, the synchronization processing unit 21 corrects the deviation of the acquisition timing of the data acquired from each information acquisition device to acquire synchronization. In the in-vehicle system, each information acquisition device operates asynchronously, and the transmission cycle of data and the like differ. Therefore, for example, if the positional information of the objects acquired from the respective information acquisition devices is directly used, the relative positional relationship between the information of the own vehicle surrounding information management device and the real world object differs due to a relative positional shift between the objects detected by the respective information acquisition devices. As a result, the possibility of collision is increased and it is relatively dangerous in autonomous driving in which control is performed by determining the positional relationship of a plurality of vehicles.
In the object grouping processing unit 22, when the objects detected by the respective information acquisition devices are the same object in the real world, it is determined that the objects are the same object. When the number of the external recognition sensors 03 mounted on the vehicle is increased, the external recognition sensors 03 may detect the same area. For example, in the example of the range of the external recognition in fig. 2, the rear radar F02, the omnidirectional camera F03, the front camera F04, and the front radar F05 detect an object in the overlapping area F06, respectively. In the case of such a configuration, if the same object is present in the real world but the own vehicle surrounding information management device transmits the same object as a different object to the automatic driving control ECU06, the operation of 1 object may be predicted and controlled, and as a result, the operation of a plurality of objects must be predicted, and wasteful processing occurs. Therefore, if the objects detected by the individual external recognition sensors are the same object, they need to be grouped into the same object and sent to the control as the same object.
The object tracking processing unit 23 is a process for continuously recognizing a certain object as the same object even if the object passes through the boundary of the recognition ranges of the respective information acquisition devices. In the case where another vehicle overtakes from the right rear side of the host vehicle F01, the other vehicle is first detected on the right rear side of the rear radar F02, and as the other vehicle moves to the right side of the host vehicle, the recognition on the right rear side of the rear radar F02 is switched to the recognition on the right side of the omnidirectional camera F03, as described with reference to the example of the range of external recognition shown in fig. 2. At this time, when the other vehicle deviates from the recognition range of the rear right side of the rear radar F02 from the rear radar F02 for which the detection is performed first, the information of the other vehicle of the rear radar F02 is interrupted. On the other hand, since another vehicle has moved to the right of the omnidirectional camera F03, the omnidirectional camera F03 outputs information of another vehicle interrupted by the rear radar F02. In this way, even if the information of the object in which the interruption has occurred is viewed from each information acquisition device, the object tracking process 23 recognizes the same object. Without this processing, the own vehicle periphery information management device switches to the omnidirectional camera F03 to recognize the same object in the real world, and thus the object that has been perceived before disappears in the data and looks like the object reappears. Therefore, the control side cannot continuously control the same object, and the accuracy of behavior prediction of the object may be lowered, thereby lowering the reliability of the control. In addition, in the present embodiment, if the continuity of the data of the object/obstacle is lost, the accuracy of determination of the travelable region calculated using the positional relationship with the object/obstacle may be lowered, and the data reduction effect may be lowered. Therefore, even when the detection is performed by each of the external world recognition sensors, the same object can be set for the tracking processing and transmitted to the control.
In the object lane position determination unit 24, it is determined in which lane on the road the object/obstacle detected by each information acquisition device is located. In the present embodiment, since the following description assumes a process of calculating a travelable region for each lane, it is necessary to calculate in which lane an object or an obstacle detected by each information acquisition device is present.
The above processing is performed, and the own vehicle information and the own vehicle peripheral information are finally output. The vehicle information here includes vehicle position information, vehicle speed, yaw rate, steering angle, acceleration, direction indicator information, brake light information, and the like. The information on the periphery of the host vehicle includes relative positions, relative speeds, and lane positions of a vehicle, a pedestrian, a two-wheeled vehicle, and the like, which are present in the periphery of the host vehicle. Further, road information, identification information, signal information, and the like existing in the periphery of the own vehicle are included.
Fig. 4 shows an example of a processing block of the filter parameter determination unit 10.
The filter parameter determination unit 10 is constituted by a travelable region determination unit 30, a travelable region adjacent peripheral information determination unit 31, a peripheral information determination unit 32 other than travelable region adjacent peripheral information, and a filter parameter integration unit 33. First, the travelable region determination unit 30 calculates the travelable region using the information on the periphery of the host vehicle from the data integration unit 09 as an input. The travelable region determination unit 30 is a function of calculating a region in which the host vehicle can travel based on the host vehicle peripheral information, and determines whether or not the host vehicle can move to the region. In the present embodiment, the determination as to whether or not the vehicle is movable is basically determined to be movable when there is no phenomenon that the vehicle is obstructed by another object, obstacle, or the like in the area where the vehicle is moving. The travelable area adjacent information list is calculated by the travelable area adjacent information determination unit 31 using the travelable area output from the travelable area determination unit 30 and the own vehicle peripheral information from the data integration unit 09. The travelable area adjacent information list is information obtained by listing objects, obstacles, and the like that can immediately enter the travelable area. That is, the information indicating that the vehicle can be immediately approached is high in priority as to safety. The filter object information list is calculated by the surrounding information determination unit 32 other than the travelable region adjacent surrounding information, using the travelable region adjacent information list output from the travelable region adjacent surrounding information determination unit 31 and the own vehicle surrounding information from the data integration unit 09. The filtering object information list is information obtained by tabulating objects, obstacles, signs, and the like that are not registered in the travelable area adjacent information list but recognized by the external recognition sensor and the map. That is, the information indicating that the vehicle cannot be immediately approached is low in priority as to safety. The parameters are integrated in the filter parameter integration unit using the filter object information list output from the peripheral information determination unit 32 other than the peripheral information adjacent to the travelable region, the filter cycle list 34 for each piece of peripheral information, and the filter object data list 35 for each piece of peripheral information, and output to the output data filter unit. The calculated filter parameter is used for the filtering process in the output data filtering unit 11. The filtering period list 34 for each piece of peripheral information and the filtering object data list 35 for each piece of peripheral information may be determined in advance by a system designer in a static manner, or may be dynamically set by being received from the outside in a parameter form. For example, in the case of receiving from the outside, a method of collecting parameters from each of the automatic drive control ECU06, the drive ECU07, and the display ECU08 of fig. 1, and the like are considered.
Fig. 5 shows an example of processing blocks of the travelable region determination unit 30.
The travelable region determination unit 30 includes a search target travel region calculation unit 40 and a travelable region calculation unit 41. The search target travel area calculation unit 40 receives the preset front maximum detected distance 42, rear maximum detected distance 43, and number of target lanes 44 as inputs, and outputs the search target travel area to the travelable area calculation unit 41. The search target travel area is an area to be searched for by a peripheral object/obstacle in the calculation of the next travelable area, and here, for example, an area detected by the external recognition sensor is set as a search target. Therefore, the front maximum detection distance 42 and the rear maximum detection distance 43 are used as inputs. The number of target lanes 44 is set to a fixed value because the number of lanes on which the host vehicle can move in the near future is not so large. The travelable region calculation unit 41 receives the search target travel region calculated by the search target travel region calculation unit 40, the lane information around the host vehicle, the distance from the host vehicle to the peripheral object/obstacle, and the lane position of the peripheral object/obstacle as inputs, and outputs a travelable region. Fig. 7 is a diagram illustrating the search target travel area. Fig. 7 is configured by a host vehicle F10, a lane F11, an object/obstacle F12 (vehicle other than the host vehicle F10) around the host vehicle that can be detected by the external recognition sensor, an external recognition sensor detection area F13, and a search target travel area F14. The search target travel area F14 is calculated using the front maximum detectable distance F15 and the rear maximum detectable distance F16 detectable in the surrounding recognition sensor detection area F13 and the preset target lane number F17. Here, the number of target lanes F17 is expressed by 3 lanes, which are, for example, a left lane, a right lane, and a travel lane of the host vehicle, in which the host vehicle can move in the nearest direction. With the host vehicle F10 as the center, the front side is the search target in 3 lanes of the number of target lanes F17 to the front maximum detection distance F15, and the rear side is the search target in 3 lanes of the number of target lanes F17 to the rear maximum detection distance F16. When the host vehicle is located in the rightmost lane, 2 lanes, i.e., the host vehicle traveling lane and the host vehicle left lane, may be set as search targets.
Fig. 6 shows an example of a flowchart of the travelable region calculation unit 41.
First, the search target traveling region calculation unit 40 shown in fig. 5 calculates a search target traveling region S01.
Next, 1 lane is selected from the lanes in the search target travel area S02. In the example of fig. 7, since there are 3 lanes at maximum, 1 lane is selected from among them, and the subsequent processes S03 to S15 are performed. The processing from S03 to S15 is repeated until all the 3 lanes are processed. The processes from S03 to S15 include 2 processes, the processes from S05 to S09 are processes for calculating a travelable region in front of the host vehicle, and the processes from S10 to S3514 are processes for calculating a travelable region behind the host vehicle.
Next, when the selected lane is an adjacent lane (left lane or right lane) of the host vehicle, it is determined whether the host vehicle can move to the adjacent lane S03. The determination as to whether or not the host vehicle can move to the adjacent lane will be described with reference to an example of the determination as to whether or not the host vehicle can move to the adjacent lane shown in fig. 8. As to whether or not the own vehicle F20 in fig. 8 (a) can move to the right lane, it is determined whether or not another vehicle F21 enters the movement possible/impossible area F25, and when it enters, it is considered that the vehicle cannot move. In the case of fig. 8 (a), the other vehicle F21 enters the movement possible/impossible zone F25 and is therefore considered unmovable. In the case of fig. 8 (b), the other vehicle F22 does not enter the movement possible or impossible zone F25, and is therefore considered to be movable. The vertical width of the movement possible/impossible region F25 is represented by a region obtained by adding a margin distance (マージン) F24 on the front/rear side to the own vehicle full length F23 of the own vehicle F20 shown in fig. 8 (a). The lateral width of the movement permission area F25 uses the lane width. In this adjacent movement possibility determination, even if there is a region that is just the size of the own vehicle total length F23, the vehicle cannot actually move, and a certain inter-vehicle distance must be left, and therefore, the margin distance F24 between the front and rear is defined. If the host vehicle cannot move to the adjacent lane, it is determined that there is no travelable region in the selected lane S04. Taking the travelable region scene 1 of fig. 9 as an example, since another vehicle exists in the right lane of the own vehicle, the lane F30 of the search target travel region F14 is excluded from the travelable regions at S04. The reason why the lane in which the host vehicle is not able to move recently is excluded from the travelable region in this way is that if the lane in which the host vehicle is not able to move recently is included in the travelable region, the region in which the host vehicle is supposed to be unable to move becomes the travelable region, and the effect of reducing the amount of transmission data is reduced.
Next, when the host vehicle can move to the adjacent lane, it is determined whether there is another object/obstacle ahead in the selected lane S05. If not, the distance to another object/obstacle present in the lane adjacent to the selected lane is set as the travelable region S06. Taking the travelable region scene 2 of fig. 11 as an example, since there is no vehicle ahead of the right lane of the own vehicle, the travelable region is the front maximum detection distance of the right lane, if it is simply considered. However, since the area that should not be moved nearest becomes the travelable area and the effect of reducing the amount of transmission data is reduced, in the present embodiment, when there is no other object/obstacle in front of the selected right lane, the travelable area of F51 is set using the distance from the other object/obstacle F50 (other object/obstacle in front of the own vehicle) present in the left adjacent lane to the own vehicle. If there is another object/obstacle in the front, the distance S07 to the nearest object/obstacle in the front in the selected lane is acquired.
Next, it is determined whether or not the difference between the distance acquired in S07 and the distance between another object and an obstacle present in the adjacent lane closest to the host vehicle is equal to or less than a predetermined value S08. If the distance is greater than the predetermined value, the distance to another object or obstacle present in the lane adjacent to the selected lane is set as the travelable region in S06. This is because, as shown in the example of the travelable region adjacent information scene 3 in fig. 13, when the difference F73 between the distances of the other object/obstacle F70 existing ahead of the right lane of the host vehicle and the other object/obstacle F71 existing ahead of the host vehicle is large, the travelable region of the right lane of the host vehicle is actually a travelable region in which the host vehicle cannot move closest, and therefore, in this case, the travelable region F72 is set using the distance to the other object/obstacle F71 in front of the host vehicle. When the distance F73 is equal to or less than a certain value, the distance from the host vehicle to another object or obstacle ahead is set as the travelable region S09. Taking the left lane of the host vehicle in the travelable region scene 1 of fig. 9 as an example, since the distance F34 between the object/obstacle in front of the host vehicle and the object/obstacle in front of the left lane is short, the object/obstacle closest to the front of the left lane is set as the left lane travelable region F31. Further, taking the travel lane of the host vehicle as an example, the travel lane possible area F32 is set to the object/obstacle closest to the front.
While the above-described processes S05 to S09 are directed to the front side of the host vehicle, the processes S10 to S14 are directed to the rear side of the host vehicle, and it is determined whether or not an object/obstacle is present behind the host vehicle in the selected lane S10. If not, the distance to an object or obstacle present in a lane adjacent to the selected lane is set as the travelable region S11. If any, the distance S12 to another object/obstacle closest to the rear of the host vehicle in the selected lane is acquired. Next, it is determined whether or not the difference between the acquired distance and the distance between another object and an obstacle present in the adjacent lane closest to the host vehicle is equal to or less than a predetermined value S13. If the value is greater than the predetermined value, the travelable region is set at S11. If the distance is equal to or less than a certain value, the distance from the host vehicle to another object or obstacle behind the host vehicle is set as the travelable region S14.
Next, it is determined whether or not the travelable regions S15 of all the lanes are calculated in the above-described processes S02 to S14, and if not, the process is repeated from S02. If the travel range is calculated, the process of calculating the travelable region ends.
As a result of processing the travelable region calculation flowchart shown in fig. 6, a travelable region F33 is calculated, taking travelable region scene 1 in fig. 9 as an example. Taking travelable area scene 2 of fig. 11 as an example, travelable area F52 is calculated.
The travelable region adjacent periphery information determination unit 31 of fig. 4 will be described.
When the travelable region scene 1 of fig. 9 is used for the travelable region determination unit 30 of fig. 4, a travelable region F33 is obtained. The travelable area adjacent periphery information indicates information of an area adjacent to the travelable area F33 of fig. 9, and refers to the area F40 of the travelable area adjacent information scene 1 of fig. 10. The lowest longitudinal width F43 of the travelable region adjacent region may be the entire length of the other object/obstacle F41, or may be changed in accordance with the vehicle speed of the other object/obstacle F41. In addition, the lowest lateral width F44 may use the full width of other objects/obstacles F41, as well as lane widths obtained from maps or cameras. Information of other objects/obstacles F41 located on the travelable area adjacent region F40 is defined as travelable area adjacent periphery information. The travelable area adjacent information list obtained by listing the plurality of other objects/obstacles is output from the travelable area adjacent periphery information determination unit 31. As for the travelable region F52 shown in the travelable region scene 2 of fig. 11, the region adjacent to the travelable region of the travelable region adjacent information scene 2 of fig. 12 is F60. As in the example of fig. 10, the other objects/obstacles F61 located in the area F60 are listed as travelable area-adjacent peripheral information and are output to the peripheral information determination unit 32 other than the travelable area-adjacent peripheral information. As described above, since the other object/obstacle assigned to the travelable area adjacent periphery information is adjacent to the travelable area of the host vehicle, the possibility of moving to the travelable area of the host vehicle in the near future is high, the priority of the provided information is high, and it is necessary to provide the information such as the position and the behavior of the object/obstacle to the user in detail.
The peripheral information determination unit 32 other than the travelable region adjacent peripheral information in fig. 4 will be described.
In the peripheral information determination unit 32 other than the travelable region adjacent peripheral information, the other object/obstacle F42 other than the other object/obstacle F41 included in the travelable region adjacent information list shown in fig. 10 and detected by the external world identification sensor is made into a filtering object information list and output to the filtering parameter integration unit 33. In the same manner as in the case of fig. 12, the other objects/obstacles F62 detected by the external world identification sensor other than the other objects/obstacles F61 included in the travelable area adjacent information list are made into the filtering object information list and output to the filtering parameter integrating unit 33. The filtering object information list selected here is different from the object/obstacle having a high priority adjacent to the travelable area, and is an object/obstacle that does not move to the travelable area of the own vehicle in the closest distance, and has a low priority. Therefore, it is expected that the objects and obstacles registered in the filtering target information list will not have a great influence even if the data provided to the user is reduced.
The method of calculating the parameters of the filter, which is executed by the filter parameter integrating unit 33 of fig. 4 on the filter target information list, will be described.
The filtering parameter integrating unit 33 performs a process of selecting a parameter to be filtered using the filtering cycle list 34 for each piece of peripheral information and the filtering object data list 35 for each piece of peripheral information, using the filtering object information list from the peripheral information determining unit other than the peripheral information adjacent to the travelable region. The filtering cycle list 34 for each piece of peripheral information is a list for reducing the amount of data by temporal thinning out of data. The filtering target data list 35 for each piece of peripheral information is a list for reducing the amount of data by thinning out the amount of data.
First, an example of a filtering cycle list for each piece of peripheral information will be described with reference to fig. 14. In the filtering cycle list for each piece of surrounding information in fig. 14, the own vehicle surrounding information object type includes surrounding three-dimensional objects and surrounding road surface information. The surrounding three-dimensional objects include objects, traffic lights, signs, road ends and the like. Further, the object includes information such as an object ID, a relative position, a relative speed, a width, and a height. The traffic signal includes information such as a traffic signal ID, a category, a relative position, and a state. In addition, the identification and the end of the way are also divided into detailed information. The peripheral road surface information includes information such as lane marks and other markings. The lane marker further includes information such as a lane ID, a lane type, a relative position, and a steering angle. Further, the other paint labels include a paint label ID, a relative position, a category, and the like. For each of these contents, the period of transmission is defined by the present list. Here, a default transmission period and a filtering period are defined. Basically, the default transmission cycle is set to a value faster than the cycle at the time of filtering, and the default transmission cycle is assigned to the information in the travelable area adjacent peripheral information list. In contrast, the filtering target information list to be reduced in the amount of data to be transmitted in a certain period is assigned a period for filtering, and the amount of data to be transmitted in the certain period is suppressed. For example, taking the relative position of the peripheral three-dimensional object as an example, the default period of 60ms is applied to the object/obstacle F41 in fig. 10. For object/obstacle F42, a filtering period of 100ms is used. Thus, the object/obstacle F42 with a low priority can provide information to the user while suppressing the data amount in a certain period compared to the object/obstacle F41 with a high priority.
Next, an example of the filtering target data list for each piece of peripheral information will be described with reference to fig. 15. The filtering target data list for each piece of surrounding information in fig. 15 also has the same category of the own vehicle surrounding information target as the filtering cycle list for each piece of surrounding information in fig. 14. Here, default transmitted content and filtered transmitted content are defined. Basically, more contents than those transmitted after filtering are set as the default transmission contents, and the default transmission contents are assigned to the information in the travelable area adjacent peripheral information list. On the other hand, the filtered transmission content is assigned to the filtering object information list to be reduced in transmission data, and the transmission content is thinned out to suppress the data amount. For example, taking the width of an object of a peripheral three-dimensional object as an example, the object/obstacle F41 in fig. 10 transmits data of the width in accordance with default transmission contents. For the object/obstacle F42, the content is transmitted after filtering, and data of the width is not transmitted. Thus, the user can be provided with information with a reduced data amount for the object/obstacle F42 with a low priority as compared with the object/obstacle F41 with a high priority. In the example of the filtering target data list for each piece of peripheral information shown in fig. 15, the content itself is thinned out to reduce the amount of transmission data, but a method of compressing and transmitting data may be used.
As shown in the travelable area adjacent information scenario 1 of fig. 10, the filter parameter may be calculated by setting a priority for each of the travelable area F45 around the host vehicle, the area F40 adjacent to the travelable area, and the other areas, and using the priority determined for each of the areas. For example, the region F45, which is a travelable region around the host vehicle, has a high priority, the region F40, which is a region adjacent to the travelable region, has a low priority, and the other regions have a low priority. In this case, the filter parameter for other objects, obstacles, and the like belonging to each area may be changed according to the priority. Taking the filtering cycle list of each peripheral information in fig. 14 as an example, fig. 14 defines two types of default cycles and filtering cycles, and when the priority is used, defines a filtering cycle with high priority, a filtering cycle with medium priority, and a filtering cycle with low priority, and sets the transmission cycle to a value from high to low according to the priority. Similarly, in the filtering target data list for each piece of peripheral information shown in fig. 15, pieces of information to be reduced are defined for each priority.
Next, an example of a filter parameter calculation flowchart for calculating the filter parameter output from the filter parameter integrating unit 33 will be described with reference to fig. 16. First, the filtering object information list S21 is acquired. Next, 1 piece of object information is selected from the acquired filtering object information list S22. It is confirmed whether or not the object category of the selected information is registered in the filtering cycle list of each peripheral information of fig. 14S 23. In the example of fig. 14, the peripheral three-dimensional object and the peripheral road surface information are listed as the object types. If not registered, the process of updating the filter parameter of the selected object information is skipped. If there is registration, the filter parameter stores the filter cycle S24 corresponding to each content of the selected object. The contents here refer to the object ID, relative position, lane ID of the lane marker, and lane type of the object in fig. 14. Next, it is confirmed whether or not the selected object category is registered in the filtering object data list for each piece of surrounding information in fig. 15S 25. If not registered, the process of updating the filter parameter of the selected object information is skipped. If there is registration, the filtered transmission content corresponding to each content of the selected object is stored in the filter parameter S26. Next, it is determined whether all the object information has been selected S27, and if not, S22 to S26 are repeatedly executed. If all the object information is selected, the process of calculating the filter parameter is completed.
As shown in fig. 4, the filter parameters calculated by the flowchart of fig. 16 are output to the output data filter unit 11. The output data filtering unit 11 determines whether or not the information is a target registered in the filter parameter for each piece of the information on the periphery of the host vehicle output from the data integration unit 09, and if the information is registered, the data is reduced using the filter parameter, and if the information is not registered, the data is transmitted in a default setting.
The effect of the present invention is shown by comparing the time of congestion travel and the time of normal travel in fig. 17. Fig. 17 (a) shows a case of congestion travel, and (b) shows a case of normal travel. Since the normal travel time in fig. 17 (b) is a travel time in which the vehicle travels with a certain distance between itself and the vehicle existing in front of the own vehicle, the larger the distance between themselves, the larger the range of the travelable area F83 in the normal travel time, as compared with the congestion travel time, when the flowchart shown in fig. 6 is used. Therefore, the region of the travelable region F83 at the time of normal travel occupies most of the recognition region F82 of the outside recognition sensor, so that the region F84 of the filtering object in front of the own vehicle is narrowed, and the object/obstacle of the filtering object is defined. However, since the inter-vehicle distance between the own object/obstacle is large, the amount of data to be provided is not large. In contrast, during the congestion traveling in fig. 17 (a), a plurality of vehicles ahead of the host vehicle travel at a low speed and a narrow inter-vehicle distance, and the range of the travelable region F80 becomes narrow. The region of the travelable region F80 is only a part of the recognition region F82 of the surrounding recognition sensor compared to the normal travel, and therefore the region F81 of the filter object in front of the host vehicle is larger than that in the normal travel. This may increase the number of vehicles detected by the external recognition sensor, and the amount of information provided in the conventional process may increase. In the present invention, the filtering process is performed on the objects and obstacles included in the region F81 to be filtered, and the amount of data to be provided can be reduced. Therefore, even in a situation where a large number of vehicles can be detected by the external recognition sensor as in the case of congestion traveling, the network load due to the provision of information can be smoothed. Further, the processing load on the information acquisition side can be reduced.
Description of the symbols
01 … own vehicle peripheral information management device, 02 … own vehicle behavior recognition sensor, 03 … external recognition sensor, 04 … map, 05 … GPS, 06 … automatic drive control ECU, 07 … drive ECU, 08 … display ECU, 09 … data integration section (data acquisition section), 10 … filter parameter determination section, 11 … output data filter section, 12 … input communication network, 13 … output communication network, 14 … data selection section.

Claims (10)

1. A device for managing information on the periphery of a vehicle, comprising:
a data acquisition unit that acquires a plurality of types of external information data around a vehicle; and
a data selection unit that selects a part of the external information data and outputs the selected part to the outside,
the data selection unit determines a first region, which is a travelable region in which the host vehicle can move, based on whether the host vehicle can move to an adjacent lane and a distance between the host vehicle and another object or an obstacle, based on the outside environment information data, and deletes data of surrounding environment information existing outside the first region and a second region adjacent to the first region.
2. The own-vehicle surrounding information management device according to claim 1,
when a region not adjacent to the first region is set as a third region, the data selection unit gives priority to the ambient environment information data for each of the regions, and changes the content and the amount of data to be output for each of the given priorities.
3. The own-vehicle surrounding information management device according to claim 1,
the deleted information is determined based on low priority information that is information relating to an object that cannot immediately approach the own vehicle.
4. The own-vehicle surrounding information management device according to claim 3,
the deleting includes making a transmission cycle of data of the ambient environment information existing outside the first area and the second area longer than a default value according to the low priority information.
5. The own-vehicle surrounding information management device according to claim 3,
the deleting includes removing the transmission contents of the data of the ambient environment information existing outside the first area and the second area at intervals according to the low priority information.
6. The own-vehicle surrounding information management device according to claim 3,
and deleting the transmission content of the data including the ambient environment information existing outside the first area and the second area in accordance with the low priority information.
7. The own vehicle peripheral information management device according to any one of claims 1 to 6, characterized by having:
a coordinate conversion processing unit for integrating coordinate systems between the plurality of types of external information data,
the data selection unit identifies the travelable region from outside world information data in which the coordinates are integrated.
8. The own vehicle peripheral information management device according to any one of claims 1 to 6, characterized by having:
a synchronization processing unit that synchronizes acquisition timings of the plurality of types of external information data,
the data selection unit identifies the travelable region from the synchronized external information data.
9. The own vehicle peripheral information management device according to any one of claims 1 to 6, characterized by having:
a grouping processing unit that groups the same object among the plurality of types of external information data as a same object,
the data selection unit identifies the travelable region from the grouped external information data.
10. The own vehicle peripheral information management device according to any one of claims 1 to 6, characterized by having:
a tracking processing unit for performing tracking processing on the same object among the plurality of types of external information data,
the data selection unit identifies the travelable region from the tracked outside world information data.
CN201680030600.XA 2015-07-31 2016-07-20 Vehicle surrounding information management device Active CN108352116B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2015-151449 2015-07-31
JP2015151449 2015-07-31
PCT/JP2016/071198 WO2017022475A1 (en) 2015-07-31 2016-07-20 Vehicle periphery information management device

Publications (2)

Publication Number Publication Date
CN108352116A CN108352116A (en) 2018-07-31
CN108352116B true CN108352116B (en) 2022-04-05

Family

ID=57942863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680030600.XA Active CN108352116B (en) 2015-07-31 2016-07-20 Vehicle surrounding information management device

Country Status (5)

Country Link
US (2) US10493907B2 (en)
EP (1) EP3330946A4 (en)
JP (2) JP6567056B2 (en)
CN (1) CN108352116B (en)
WO (1) WO2017022475A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6332384B2 (en) * 2016-09-29 2018-05-30 マツダ株式会社 Vehicle target detection system
US10663974B2 (en) * 2016-11-23 2020-05-26 Electronics And Telecommunications Research Institute Object recognition device, autonomous driving system including the same, and object recognition method using the object recognition device
EP3360747A1 (en) * 2017-02-13 2018-08-15 Autoliv Development AB Improvements in or relating to driver assistance systems
JP6661695B2 (en) * 2018-05-09 2020-03-11 三菱電機株式会社 Moving object detection device, vehicle control system, moving object detection method, and vehicle control method
DE102018133457B4 (en) * 2018-12-21 2020-07-09 Volkswagen Aktiengesellschaft Method and system for providing environmental data
EP3700796B1 (en) * 2018-12-26 2021-04-14 Baidu.com Times Technology (Beijing) Co., Ltd. Methods for obstacle filtering for non-nudge planning system in autonomous driving vehicle
CN113345269B (en) * 2018-12-28 2022-11-08 北京百度网讯科技有限公司 Vehicle danger early warning method, device and equipment based on V2X vehicle networking cooperation
US11003195B2 (en) * 2019-02-28 2021-05-11 GM Global Technology Operations LLC Method to prioritize the process of receiving for cooperative sensor sharing objects
US10741070B1 (en) * 2019-03-04 2020-08-11 GM Global Technology Operations LLC Method to prioritize transmission of sensed objects for cooperative sensor sharing
JP7226544B2 (en) * 2019-06-13 2023-02-21 日産自動車株式会社 VEHICLE TRIP CONTROL METHOD AND TRIP CONTROL DEVICE
JP7363118B2 (en) * 2019-06-14 2023-10-18 マツダ株式会社 External environment recognition device
WO2020262718A1 (en) * 2019-06-25 2020-12-30 엘지전자 주식회사 Method for transmitting sensing information for remote driving in automated vehicle & highway systems, and apparatus therefor
JP7307170B2 (en) * 2019-07-10 2023-07-11 日立Astemo株式会社 SENSING PERFORMANCE EVALUATION DIAGNOSTIC SYSTEM OF EXTERNAL WORLD RECOGNIZATION SENSOR AND SENSING PERFORMANCE EVALUATION DIAGNOSTIC METHOD
DE102019216916A1 (en) * 2019-11-04 2021-05-06 Robert Bosch Gmbh Method for transmitting a message in a communication network for communication between a road user and at least one other road user
JP7392506B2 (en) * 2020-02-13 2023-12-06 株式会社アイシン Image transmission system, image processing system and image transmission program
CN113705272A (en) * 2020-05-20 2021-11-26 华为技术有限公司 Method, device, equipment and storage medium for detecting travelable area
DE102021209575B3 (en) * 2021-08-31 2023-01-12 Volkswagen Aktiengesellschaft Method and assistance device for supporting vehicle functions in a parking space and motor vehicle

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257924A (en) * 1996-03-19 1997-10-03 Mitsubishi Electric Corp Moving body monitoring method
JP2000310677A (en) * 1999-04-28 2000-11-07 Honda Motor Co Ltd Obstacle detector apparatus
JP2003217099A (en) * 2002-01-23 2003-07-31 Mitsubishi Electric Corp On-vehicle surrounding monitoring device
JP2004136816A (en) * 2002-10-18 2004-05-13 Denso Corp Control system for vehicle
JP2006285335A (en) * 2005-03-31 2006-10-19 Nissan Motor Co Ltd Obstacle detector and obstacle detection method
JP2009187351A (en) * 2008-02-07 2009-08-20 Fujitsu Ten Ltd Obstacle detecting device and obstacle detecting method
EP2093741A1 (en) * 2006-11-10 2009-08-26 Toyota Jidosha Kabushiki Kaisha Obstacle course predicting method, device and program
CN101585361A (en) * 2009-05-25 2009-11-25 郭文艺 Control device for preventing automobile from colliding and deviating roadway
WO2011101988A1 (en) * 2010-02-22 2011-08-25 トヨタ自動車株式会社 Risk degree calculation device
JP2011227582A (en) * 2010-04-15 2011-11-10 Toyota Motor Corp Vehicle collision prediction apparatus
CN104537889A (en) * 2014-12-30 2015-04-22 四川九洲电器集团有限责任公司 Anti-collision method and system under different vehicle conditions

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7202776B2 (en) * 1997-10-22 2007-04-10 Intelligent Technologies International, Inc. Method and system for detecting objects external to a vehicle
US7899616B2 (en) * 1997-10-22 2011-03-01 Intelligent Technologies International, Inc. Method for obtaining information about objects outside of a vehicle
JP2001052297A (en) * 1999-08-06 2001-02-23 Fujitsu Ltd Method and device for supporting safe travel and recording medium
JP4692831B2 (en) * 2006-03-28 2011-06-01 アイシン・エィ・ダブリュ株式会社 Peripheral situation recognition device and method
JP5012396B2 (en) 2007-10-16 2012-08-29 日産自動車株式会社 White line detection device, parking and stopping support device, and white line detection method
DE102008011228A1 (en) * 2008-02-26 2009-08-27 Robert Bosch Gmbh Method for assisting a user of a vehicle, control device for a driver assistance system of a vehicle and vehicle having such a control device
US8605947B2 (en) * 2008-04-24 2013-12-10 GM Global Technology Operations LLC Method for detecting a clear path of travel for a vehicle enhanced by object detection
JP2010076460A (en) * 2008-09-23 2010-04-08 Denso Corp Control device for blinker
US8352111B2 (en) * 2009-04-06 2013-01-08 GM Global Technology Operations LLC Platoon vehicle management
CN102449672B (en) * 2009-06-02 2013-05-01 丰田自动车株式会社 Vehicular peripheral surveillance device
JP5327321B2 (en) 2009-06-04 2013-10-30 トヨタ自動車株式会社 Vehicle periphery monitoring device and vehicle periphery monitoring method
JP5197679B2 (en) 2010-06-09 2013-05-15 株式会社豊田中央研究所 Object detection apparatus and program
JP5206752B2 (en) * 2010-08-30 2013-06-12 株式会社デンソー Driving environment recognition device
JP5338801B2 (en) * 2010-12-23 2013-11-13 株式会社デンソー In-vehicle obstacle information notification device
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US9324235B2 (en) * 2011-12-27 2016-04-26 Honda Motor Co., Ltd. Driving assistance system
US9412273B2 (en) * 2012-03-14 2016-08-09 Autoconnect Holdings Llc Radar sensing and emergency response vehicle detection
JP6015329B2 (en) * 2012-10-11 2016-10-26 株式会社デンソー Convoy travel system and convoy travel device
KR101491256B1 (en) 2013-05-28 2015-02-06 현대자동차주식회사 Apparatus and Method for Recognizing for Traffic Lane using Wireless Communication
JP6441610B2 (en) * 2013-10-30 2018-12-19 株式会社デンソー Travel control device and server
JP2015102349A (en) * 2013-11-21 2015-06-04 株式会社デンソー Operation assist device
JP6327078B2 (en) * 2013-12-23 2018-05-23 株式会社デンソー Driving assistance device
US9632502B1 (en) * 2015-11-04 2017-04-25 Zoox, Inc. Machine-learning systems and techniques to optimize teleoperation and/or planner decisions

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09257924A (en) * 1996-03-19 1997-10-03 Mitsubishi Electric Corp Moving body monitoring method
JP2000310677A (en) * 1999-04-28 2000-11-07 Honda Motor Co Ltd Obstacle detector apparatus
JP2003217099A (en) * 2002-01-23 2003-07-31 Mitsubishi Electric Corp On-vehicle surrounding monitoring device
JP2004136816A (en) * 2002-10-18 2004-05-13 Denso Corp Control system for vehicle
JP2006285335A (en) * 2005-03-31 2006-10-19 Nissan Motor Co Ltd Obstacle detector and obstacle detection method
EP2093741A1 (en) * 2006-11-10 2009-08-26 Toyota Jidosha Kabushiki Kaisha Obstacle course predicting method, device and program
JP2009187351A (en) * 2008-02-07 2009-08-20 Fujitsu Ten Ltd Obstacle detecting device and obstacle detecting method
CN101585361A (en) * 2009-05-25 2009-11-25 郭文艺 Control device for preventing automobile from colliding and deviating roadway
WO2011101988A1 (en) * 2010-02-22 2011-08-25 トヨタ自動車株式会社 Risk degree calculation device
JP2011227582A (en) * 2010-04-15 2011-11-10 Toyota Motor Corp Vehicle collision prediction apparatus
CN104537889A (en) * 2014-12-30 2015-04-22 四川九洲电器集团有限责任公司 Anti-collision method and system under different vehicle conditions

Also Published As

Publication number Publication date
JP6804596B2 (en) 2020-12-23
JP6567056B2 (en) 2019-08-28
JP2019215893A (en) 2019-12-19
EP3330946A4 (en) 2019-07-03
JPWO2017022475A1 (en) 2018-02-08
US20200055447A1 (en) 2020-02-20
US20180154825A1 (en) 2018-06-07
CN108352116A (en) 2018-07-31
US10759336B2 (en) 2020-09-01
EP3330946A1 (en) 2018-06-06
WO2017022475A1 (en) 2017-02-09
US10493907B2 (en) 2019-12-03

Similar Documents

Publication Publication Date Title
CN108352116B (en) Vehicle surrounding information management device
US10783789B2 (en) Lane change estimation device, lane change estimation method, and storage medium
CN110366513B (en) Vehicle control system, vehicle control method, and storage medium
US10943133B2 (en) Vehicle control device, vehicle control method, and storage medium
US11004000B1 (en) Predicting trajectory intersection by another road user
CN110267856B (en) Vehicle control device, vehicle control method, and storage medium
US10796574B2 (en) Driving assistance method and device
US9550498B2 (en) Traffic light anticipation
US20190315348A1 (en) Vehicle control device, vehicle control method, and storage medium
KR102546343B1 (en) Driving support method and driving support device
US20200385020A1 (en) Vehicle control device, vehicle control method, and storage medium
CN110087964B (en) Vehicle control system, vehicle control method, and storage medium
CN110167811B (en) Vehicle control system, vehicle control method, and storage medium
JP6954469B2 (en) Driving support method and driving support device
CN110001641B (en) Vehicle control device, vehicle control method, and storage medium
CN110949376B (en) Vehicle control device, vehicle control method, and storage medium
US20190035278A1 (en) Driving Assistance Method and Device
US20190276029A1 (en) Vehicle control device, vehicle control method, and storage medium
US10854083B2 (en) Vehicle control device, vehicle control method, and storage medium
CN113085852A (en) Behavior early warning method and device for automatic driving vehicle and cloud equipment
JP6809087B2 (en) Driving support method and driving support device
JP2020125988A (en) Entrance lane estimation system, entrance lane estimation method, and entrance lane estimation program
CN109314763B (en) Method and device for estimating vehicle-to-vehicle distance
SE541480C2 (en) Method and system for estimating traffic flow
JP7291015B2 (en) Surrounding object recognition method and surrounding object recognition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

GR01 Patent grant
GR01 Patent grant