CN113479209A - Travel control device, vehicle, travel control method, and storage medium - Google Patents

Travel control device, vehicle, travel control method, and storage medium Download PDF

Info

Publication number
CN113479209A
CN113479209A CN202110252626.1A CN202110252626A CN113479209A CN 113479209 A CN113479209 A CN 113479209A CN 202110252626 A CN202110252626 A CN 202110252626A CN 113479209 A CN113479209 A CN 113479209A
Authority
CN
China
Prior art keywords
lane
map information
vehicle
recognition
travel control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110252626.1A
Other languages
Chinese (zh)
Other versions
CN113479209B (en
Inventor
田村祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN113479209A publication Critical patent/CN113479209A/en
Application granted granted Critical
Publication of CN113479209B publication Critical patent/CN113479209B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • B60W30/12Lane keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0059Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/53Road markings, e.g. lane marker or crosswalk
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/40High definition maps
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Navigation (AREA)

Abstract

The invention relates to a travel control device, a vehicle, a travel control method, and a storage medium. The invention improves the accuracy of the driving control under the condition that the map information and the information acquired by the camera are not matched. The recognition means recognizes a lane on which the mobile object travels, based on an image captured by a camera provided on the mobile object. The control means performs travel control of the moving object based on the recognition result of the recognition means and map information of the periphery of the moving object. The first determination unit determines whether or not a first angular difference between the lane recognized by the recognition unit and the lane based on the map information is continuously in a first state of being equal to or greater than a first threshold value in a first range in front of the mobile object. The second determination means determines whether or not a second angular difference between the lane recognized by the recognition means and the lane based on the map information is in a second state in which the second angular difference is equal to or greater than a second threshold value within a second range that is ahead of the mobile body and closer to the mobile body side than the first range.

Description

Travel control device, vehicle, travel control method, and storage medium
Technical Field
The invention relates to a travel control device, a vehicle, a travel control method, and a storage medium.
Background
There is known a vehicle that recognizes a lane of a road on which the vehicle is traveling and performs travel control based on the recognized lane. Patent document 1 discloses the following technique: the presence or absence of a construction section is confirmed by a camera, and when the construction section is confirmed, the travel environment information of the construction section is acquired from a map or the like.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2019-156195
Disclosure of Invention
Problems to be solved by the invention
Here, for example, the position of the dividing line may be changed before and after the construction because the dividing line is newly divided after the road construction. In such a case, if the change of the division line due to the construction is not reflected in the map information, the shape of the division line recognized by the camera may not match the shape of the division line based on the map information. That is, sometimes the map information does not match the information acquired by the camera. In such a case, there is room for improvement in the accuracy of the travel control.
The purpose of the present invention is to provide a technique for improving the accuracy of travel control when map information and information acquired by a camera do not match.
Means for solving the problems
According to an aspect of the present invention, there is provided a running control apparatus,
the travel control device includes:
a recognition unit that recognizes a lane on which a moving object travels, based on an image captured by a camera provided on the moving object;
a control means for performing travel control of the mobile body based on the recognition result of the recognition means and map information of the periphery of the mobile body;
a first determination unit configured to determine whether or not a first angular difference between the lane recognized by the recognition unit and the lane based on the map information is continuously in a first state equal to or larger than a first threshold value in a first range in front of the mobile object; and
a second determination unit that determines whether or not a second angular difference between the lane recognized by the recognition unit and the lane based on the map information is in a second state equal to or larger than a second threshold value within a second range that is ahead of the mobile body and closer to the mobile body side than the first range,
the control means performs the travel control in which the recognition result of the recognition means is prioritized over the map information when at least one of the first determination means determines that the vehicle is in the first state and the second determination means determines that the vehicle is in the second state.
In addition, according to another aspect of the present invention, there is provided a running control method,
the travel control method includes:
a recognition step of recognizing a lane on which a mobile body travels based on an image captured by a camera provided to the mobile body;
a control step of performing travel control of the mobile body based on a recognition result of the recognition step and map information of the periphery of the mobile body;
a first determination step of determining whether or not a first angular difference between the lane recognized by the recognition step and a lane based on map information on the periphery of the mobile object is continuously in a first state of being equal to or greater than a first threshold value in a first range in front of the mobile object; and
a second determination step of determining whether or not a second angular difference between the lane identified by the identification step and the lane based on the map information is in a second state equal to or larger than a second threshold value in a second range ahead of the mobile body and closer to the mobile body side than the first range,
in the control step, when at least one of the first determination step determines that the vehicle is in the first state and the second determination step determines that the vehicle is in the second state, the travel control is performed such that the recognition result of the recognition step is prioritized over the map information.
In addition, according to other aspects of the present invention, there is provided a storage medium, which is a computer-readable storage medium,
the storage medium stores a program that causes a computer to function as:
a recognition unit that recognizes a lane on which a moving object travels, based on an image captured by a camera provided on the moving object;
a control means for performing travel control of the mobile body based on the recognition result of the recognition means and map information of the periphery of the mobile body;
a first determination unit configured to determine whether or not a first angular difference between the lane recognized by the recognition unit and the lane based on the map information is continuously in a first state equal to or larger than a first threshold value in a first range in front of the mobile object; and
a second determination unit that determines whether or not a second angular difference between the lane recognized by the recognition unit and the lane based on the map information is in a second state equal to or larger than a second threshold value within a second range that is ahead of the mobile body and closer to the mobile body side than the first range,
the control means performs the travel control in which the recognition result of the recognition means is prioritized over the map information when at least one of the first determination means determines that the vehicle is in the first state and the second determination means determines that the vehicle is in the second state.
Effects of the invention
According to the present invention, it is possible to improve the accuracy of travel control in the case where the map information and the information acquired by the camera do not match.
Drawings
Fig. 1 is a block diagram of a vehicle control device according to an embodiment.
Fig. 2 is a diagram illustrating switching of the travel mode by the control unit.
Fig. 3 (a) to 3 (c) are diagrams illustrating the shapes of the map information and the lane based on the information acquired by the camera.
Fig. 4 is a flowchart showing an example of processing by the control unit.
Fig. 5 (a) and 5 (b) are flowcharts showing an example of processing by the control unit.
Fig. 6 is a flowchart showing an example of processing by the control unit.
Description of the reference numerals
1: a vehicle; 2: a control unit; 20: an ECU.
Detailed Description
Hereinafter, embodiments will be described in detail with reference to the drawings. The following embodiments do not limit the invention according to the claims, and all combinations of features described in the embodiments are not necessarily essential to the invention. Two or more of the plurality of features described in the embodiments may be arbitrarily combined. The same or similar components are denoted by the same reference numerals, and redundant description thereof is omitted.
< first embodiment >
Fig. 1 is a block diagram of a vehicle control device according to an embodiment of the present invention, and controls a vehicle 1. Fig. 1 shows a schematic of a vehicle 1 in a plan view and a side view. As an example, the vehicle 1 is a sedan-type four-wheeled passenger vehicle. In the following description, the left and right sides are referred to as being oriented in the forward direction of the vehicle 1. In the present embodiment, a four-wheeled passenger vehicle is described as an example, but the configuration according to the present embodiment can be applied to a saddle-ride type vehicle such as a motorcycle or other moving body that can move on a road.
The control device of fig. 1 comprises a control unit 2. The control unit 2 includes a plurality of ECUs 20 to 29 that are connected so as to be able to communicate using an in-vehicle network. Each ECU includes a processor typified by a CPU, a storage device such as a semiconductor memory, an interface with an external device, and the like. The storage device stores a program executed by the processor, data used by the processor in processing, and the like. Each ECU may include a plurality of processors, storage devices, interfaces, and the like. Instead of the above, each ECU may include an application specific integrated circuit such as an ASIC for executing processing performed by each ECU.
Hereinafter, functions and the like of the ECUs 20 to 29 will be described. The number of ECUs and the functions to be assigned to the ECUs can be appropriately designed, and can be further detailed or integrated than the present embodiment.
The ECU20 executes control related to automatic driving of the vehicle 1. In the autonomous driving, at least one of steering, acceleration, and deceleration of the vehicle 1 is automatically controlled. In a control example described later, the ECU20 automatically controls at least the steering of the vehicle 1 to thereby execute stop control of the vehicle 1. In this way, in a certain aspect, the ECU20 functions as a travel control device of the vehicle 1.
The ECU21 controls the electric power steering device 3. The electric power steering apparatus 3 includes a mechanism for steering the front wheels in accordance with a driving operation (steering operation) of the steering wheel 31 by the driver. The electric power steering apparatus 3 includes a motor that generates a driving force for assisting a steering operation or automatically steering front wheels, a sensor that detects a steering angle, and the like. When the driving state of the vehicle 1 is the automatic driving, the ECU21 automatically controls the electric power steering device 3 in accordance with an instruction from the ECU20 to control the traveling direction of the vehicle 1.
The ECUs 22 and 23 control the detection units 41 to 43 for detecting the surrounding conditions of the vehicle and process the detection results. The detection unit 41 is a camera (hereinafter, may be referred to as a camera 41) that captures an image of the front of the vehicle 1, and in the case of the present embodiment, is attached to the vehicle interior side of the front window at the front roof portion of the vehicle 1. By analyzing the image captured by the camera 41, the outline of the target object and the lane lines (white lines, etc.) on the road can be extracted.
The Detection unit 42 is a Light Detection and Ranging (LIDAR: optical radar) (hereinafter, may be referred to as an optical radar 42) and detects a target object around the vehicle 1 or measures a distance to the target object. In the present embodiment, five optical radars 42 are provided, one at each corner of the front portion of the vehicle 1, one at the center of the rear portion, and one at each side of the rear portion. The detection means 43 is a millimeter wave radar (hereinafter, may be referred to as a radar 43) and detects a target object around the vehicle 1 or measures a distance to the target object. In the present embodiment, five radars 43 are provided, one at the center of the front portion of the vehicle 1, one at each corner portion of the front portion, and one at each corner portion of the rear portion.
The ECU22 controls one of the cameras 41 and the optical radars 42 and performs information processing of detection results. The ECU23 controls the other camera 41 and each radar 43 and performs information processing of the detection results. By providing two sets of devices for detecting the surrounding conditions of the vehicle, the reliability of the detection result can be improved, and by providing different types of detection means such as a camera, a radar, and an optical radar, the surrounding environment of the vehicle can be analyzed in various ways.
The ECU24 controls the gyro sensor 5, the GPS sensor 24b, and the communication device 24c and processes the detection result or the communication result. The gyro sensor 5 detects a rotational motion of the vehicle 1. The course of the vehicle 1 can be determined from the detection result of the gyro sensor 5, the wheel speed, and the like. The GPS sensor 24b detects the current position of the vehicle 1. The communication device 24c wirelessly communicates with a server that provides map information and traffic information, and acquires these pieces of information. The ECU24 can access the database 24a of map information constructed in the storage device, and the ECU24 performs a route search from the current position to the destination, and the like.
The ECU25 includes a communication device 25a for vehicle-to-vehicle communication. The communication device 25a performs wireless communication with other vehicles in the vicinity to exchange information between the vehicles.
The ECU26 controls the power unit 6. The power plant 6 is a mechanism that outputs a driving force for rotating the driving wheels of the vehicle 1, and includes, for example, an engine and a transmission. The ECU26 controls the output of the engine in accordance with, for example, the driver's driving operation (accelerator operation or accelerator operation) detected by an operation detection sensor 7A provided at the accelerator pedal 7A, or switches the gear position of the transmission based on information such as the vehicle speed detected by a vehicle speed sensor 7 c. When the driving state of the vehicle 1 is the automatic driving, the ECU26 automatically controls the power plant 6 in response to an instruction from the ECU20 to control acceleration and deceleration of the vehicle 1.
The ECU27 controls lighting devices (headlamps, tail lamps, etc.) including a direction indicator 8 (turn signal lamp). In the case of the example of fig. 1, the direction indicator 8 is provided at the front, door mirror, and rear of the vehicle 1.
The ECU28 controls the input/output device 9. The input/output device 9 outputs information of the driver and receives input of information from the driver. The sound output device 91 notifies the driver of information by sound. The display device 92 notifies the driver of information by display of an image. The display device 92 is disposed on the front of the driver's seat, for example, and constitutes an instrument panel or the like. Note that, although the example is made by sound and display, information may be notified by vibration or light. Further, a plurality of sounds, displays, vibrations, or lights may be combined to report information. Further, the combination may be different or the notification manner may be different depending on the level of information to be notified (e.g., the degree of urgency).
The input device 93 is a switch group that is disposed at a position where the driver can operate and gives an instruction to the vehicle 1, but may include a voice input device.
The ECU29 controls the brake device 10 and a parking brake (not shown). The brake device 10 is, for example, a disc brake device, is provided to each wheel of the vehicle 1, and decelerates or stops the vehicle 1 by applying resistance to rotation of the wheel. The ECU29 controls the operation of the brake device 10 in accordance with, for example, the driver's driving operation (braking operation) detected by an operation detection sensor 7B provided on the brake pedal 7B. When the driving state of the vehicle 1 is the automatic driving, the ECU29 automatically controls the brake device 10 in response to an instruction from the ECU20 to decelerate and stop the vehicle 1. The brake device 10 and the parking brake can be operated to maintain the stopped state of the vehicle 1. In addition, when the transmission of the power unit 6 includes the parking lock mechanism, the transmission can be operated to maintain the stopped state of the vehicle 1.
< switching of drive mode >
Fig. 2 is a diagram showing mode switching in the travel control of the vehicle 1 by the control unit 2. In the present embodiment, the control means 2 performs the travel control of the vehicle 1 while switching the control mode among the manual driving mode, the combination mode, and the camera priority mode.
The manual driving mode is a mode in which manual driving is performed by the driver. In other words, the manual driving mode is a control mode in which automatic control of steering and acceleration/deceleration of the vehicle 1 is not performed by the control unit 2. In this mode, for example, corresponding ECUs in the control unit 2 control the electric power steering device 3, the power unit 6, the brake device 10, and the like in accordance with operations of the steering wheel 31, the accelerator pedal 7A, and the brake pedal 7B by the driver.
The combined mode is a mode in which the travel control is performed using the map information and the information acquired from the camera 41 in a state in which these pieces of information match each other. Specifically, in this mode, the control unit 2 performs the travel control in a state where the matching is achieved between the map information acquired from the communication device 24c and the information of the lane recognized based on the captured image of the camera 41 or the like. In one embodiment, the control unit 2 may compare the shape of the lane in front of the vehicle 1 based on the captured image of the camera 41 or the like with the shape of the lane in front of the vehicle 1 based on the information of the current position acquired from the map information and the GPS sensor 24b or the like. Then, the control unit 2 may determine that matching is achieved between the map information and the information acquired by the camera 41, that is, the above information matching, in a case where the above shapes coincide or the above difference in shape is within an allowable range.
As the travel control performed by the control unit 2 in the combination mode, for example, lane keeping control for controlling the vehicle 1 to travel in the center of the travel lane is given. In the combined mode, the control unit 2 uses both the map information and the information acquired by the camera 41, and therefore can be said to be able to execute the travel control with high accuracy. Thus, the control unit 2 can perform the travel control in the hands-off state in which the driver is not required to hold the steering wheel in the combined mode.
The camera priority mode is a mode in which the information acquired by the camera 41 is prioritized over the map information to perform travel control. For example, in this mode, the control unit 2 prioritizes the information acquired by the camera 41 and executes the travel control when the matching between the map information and the information acquired by the camera 41 cannot be achieved, when the map information cannot be acquired, or the like. As the travel control performed by the control unit 2 in the camera priority mode, for example, lane keeping control for controlling the vehicle 1 to travel in the center of the travel lane and the like can be cited. In the camera priority mode, the control unit 2 may execute the travel control in a grip state in which the driver is requested to grip the steering wheel.
Although not shown in fig. 2, the control unit 2 may have a travel mode in which priority is given to the map information when matching cannot be achieved between the map information and the information acquired by the camera 41, when a lane cannot be recognized by the camera, or the like.
In addition, when the matching between the map information and the information acquired by the camera 41 is not achieved, it is possible to appropriately set which of these pieces of information is prioritized. For example, when the deviation between the map information and the information acquired by the camera 41 is relatively small, it is considered that the road width is temporarily reduced by the construction, or the deviation occurs between the map information and the actual lane by newly dividing the lane after the construction. Thus, in such a case, the control unit 2 may also execute the travel control in the camera priority mode based on the current surrounding situation. On the other hand, when the deviation between the map information and the information acquired by the camera 41 is relatively large, the erroneous detection of the lane by the camera or the like is considered. Therefore, in such a case, the travel control may be performed based on the existing road information in the map priority mode.
In the present embodiment, the control unit 2 performs the travel control while switching the travel mode among the manual drive mode, the combination mode, and the camera priority mode. For example, in the case of the manual driving mode, the control unit 2 may switch to the simultaneous mode when it is determined that there is an operation of the occupant such as the switch on for starting the automatic driving and the map information matches the information acquired by the camera 41. For example, the control unit 2 may be switched to the camera priority mode when the manual driving mode is operated by an occupant such as turning on a switch for starting automatic driving and when the dividing line can be recognized by the camera without matching the map with the camera. In addition, for example, in the case of the combination mode, the control unit 2 does not match the map information with the information acquired by the camera 41 any more, but may switch to the camera priority mode when the dividing line can be recognized by the camera. For example, in the case of the combination mode and the camera priority mode, the control unit 2 may switch to the manual driving mode when the camera cannot recognize the dividing line. Further, when switching from the combination mode and the camera priority mode to the manual driving mode, the control unit 2 may make a request for switching to the manual driving (take-over request) to the driver.
< description of the shape of a lane based on map information and information acquired by a camera >)
Fig. 3 (a) to 3 (c) are diagrams illustrating the shape of the lane based on the map information and the information acquired by the camera 41. In the present embodiment, the control unit 2 compares the shapes of the lanes in a distant section B in front of the vehicle 1, which is the host vehicle, and a nearby section a in front of the vehicle 1 and closer to the vehicle 1 than the distant section B based on the map information and the information acquired by the camera 41. Also, the control unit 2 determines whether the map information and the information acquired by the camera 41 match, based on the result of the comparison.
Fig. 3 (a) shows a state in which the map information matches the information acquired by the camera 41. In fig. 3 (a), the shapes of the left and right actual lanes L, R of the road, the left and right lanes ML1 and MR1 based on the map information, and the left and right lanes CL1 and CR1 of the information acquired by the camera 41 coincide with each other in both the near zone a and the far zone B.
Fig. 3 (B) shows a state in which the map information and the information acquired by the camera 41 do not match in the distant section B. Specifically, in the near section a, the lanes ML2, MR2 based on the map information and the lanes CL2, CR2 based on the information acquired by the camera 41 coincide with each other, but in the far section B, a deviation of Δ θ 1(°) is generated between the lanes ML2, MR2 and the lanes CL2, CR2 at a position farther from the point P1 as a starting point. More specifically, in fig. 3 (B), the lanes CL2 and CR2 are formed in a shape along the actual lane L, R, but the lanes ML and MR2 are formed in a shape deviating from the actual lane L, R at a distance from the point P1 in the distant section B.
Fig. 3 (c) shows a state in which the map information and the information acquired by the camera 41 do not match in the nearby section a. Specifically, in the vicinity section a, a deviation Δ θ 2(°) occurs between the lanes ML3, MR3 based on the map information and the lanes CL3, CR3 based on the cameras. More specifically, in fig. 3 (c), the lanes CL3 and CR3 have a shape along the actual lane L, R, but the lanes ML3 and MR3 have a shape deviating from the actual lane L, R from the point P2 of the nearby section a to the distant section B.
In one embodiment, the nearby section A may be 0 to 30m ahead of the vehicle, and the distant section B may be 30 to 100 m. In one embodiment, the near zone a and the far zone B may partially overlap each other. For example, the nearby section A may be 0 to 40m ahead of the vehicle, and the distant section B may be 30 to 100 m. That is, the nearby section a may have an area closer to the host vehicle than the distant section B. In the examples of fig. 3 (a) to 3 (c), the upper limit of the distance from the distant section B to the vehicle 1 is defined, but the upper limit of the distance from the distant section B to the vehicle 1 may not be defined. In this case, the recognition limit distance in the camera 41 can be equivalent to the upper limit of the distance from the vehicle 1 in the distant section B. As viewed from a side, a distance between a predetermined position in front of the vehicle 1 and the vehicle 1 may correspond to a near zone a, and a distance between the predetermined position in front of the vehicle 1 and the far zone B.
Note that, although the dividing lines are shown as the left and right lanes L, R of the actual road in fig. 3 (a) to 3 (c), the constituent elements of the lanes are not limited to the dividing lines, and may be road boundaries such as curbs and guard rails.
In the present embodiment, the control unit 2 determines whether or not the map information and the information acquired by the camera 41 match in the near zone a and the far zone B, respectively, and selects a travel mode to be performed in travel control of the vehicle 1 based on the determination result. Hereinafter, a processing example thereof will be described.
< example of processing by control Unit >
Fig. 4 is a flowchart showing an example of processing performed by the control unit 2, and shows an example of processing for selecting a travel mode of the vehicle 1. More specifically, a processing example in the case where the control unit 2 switches between the simultaneous use mode and the camera priority mode according to the situation when performing the automatic driving control of the vehicle 1 is shown. The control unit 2 can periodically execute the present process in execution of the automatic driving control.
The processing of fig. 4 is realized, for example, by the processor of each ECU of the control unit 2 executing a program stored in each ECU. Alternatively, at least a portion of each step may be performed by dedicated hardware (e.g., a circuit).
In S1, ECU22 performs a partition line recognition process. For example, the ECU22 identifies the dividing line of the road on which the vehicle 1 travels based on the detection result of the camera 41 or the like. The ECU22 acquires various information such as the line type, width, and angle of the dividing line based on the recognition result. Further, the ECU22 can recognize road boundaries such as guard rails and curbs in addition to the dividing lines.
In S2, the ECU24 acquires map information of the periphery of the vehicle 1 via the communication device 24 c. More specifically, the ECU24 acquires various information such as the line type, width, and angle of a dividing line of a road on which the vehicle 1 is traveling. In addition, the ECU24 can acquire the current position of the vehicle 1 from the GPS sensor 24 b. The map information is not limited to being acquired by the communication device 24c, and may be map information stored in a storage device or the like in the vehicle 1.
In S3, the ECU20 determines the matching state of the map information and the information acquired by the camera 41 based on the processing of S1 and S2. Then, in S4, the ECU20 selects the travel mode based on the process of S3, and ends the processing cycle once. Details of S3 and S4 will be described later.
Fig. 5 (a) and 5 (b) are flowcharts of a subroutine (route) showing the process of S3 of fig. 4. The control unit 2 can execute the processing of (a) in fig. 5 and (b) in fig. 5 sequentially or in parallel.
First, the process of fig. 5 (a) will be described.
In S301, the ECU20 confirms whether the angle difference Δ θ 1 between the lane based on the map information in the distant section B and the lane based on the information acquired by the camera 41 is continuously in the state of being equal to or greater than the threshold T1, based on the information acquired in the processes of S1 and S2. The ECU20 proceeds to S302 if the angle difference Δ θ 1 is continuously in a state above the threshold T1, and otherwise proceeds to S303. For example, in the example of fig. 3 (b), the ECU20 confirms whether the angle difference Δ θ 1 between the lanes ML2, MR2 and the lanes CL2, CR2 is continuously in the state of being equal to or greater than the threshold T1. The threshold value T1 may be, for example, 1.0 ° to 3.0 °. Further, the threshold T1 may be 1.5 °.
In one embodiment, the ECU20 may determine that the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1 when the state in which the angle difference Δ θ 1 is equal to or greater than the threshold T1 continues for a predetermined time. For example, the ECU20 may determine that the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1 when the state in which the angle difference Δ θ 1 is equal to or greater than the threshold T1 continues for 0.5 to 3 seconds.
In one embodiment, the ECU20 may determine that the angle difference Δ θ 1 is continuously equal to or greater than the threshold value T1 when the vehicle 1 continues to travel a predetermined distance while the angle difference Δ θ 1 is equal to or greater than the threshold value T1. For example, the ECU20 may determine that the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1 when the vehicle continues to travel for 5 to 30m or more with the angle difference Δ θ 1 being equal to or greater than the threshold T1. Further, the ECU20 may determine that the angle difference Δ θ 1 is continuously in the state of being equal to or greater than the threshold T1 when the vehicle continues to travel by 15m or more with the angle difference Δ θ 1 being equal to or greater than the threshold T1.
In one embodiment, the ECU20 may determine that the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1 when the angle difference Δ θ 1 in a portion of the distant section B that is equal to or greater than a predetermined length of the lane is equal to or greater than the threshold T1. In the example of fig. 3 (B), when the length of the portion X in which the angle difference Δ θ 1 in the lane of the distant section B is equal to or greater than the threshold T1 is equal to or greater than 5 to 30m, it may be determined that the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1.
In S302, the ECU20 determines that the map information and the information acquired by the camera 41 are in a non-matching state in the distant area B, and ends the subroutine. In S303, the ECU20 determines that the map information and the information acquired by the camera 41 are in a matching state in the distant area B, and ends the subroutine.
Next, the process of fig. 5 (b) will be described.
In S311, the ECU20 confirms whether or not the angle difference Δ θ 2 between the lane based on the map information in the nearby section a and the lane based on the information acquired by the camera 41 is equal to or greater than the threshold T2, based on the information acquired in the processing of S1 and S2. If the angular difference is equal to or greater than the threshold value, the ECU20 proceeds to S312, and otherwise, proceeds to S313. For example, in the example of fig. 3 (c), the ECU20 confirms whether or not the angle difference Δ θ 2 between the lanes ML3, MR3 and the lanes CL3, CR3 is equal to or greater than the threshold T2. In one embodiment, the threshold T2 may be, for example, 1.0 ° to 5.0 °. Further, the threshold T2 may be 3.0 °.
In S312, the ECU20 determines that the map information in the nearby section a and the information acquired by the camera 41 are in a non-matching state, and ends the subroutine. In S313, the ECU20 determines that the map information in the nearby section a and the information acquired by the camera 41 are in a matching state, and ends the subroutine.
When the process of fig. 5 (a) is compared with the process of fig. 5 (B), the ECU20 determines that the state is not matched if the angle difference Δ θ 2 is equal to or greater than the threshold T2 in the near zone a, whereas it determines that the state is not matched if the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1 in the far zone B. This allows the driving mode to be appropriately switched according to the position where the map information and the information acquired by the camera 41 are not matched.
More specifically, in the far zone B, by considering the continuity of the state where the angle difference Δ θ 1 is equal to or greater than the threshold value T1, it is possible to suppress the influence of erroneous detection or the like in the far zone B, which may have a lower detection accuracy than in the near zone a, and to perform more appropriate mode switching. On the other hand, in the nearby section a, the mode switching can be reliably performed before the vehicle 1 reaches the position where the mismatch occurs, as compared with performing the mode switching earlier.
In one embodiment, the threshold T2 in the near zone a may be set to be larger than the threshold T1 in the far zone B. In the vicinity interval a, since continuity of the state where the angle difference Δ θ 2 is equal to or larger than the threshold T2 is not considered, erroneous determination and the like can be suppressed by making the threshold T2 larger than the threshold T1.
In the present embodiment, the ECU20 determines the matching state by checking whether or not the angle difference between the lane based on the map information and the lane based on the information acquired by the camera 41 is equal to or greater than a threshold value for the left and right lanes. However, the determination may be made for one of the left and right lanes. In one embodiment, the ECU20 may determine both the left and right lanes in the distant section B and determine one of the left and right lanes in the nearby section a. Thus, in the distant section B, the right and left lanes are targeted, and the determination can be made with higher accuracy, and in the nearby section a, the determination can be made earlier by targeting only one lane.
Fig. 6 is a flowchart showing a subroutine of the process of S4 of fig. 4.
In S401, the ECU20 confirms whether the map information and the information acquired by the camera 41 are in the matching state in the distant section B based on the processing in S3, and proceeds to S402 if the matching state is established, and proceeds to S404 if the non-matching state is established.
In S402, the ECU20 confirms whether the map information and the information acquired by the camera 41 are in the matching state in the nearby section a based on the processing in S3, and proceeds to S403 if the map information and the information are in the matching state, and proceeds to S404 if the map information and the information are in the non-matching state.
In S403, the ECU20 selects the combination mode, and ends the subroutine. In S404, the ECU20 selects the camera priority mode, and ends the subroutine.
According to the processing of fig. 6, the ECU20 selects the combination mode when the map information of both the near zone a and the far zone B and the information acquired by the camera 41 are in a matching state. On the other hand, the ECU20 selects the camera priority mode when at least one of the map information of the nearby section a and the distant section B and the information acquired by the camera 41 are in a non-matching state.
As described above, in the present embodiment, the control unit 2 determines whether or not the angle difference Δ θ 1 is continuously equal to or greater than the threshold T1 in the distant section B, and determines whether or not the angle difference Δ θ 2 is equal to or greater than the threshold T2 in the near section a. Then, the control unit 2 performs travel control for giving priority to the recognition result of the lane by the camera 41 based on these determination results. This can improve the accuracy of the travel control when the camera does not match the map.
< summary of the embodiments >
The above embodiment discloses at least the following travel control device, vehicle, travel control method, and program.
1. The travel control device (for example 2) of the above embodiment includes:
a recognition means (e.g., 22, S1) that recognizes a lane in which a moving body travels, based on an image captured by a camera provided to the moving body;
control means (e.g., 20, S4) for performing travel control of the mobile object based on the recognition result of the recognition means and map information of the periphery of the mobile object;
first determination means (e.g., 20, S301 to S303) for determining whether or not a first angular difference (e.g., Δ θ 1) between the lane recognized by the recognition means and the lane based on the map information is continuously in a first state equal to or greater than a first threshold value (e.g., T1) within a first range (e.g., B) in front of the mobile body; and
a second determination means (e.g., 20, S311-S313) for determining whether or not a second angular difference (e.g., Δ θ 2) between the lane identified by the identification means and the lane based on the map information is in a second state equal to or larger than a second threshold (e.g., T2) in a second range (e.g., A) ahead of the mobile body and closer to the mobile body than the first range,
the control means performs travel control in which the recognition result of the recognition means is prioritized over the map information when at least one of the first determination means determines that the vehicle is in the first state and the second determination means determines that the vehicle is in the second state (for example, S401 to S404).
According to the embodiment, the first determination means and the second determination means determine the states of the map information and the lane based on the information acquired by the camera 41, respectively, according to different conditions. Therefore, the control means can select the travel mode more appropriately according to the condition of the road, and the accuracy of travel control in the case where the map information and the information acquired by the camera do not match can be improved.
2. According to the above-described embodiment of the present invention,
the first determination means and the second determination means perform determination with respect to one of right and left constituent elements (e.g., L, R) constituting a lane.
According to this embodiment, the first determination means and the second determination means perform determination with respect to one of the left and right constituent elements, and thus the determination can be performed more easily.
3. According to the above-described embodiment of the present invention,
the first determination means determines that both of the left and right components constituting the lane are the target,
the second determination means performs determination with respect to one of the left and right constituent elements constituting the lane.
According to this embodiment, since the first determination means targets both of the left and right components, it is possible to perform determination with higher accuracy in the distant section. Further, since the second determination means targets one of the left and right components, it is possible to perform determination earlier in the nearby section.
4. According to the above-described embodiment of the present invention,
the left and right components constituting the lane are a dividing line (e.g., L, R) or a road boundary.
According to this embodiment, the determination can be made for a dividing line or a road boundary.
5. According to the above-described embodiment of the present invention,
the first determination unit determines that the first state is present when a situation in which the first angle difference is equal to or greater than the first threshold value continues for a predetermined time, when the first angle difference is equal to or greater than the first threshold value continuously while the mobile object travels a predetermined distance, or when the first angle difference in a portion (for example, X) of the lane within the first range that is equal to or greater than a predetermined length is equal to or greater than the first threshold value.
According to this embodiment, whether or not the first state is present can be determined based on the travel time of the mobile body, the travel distance of the mobile body, or the length of the lane in which the first angle difference is equal to or greater than the first threshold value.
6. According to the above-described embodiment of the present invention,
the second threshold is greater than the first threshold.
According to this embodiment, the second determination means for determining in the vicinity section does not consider the continuity of the state in which the second angle difference is equal to or greater than the second threshold value, and therefore, by making the second threshold value larger than the first threshold value, erroneous determination and the like can be suppressed.
7. The vehicle (for example 1) according to the above embodiment is mounted with the travel control device (for example 2) according to the above 1 to 6.
According to this embodiment, a vehicle is provided that is capable of improving the accuracy of travel control in the case where the map information and the information acquired by the camera do not match.
8. The travel control method of the above embodiment includes:
a recognition step (e.g., S1) of recognizing a lane in which a moving body travels, based on an image captured by a camera provided to the moving body;
a control step (e.g., S4) of performing travel control of the moving body based on a recognition result of the recognition step and map information of the periphery of the moving body;
a first determination step (e.g., S301 to S303) of determining whether or not a first angle difference between the lane identified by the identification step and the lane based on the map information is continuously in a first state of being equal to or greater than a first threshold value in a first range in front of the mobile object; and
a second determination step (e.g., S311-S313) of determining whether or not a second angular difference between the lane identified by the identification step and the lane based on the map information is in a second state equal to or larger than a second threshold value in a second range ahead of the mobile body and closer to the mobile body side than the first range,
in the control step, when at least one of the first determination step determines that the vehicle is in the first state and the second determination step determines that the vehicle is in the second state, travel control is performed in which the recognition result of the recognition step is prioritized over the map information (for example, S401 to S404).
According to this embodiment, there is provided a travel control method capable of improving the accuracy of travel control in the case where map information and information acquired by a camera do not match.
9. The computer-readable storage medium of the above embodiment stores a program for causing a computer to function as:
a recognition means (e.g., S1) that recognizes a lane on which a moving body travels, based on an image captured by a camera provided to the moving body;
control means (e.g., S3) for performing travel control of the mobile object based on the recognition result of the recognition means and map information;
a first determination unit (e.g., S301 to S303) that determines whether or not a first angular difference between the lane recognized by the recognition unit and the lane based on the map information is continuously in a first state of being equal to or greater than a first threshold value in a first range ahead of the mobile body; and
a second determination means (e.g., S311-S313) for determining whether or not a second angular difference between the lane identified by the identification means and the lane based on the map information acquired by the acquisition means is in a second state equal to or larger than a second threshold value in a second range ahead of the mobile body and closer to the mobile body side than the first range,
the control means performs travel control in which the recognition result of the recognition means is prioritized over the map information when at least one of the first determination means determines that the vehicle is in the first state and the second determination means determines that the vehicle is in the second state (for example, S401 to S404).
According to this embodiment, a program capable of improving the accuracy of travel control in the case where the map information and the information acquired by the camera do not match is provided.
The present invention is not limited to the above-described embodiments, and various modifications and changes can be made within the scope of the present invention.

Claims (9)

1. A travel control device is characterized in that,
the travel control device includes:
a recognition unit that recognizes a lane on which a moving object travels, based on an image captured by a camera provided on the moving object;
a control means for performing travel control of the mobile body based on the recognition result of the recognition means and map information of the periphery of the mobile body;
a first determination unit configured to determine whether or not a first angular difference between the lane recognized by the recognition unit and the lane based on the map information is continuously in a first state equal to or larger than a first threshold value in a first range in front of the mobile object; and
a second determination unit that determines whether or not a second angular difference between the lane recognized by the recognition unit and the lane based on the map information is in a second state equal to or larger than a second threshold value within a second range that is ahead of the mobile body and closer to the mobile body side than the first range,
the control means performs the travel control in which the recognition result of the recognition means is prioritized over the map information when at least one of the first determination means determines that the vehicle is in the first state and the second determination means determines that the vehicle is in the second state.
2. The travel control device according to claim 1, wherein the first determination means and the second determination means perform determination with respect to one of right and left constituent elements that constitute a lane.
3. The running control apparatus according to claim 1,
the first determination means determines that both of the left and right components constituting the lane are the target,
the second determination means performs determination with respect to one of the left and right constituent elements constituting the lane.
4. The travel control device according to claim 2 or 3, wherein the constituent elements constituting the left and right of the lane are a dividing line or a road boundary.
5. The travel control device according to any one of claims 1 to 3, wherein the first determination means determines that the first state is present when a situation in which a first angle difference is equal to or greater than the first threshold value continues for a predetermined time, when the first angle difference is equal to or greater than the first threshold value continuously while the mobile body travels a predetermined distance, or when the first angle difference in a portion of the lane within the first range that is equal to or greater than a predetermined length is equal to or greater than the first threshold value.
6. The running control apparatus according to any one of claims 1 to 3, characterized in that the second threshold value is larger than the first threshold value.
7. A vehicle mounted with the running control apparatus according to any one of claims 1 to 3.
8. A running control method characterized in that,
the travel control method includes:
a recognition step of recognizing a lane on which a mobile body travels based on an image captured by a camera provided to the mobile body;
a control step of performing travel control of the mobile body based on a recognition result of the recognition step and map information of the periphery of the mobile body;
a first determination step of determining whether or not a first angular difference between the lane recognized by the recognition step and a lane based on map information on the periphery of the mobile object is continuously in a first state of being equal to or greater than a first threshold value in a first range in front of the mobile object; and
a second determination step of determining whether or not a second angular difference between the lane identified by the identification step and the lane based on the map information is in a second state equal to or larger than a second threshold value in a second range ahead of the mobile body and closer to the mobile body side than the first range,
in the control step, when at least one of the first determination step determines that the vehicle is in the first state and the second determination step determines that the vehicle is in the second state, the travel control is performed such that the recognition result of the recognition step is prioritized over the map information.
9. A storage medium, which is a computer-readable storage medium,
the storage medium stores a program that causes a computer to function as:
a recognition unit that recognizes a lane on which a moving object travels, based on an image captured by a camera provided on the moving object;
a control means for performing travel control of the mobile body based on the recognition result of the recognition means and map information of the periphery of the mobile body;
a first determination unit configured to determine whether or not a first angular difference between the lane recognized by the recognition unit and the lane based on the map information is continuously in a first state equal to or larger than a first threshold value in a first range in front of the mobile object; and
a second determination unit that determines whether or not a second angular difference between the lane recognized by the recognition unit and the lane based on the map information is in a second state equal to or larger than a second threshold value within a second range that is ahead of the mobile body and closer to the mobile body side than the first range,
the control means performs the travel control in which the recognition result of the recognition means is prioritized over the map information when at least one of the first determination means determines that the vehicle is in the first state and the second determination means determines that the vehicle is in the second state.
CN202110252626.1A 2020-03-17 2021-03-09 Travel control device, vehicle, travel control method, and storage medium Active CN113479209B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-046811 2020-03-17
JP2020046811A JP7377143B2 (en) 2020-03-17 2020-03-17 Travel control device, vehicle, travel control method and program

Publications (2)

Publication Number Publication Date
CN113479209A true CN113479209A (en) 2021-10-08
CN113479209B CN113479209B (en) 2023-12-22

Family

ID=77747399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110252626.1A Active CN113479209B (en) 2020-03-17 2021-03-09 Travel control device, vehicle, travel control method, and storage medium

Country Status (3)

Country Link
US (1) US11938933B2 (en)
JP (1) JP7377143B2 (en)
CN (1) CN113479209B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7449971B2 (en) 2022-03-25 2024-03-14 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
JP7425133B1 (en) 2022-08-10 2024-01-30 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
JP7425132B1 (en) 2022-08-10 2024-01-30 本田技研工業株式会社 Vehicle control device, vehicle control method, and program
JP7433382B1 (en) 2022-08-12 2024-02-19 本田技研工業株式会社 Vehicle control device, vehicle control method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100138115A1 (en) * 2008-11-28 2010-06-03 Toyota Jidosha Kabushiki Kaisha On-board apparatus and method used by said apparatus
CN105984465A (en) * 2015-03-23 2016-10-05 富士重工业株式会社 Travel control apparatus for vehicle
CN106767853A (en) * 2016-12-30 2017-05-31 中国科学院合肥物质科学研究院 A kind of automatic driving vehicle high-precision locating method based on Multi-information acquisition
CN108139217A (en) * 2015-09-30 2018-06-08 日产自动车株式会社 Travel control method and travel controlling system
JP2019038289A (en) * 2017-08-22 2019-03-14 株式会社Subaru Vehicle driving support device
CN110120084A (en) * 2019-05-23 2019-08-13 广东星舆科技有限公司 A method of generating lane line and road surface

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9123152B1 (en) * 2012-05-07 2015-09-01 Google Inc. Map reports from vehicles in the field
CN103682676B (en) * 2012-09-03 2015-12-09 万都株式会社 For improving antenna assembly and the radar installations of radiation efficiency
US9483700B1 (en) * 2015-05-13 2016-11-01 Honda Motor Co., Ltd. System and method for lane vehicle localization with lane marking detection and likelihood scoring
US10853680B2 (en) * 2015-07-14 2020-12-01 Panasonic Intellectual Property Management Co., Ltd. Identification medium recognition device and identification medium recognition method
DE102017215406A1 (en) * 2017-09-04 2019-03-07 Bayerische Motoren Werke Aktiengesellschaft A method, mobile user device, computer program for generating visual information for at least one occupant of a vehicle
JP6981850B2 (en) * 2017-11-09 2021-12-17 株式会社Soken Driving support system
CN108297866B (en) * 2018-01-03 2019-10-15 西安交通大学 A kind of lane holding control method of vehicle
US10990832B2 (en) * 2018-03-06 2021-04-27 Phantom AI, Inc. Lane line reconstruction using future scenes and trajectory
JP6754386B2 (en) 2018-03-14 2020-09-09 本田技研工業株式会社 Vehicle control device
WO2020008221A1 (en) 2018-07-04 2020-01-09 日産自動車株式会社 Travel assistance method and travel assistance device
JP2020026985A (en) 2018-08-09 2020-02-20 株式会社豊田中央研究所 Vehicle position estimation device and program
US11117576B2 (en) * 2019-02-04 2021-09-14 Denso Corporation Vehicle lane trace control system
JP2021077099A (en) * 2019-11-08 2021-05-20 トヨタ自動車株式会社 Information provision system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100138115A1 (en) * 2008-11-28 2010-06-03 Toyota Jidosha Kabushiki Kaisha On-board apparatus and method used by said apparatus
CN105984465A (en) * 2015-03-23 2016-10-05 富士重工业株式会社 Travel control apparatus for vehicle
CN108139217A (en) * 2015-09-30 2018-06-08 日产自动车株式会社 Travel control method and travel controlling system
CN106767853A (en) * 2016-12-30 2017-05-31 中国科学院合肥物质科学研究院 A kind of automatic driving vehicle high-precision locating method based on Multi-information acquisition
JP2019038289A (en) * 2017-08-22 2019-03-14 株式会社Subaru Vehicle driving support device
CN110120084A (en) * 2019-05-23 2019-08-13 广东星舆科技有限公司 A method of generating lane line and road surface

Also Published As

Publication number Publication date
JP2021149321A (en) 2021-09-27
US20210291830A1 (en) 2021-09-23
US11938933B2 (en) 2024-03-26
JP7377143B2 (en) 2023-11-09
CN113479209B (en) 2023-12-22

Similar Documents

Publication Publication Date Title
CN110281930B (en) Vehicle control device, vehicle control method, and storage medium
CN110001643B (en) Vehicle control device, vehicle control method, storage medium, and information acquisition device
CN110599788B (en) Automatic driving system and control method of automatic driving system
CN113479209B (en) Travel control device, vehicle, travel control method, and storage medium
CN111532267B (en) Vehicle, control device for vehicle, and control method for vehicle
CN109421712B (en) Vehicle control device, vehicle control method, and storage medium
CN109917783B (en) Driving support device
CN111731295B (en) Travel control device, travel control method, and storage medium storing program
CN111587206B (en) Vehicle control device, vehicle having the same, and control method
CN111434551B (en) Travel control device, travel control method, and storage medium storing program
US20210139019A1 (en) Driving assistance apparatus
CN113511196A (en) Vehicle and control device thereof
CN111731318A (en) Vehicle control device, vehicle control method, vehicle, and storage medium
CN113386788A (en) Control device and vehicle
CN112046474B (en) Vehicle control device, method for operating same, vehicle, and storage medium
CN113320530A (en) Travel control device, vehicle, travel control method, and storage medium
CN113060140A (en) Path planning before lane change based on center line displacement
CN112046476A (en) Vehicle control device, method for operating vehicle control device, vehicle, and storage medium
US20230075153A1 (en) Arithmetic device
US11654931B2 (en) Driving assistance device and vehicle
CN113370972B (en) Travel control device, travel control method, and computer-readable storage medium storing program
CN112046478B (en) Vehicle control device, method for operating same, vehicle, and storage medium
CN113511195A (en) Vehicle and control device thereof
CN113386749A (en) Travel control device, vehicle, travel control method, and storage medium
JP2020200029A (en) Vehicle control apparatus, vehicle, method for operating vehicle control apparatus, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant