CN116923420A - Vehicle control system and method for facilitating user getting on and off - Google Patents

Vehicle control system and method for facilitating user getting on and off Download PDF

Info

Publication number
CN116923420A
CN116923420A CN202210367461.7A CN202210367461A CN116923420A CN 116923420 A CN116923420 A CN 116923420A CN 202210367461 A CN202210367461 A CN 202210367461A CN 116923420 A CN116923420 A CN 116923420A
Authority
CN
China
Prior art keywords
vehicle
location
user
obstacle
surrounding environment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210367461.7A
Other languages
Chinese (zh)
Inventor
蔡剑飞
曹靓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mobility Asia Smart Technology Co Ltd
Original Assignee
Mobility Asia Smart Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mobility Asia Smart Technology Co Ltd filed Critical Mobility Asia Smart Technology Co Ltd
Priority to CN202210367461.7A priority Critical patent/CN116923420A/en
Priority to DE102023108328.7A priority patent/DE102023108328A1/en
Publication of CN116923420A publication Critical patent/CN116923420A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0025Planning or execution of driving tasks specially adapted for specific operations
    • B60W60/00253Taxi operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/06Automatic manoeuvring for parking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18109Braking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0013Planning or execution of driving tasks specially adapted for occupant comfort
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0025Planning or execution of driving tasks specially adapted for specific operations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/173Reversing assist
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/175Autonomous driving
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/18Information management
    • B60K2360/186Displaying information according to relevancy
    • B60K2360/1868Displaying information according to relevancy according to driving situations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2400/00Indexing codes relating to detected, measured or calculated conditions or factors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/01Occupants other than the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/041Potential occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/215Selection or confirmation of options
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention relates to a system and method for controlling a vehicle to park at a location that facilitates a user to get on and off. The system comprises: a driving control system for controlling the running of the vehicle; and an obstacle recognition system for recognizing an obstacle affecting a user to get on or off the vehicle in a surrounding environment of the vehicle; the driving control system is further configured to control the vehicle to park at a location avoiding the identified obstacle. The embodiment of the invention can ensure the safety of the user and improve the getting-on and getting-off experience of the user.

Description

Vehicle control system and method for facilitating user getting on and off
Technical Field
The present invention relates to a vehicle control technology, and more particularly, to a system and method for controlling a vehicle to park at a position where a user can get on or off the vehicle conveniently.
Background
At present, when a passenger gets on or gets off a vehicle, the driver often looks at a proper position to stop the vehicle, so that the user gets on or off the vehicle. However, the actual parking position may not facilitate passengers getting on or off due to limited eyes of the driver (e.g., at night) and limited ability to judge the actual distance of the obstacle. The automatic vehicle driving system is used for stopping the passenger to get on or off the vehicle based on the preset position, but if a certain condition that the passenger is not suitable for getting on or off the vehicle occurs at the get-on point or the get-off point, for example, a puddle or a mud pit exists on the ground of the get-on point or the get-off point, an open inspection well cover or an obstacle (such as that the distance between other vehicles adjacent to a parking space is too close) on the door is left on the door, the passenger gets on or off the vehicle very inconveniently, and a very poor experience is brought. At this time, the vehicle cannot dynamically adjust the parking position according to the real-time changing surrounding environment of the vehicle to improve the getting-on or getting-off experience of the passengers.
Disclosure of Invention
The present invention is directed to a system and method for controlling a vehicle, which can be used to control the vehicle to stop at a position where a user can get on or off the vehicle conveniently, to ensure user safety and to improve user experience.
An embodiment of the present invention provides a system for controlling a vehicle, the system including: a driving control system configured to control running of the vehicle; and an obstacle recognition system configured to recognize an obstacle affecting a user getting on or off in a surrounding environment of the vehicle; wherein the driving control system is further configured to control the vehicle to park at a location avoiding the identified obstacle.
Embodiments of the present invention also provide a method for controlling a vehicle to park at a location that facilitates a user to get on and off, the method comprising: identifying obstacles in the surrounding environment of the vehicle that affect the user to get on or off the vehicle; and controlling the vehicle to park at a location avoiding the identified obstacle.
The embodiment of the invention also provides a vehicle, which comprises the system for controlling the vehicle.
Embodiments of the present invention also provide a computer readable instruction storage medium in which computer readable instructions are stored which, when executed, cause a vehicle to perform a method for controlling a vehicle to park at a location where a user is facilitated to get on and off the vehicle as described in the above embodiments.
Embodiments of the present invention may be used to control a vehicle to park at a position where obstacles beside the vehicle are avoided and a user is facilitated to get on or off the vehicle in an automatic or semi-automatic manner, whereby user safety may be ensured and user get on or off experience may be improved.
Drawings
Fig. 1 shows a block diagram of a system for controlling a vehicle according to an embodiment of the present invention.
Fig. 2 shows a flowchart of a vehicle control method according to an embodiment of the present invention.
Fig. 3A, 3B and 3C illustrate views of a scene in which a vehicle is parked in an embodiment of the present invention.
Fig. 4 is a schematic diagram showing an operation flow of adjusting the get-off position in the embodiment of the present invention.
Fig. 5 shows a schematic of a modular structure of a system for controlling a vehicle in an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are further described below with reference to the accompanying drawings.
Fig. 1 shows a block diagram of a system for controlling a vehicle according to an embodiment of the present invention. The system 100 includes: an obstacle recognition system 120 configured to recognize an obstacle affecting a user getting on or off in a surrounding environment of the vehicle; a driving control system 140 configured to control the running of the vehicle; the driving control system 140 is further configured to control the vehicle to park at a location avoiding the identified obstacle. The obstacle recognition system 120 recognizes an obstacle in which a user may be prevented from getting on or off the vehicle in the surrounding environment of the vehicle, and transmits the recognition result to the driving control system 140. Based on the recognition result, the driving control system 140 may determine a parking position avoiding the obstacle and suitable for the user to get on or off the vehicle to control the vehicle to park at the determined position. The system 100 is mounted on a vehicle for controlling the vehicle; the drive control system 140 may automatically park the vehicle at a parking location determined by the system 100, or may automatically park the vehicle at a parking location manually selected from among a number of parking locations determined. The vehicle in an embodiment of the invention may be an autonomous vehicle or a manned vehicle.
In some embodiments, the obstacle recognition system 120 may recognize obstacles affecting getting on and off in the surrounding environment, such as inspection wells, puddles, stones, signs, garbage cans, construction sites, etc. with at least one of an electromagnetic wave target detection and recognition (e.g., radar) system, an infrared target detection and recognition system, an image target detection and recognition system, etc. located beside the road with the well cover opened. In some examples, electromagnetic wave target detection and recognition systems, or infrared target detection and recognition systems, may be utilized to identify obstacles during night or light dark scenes; during daytime or brightly lit scenes, the image object detection and recognition system may be utilized to identify obstacles.
The embodiment of the invention can avoid discomfort and inconvenience caused by obstacles in the process of getting on or off the vehicle by the user and improve the getting on or off experience of the user.
In some embodiments, obstacle recognition system 120 is further configured to identify obstacles in the surrounding environment using a pre-trained obstacle recognition model, and to obtain a location and recognition confidence of each obstacle in the surrounding environment. The recognition confidence indicates a degree of confidence in the obstacle recognition result, and if the recognition confidence is greater than a certain threshold, it may indicate that there is an obstacle that may prevent a get-on or get-off at the location in the surrounding environment. A machine learning algorithm may be employed to enhance the recognition capabilities of obstacle recognition system 120.
In some embodiments, the system 100 further includes a location selection system 130 configured to select locations in the surroundings of the vehicle that avoid the identified obstacle and facilitate the user to get on or off the vehicle; the drive control system 140 is further configured to control the vehicle to park at the location selected by the location selection system 130. An optional position selection system 130 is provided between the obstacle recognition system 120 and the driving control system 140, which can receive recognition results regarding obstacles in the environment from the obstacle recognition system 120, provide a plurality of parking positions in the surrounding environment where the obstacles can be avoided and the user can get on or off the vehicle, and display the positions on the interactive interface for the user to select; the selected parking position will be sent to the drive control system 140 to control the vehicle to park at the selected parking position.
In some embodiments, the system further comprises an image acquisition system 110 configured for acquiring images of the surroundings of the vehicle; wherein the obstacle recognition system 120 is configured to recognize an obstacle affecting a user to get on or off in the surrounding environment of the vehicle in the image acquired by the image acquisition system 110. The optional image acquisition system 110 may acquire images of the vehicle surroundings using at least one of radio wave imaging technology, infrared imaging technology, optical imaging technology, and the like. For example, the image acquisition system 110 may be at least one of an optical camera, an infrared thermal imaging camera, a radar. The image acquisition system 110 may be a stand alone system or may be included within the obstacle recognition system 120. In one example, obstacle recognition system 120 includes an image acquisition unit (e.g., an optical camera, an infrared thermal imaging camera, a radar, etc.) and a target recognition unit for recognizing an obstacle in the acquired image.
In other embodiments, the system 100 further includes a location selection system 130 configured to display the captured image of the surrounding environment, the identified obstacle, and an indicator for selecting a location, and to receive an indication of the parking location from the user; wherein the indication of the parking position by the user is generated in response to the user performing an interactive operation with respect to the indicator in relation to the position selection; the driving control system 140 is further configured to control the vehicle to park at the parking position in the indication based on the user indication about the parking position.
In some embodiments, the location selection system 130 includes a display screen configured to display images of the surrounding environment, the identified obstacle, and an indicator for selecting a location using Augmented Reality (AR) technology; the display screen is disposed in front of a user seat of the vehicle, and the display screen includes at least one of a center control screen in the vehicle and a display screen located in a rear row of the vehicle. The display screen is used to present a display interactive interface for the position selection system 130, which may be a center control screen for the co-driver's seat position in the vehicle, an entertainment screen for the rear seat, and a screen for any position that the user may reach while riding. Based on the recognition result of the obstacle recognition system 120, a mark (such as a red circle or box, or other marking means) may be highlighted in the image to indicate to the user that adverse environmental factors are adverse to getting on or off the vehicle.
Fig. 3A shows a view of a scene of a vehicle parking in an embodiment of the present invention, fig. 3B shows a view of a scene based on 360 degrees of looking around centered on the vehicle, and fig. 3C shows a view of a scene based on a camera on the right side of the vehicle. In an embodiment of an autonomous vehicle, the drive control system may pre-delineate a drop-off parking area 320 at each parking spot near the road edge 310, as shown by the dashed box in fig. 3A. A user in the autopilot vehicle may sit in the rear row, and a display screen 330 (e.g., a tablet computer) may be disposed in the rear row of the vehicle, for displaying real-time information of the current autopilot, such as a 3D map of the surrounding environment, information of vehicles and pedestrians in the surrounding environment, traffic light information, vehicle position, vehicle speed, etc. The display screen may also be a center screen in front of the passenger's seat if the autonomous vehicle allows the user to sit on the passenger's seat. The user can also see on the display screen 330 obstacles 340 (e.g., open well covers, puddles, etc.) that are present in the surroundings of the vehicle that prevent the user from getting off. By interacting with the user on these displays, the selection of the parking position can be accomplished.
In some embodiments, the display screen in the location selection system 130 is further configured to display a distance indicator, and at least one of a recommended parking location, a location of the vehicle, and a location of the obstacle; the distance indicators are used to indicate recommended parking locations, locations of vehicles, locations of obstacles, and distances associated with the respective locations. As shown in fig. 3B and 3C, the environment in the user's getting-off direction is detected after the automated driving vehicle is stopped by the side, it is determined whether there is a get-off inconvenience factor (i.e., obstacle), and if there is an obstacle, an interactive interface of the position selection system appears on the display screen 330 after the vehicle is stopped, in which a recommended parking position 350 (shown as a small black circle at the upper right in fig. 3B and the upper left in fig. 3C), a position 360 of the vehicle (shown as a small black circle at the middle of fig. 3B and 3C, corresponding to the user's getting-off position), and a distance indicator 370 are displayed. Considering that there is an obstacle 340 on the right side of the vehicle as shown in fig. 3A, the recommended parking position 350 is located at a position that avoids the obstacle 340, such as further forward or further backward than the obstacle 340 in the scene view, as shown in fig. 3B and 3C, respectively, two recommended parking positions 350 are shown. In addition, a distance indicator 370 depicted in black lines is shown in a position on the right side of the vehicle in fig. 3B, and a position from the upper left to the lower right in fig. 3C, the distance indicator 370 including a reference line shown in a long line, and a unit length (e.g., meter) identification line shown in a plurality of short lines perpendicular to the reference line, and a user can intuitively and accurately know a recommended parking position, a position of the vehicle (i.e., a boarding position or a alighting position corresponding to a door), a position of an obstacle, and distances associated with the respective positions (e.g., distances between the recommended parking position and the obstacle) through the distance indicator 370.
The distance between the object and the camera in the picture can be calculated in the distance indicator 370 using various existing distance measuring algorithms. For example, taking the position of the vehicle (i.e. the position of the vehicle), taking the image of the user in the getting-off direction (taking the right side as the main side) as the center, calculating the point of the image of the vehicle head direction at a specific distance (such as 10 meters) through a target distance and size calculation algorithm, and then forming a distance indicator (such as 370 in fig. 3B and 3C) to be overlapped on the real scene image, wherein the display of the distance indicator (such as 370 in fig. 3C) can have a certain gradual effect according to the size of the distance from the vehicle (such as that the line width of the distance indicator 370 is lower wide and upper narrow in fig. 3C), so as to give the user a three-dimensional display effect from near to far.
In some embodiments, the indicator for selecting a location is an interactive control configured to receive an indication of a user selection of a plurality of recommended parking locations. For example, using Augmented Reality (AR) technology, a live-action video image acquired by a camera is used as a background on the display screen 330, and an indicator interaction control for selecting a position is superimposed on the background, so that a user can select a parking position 350 from a plurality of recommended parking positions through the interaction control; the drive control system then controls the vehicle such that its right side door (e.g., rear row right side door, or front row right side door) is facing the user selected drop-off position 350 when parked.
The image acquisition system 110 may include one or more cameras. In some embodiments, the image acquisition system 110 includes a plurality of cameras disposed at different locations on the vehicle; wherein the plurality of cameras includes at least a right side camera on the vehicle; images acquired by a plurality of cameras are stitched into an image of the surrounding environment of the vehicle. For example, the scene view shown in fig. 3B may be a top view scene view of 360 degrees surrounding the vehicle formed by stitching images collected by four cameras disposed on the front, rear, left and right sides of the vehicle, so that the user can easily see the surrounding environment of the vehicle and the on-off positions. Various existing image stitching techniques (including feature point extraction, image registration, image morphing and fusion, etc.) may be utilized to form the look-around scene view. The scene view shown in fig. 3C may be a right scene view formed by at least an image captured by a camera provided at a right side portion of the vehicle, so that a user can easily see the environment outside the door on the right side and get on/off the vehicle. In the embodiment of the present invention, the right side camera in the image acquisition system 110 is utilized to monitor the scene and environment of the user getting on and off the right side of the vehicle, but the present invention is not limited thereto. For example, in a country where the vehicle driving direction is on the left side of the road, the camera in the image acquisition system 110 employed includes at least a left side camera. In one example, images acquired by three cameras disposed on the front, rear, and right sides of a vehicle may be stitched into a view of the scene on the right side of the vehicle, facilitating the user to see the front, rear, and right side scenes at the same time, including other vehicles, pedestrians, etc. that appear in front of the right, behind the right. In another example, images captured by two cameras disposed on the left and right sides of the vehicle may be used to form left and right scene views of the vehicle, respectively, to facilitate a user's view of the surroundings outside the door when attempting to get on and off the left or right door.
In the case of applying the system 100 to an automatically driven vehicle, the vehicle may be automatically controlled to park at a position avoiding an obstacle that prevents boarding when a user gets on the vehicle; when a user gets off the vehicle, the user may be automatically controlled to park at a position avoiding an obstacle that prevents the user from getting off the vehicle, or the user may first select a desired parking position from a plurality of available parking positions on the vehicle through the position selection system 130, and park at the selected position for the user to get off the vehicle.
In some embodiments, the system 100 further comprises a location identification system 150 configured to identify whether the surroundings of the vehicle belong to a parking limited location using a pre-trained location identification model; the driving control system 140 is further configured to control the vehicle to avoid parking at the identified parking limited location. The position recognition system 150 may be disposed between the image acquisition system 110 and the driving control system 140. The position recognition system 150 receives the captured image from the image capturing system 110, recognizes whether the surrounding environment of the vehicle belongs to a parking-limited position from the image using a position recognition model, and transmits the recognition result to the driving control system 140 for controlling the parking position of the vehicle.
In one example, one or more cameras are mounted under the rear view mirror on the right side of the vehicle as the right side camera. In the use process, a video stream is acquired from a right camera, a pre-trained specific type of inconvenient environmental factor (including but not limited to a puddle, a manhole cover, an obstacle and the like, and subsequent adverse environmental factors can be expanded by adding a new classification model) is used for identifying the inconvenient environmental factor of each frame of image of the video stream, so that the position and the identification confidence coefficient of each inconvenient environmental factor in the image are obtained, and if the confidence coefficient is larger than a certain threshold value, the inconvenient environmental factor of a certain classification exists at the position of the image.
In another example, key traffic locations (corresponding to parking limited locations including but not limited to, e.g., intersections, sidewalks, bus stops, etc., and subsequent key traffic locations may be expanded by adding new classification models) are identified by pre-trained models in the direction of the distance indicators 370, one or more locations that are neither key traffic locations nor adverse environmental factors identified by the obstacle recognition system 120 are found in the distance indicator direction of the screen as recommended locations, and presented (e.g., using red or green circles) for selection by the user on the lines of the distance indicators. The user may directly click the recommended parking position on the display screen 330 to select the get-off point, or may drag the indication point of the own vehicle position in the distance indicator 370 to a certain position to determine the get-off position point, and after the user finishes selecting the get-off position, the display screen 330 may prompt the user to further confirm the selected get-off point through the prompt interface. After the user confirms the drop-off point, the position selection system 130 provides information to the drive control system, such as direction, target position, etc., associated with the drop-off point to control or adjust the stopping of the vehicle at the confirmed drop-off point. If the user is still dissatisfied with the current parking position, the position reselection may continue through the position selection system 130 to readjust the parking position.
Fig. 2 shows a flowchart of a method for controlling a vehicle to park at a location that facilitates a user to get on and off the vehicle, in accordance with an embodiment of the present invention. As shown in fig. 2, the method 200 includes: identifying obstacles in the surrounding environment of the vehicle that affect the user's boarding or disembarking (step 220); and controlling the vehicle to park at a location avoiding the identified obstacle (step 230). Method 200 may be implemented by system 100 shown in fig. 1 and achieve similar technical effects. The vehicle may be an autonomous vehicle or a manually driven vehicle.
In some embodiments, the identifying step 220 may utilize at least one of electromagnetic wave target detection and identification (e.g., radar) technology, infrared target detection and identification technology, and image target detection and identification technology to identify obstacles in the surrounding environment that affect getting on and off, such as inspection wells, puddles, stones, signs, garbage cans, construction sites, etc. that are located beside the roadway that open well covers. In some examples, electromagnetic wave target detection and recognition techniques, or infrared target detection and recognition techniques, may be utilized to identify obstacles during night or light-dark scenes; during daytime or brightly lit scenes, image object detection and recognition techniques can be utilized to identify obstacles.
In some embodiments, the identifying step 220 includes: the method comprises the steps of identifying obstacles in the surrounding environment by using a pre-trained identification model, and obtaining the position and the identification confidence of each obstacle in the surrounding environment.
In some embodiments, the controlling step 230 includes: selecting a location in the surrounding environment of the vehicle that avoids the identified obstacle and facilitates a user to get on or off the vehicle; and controlling the vehicle to park at the selected location.
In some embodiments, the method 200 further comprises: collecting an image of the surroundings of the vehicle (step 210); wherein the identifying step 220 includes: in the acquired images, obstacles affecting the user to get on or off the vehicle are identified in the surroundings of the vehicle.
In some embodiments, the controlling step 230 includes: displaying the captured image of the surrounding environment, the identified obstacle, and an indicator for selecting a location on a display screen; receiving an indication of a parking location from a user on a display screen; wherein the indication of the parking position by the user is generated in response to the user performing an interactive operation with respect to the indicator in relation to the position selection; and controlling the vehicle to park at the indicated parking position based on the user's indication of the parking position. The displaying step may include: displaying an image of the surrounding environment, the identified obstacle, and an indicator for selecting a location on a display screen using augmented reality technology; the display screen may be disposed in front of a user seat of the vehicle, and the display screen may include at least one of a center control screen in the vehicle and a display screen located at a rear row of the vehicle.
In some embodiments, the method 200 further comprises: displaying at least one of a recommended parking position, a position of the vehicle, and a position of the obstacle, and a distance indicator; the distance indicators are used to indicate recommended parking locations, locations of vehicles, locations of obstacles, and distances associated with the respective locations.
In some embodiments, the indicator for selecting the location is an interactive control configured to receive an indication of a user selection of a plurality of recommended parking locations.
In some embodiments, the image acquisition step 210 includes: acquiring images of the surrounding environment of the vehicle by using a plurality of cameras arranged at different positions on the vehicle; wherein the plurality of cameras at least comprises a right side camera on the vehicle; and stitching the images acquired by the plurality of cameras into an image of the surrounding environment of the vehicle.
In some embodiments, the method 200 further comprises: using a pre-trained position recognition model to recognize whether the surrounding environment of the vehicle belongs to a parking-limited position; and controlling the vehicle to avoid parking at the identified parking limited location.
Those skilled in the art will appreciate that the various embodiments described above with respect to method 200 may be implemented in the various embodiments of the system described in connection with fig. 1. These embodiments contain substantially the same or similar technical details, which are not described in detail herein.
Fig. 4 is a schematic diagram showing an operation flow of adjusting the get-off position in the embodiment of the present invention. In method 400, first, an autonomous vehicle enters a drop-off point to park (step 410). Based on the image collected by the right camera, the obstacle recognition system recognizes certain environmental factors (step 420) of the surrounding environment outside the right door, such as water pits, inspection wells of the opened inspection well cover, stones, etc., which are inconvenient to get off. Judging whether an environment factor inconvenient for getting off exists or not (step 420); when judging that the environment factors inconvenient to get off do not exist in the surrounding environment outside the right door (the judgment result is No), the automatic driving vehicle normally stops at the get-off point; when it is determined that there are environmental factors (Yes) that are inconvenient to get off in the surrounding environment outside the right door, a parking position selection system (such as the position selection system 130 shown in fig. 1) is turned on (step 440), and a parking position is selected. Alternatively, if the user in the vehicle considers the current drop-off point to be less than optimal and it is necessary to adjust the drop-off position, the user may perform a user operation (step 450) to turn on the park position selection system. After the park position selection system is turned on, the environment is modeled based on images of the surrounding environment acquired by camera systems (e.g., front, rear, left, right cameras) mounted on the vehicle (step 460). After modeling the environment, the identified adverse environmental factors may be highlighted in the scene view (e.g., red boxes, etc.). Based on the established model, distance indicating information is calculated, a recommended parking position is found, and the distance indicating information and the recommended parking position are presented in a scene view (step 470), as shown in fig. 3B or fig. 3C. The user may then select a get-off location on the view of the scene presented by the park location selection system (step 480). If the user is not satisfied with the current getting-off position before getting-off, the user may continue to adjust the parking position through the parking position selection system (step 490) so that the user can get off at the position most favorable for getting off in the surrounding environment.
The above embodiments may also be equally applicable to selecting a user boarding location. In application to an autonomous vehicle, selection and adjustment of the boarding location may be accomplished automatically by the autonomous vehicle, by the passenger on the vehicle on a vehicle-mounted location selection system, or by the user preparing to board the vehicle through a location selection system on a mobile device (e.g., a smart phone), etc.
Fig. 5 shows a schematic of a modular structure of a system for controlling a vehicle in an embodiment of the present invention. The system 500 shown in fig. 5 is a specific embodiment of the system 100 shown in fig. 1, which may be mounted on a vehicle 506. The various modules in the system 500 shown in fig. 5 may be implemented in hardware, software, or a combination thereof. For example, when the modules are implemented in software, the software modules may be executed by a processor in combination with a memory. Specifically, in system 500, video processing module 510 receives and processes acquired video images from front camera 501, left camera 502, right camera 503, and rear camera 504; the processed video images are sent to the image composition module 520 and the obstacle recognition/identification module 530, respectively. The image synthesis module 520 performs stitching and synthesis processing on the video images from each camera to obtain an overall surrounding view; the obstacle recognition/identification module 530 performs obstacle recognition on the video image and identifies the recognized obstacle on the surrounding view; the looking-around display module 540 splices video pictures shot by the front camera, the rear camera, the left camera and the right camera to form a looking-down scene view based on 360-degree looking-around, and sends the looking-down scene view to the central control display 507 for display to the user 505 for viewing. The position distance calculation module 550 calculates the actual distances between each target position (e.g., recommended parking position, obstacle position, vehicle position, etc.) and the camera in the scene view, and the actual distances between each position, so as to form a distance indicator. The location recommendation module 560 generates one or more recommended parking locations based on the obstacle identified and identified by the obstacle identification/identification module 530 and sends the recommended parking locations to the indicator enhancement display module 570. The indicator enhancement display module 570 may form a distance indicator and superimpose the distance indicator, the recommended parking location, and the interactive controls on the top view of the look-around display module 540, as displayed by the central display 507. The user 505 may interactively select, adjust, and determine the parking position on the central display 507 and/or interactive controls. The target position calculation module 580 calculates a target parking position based on the user's interactive operation. The control instruction conversion/transmission module 590 converts the calculated target parking position into a vehicle control instruction based on the calculated target parking position, and transmits the control instruction to the driving control system of the vehicle 506 to control the vehicle 506 to park at the target parking position.
Embodiments of the present invention also provide a vehicle (e.g., vehicle 506 of fig. 5) including the system for controlling a vehicle (e.g., system 100 of fig. 1, system 500 of fig. 5) described in the above embodiments.
Embodiments of the present invention also provide a computer readable instruction storage medium having stored therein computer readable instructions that, when executed, cause a vehicle to perform a method for controlling a vehicle to park at a location that facilitates a user getting on and off as described in the above embodiments (e.g., method 200 shown in fig. 2). The computer readable instruction storage medium may be used on, for example, the system 100 shown in fig. 1, the system 500 shown in fig. 5, and the instructions stored on the computer readable instruction storage medium may be executed by the system 100 or the system 500.
The above description is only of a preferred embodiment of the invention and is not to be construed as limiting the invention in any way. Various modifications and variations in form and detail may be made by those skilled in the art without departing from the principles of the present invention, which are within the scope of the present patent as defined in the appended claims.

Claims (24)

1. A system for controlling a vehicle, comprising:
a driving control system configured to control running of the vehicle; and
an obstacle recognition system configured to recognize an obstacle affecting a user getting on or off in a surrounding environment of the vehicle;
wherein the driving control system is further configured to control the vehicle to park at a location avoiding the identified obstacle.
2. The system of claim 1, wherein the obstacle recognition system is further configured to recognize the obstacles using a pre-trained obstacle recognition model, and to obtain a location and recognition confidence of each obstacle in the surrounding environment.
3. The system of claim 1, further comprising:
a position selection system configured to select a position in a surrounding environment of the vehicle that avoids the identified obstacle and facilitates a user to get on or off;
wherein the driving control system is further configured to control the vehicle to park at the position selected by the position selection system.
4. The system of claim 1, further comprising:
an image acquisition system configured to acquire an image of a surrounding environment of the vehicle;
wherein the obstacle recognition system is configured to recognize an obstacle affecting a user to get on or off in a surrounding environment of the vehicle in an image acquired by the image acquisition system.
5. The system of claim 4, further comprising:
a position selection system configured to display the captured image of the surrounding environment, the identified obstacle, and an indicator for selecting a position, and to receive an indication of the user regarding a parking position; wherein the indication of the parking location by the user is generated in response to the user performing an interactive operation with respect to the indicator in relation to a location selection;
wherein the driving control system is further configured to control the vehicle to park at a parking position in the indication based on the user's indication of the parking position.
6. The system of claim 5, wherein the location selection system comprises a display screen configured to display an image of the surrounding environment, the identified obstacle, and the indicator for selecting a location using augmented reality technology;
wherein the display screen is disposed in front of a user seat of the vehicle, the display screen including at least one of a center control screen in the vehicle and a display screen located in a rear row of the vehicle.
7. The system of claim 6, wherein the display screen is further configured to display a distance indicator and at least one of a recommended parking location, a location of the vehicle, and a location of the obstacle; the distance indicator is for indicating the recommended parking position, the position of the vehicle, the position of the obstacle, and the distances associated with the respective positions.
8. The system of claim 5 or 6, wherein the indicator for selecting a location is an interactive control configured to receive an indication of the user's selection of a plurality of recommended parking locations.
9. The system of claim 4, wherein the image acquisition system comprises a plurality of cameras disposed at different locations on the vehicle; wherein the plurality of cameras comprises at least a right side camera on the vehicle; the images acquired by the plurality of cameras are stitched into an image of the surroundings of the vehicle.
10. The system according to any one of claims 1 to 9, wherein the vehicle is an autonomous vehicle.
11. The system of claim 10, further comprising:
a position recognition system configured to recognize whether a surrounding environment of the vehicle belongs to a parking-limited position using a pre-trained position recognition model;
wherein the driving control system is further configured to control the vehicle to avoid parking at the identified parking limited location.
12. A method for controlling a vehicle to park at a location that facilitates a user getting on and off, comprising:
identifying obstacles in the surrounding environment of the vehicle that affect the user to get on or off the vehicle; and
the vehicle is controlled to park where the identified obstacle is avoided.
13. The method of claim 12, wherein the identifying step comprises:
identifying the obstacle in the ambient environment using a pre-trained identification model, an
The location and identification confidence of each obstacle in the surrounding environment is obtained.
14. The method of claim 12, wherein the controlling step comprises:
selecting a location in the surrounding environment of the vehicle that avoids the identified obstacle and facilitates a user to get on or off the vehicle; and
controlling the vehicle to park at the selected location.
15. The method of claim 12, further comprising:
acquiring an image of the surroundings of the vehicle;
wherein the identifying step comprises: and identifying obstacles affecting the user to get on or off the vehicle in the surrounding environment of the vehicle in the acquired image.
16. The method of claim 15, wherein the controlling step comprises:
displaying the acquired image of the surrounding environment, the identified obstacle, and an indicator for selecting a location on a display screen;
receiving an indication of the user regarding a parking location on the display screen; wherein the indication of the parking location by the user is generated in response to the user performing an interactive operation with respect to the indicator in relation to a location selection; and
controlling the vehicle to park at a park position in the indication based on the user indication of the park position.
17. The method of claim 16, wherein the displaying step comprises: displaying an image of the surrounding environment, the identified obstacle, and the indicator for selecting a location on the display screen using augmented reality technology;
wherein the display screen is disposed in front of a user seat of the vehicle, the display screen including at least one of a center control screen in the vehicle and a display screen located in a rear row of the vehicle.
18. The method of claim 17, further comprising:
displaying a recommended parking location, at least one of a location of the vehicle and a location of the obstacle, and a distance indicator;
wherein the distance indicator is for indicating the recommended parking position, the position of the vehicle, the position of the obstacle, and the distances associated with the respective positions.
19. The method of claim 16 or 17, wherein the indicator for selecting a location is an interactive control configured to receive an indication of the user's selection of a plurality of recommended parking locations.
20. The method of claim 15, wherein the image acquisition step comprises:
acquiring images of a surrounding environment of the vehicle using a plurality of cameras disposed at different positions on the vehicle; wherein the plurality of cameras comprises at least a right side camera on the vehicle; and
and splicing the images acquired by the cameras into images of the surrounding environment of the vehicle.
21. The method according to any one of claims 12 to 20, wherein the vehicle is an autonomous vehicle.
22. The method of claim 21, further comprising:
using a pre-trained position recognition model to recognize whether the surrounding environment of the vehicle belongs to a parking-limited position; and
the vehicle is controlled to avoid parking at the identified parking limited location.
23. A vehicle comprising the system of any one of claims 1 to 11.
24. A computer readable instruction storage medium in which computer readable instructions are stored which, when executed, cause a vehicle to perform the method of any one of claims 12 to 22.
CN202210367461.7A 2022-04-08 2022-04-08 Vehicle control system and method for facilitating user getting on and off Pending CN116923420A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210367461.7A CN116923420A (en) 2022-04-08 2022-04-08 Vehicle control system and method for facilitating user getting on and off
DE102023108328.7A DE102023108328A1 (en) 2022-04-08 2023-03-31 Vehicle control system and method for user-friendly entry or exit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210367461.7A CN116923420A (en) 2022-04-08 2022-04-08 Vehicle control system and method for facilitating user getting on and off

Publications (1)

Publication Number Publication Date
CN116923420A true CN116923420A (en) 2023-10-24

Family

ID=88094099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210367461.7A Pending CN116923420A (en) 2022-04-08 2022-04-08 Vehicle control system and method for facilitating user getting on and off

Country Status (2)

Country Link
CN (1) CN116923420A (en)
DE (1) DE102023108328A1 (en)

Also Published As

Publication number Publication date
DE102023108328A1 (en) 2023-10-12

Similar Documents

Publication Publication Date Title
US11318928B2 (en) Vehicular automated parking system
CN109427199B (en) Augmented reality method and device for driving assistance
US10086870B2 (en) Trailer parking assist system for vehicle
JP4970516B2 (en) Surrounding confirmation support device
US9886636B2 (en) Enhanced top-down view generation in a front curb viewing system
US10093247B2 (en) Enhanced front curb viewing system
US8754760B2 (en) Methods and apparatuses for informing an occupant of a vehicle of surroundings of the vehicle
US9834143B2 (en) Enhanced perspective view generation in a front curb viewing system
CN102442311B (en) Determine the method and apparatus of the view data of the editor relevant with vehicle-surroundings environment
JP4696248B2 (en) MOBILE NAVIGATION INFORMATION DISPLAY METHOD AND MOBILE NAVIGATION INFORMATION DISPLAY DEVICE
CN107547864B (en) Surrounding view monitoring system and method for vehicle
CN107111742A (en) To track limitation and the identification and prediction of construction area in navigation
WO2012091476A2 (en) Apparatus and method for displaying a blind spot
US20110157184A1 (en) Image data visualization
CN107273788A (en) The imaging system and vehicle imaging systems of lane detection are performed in vehicle
EP3530521B1 (en) Driver assistance method and apparatus
US20190135169A1 (en) Vehicle communication system using projected light
CN108928348A (en) Generate the method and system of wide area perception scene figure
US20130021453A1 (en) Autostereoscopic rear-view display system for vehicles
EP3859390A1 (en) Method and system for rendering a representation of an evinronment of a vehicle
US11577748B1 (en) Real-time perception system for small objects at long range for autonomous vehicles
CN114228491A (en) Head-up display system and method with night vision enhanced virtual reality
CN112896159A (en) Driving safety early warning method and system
CN111294564A (en) Information display method and wearable device
CN112492522A (en) Control method for autonomous parking of vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination