CN113655790A - Vehicle control method, device, equipment, storage medium and program product - Google Patents

Vehicle control method, device, equipment, storage medium and program product Download PDF

Info

Publication number
CN113655790A
CN113655790A CN202110894828.6A CN202110894828A CN113655790A CN 113655790 A CN113655790 A CN 113655790A CN 202110894828 A CN202110894828 A CN 202110894828A CN 113655790 A CN113655790 A CN 113655790A
Authority
CN
China
Prior art keywords
vehicle
information
cloud server
perception information
automatic driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110894828.6A
Other languages
Chinese (zh)
Inventor
门寿蓬
李永晨
郑思宜
闫婧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Apollo Zhilian Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Zhilian Beijing Technology Co Ltd filed Critical Apollo Zhilian Beijing Technology Co Ltd
Priority to CN202110894828.6A priority Critical patent/CN113655790A/en
Publication of CN113655790A publication Critical patent/CN113655790A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y10/00Economic sectors
    • G16Y10/40Transportation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/10Information sensed or collected by the things relating to the environment, e.g. temperature; relating to location
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y20/00Information sensed or collected by the things
    • G16Y20/20Information sensed or collected by the things relating to the thing itself
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16YINFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
    • G16Y40/00IoT characterised by the purpose of the information processing
    • G16Y40/30Control

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computing Systems (AREA)
  • Electromagnetism (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • Toxicology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • Operations Research (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

The scheme provided by the disclosure can take over the vehicle by the cloud server when the vehicle meets a certain condition, so that the processing capability of the cloud server is utilized to process the complex environment around the vehicle, and the automatic driving performance of the vehicle is improved.

Description

Vehicle control method, device, equipment, storage medium and program product
Technical Field
The present disclosure relates to an automatic driving technique and an intelligent transportation technique in an artificial intelligence technique, and more particularly, to a vehicle control method, apparatus, device, storage medium, and program product.
Background
With the development of the automatic driving technology, more and more vehicles are provided with automatic driving systems. The automatic driving vehicle can complete driving tasks and monitor driving environments under certain environments and specific conditions based on hardware such as a vehicle-mounted radar, a sensor and a camera and algorithm capabilities.
In the prior art, the automatic driving vehicle relies on vehicle-end equipment to sense the surrounding environment and relies on the capability and algorithm of the vehicle-end equipment to formulate a driving strategy.
The scheme is realized based on equipment arranged at the vehicle end, and the processing capability of the complex road condition scene is relatively weak depending on the capability of the equipment at the vehicle end and the algorithm at the vehicle end.
Disclosure of Invention
The present disclosure provides a vehicle control method, apparatus, device, storage medium, and program product to solve the problem in the prior art that the processing capability of an autonomous vehicle for a complex road condition scene is relatively weak.
According to a first aspect of the present disclosure, there is provided a vehicle control method including:
acquiring perception information of a vehicle in a driving process, and reporting the perception information to a cloud server;
sending a takeover request to the cloud server when the vehicle meets a takeover condition, wherein the takeover request is used for indicating that an automatic driving strategy of the vehicle is generated according to the perception information;
and receiving and executing the automatic driving strategy sent by the cloud server.
According to a second aspect of the present disclosure, there is provided a vehicle control method including:
receiving perception information reported by a vehicle, and generating an automatic driving strategy of the vehicle according to the perception information after receiving a takeover request sent by the vehicle;
transmitting the autonomous driving maneuver to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
According to a third aspect of the present disclosure, there is provided a vehicle control apparatus including:
the reporting unit is used for acquiring the perception information of the vehicle in the running process and reporting the perception information to the cloud server;
the request unit is used for sending a takeover request to the cloud server when the vehicle meets a takeover condition, wherein the takeover request is used for indicating that an automatic driving strategy of the vehicle is generated according to the perception information;
and the execution unit is used for receiving and executing the automatic driving strategy sent by the cloud server.
According to a fourth aspect of the present disclosure, there is provided a vehicle control apparatus comprising:
the strategy making unit is used for receiving perception information reported by a vehicle and generating an automatic driving strategy of the vehicle according to the perception information after receiving a takeover request sent by the vehicle;
a transmitting unit to transmit the autonomous driving maneuver to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first or second aspect.
According to a sixth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of the first or second aspect.
According to a seventh aspect of the present disclosure, there is provided a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of an electronic device can read the computer program, execution of the computer program by the at least one processor causing the electronic device to perform the method of the first or second aspect.
The vehicle control method, the vehicle control device, the vehicle control equipment, the vehicle control storage medium and the vehicle control program product can take over the vehicle by the cloud server when the vehicle meets a certain condition, so that the processing capacity of the cloud server is utilized to process the complex environment around the vehicle, and the automatic driving performance of the vehicle is improved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a diagram of an application scenario;
fig. 2 is a flowchart illustrating a vehicle control method according to a first example embodiment of the disclosure;
FIG. 3 is a system architecture diagram illustrating an exemplary embodiment of the present disclosure;
fig. 4 is a flowchart illustrating a vehicle control method according to a second example embodiment of the disclosure;
FIG. 5 is a diagram illustrating an application scenario according to an exemplary embodiment of the present disclosure;
FIG. 6 is a diagram illustrating an application scenario according to another exemplary embodiment of the present disclosure;
fig. 7 is a flowchart illustrating a vehicle control method according to a third example embodiment of the disclosure;
fig. 8 is a flowchart illustrating a vehicle control method according to a fourth example embodiment of the disclosure;
fig. 9 is a schematic configuration diagram of a vehicle control apparatus shown in a first example embodiment of the disclosure;
fig. 10 is a schematic configuration diagram of a vehicle control apparatus shown in a second example embodiment of the disclosure;
fig. 11 is a schematic configuration diagram of a vehicle control apparatus shown in a third example embodiment of the disclosure;
fig. 12 is a schematic configuration diagram of a vehicle control apparatus shown in a fourth example embodiment of the disclosure;
FIG. 13 is a block diagram of an electronic device used to implement an embodiment of the disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is an application scenario diagram.
As shown in fig. 1, a plurality of vehicles may travel on a road, which may include a vehicle having an automatic driving function.
If the surrounding environment of the automatic driving vehicle is simple, the vehicle can process the scene based on the self capacity, such as overtaking, avoiding pedestrians and the like. However, when the environment surrounding the vehicle is complex, the autonomous vehicle may have poor handling capabilities, resulting in a poor ride experience.
As shown in fig. 1, for example, there are two unidirectional lanes, there is an obstacle vehicle 12 in front of the autonomous vehicle 11, and the autonomous vehicle 11 stops behind the obstacle vehicle 12 and cannot overtake the vehicle from a side road in time, thereby bringing poor riding experience to passengers.
In order to promote the controllability of vehicle, when the scheme that this disclosure provided satisfies the takeover condition, the vehicle can send the takeover request to the high in the clouds server, by high in the clouds server control vehicle, because high in the clouds server has powerful throughput, consequently, can handle the complex environment that the vehicle was located through the high in the clouds server to improve user's experience by bus.
Fig. 2 is a flowchart illustrating a vehicle control method according to a first example embodiment of the disclosure.
The scheme provided by the disclosure is applied to the vehicle-mounted terminal arranged in the vehicle, and the vehicle-mounted terminal can interact with the cloud server, so that a take-over request is reported, and an automatic driving strategy issued by the cloud server can be received.
As shown in fig. 2, the present disclosure provides a vehicle control method including:
step 201, obtaining the perception information of the vehicle in the driving process, and reporting the perception information to a cloud server.
Fig. 3 is a system architecture diagram illustrating an exemplary embodiment of the present disclosure.
As shown in fig. 3, the vehicle 31 and the cloud server 32 may be connected through a network, specifically, a vehicle-mounted terminal in the vehicle 31 is connected to the cloud server 32, and the vehicle-mounted terminal may obtain the sensing information and report the sensing information to the cloud server 32 through the network.
The vehicle can acquire perception information through the vehicle-mounted terminal in the driving process, and the perception information can comprise various information, such as image information acquired by cameras inside and outside the vehicle, point cloud data scanned by a radar, positioning results acquired by a positioning system and the like.
Specifically, the vehicle-mounted terminal can report the sensing information to the cloud server in real time and can also report the sensing information at regular time. For example, the vehicle-mounted terminal can report image information to the cloud server in real time, can report point cloud data to the cloud server at regular time, and can specifically set according to requirements.
Step 202, sending a takeover request to the cloud server when the vehicle meets a takeover condition, wherein the takeover request is used for indicating that an automatic driving strategy of the vehicle is generated according to the perception information.
Furthermore, a takeover condition can be preset, and when the vehicle meets the takeover condition, the vehicle-mounted terminal can be triggered to send a takeover request to the cloud server, so that the cloud server takes over the vehicle and controls the vehicle.
During practical application, the takeover condition can be set according to the requirement, for example, the vehicle-mounted terminal can determine the state of the vehicle according to the sensing information, and if the vehicle state represents that the environment of the vehicle is complex, the vehicle-mounted terminal can send the takeover request to the cloud server so that the cloud server can take over the vehicle.
Optionally, if a vehicle suddenly comes from the side of the vehicle, the environment where the vehicle is located may be considered to be complex, and if an intersection exists in front of the vehicle, the environment where the vehicle is located may also be considered to be complex.
The vehicle-mounted terminal can send a takeover request to the cloud server, and the passenger can press the takeover key according to actual conditions. For example, when a traffic accident occurs in front of the vehicle, the passenger can press the take-over key, so that the vehicle-mounted terminal is triggered to send a take-over request to the cloud server, and the cloud server processes the current complex road condition.
Specifically, after receiving a take-over request sent by the vehicle, the cloud server can generate an automatic driving strategy of the vehicle according to the perception information of the vehicle. For example, the cloud server may store the sensing information reported by each vehicle within a preset time period, and after receiving the takeover request, the cloud server may obtain the sensing information corresponding to the vehicle identifier according to the vehicle identifier, and generate an automatic driving policy according to the sensing information.
Furthermore, in the process that the cloud server takes over the vehicle, the vehicle-mounted terminal reports the perception information, and specifically the perception information can be reported in real time, so that the cloud server can make an automatic driving strategy according to the real-time environment around the vehicle.
In practical application, the automatic driving strategy can comprise various information such as lane change, acceleration, deceleration, braking, parking and the like.
After the cloud server generates the automatic driving strategy, the automatic driving strategy can be issued to the vehicle, so that the vehicle can execute the automatic driving strategy.
In an optional implementation manner, after receiving a takeover request sent by a vehicle, the cloud server may further determine whether to take over the vehicle. For example, a takeover reason may be included in the takeover request, such as because the vehicle is trapped, and such as because the passenger requested takeover. The cloud server may determine whether the vehicle needs to be taken over according to the reason for taking over.
If the computing power of the cloud server side is insufficient, the cloud server can reject the takeover request of the vehicle under some conditions. For example, when the vehicle sends the takeover request due to an accident in front, the cloud server can reject the takeover request if the computing power is insufficient.
And step 203, receiving and executing the automatic driving strategy sent by the cloud server.
Specifically, if the cloud server generates an automatic driving strategy of the vehicle, the cloud server can send the automatic driving strategy to the vehicle-mounted terminal, so that the vehicle-mounted terminal can receive the automatic driving strategy.
Further, the in-vehicle terminal may also execute the received automatic driving maneuver. For example, if the automatic driving strategy includes lane change information, the vehicle-mounted terminal may control the vehicle to change the lane according to the information, and for example, if the automatic driving strategy includes acceleration information, the vehicle-mounted terminal may accelerate the vehicle according to the information.
During actual application, the vehicle-mounted terminal can analyze specific information included in the automatic driving strategy and can disassemble the information, so that various information included in the information can be executed. For example, the automatic driving strategy may include information of braking and lane changing, and the vehicle-mounted terminal may disassemble the two information from the automatic driving strategy and execute the two information.
In an optional implementation manner, after the cloud server takes over the vehicle, the cloud server may not take over the vehicle after the vehicle has traveled for a preset time or a preset distance, but the vehicle-mounted terminal of the vehicle controls the vehicle to travel. In this embodiment, the cloud server can return the control right of the vehicle to the vehicle-mounted terminal after controlling the vehicle to be separated from the complex driving environment, so as to release the computing power of the cloud server.
The present disclosure provides a vehicle control method including: acquiring perception information of a vehicle in a driving process, and reporting the perception information to a cloud server; sending a takeover request to a cloud server when the vehicle meets a takeover condition, wherein the takeover request is used for indicating that an automatic driving strategy of the vehicle is generated according to the perception information; and receiving and executing the automatic driving strategy sent by the cloud server. In the scheme that this disclosure provided, can be when the vehicle satisfies certain condition, take over the vehicle by high in the clouds server to utilize the throughput of high in the clouds server, handle the peripheral complicated environment of vehicle, with the autopilot performance that improves the vehicle.
Fig. 4 is a flowchart illustrating a vehicle control method according to a second exemplary embodiment of the present disclosure.
The scheme provided by the disclosure is applied to the vehicle-mounted terminal arranged in the vehicle, and the vehicle-mounted terminal can interact with the cloud server, so that a take-over request is reported, and an automatic driving strategy issued by the cloud server can be received.
As shown in fig. 4, the present disclosure provides a vehicle control method including:
step 401, obtaining perception information of a vehicle in a driving process, and reporting the perception information to a cloud server.
The execution manner and principle of step 401 are similar to those of step 201, and are not described again.
And step 402, if the condition of taking over is determined to be met according to the sensing information, sending a taking over request to the cloud server.
The vehicle-mounted terminal can determine whether the takeover conditions are met or not according to the sensing information, and if any takeover condition is met, the vehicle-mounted terminal can send a takeover request to the cloud server and take over the vehicle through the cloud server.
In this embodiment, the pipe connection condition can be set according to the requirement, so as to improve the applicable flexibility of the scheme. For example, the takeover condition may be set according to a running environment of the vehicle, and if the vehicle runs in the campus, the takeover condition may be set according to the environment of the campus, and if the vehicle runs on the main road in the city, the takeover condition may be set according to the environment of the main road in the city. In this way, the autonomous vehicle can be adapted to different application scenarios.
If the distance between the vehicle and a preset station and/or intersection in front of the vehicle is determined to be smaller than a preset value according to the positioning information, determining that a take-over condition is met; the perception information comprises positioning information.
In one embodiment, the perception information reported by the vehicle may include positioning information. In this embodiment, the in-vehicle terminal may send a takeover request to the cloud server when the location of the vehicle meets a certain condition.
For example, the preset station where the vehicle stops can be preset, and when the vehicle is close to the preset station where the vehicle runs in front, the takeover request can be sent to the cloud server, so that the cloud server can control the preset station where the vehicle stops in front.
Fig. 5 is a diagram illustrating an application scenario according to an exemplary embodiment of the present disclosure.
As shown in fig. 5, a plurality of stations 51 are provided, and when the stations 51 exist in front of the vehicle and the distance between the vehicle and the stations 51 is smaller than a preset value, it can be considered that the vehicle is about to reach the preset stations, and therefore, the vehicle-mounted terminal in the vehicle can send a take-over request to the cloud server.
For example, when the vehicle is a bus-type passenger vehicle, the vehicle needs to be parked at a passing site, and if other vehicles or persons exist near the passing site, resulting in a complicated environment near the passing site, the vehicle may not be parked at an accurate position, or even at a stop. In the method provided by the application, the vehicle can be taken over by the cloud server before reaching the preset station, so that the vehicle can be stably parked near the preset station under the condition that the surrounding environment of the preset station is complex, and the driving performance of the vehicle is improved.
For another example, the vehicle-mounted terminal can also determine the position of an intersection in front of the vehicle according to the high-precision map, and when the vehicle is close to the intersection in front of the vehicle, the vehicle-mounted terminal can send a take-over request to the cloud server, so that the cloud server can process the complex environment of the intersection in front of the vehicle.
Traffic conditions near the intersection are complex, there may be suddenly rushed cars or pedestrians, there may also be vehicles or pedestrians running red light, and the vehicle-mounted terminal may have a certain processing pressure when processing the conditions, so that in the application scenario, the cloud server can take over the vehicles.
Fig. 6 is a diagram illustrating an application scenario according to another exemplary embodiment of the present disclosure.
As shown in fig. 6, there is an intersection 61 in front of the vehicle, and when there is an intersection 61 in front of the vehicle and the distance between the vehicle and the intersection 61 is smaller than a preset value, it can be considered that the vehicle is about to reach the intersection position, and therefore, the vehicle-mounted terminal in the vehicle can send a take-over request to the cloud server.
The vehicle can be taken over by the cloud server before reaching the intersection, so that the cloud server processes the complex road condition of the intersection position, the vehicle can safely and stably pass through the intersection, and the driving performance of the vehicle is improved.
In another optional implementation, the perception information reported by the vehicle may include external environment information. In this embodiment, the in-vehicle terminal may send a takeover request to the cloud server when the external environment information of the vehicle meets a certain condition.
Specifically, if it is determined that the driving environment satisfies any one of the following conditions according to the external environment information, it is determined that the take-over condition is satisfied: traffic jam, sudden traffic coming from the side, obstacles in front of the vehicle and traffic accidents in front of the vehicle.
Further, the vehicle-mounted terminal can determine whether the vehicle is in a traffic jam, a sudden vehicle coming from the side, an obstacle in front of the vehicle, a traffic accident in front of the vehicle and the like according to the external environment information. If the vehicle is in the states, the vehicle is trapped, and the cloud server is required to take over the vehicle.
In the embodiment, the vehicle-mounted terminal can send the takeover request to the cloud server under the condition that the surrounding environment of the vehicle is complex, so that the cloud server controls the vehicle, and the performance of the automatic driving vehicle is improved.
During actual application, the vehicle-mounted terminal can acquire road side data sent by road side equipment and sensor data acquired by a sensor; and determining external environment information according to the road side data and the sensor data.
Roadside devices refer to devices disposed at the periphery of a road, which may communicate with an in-vehicle terminal to transmit roadside data to the in-vehicle terminal. For example, vehicle information around a road, and further, for example, traffic light information in front of the road, and the like.
Wherein the sensors arranged on the vehicle can also acquire sensor data, and the vehicle-mounted terminal can acquire the sensor data. Such as point cloud data collected by a radar, images collected by a camera, and position information collected by a positioning system.
Specifically, the vehicle-mounted terminal may determine external environment information of the vehicle by combining the roadside data and the sensor data.
In one embodiment, the vehicle-mounted terminal may filter and fuse the road side data, and then determine the external environment information of the vehicle according to the processed road side data and the sensor data. Including, for example, geographic location information, obstacles, nearby vehicle conditions, traffic light conditions, pedestrian conditions, sprinkles, and the like.
In the embodiment, the vehicle can determine the external environment information of the vehicle by combining the data collected by the road side equipment and the sensor data, so that the external environment information determined by the vehicle is more widely covered, and the automatic driving performance of the vehicle can be improved.
Step 403, responding to the takeover instruction acting on the takeover key, and sending a takeover request to the cloud server.
Optionally, a take-over button may be provided in the vehicle, for example, a take-over button may be provided next to each seat.
The user can press the takeover key to trigger the vehicle-mounted terminal to send a takeover request to the cloud server. For example, the user can judge according to the environment that the vehicle is located, and when lane road environment is complicated, the user can operate the vehicle, and then takes over the vehicle by the high in the clouds server. For example, when there are many vehicles, the user may press the take-over button.
In the embodiment, the cloud server can take over the vehicle through user operation, so that a passenger can control the cloud server to take over the vehicle according to the actual road environment, and the riding experience of the user is improved.
And step 404, receiving and executing the automatic driving strategy sent by the cloud server.
Step 404 is similar to the implementation of step 203 and will not be described again.
In an optional implementation manner, when the user clicks the takeover key to enable the vehicle-mounted terminal to send a takeover request to the cloud server, the cloud server may generate a driving strategy for controlling the vehicle to stop at the side after receiving the request. For example, the vehicle is a small bus, and the passenger can press the takeover request before arriving at the destination, so that the cloud server generates a driving strategy for controlling the vehicle to stop near, and issues the driving strategy to the vehicle, and the vehicle can stop near in front when executing the strategy.
In the embodiment, the effect of parking at any time can be realized without setting up a stop for the vehicle in advance, and the vehicle parking system is particularly suitable for application scenes of carrying a plurality of passengers with different destinations.
Fig. 7 is a flowchart illustrating a vehicle control method according to a third example embodiment of the disclosure.
The scheme provided by the disclosure is applied to the cloud server arranged in the vehicle, and the vehicle-mounted terminal can interact with the cloud server, so that the automatic driving strategy of the vehicle can be formulated according to the perception information of the vehicle after the takeover request is received.
As shown in fig. 7, the present disclosure provides a vehicle control method including:
and 701, receiving the sensing information reported by the vehicle, and generating an automatic driving strategy of the vehicle according to the sensing information after receiving a take-over request sent by the vehicle.
The vehicle can acquire perception information through the vehicle-mounted terminal in the driving process, and the perception information can comprise various information, such as image information acquired by cameras inside and outside the vehicle, point cloud data scanned by a radar, positioning results acquired by a positioning system and the like.
The vehicle-mounted terminal can report the sensing information to the cloud server in real time and can also report the sensing information at regular time. For example, the vehicle-mounted terminal can report image information to the cloud server in real time, can report point cloud data to the cloud server at regular time, and can specifically set according to requirements. So that the cloud server receives the perception information reported by the vehicle.
Furthermore, a takeover condition can be preset, and when the vehicle meets the takeover condition, the vehicle-mounted terminal can be triggered to send a takeover request to the cloud server, so that the cloud server takes over the vehicle and controls the vehicle.
Specifically, after receiving a take-over request sent by the vehicle, the cloud server can generate an automatic driving strategy of the vehicle according to the perception information of the vehicle. For example, the cloud server may store the sensing information reported by each vehicle within a preset time period, and after receiving the takeover request, the cloud server may obtain the sensing information corresponding to the vehicle identifier according to the vehicle identifier, and generate an automatic driving policy according to the sensing information.
In practical application, the automatic driving strategy can comprise various information such as lane change, acceleration, deceleration, braking, parking and the like.
At step 702, an autonomous driving maneuver is transmitted to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
After the cloud server generates the automatic driving strategy, the automatic driving strategy can be issued to the vehicle, so that the vehicle can execute the automatic driving strategy.
In an optional implementation manner, after receiving a takeover request sent by a vehicle, the cloud server may further determine whether to take over the vehicle. For example, a takeover reason may be included in the takeover request, such as because the vehicle is trapped, and such as because the passenger requested takeover. The cloud server may determine whether the vehicle needs to be taken over according to the reason for taking over.
If the computing power of the cloud server side is insufficient, the cloud server can reject the takeover request of the vehicle under some conditions. For example, when the vehicle sends the takeover request due to an accident in front, the cloud server can reject the takeover request if the computing power is insufficient.
The in-vehicle terminal may execute the received autonomous driving maneuver. For example, if the automatic driving strategy includes lane change information, the vehicle-mounted terminal may control the vehicle to change the lane according to the information, and for example, if the automatic driving strategy includes acceleration information, the vehicle-mounted terminal may accelerate the vehicle according to the information.
In the scheme that this disclosure provided, can be when the vehicle satisfies certain condition, take over the vehicle by high in the clouds server to utilize the throughput of high in the clouds server, handle the peripheral complicated environment of vehicle, with the autopilot performance that improves the vehicle.
Fig. 8 is a flowchart illustrating a vehicle control method according to a fourth example embodiment of the disclosure.
The scheme provided by the disclosure is applied to the cloud server arranged in the vehicle, and the vehicle-mounted terminal can interact with the cloud server, so that the automatic driving strategy of the vehicle can be formulated according to the perception information of the vehicle after the takeover request is received.
As shown in fig. 8, the present disclosure provides a vehicle control method including:
step 801, receiving perception information reported by vehicles, and after receiving a takeover request sent by the vehicles, determining perception information of surrounding vehicles with a distance less than a preset value from the received perception information of a plurality of vehicles.
The cloud server can acquire the perception information of the peripheral vehicles with the distance less than the preset value according to the position information of each vehicle after receiving the takeover request sent by one vehicle.
For example, the vehicle a may send a takeover request to the cloud server, and the cloud server may obtain perception information reported by vehicles within a range of 500 meters around the vehicle a.
And step 802, generating an automatic driving strategy of the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle.
Specifically, the cloud server can determine environment information outside the vehicle according to the perception information of the surrounding vehicles and the perception information of the vehicle; and generating an automatic driving strategy of the vehicle according to the environment information outside the vehicle.
Further, the cloud server can process the perception information of the surrounding vehicles so as to acquire the information of each vehicle, such as the speed and the position of the surrounding vehicle, in the environment surrounding the vehicle, which needs to be taken over.
During actual application, the cloud server can also determine information such as relative positions and speeds of the vehicles needing to be taken over and the surrounding vehicles according to the perception information of the vehicles needing to be taken over and the perception information of the surrounding vehicles. Specifically, the environmental information outside the vehicle can be obtained from the perception information of the surrounding vehicle and the perception information of the vehicle.
The cloud server can make a driving strategy for the vehicle needing to take over according to the external environment information of the vehicle so as to avoid surrounding vehicles.
At step 803, an autonomous driving maneuver is transmitted to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
Step 803 is similar to step 702 and will not be described again.
Fig. 9 is a schematic configuration diagram of a vehicle control device shown in the first example embodiment of the disclosure.
As shown in fig. 9, the present disclosure provides a vehicle control apparatus 900 including:
the reporting unit 910 is configured to obtain perception information of a vehicle during a driving process, and report the perception information to a cloud server;
a requesting unit 920, configured to send a takeover request to the cloud server when the vehicle meets a takeover condition, where the takeover request is used to instruct to generate an automatic driving policy of the vehicle according to the sensing information;
and the execution unit 930 is configured to receive and execute the automatic driving policy sent by the cloud server.
The vehicle control device provided by the present disclosure is similar to the embodiment shown in fig. 2, and is not described again.
Fig. 10 is a schematic configuration diagram of a vehicle control device shown in a second example embodiment of the disclosure.
As shown in fig. 10, in the vehicle control apparatus 1000 provided by the present disclosure, the reporting unit 1010 is similar to the reporting unit 910 shown in fig. 9, the requesting unit 1020 is similar to the requesting unit 920 shown in fig. 9, and the executing unit 1030 is similar to the executing unit 930 shown in fig. 9.
As shown in fig. 10, in a vehicle control device 1000 provided by the present disclosure, a request unit 1020 includes:
a first reporting module 1021, configured to send the takeover request to the cloud server if it is determined that the takeover condition is met according to the sensing information.
Wherein, the perception information comprises positioning information; the first reporting module 1021 is specifically configured to:
and if the distance between the vehicle and a preset station and/or intersection in front of the vehicle is determined to be less than a preset value according to the positioning information, determining that a take-over condition is met.
Wherein the perception information comprises external environment information;
the first reporting module 1021 is specifically configured to:
if the running environment meets any one of the following states according to the external environment information, determining that the taking over condition is met:
traffic jam, sudden traffic coming from the side, obstacles in front of the vehicle and traffic accidents in front of the vehicle.
Wherein, it further comprises a fusion unit 1040, configured to:
the method comprises the steps of obtaining roadside data sent by roadside equipment and sensor data collected by a sensor;
and determining the external environment information according to the roadside data and the sensor data.
The requesting unit 1020 includes:
the second reporting module 1022 is configured to send a takeover request to the cloud server in response to a takeover instruction acting on a takeover key.
Wherein the automatic driving strategy received by the execution unit 1030 is a driving strategy for parking beside.
The vehicle control device provided by the present disclosure is similar to the embodiment shown in fig. 4, and is not described again.
Fig. 11 is a schematic configuration diagram of a vehicle control device shown in a third example embodiment of the disclosure.
As shown in fig. 11, the present disclosure provides a vehicle control apparatus 1100 including:
the strategy making unit 1110 is configured to receive perception information reported by a vehicle, and generate an automatic driving strategy of the vehicle according to the perception information after receiving a takeover request sent by the vehicle;
a transmitting unit 1120 configured to transmit the autonomous driving maneuver to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
The vehicle control device provided by the present disclosure is similar to the embodiment shown in fig. 7, and is not described again.
Fig. 12 is a schematic configuration diagram of a vehicle control device shown in a fourth example embodiment of the disclosure.
As shown in fig. 12, the present disclosure provides a vehicle control apparatus 1200 in which a policy making unit 1120 is similar to the policy making unit 1110 shown in fig. 11, and a transmitting unit 1220 is similar to the transmitting unit 1120 shown in fig. 11.
In the vehicle control device provided by the present disclosure, the policy making unit 1220 includes:
the information acquisition module 1221 is configured to determine, from the received perception information of multiple vehicles, perception information of a peripheral vehicle that is less than a preset value away from the vehicle;
a generating module 1222, configured to generate an automatic driving strategy of the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle.
The generating module 1222 is specifically configured to:
determining environmental information outside the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle;
and generating an automatic driving strategy of the vehicle according to the environment information outside the vehicle.
The present disclosure provides a vehicle control method, apparatus, device, storage medium, and program product, which are applied to an automatic driving technology and an intelligent transportation technology in an artificial intelligence technology, so as to solve the problem in the prior art that the processing capability of an automatically driven vehicle on a complex road condition scene is relatively weak.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the good customs of the public order.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
According to an embodiment of the present disclosure, the present disclosure also provides a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of the electronic device can read the computer program, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any of the embodiments described above.
Fig. 13 illustrates a schematic block diagram of an example electronic device 1300 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 13, the apparatus 1300 includes a computing unit 1301 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)1302 or a computer program loaded from a storage unit 1308 into a Random Access Memory (RAM) 1303. In the RAM 1303, various programs and data necessary for the operation of the device 1300 can also be stored. The calculation unit 1301, the ROM 1302, and the RAM 1303 are connected to each other via a bus 1304. An input/output (I/O) interface 1305 is also connected to bus 1304.
A number of components in the device 1300 connect to the I/O interface 1305, including: an input unit 1306 such as a keyboard, a mouse, or the like; an output unit 1307 such as various types of displays, speakers, and the like; storage unit 1308, such as a magnetic disk, optical disk, or the like; and a communication unit 1309 such as a network card, modem, wireless communication transceiver, etc. The communication unit 1309 allows the device 1300 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
Computing unit 1301 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of computing unit 1301 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 1301 executes the respective methods and processes described above, such as the vehicle control method. For example, in some embodiments, the vehicle control method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 1308. In some embodiments, some or all of the computer program may be loaded onto and/or installed onto device 1300 via ROM 1302 and/or communications unit 1309. When the computer program is loaded into the RAM 1303 and executed by the computing unit 1301, one or more steps of the vehicle control method described above may be performed. Alternatively, in other embodiments, the computing unit 1301 may be configured to perform the vehicle control method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (23)

1. A vehicle control method comprising:
acquiring perception information of a vehicle in a driving process, and reporting the perception information to a cloud server;
sending a takeover request to the cloud server when the vehicle meets a takeover condition, wherein the takeover request is used for indicating that an automatic driving strategy of the vehicle is generated according to the perception information;
and receiving and executing the automatic driving strategy sent by the cloud server.
2. The method of claim 1, wherein sending a takeover request to the cloud server when the vehicle satisfies a takeover condition comprises:
and if the receiving condition is determined to be met according to the sensing information, sending the receiving request to the cloud server.
3. The method of claim 2, wherein the perception information includes positioning information;
and if the distance between the vehicle and a preset station and/or intersection in front of the vehicle is determined to be less than a preset value according to the positioning information, determining that a take-over condition is met.
4. The method according to claim 2 or 3, wherein the perception information comprises external environment information;
if the running environment meets any one of the following states according to the external environment information, determining that the taking over condition is met:
traffic jam, sudden traffic coming from the side, obstacles in front of the vehicle and traffic accidents in front of the vehicle.
5. The method of claim 4, further comprising:
the method comprises the steps of obtaining roadside data sent by roadside equipment and sensor data collected by a sensor;
and determining the external environment information according to the roadside data and the sensor data.
6. The method of any one of claims 1-5, wherein sending a takeover request to the cloud server when the vehicle satisfies a takeover condition comprises:
and responding to a take-over instruction acting on a take-over key, and sending a take-over request to the cloud server.
7. The method of claim 6, further comprising:
the received automatic driving strategy is a driving strategy for parking beside.
8. A vehicle control method comprising:
receiving perception information reported by a vehicle, and generating an automatic driving strategy of the vehicle according to the perception information after receiving a takeover request sent by the vehicle;
transmitting the autonomous driving maneuver to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
9. The method of claim 8, wherein the generating an autonomous driving maneuver of the vehicle according to the perception information comprises:
determining perception information of surrounding vehicles with the distance less than a preset value from the received perception information of a plurality of vehicles;
and generating an automatic driving strategy of the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle.
10. The method of claim 9, wherein generating an autonomous driving maneuver for the vehicle based on the perception information of the nearby vehicle and the perception information of the vehicle comprises:
determining environmental information outside the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle;
and generating an automatic driving strategy of the vehicle according to the environment information outside the vehicle.
11. A vehicle control apparatus comprising:
the reporting unit is used for acquiring the perception information of the vehicle in the running process and reporting the perception information to the cloud server;
the request unit is used for sending a takeover request to the cloud server when the vehicle meets a takeover condition, wherein the takeover request is used for indicating that an automatic driving strategy of the vehicle is generated according to the perception information;
and the execution unit is used for receiving and executing the automatic driving strategy sent by the cloud server.
12. The apparatus of claim 11, wherein the request unit comprises:
and the first reporting module is used for sending the takeover request to the cloud server if the takeover condition is determined to be met according to the sensing information.
13. The apparatus of claim 12, wherein the perception information comprises positioning information; the first reporting module is specifically configured to:
and if the distance between the vehicle and a preset station and/or intersection in front of the vehicle is determined to be less than a preset value according to the positioning information, determining that a take-over condition is met.
14. The apparatus of claim 12 or 13,
the first reporting module is specifically configured to:
if the running environment meets any one of the following states according to the external environment information, determining that the taking over condition is met:
traffic jam, sudden traffic coming from the side, obstacles in front of the vehicle and traffic accidents in front of the vehicle.
15. The apparatus of claim 14, further comprising a fusion unit to:
the method comprises the steps of obtaining roadside data sent by roadside equipment and sensor data collected by a sensor;
and determining the external environment information according to the roadside data and the sensor data.
16. The apparatus according to any of claims 11-15, wherein the requesting unit comprises:
and the second reporting module is used for responding to a take-over instruction acting on a take-over key and sending a take-over request to the cloud server.
17. The apparatus of claim 16, the autonomous driving maneuver received by the enforcement unit being a driving maneuver for parking alongside.
18. A vehicle control apparatus comprising:
the strategy making unit is used for receiving perception information reported by a vehicle and generating an automatic driving strategy of the vehicle according to the perception information after receiving a takeover request sent by the vehicle;
a transmitting unit to transmit the autonomous driving maneuver to the vehicle, wherein the autonomous driving maneuver is executed by the vehicle.
19. The apparatus of claim 18, wherein the policy making unit comprises:
the information acquisition module is used for determining the perception information of the surrounding vehicles which are less than a preset value away from the vehicles in the received perception information of the plurality of vehicles;
and the generating module is used for generating an automatic driving strategy of the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle.
20. The apparatus of claim 19, wherein the generation module is specifically configured to:
determining environmental information outside the vehicle according to the perception information of the surrounding vehicle and the perception information of the vehicle;
and generating an automatic driving strategy of the vehicle according to the environment information outside the vehicle.
21. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-10.
22. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-10.
23. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-10.
CN202110894828.6A 2021-08-05 2021-08-05 Vehicle control method, device, equipment, storage medium and program product Pending CN113655790A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110894828.6A CN113655790A (en) 2021-08-05 2021-08-05 Vehicle control method, device, equipment, storage medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110894828.6A CN113655790A (en) 2021-08-05 2021-08-05 Vehicle control method, device, equipment, storage medium and program product

Publications (1)

Publication Number Publication Date
CN113655790A true CN113655790A (en) 2021-11-16

Family

ID=78478472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110894828.6A Pending CN113655790A (en) 2021-08-05 2021-08-05 Vehicle control method, device, equipment, storage medium and program product

Country Status (1)

Country Link
CN (1) CN113655790A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108462726A (en) * 2017-02-14 2018-08-28 广州市联奥信息科技有限公司 Vehicle assistant drive decision system and device towards unknown situation
CN109756551A (en) * 2018-06-12 2019-05-14 启迪云控(北京)科技有限公司 The intelligent network connection of cloud platform auxiliary drives vehicle emergency call method, system
CN110033618A (en) * 2019-04-23 2019-07-19 吉林大学 A kind of vehicle travel control method based on cloud control platform
CN111009147A (en) * 2019-11-25 2020-04-14 奇瑞汽车股份有限公司 Vehicle remote entrusted designated driving system and application method thereof
WO2020133208A1 (en) * 2018-12-28 2020-07-02 驭势科技(北京)有限公司 Control method for self-driving vehicle, and self-driving system
CN111949037A (en) * 2020-08-26 2020-11-17 北京享云智汇科技有限公司 Automatic driving system and method for internet vehicle
CN112622930A (en) * 2020-12-22 2021-04-09 北京百度网讯科技有限公司 Unmanned vehicle driving control method, device and equipment and automatic driving vehicle
CN112687122A (en) * 2020-12-22 2021-04-20 北京百度网讯科技有限公司 Information transmission method, vehicle, cloud end and cockpit in automatic driving process
WO2021134187A1 (en) * 2019-12-30 2021-07-08 深圳元戎启行科技有限公司 Network monitoring-based vehicle control method and apparatus, and computer device
CN113771874A (en) * 2021-08-02 2021-12-10 北京百度网讯科技有限公司 Control method and device for automatic driving vehicle, electronic equipment and readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108462726A (en) * 2017-02-14 2018-08-28 广州市联奥信息科技有限公司 Vehicle assistant drive decision system and device towards unknown situation
CN109756551A (en) * 2018-06-12 2019-05-14 启迪云控(北京)科技有限公司 The intelligent network connection of cloud platform auxiliary drives vehicle emergency call method, system
WO2020133208A1 (en) * 2018-12-28 2020-07-02 驭势科技(北京)有限公司 Control method for self-driving vehicle, and self-driving system
CN110033618A (en) * 2019-04-23 2019-07-19 吉林大学 A kind of vehicle travel control method based on cloud control platform
CN111009147A (en) * 2019-11-25 2020-04-14 奇瑞汽车股份有限公司 Vehicle remote entrusted designated driving system and application method thereof
WO2021134187A1 (en) * 2019-12-30 2021-07-08 深圳元戎启行科技有限公司 Network monitoring-based vehicle control method and apparatus, and computer device
CN111949037A (en) * 2020-08-26 2020-11-17 北京享云智汇科技有限公司 Automatic driving system and method for internet vehicle
CN112622930A (en) * 2020-12-22 2021-04-09 北京百度网讯科技有限公司 Unmanned vehicle driving control method, device and equipment and automatic driving vehicle
CN112687122A (en) * 2020-12-22 2021-04-20 北京百度网讯科技有限公司 Information transmission method, vehicle, cloud end and cockpit in automatic driving process
CN113771874A (en) * 2021-08-02 2021-12-10 北京百度网讯科技有限公司 Control method and device for automatic driving vehicle, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN108399792B (en) Unmanned vehicle avoidance method and device and electronic equipment
CN113741485A (en) Control method and device for cooperative automatic driving of vehicle and road, electronic equipment and vehicle
KR20210038852A (en) Method, apparatus, electronic device, computer readable storage medium and computer program for early-warning
CN112700667A (en) Method, apparatus, electronic device, and medium for assisting vehicle driving
US20200271456A1 (en) Method for the generation of a merged free-space map, electronic control device and storage medium
CN108958248A (en) Standby system
CN115092130A (en) Vehicle collision prediction method, device, electronic apparatus, medium, and vehicle
CN114964274A (en) Map updating method, path planning method, device, electronic equipment and medium
CN114212108A (en) Automatic driving method, device, vehicle, storage medium and product
CN114677848B (en) Perception early warning system, method, device and computer program product
CN114283609B (en) Information display method and device for automatic driving vehicle and platform terminal
CN115973190A (en) Decision-making method and device for automatically driving vehicle and electronic equipment
CN114333312A (en) Road traffic environment information display method and device and electronic equipment
CN113655790A (en) Vehicle control method, device, equipment, storage medium and program product
CN114779705A (en) Method, device, electronic equipment and system for controlling automatic driving vehicle
CN114559958A (en) Method and device for determining trapped-person escaping strategy, electronic equipment and storage medium
CN114379547A (en) Brake control method, brake control device, vehicle, electronic device, and storage medium
CN114333381A (en) Data processing method and device for automatic driving vehicle and electronic equipment
CN114299758A (en) Vehicle control method and apparatus, device, medium, and product
CN114228735A (en) Visualization method, device and system for intelligent driving vehicle
CN114379588B (en) Inbound state detection method, apparatus, vehicle, device and storage medium
CN114407916B (en) Vehicle control and model training method and device, vehicle, equipment and storage medium
CN114120651B (en) Method, apparatus, device, medium and product for testing perceived target number
JP7212708B2 (en) Traffic signal control method and device
CN115171412B (en) Method, system and device for displaying running state of vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20211116