CN108614566B - Operation method for parallel driving - Google Patents

Operation method for parallel driving Download PDF

Info

Publication number
CN108614566B
CN108614566B CN201810635557.0A CN201810635557A CN108614566B CN 108614566 B CN108614566 B CN 108614566B CN 201810635557 A CN201810635557 A CN 201810635557A CN 108614566 B CN108614566 B CN 108614566B
Authority
CN
China
Prior art keywords
vehicle
control
data
processor
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810635557.0A
Other languages
Chinese (zh)
Other versions
CN108614566A (en
Inventor
张德兆
王肖
霍舒豪
李晓飞
张放
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Idriverplus Technologies Co Ltd
Original Assignee
Beijing Idriverplus Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Idriverplus Technologies Co Ltd filed Critical Beijing Idriverplus Technologies Co Ltd
Priority to CN201810635557.0A priority Critical patent/CN108614566B/en
Publication of CN108614566A publication Critical patent/CN108614566A/en
Application granted granted Critical
Publication of CN108614566B publication Critical patent/CN108614566B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention relates to an operation method for parallel driving, which comprises the following steps: the control console in the operation control end receives control data input by a user according to the first vehicle state data and sends the control data to the processor; the processor analyzes the control data through the simulation equipment control model to obtain vehicle control data, and sends the vehicle control data to the server; the vehicle control data includes control device information and control parameters; the server sends the control equipment information and the control parameters to the vehicle end for the vehicle end to work according to the control equipment information and the control parameters; the vehicle end sends feedback data and second vehicle state data to the server according to the control equipment information and the control parameters; and when the feedback data is not empty, the server sends the second vehicle state data to a display in the operation control end, and the display is used for displaying the second vehicle state data.

Description

Operation method for parallel driving
Technical Field
The invention relates to the field of automatic driving, in particular to an operation method for parallel driving.
Background
At present, a few systems related to parallel driving are in a theoretical stage, the initial idea of parallel driving is formed in the middle of the 90 s of the 20 th century, the concept of parallel driving is formally proposed in 2005, the idea of virtual-real interaction between an artificial system and an actual system is applied to the driving field, and a prototype of the current parallel driving theory is formed. For automatic driving, an automatic driving algorithm and deep learning are inherently important. However, in order to achieve the landing of the automatic driving product and reduce the maintenance of the automatic driving product, the automatic driving and the simulated driving are combined to realize the parallel driving in the true sense. I.e. the seamless switching operation between the automatic driving mode and the simulated driving mode.
The field of the automatic driving remote monitoring background is an important node for ensuring safety of automatic driving, and under the premise that machine algorithms such as numerous traffic emergencies cannot be used, manual intervention is needed and current events need to be recorded. When the unmanned vehicle is operated and driven on a normal road, the unmanned vehicle often has some failure problems of force-ineligibility factors. At this time, when the event playback function in the vehicle cannot meet the judgment of the fault, a specially-assigned person is required to transport the vehicle back for maintenance. This results in increased labor costs and increased time costs.
Disclosure of Invention
The invention aims to provide an operation method for parallel driving, which aims to overcome the defects of the prior art, and can enable operators to accurately know various peripheral environment states of the existing unmanned vehicle in the running process in real time by establishing operation control and operation feedback processes of an operation control end and a vehicle end, and also enable the operators to return the unmanned vehicle to a maintenance point for maintenance through a remote control function and video service in special environments when the unmanned vehicle cannot run autonomously due to failure of an automatic driving function and the like, instead of needing special personnel to convey the vehicle back to the maintenance point, thereby reducing the increase of labor cost and time cost.
In order to achieve the above object, an embodiment of the present invention provides an operation method of parallel driving, including:
the method comprises the steps that a console in an operation control end receives operation data input by a user according to first vehicle state data, and sends the operation data to a processor in the operation control end;
the processor analyzes the control data through a simulation equipment control model to obtain vehicle control data, and sends the vehicle control data to the server; the vehicle control data includes control device information and control parameters;
the server sends the control equipment information and the control parameters to the vehicle end, so that the vehicle end works according to the control equipment information and the control parameters;
the vehicle end sends feedback data and second vehicle state data to the server according to the control equipment information and the control parameters;
and when the feedback data is not empty, the server sends the second vehicle state data to a display in the operation control end, and the display is used for displaying the second vehicle state data.
Preferably, before the console in the operation control terminal receives the operation data input by the user according to the first vehicle state data, the method further includes:
the server receives first vehicle state data sent by the vehicle end;
a processor in the operation control end acquires first vehicle state data sent by the server according to an acquisition instruction;
and the processor sends the first vehicle state data to a display in the operation control end, so that the display displays the first vehicle state data to a user.
Further preferably, the vehicle state data includes: vehicle environment video data and vehicle status information; the vehicle environment video data includes a plurality of camera position information.
Further preferably, the sending, by the processor, the first vehicle state data to a display in the operation control end, and displaying, by the display, the first vehicle state data to a user specifically includes:
and the processor compresses the vehicle environment video data according to the camera position information to obtain a plurality of compressed vehicle environment video data, and the display is used for displaying the plurality of compressed vehicle environment video data to the user.
Preferably, the console comprises: the device comprises a steering wheel control module, a gear control module, an accelerator control module and a brake control module.
Further preferably, the manipulation data includes: the accelerator control module is used for controlling the accelerator of the vehicle according to the control data, and the accelerator control module is used for controlling the accelerator of the vehicle according to the control data.
Preferably, before the processor analyzes the manipulation data through a simulation device manipulation model, the method further includes:
the processor acquires the control data according to the first time parameter;
when the control data acquired by the processor according to the first time parameter is empty, the processor sends the last vehicle control data to the server;
the server sends the last vehicle control data to the vehicle end, so that the vehicle end works according to the last vehicle control data;
when the control data acquired by the processor according to the second time parameter is empty, the processor sends a braking instruction to the server;
and the server sends the braking instruction to the vehicle end for the vehicle end to work according to the braking instruction.
Preferably, the receiving, by the processor in the operation control terminal, the vehicle state data sent by the server according to the acquisition instruction is specifically:
and the processor in the operation control terminal receives the vehicle state data sent by the server through a first network protocol according to the acquisition instruction.
Preferably, after the vehicle end operates according to the control device information and the control parameter, the method further includes:
the vehicle end obtains environmental laser point data of the vehicle;
when the environmental laser point data does not correspond to the control equipment information and the control parameter, generating the braking instruction to enable the vehicle end to work according to the braking instruction; and generating alarm information and sending the alarm information to the operation control terminal.
According to the parallel driving operation method provided by the embodiment of the invention, through establishing the operation control and operation feedback processes of the operation control end and the vehicle end, operators can accurately know various peripheral environment states of the existing unmanned vehicle in the operation in real time, and in some special environments, when the unmanned vehicle cannot operate autonomously due to the failure of an automatic driving function and the like, the operators can return the unmanned vehicle to a maintenance point through a remote control function and video service, the vehicle is driven to the maintenance point through network remote control for maintenance, instead of needing special personnel to transport the vehicle back to the maintenance point, so that the increase of labor cost and time cost are reduced.
Drawings
Fig. 1 is a flowchart of an operation method of parallel driving according to an embodiment of the present invention.
Detailed Description
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
In order to better understand the execution flow of the parallel driving control method proposed by the present invention, the concept of parallel driving will be explained first.
The parallel driving is to combine automatic driving and simulated driving, and realize the real parallel driving only by safely and stably switching between an automatic driving mode and a simulated driving mode. The automatic driving mode can be understood as a driving mode that the unmanned vehicle can automatically and safely drive according to a required track without any active operation of human beings by means of cooperation of artificial intelligence, visual calculation, radar, a monitoring device and a global positioning system. The simulated driving mode can be understood as a driving mode in which a user controls the unmanned vehicle to travel along a desired trajectory in a virtual driving environment through a virtual driving device.
According to the parallel driving operation method provided by the embodiment of the invention, the data of the operation control end is collected and processed and then sent to the unmanned vehicle, and the unmanned vehicle feeds back the operation data to the operation control end, so that the remote control of the unmanned vehicle is realized. The flow chart of the method is shown in figure 1, and the method comprises the following steps:
step 110, a processing unit in a vehicle end acquires vehicle state data through a sensing unit;
in particular, the vehicle end may be understood as an unmanned vehicle comprising a processing unit, a sensing unit and a vehicle control unit.
The processing unit can be understood as the brain of the unmanned vehicle and is mainly used for processing and calculating various instructions to obtain various control parameters.
The sensing unit is used for acquiring vehicle state data of the vehicle, and can be understood as a unit for monitoring the vehicle and the surrounding driving environment in real time in a mode of fusing various sensors and providing detailed and accurate data information for automatic driving of the vehicle. The sensing unit comprises an environment sensing module for acquiring vehicle environment video data representing the driving environment around the vehicle and a vehicle state sensing module for acquiring vehicle state information representing the state of the vehicle. The environment sensing module comprises a plurality of camera devices, the directions of the surrounding environment of the vehicle monitored by each camera device are different, and the directions represented by the obtained vehicle environment video data are also different, so that the vehicle environment video data of each direction can correspond to the position information of one camera. This makes it possible to distinguish from which orientation of the camera device the vehicle environment video data comes from. The vehicle state sensing module includes but is not limited to one or more of a thermometer, a wheel speed meter, an oil quantity sensor, a remaining charge sensor, a battery state of health sensor, a wear sensor and a tire pressure sensor.
The vehicle control unit may be understood as a unit for controlling the operation of the unmanned vehicle, including controlling the vehicle traveling direction and traveling speed, and the like.
It should be noted that, the units included in the vehicle end are not limited to the processing unit, the sensing unit and the vehicle control unit, and those skilled in the art may set other units or components included in the vehicle end according to needs.
In some preferred embodiments, four cameras are respectively arranged at the front, the rear, the left and the right of the vehicle, and the monitoring angle formed by the four cameras is 360 degrees.
In some preferred embodiments, the environment sensing module includes, in addition to the plurality of camera devices, one or more of, but not limited to, a GPS device, inertial navigation, lidar, and millimeter-wave radar; the GPS device is used for acquiring longitude and latitude information of the vehicle body so as to position the vehicle body; the inertial navigation estimates the motion attitude of the vehicle according to the acceleration information of six degrees of freedom of the vehicle body and corrects the positioning information; the laser radar is used for detecting lane line edges, obstacle information, vehicles and pedestrians; the millimeter wave radar is used for detecting vehicles on the structured road; cameras are used to detect traffic lights, traffic signs, obstacle information, vehicles, pedestrians, and other objects that cannot be accurately identified by sensors.
Step 120, sending the vehicle environment video data, the vehicle state information and the vehicle I D information to a server;
specifically, the processing unit sends the vehicle environment video data, the vehicle state information, and the vehicle I D information of the vehicle to the first server through the first network protocol according to the acquisition instruction. The vehicle I D information may be understood as identification information that identifies unique identification information for the vehicle. The first network protocol is a WebSocket protocol.
Step 130, the server sends the vehicle environment video data and the vehicle state information to the operation control end through a first network protocol, so that the operation control end displays the vehicle data state;
specifically, the operation control terminal may be understood as a simulation console for the user to perform control of the unmanned vehicle. The operation control end comprises a display, a console and a processor. The operating platform comprises but is not limited to a steering wheel control module, a gear control module, an accelerator control module and a brake control module.
And the processor in the operation control terminal receives first vehicle state data which is sent by the server and is composed of the first vehicle environment video data and the first vehicle state information according to the control signal, and sends the first vehicle state data to the display in the operation control terminal so that the display can display the first vehicle state data to a user. The first vehicle state data may be understood as vehicle state data of the vehicle before the vehicle is remotely operated; the first vehicle environment video data may be understood as data of a driving environment around the vehicle before the vehicle is remotely operated; the first vehicle state information may be understood as information of the state of the vehicle itself before the vehicle is remotely operated.
In some preferred embodiments, since the definition requirements for videos in different directions are generally different, before the processor in the operation control terminal sends the first vehicle state data to the display in the operation control terminal, the processor compresses the vehicle environment video data according to the position information of the camera to obtain a plurality of compressed vehicle environment video data, and the display displays the plurality of compressed vehicle environment video data to the user. This process may be understood as a process of selecting a frame rate of its corresponding transmitted image resolution according to the specific orientation of the camera, and compressing the video image according to the frame rate. In general, the image resolution of the vehicle environment video data with the position information of the camera being "front" is higher than that of other vehicle environment video data.
In some preferred embodiments, the processor may be stuck or delayed in acquiring the vehicle environment video data due to network congestion or other factors, such that the processor may not acquire the vehicle environment video data for a short period of time. At this time, the two cases should be handled. The method comprises the steps that firstly, a processor obtains vehicle environment video data according to a first time parameter, when the vehicle environment video data obtained by the processor according to the first time parameter are not empty, the fact that a remote control channel is smooth is represented, and the currently obtained vehicle environment video data are displayed. When the vehicle environment video data acquired by the processor according to the first time parameter is empty, the fact that the remote control channel is not smooth is represented, and the transmission of the vehicle environment video data has a hysteresis phenomenon, the processor continues to display a previous image of the vehicle environment video data. Secondly, when the vehicle environment video data acquired by the processor according to the second time parameter is still empty, the fact that the remote control channel is not smooth is represented, and the phenomenon of blocking exists in the transmission of the vehicle environment video data, the processor of the operation control end sends a braking instruction to the processing unit in the vehicle end through the server, and the processing unit sends the braking instruction to the vehicle control unit so that the vehicle control unit can work according to the braking instruction. The second time parameter represents a time greater than the time represented by the first time parameter. This process may be understood as the processor defaulting to displaying an ambient video frame if the processor does not acquire the vehicle ambient video data for a short period of time, indicating that the remote control channel is not clear. If the processor still does not acquire the vehicle environment video data within a longer period of time, the vehicle is parked emergently to ensure the driving safety.
Step 140, sending a mode switching instruction to a processing unit in the vehicle end, so that the processing unit works according to the mode switching instruction;
in particular, the mode switching instruction may be understood as an instruction to control the vehicle to switch from the automatic driving mode to the simulated driving mode. And the user sends a mode switching instruction to a processing unit in the vehicle end through an operation console or a remote control device, so that the vehicle finishes the automatic driving mode according to the mode switching instruction, starts to simulate the driving mode and waits for corresponding control data.
In some preferred embodiments, this step may also be performed before step 110. That is, the mode switching instruction may be transmitted to the vehicle before the vehicle state data is acquired, or may be transmitted after the vehicle state data is acquired. When the mode switching instruction is transmitted to the vehicle side before the vehicle state data is acquired, the processing unit of the vehicle side transmits the vehicle state data and the vehicle I D information to the server according to the mode switching instruction. This situation may be understood as the process of obtaining the vehicle state situation may occur after receiving the mode switching command, without the user having to see the current vehicle state situation before switching to the simulated driving mode. However, as long as the user switches to the simulated driving mode and the user has to perform corresponding operations according to the current vehicle state data, the vehicle processing unit must send the vehicle state data of a plurality of directions and the information of the vehicle I D to the server. When the mode switching command is transmitted to the vehicle after the vehicle state data is acquired, it can be understood that the user needs to determine whether to switch to the simulated driving mode according to the current vehicle state data, and thus the process of acquiring the vehicle state data occurs before the mode switching command is received.
Step 150, the console receives control data input by a user according to the first vehicle state data, and sends the control data to the processor;
specifically, the user inputs the operation data to the operation control terminal by using the console by observing the current driving state of the vehicle displayed on the display. And after receiving the control data, the operating platform sends the control data to a processor of the operation control end. The control data includes, but is not limited to, steering wheel angle data generated by the control of the steering wheel, gear control data generated by the gear control module, accelerator control data generated by the accelerator control module, and brake control data generated by the brake control module.
In some preferred embodiments, the steering wheel control module, the gear control module, the accelerator control module and the brake control module in the console can be integrated into a handle or a keyboard, so that the operation of a user is facilitated.
Step 160, the processor analyzes the control data to obtain vehicle control data, and sends the vehicle control data to the server;
specifically, the processor stores a simulation device control model, and the simulation device control model is used for analyzing control data generated by the console to obtain vehicle control data which can be identified by the vehicle terminal. The vehicle control data includes control device information and control parameters. The control device information may be understood as a device in the vehicle end corresponding to the current manipulation, and the control parameter may be understood as a control parameter of the device in the vehicle end to the current manipulation.
170, the server sends the control equipment information and the control parameters to the vehicle end for the vehicle end to work according to the control equipment information and the control parameters;
specifically, the processing unit at the vehicle end acquires the control parameter from the server according to the first time parameter, when the control device information and the control parameter work acquired by the processing unit according to the first time parameter are not empty, which means that the remote control channel is unblocked, the processing unit comprehensively analyzes the control device information and the control parameter work, determines how the vehicle should run according to the remote operation instruction of the user, generates a corresponding parameter and sends the corresponding parameter to the vehicle control unit, so that the vehicle control unit controls the vehicle running direction, the running speed and the like according to the remote operation instruction running of the user. When the processing unit obtains the control equipment information and the control parameter is empty according to the first time parameter, the remote control channel is not smooth, the data transmission is delayed, and the related data of the user is not transmitted, the processing unit works according to the previous control equipment information and the control parameter, generates the parameter corresponding to the previous control equipment information and the control parameter and sends the parameter to the vehicle control unit, so that the vehicle control unit controls the vehicle to run in the running direction, the running speed and the like according to the previous remote operation instruction of the user. And when the control equipment information and the control parameters acquired by the processing unit according to the second time parameter are still empty, which represents that the remote control channel is not smooth and the data transmission is seriously delayed, the processing unit sends a braking instruction to a vehicle control unit in the vehicle so that the vehicle control unit works according to the braking instruction. The second time parameter represents a time greater than the time represented by the first time parameter. This process may be understood as that, if the processing unit does not receive the control device information and the control parameters within a short period of time, which indicates that the remote control channel is not smooth, the vehicle will default to execute the previous control device information and control parameters, so that the vehicle still keeps the current state. If the processing unit still does not receive the control equipment information and the control parameters within a longer period of time, the vehicle is parked emergently to ensure the driving safety.
In some preferred embodiments, there are special situations that require attention when a user uses the operating control to remotely control the vehicle. When the vehicle environment picture displayed in the display of the display operation control side is still, there may be two cases. The environment video data may not be acquired by the processor due to the fact that the user operates the vehicle to park through the console and the fact that the remote control channel is not smooth. That is, when the user operates the vehicle to park through the console, even if the vehicle environment screen displayed on the display is still, the successful parking is not necessarily represented. If the success of parking is determined only by the front and rear conditions of the vehicle environment picture displayed on the display, there may be some cases of misjudgment in some extreme cases. Therefore, in the embodiment, whether parking is successful or not is determined by comparing the laser point data collected by the vehicle with the control equipment information and the control parameters.
Further specifically, a laser radar module in the sensing unit performs laser point acquisition on objects in the surrounding environment of the vehicle, so that the sensing unit can acquire environmental laser point data of the vehicle. If the vehicle is stationary, the ambient laser point data should also be sheetlike stationary. If the vehicle is moving, the ambient laser point data should be moved in multiple lines. In this case, even if the vehicle environment screen displayed by the user through the display page is still stationary, the vehicle is not actually parked. The processing unit generates a brake command to the vehicle control unit so that the vehicle is parked. And the processing unit generates alarm information and sends the alarm information to the display page, so that a user can know that the situation that the display page cannot acquire the environmental video data due to the fact that the remote control channel is not smooth through the display page display alarm information, the picture display is not timely, and the parking operation of the user does not cause the situation that the vehicle is not parked, and relevant inspection is needed.
In some preferred embodiments, when the vehicle is switched from the automatic driving mode to the simulated driving mode according to the mode switching instruction, the vehicle decelerates or stops first, and then waits for control equipment information and control parameters of corresponding simulated driving, so that the situation that the control effect of the operation information on the vehicle pair is not timely due to overhigh vehicle speed can be avoided, and the switching process between the modes is safer and more stable.
More specifically, the processing unit sends a braking command to a vehicle control unit in the vehicle based on the mode switch command. The brake command can be understood as a control brake command. The vehicle control unit operates first according to the brake command, i.e. decelerates or stops first according to the brake command. Then, the processing unit receives the control equipment information and the control parameters and sends the control equipment information and the control parameters to the vehicle control unit, so that the vehicle control unit works according to the control equipment information and the control parameters after working according to the braking instruction.
Step 180, the vehicle end sends feedback data and second vehicle state data to a server according to the control equipment information and the control parameters, and a display is used for displaying the second vehicle state data;
in particular, the feedback data may be understood as data of whether the vehicle end responds to the operation result in all. When the feedback data received by the server is empty, the fact that the vehicle end does not respond to the operation result of the operation console means that the vehicle end does not respond to the operation result of the operation console, at the moment, the server generates response failure information and sends the response failure information to the operation control end, and a user carries out next processing according to the response failure information displayed by a display in the operation control end. When the feedback data received by the server are not empty, the server sends the second vehicle state data to a display in the operation control end, the display is used for displaying the second vehicle state data to a user, and the user can check the state of the vehicle end in operation in real time. Corresponding to the first vehicle state data, the second vehicle state data may be understood as vehicle state data of the vehicle after being remotely operated; the second vehicle state data includes second vehicle environment video data representing data of a driving environment around the vehicle after the vehicle is remotely operated, and second vehicle state information representing information of a state of the vehicle itself after the vehicle is remotely operated. This process may be understood as a process in which the vehicle side feeds back the operation result to the server and renders the vehicle state. The method for feeding back the second vehicle state data from the vehicle end to the operation control end can refer to the above steps 110 and 130.
According to the parallel driving operation method provided by the embodiment of the invention, through establishing the operation control and operation feedback processes of the operation control end and the vehicle end, operators can accurately know various peripheral environment states of the existing unmanned vehicle in the operation in real time, and in some special environments, when the unmanned vehicle cannot operate autonomously due to the failure of an automatic driving function and the like, the operators can return the unmanned vehicle to a maintenance point through a remote control function and video service, the vehicle is driven to the maintenance point through network remote control for maintenance, instead of needing special personnel to transport the vehicle back to the maintenance point, so that the increase of labor cost and time cost are reduced.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a user terminal, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (7)

1. An operating method for parallel driving, characterized in that the method comprises:
the method comprises the steps that a console in an operation control end receives operation data input by a user according to first vehicle state data, and sends the operation data to a processor in the operation control end;
the processor analyzes the control data through a simulation equipment control model to obtain vehicle control data, and sends the vehicle control data to a server; the vehicle control data includes control device information and control parameters;
the server sends the control equipment information and the control parameters to the vehicle end, so that the vehicle end works according to the control equipment information and the control parameters;
the vehicle end sends feedback data and second vehicle state data to the server according to the control equipment information and the control parameters;
when the feedback data are not empty, the server sends the second vehicle state data to a display in the operation control end, and the display is used for displaying the second vehicle state data;
before the processor parses the manipulation data through a simulation device manipulation model, the method further comprises:
the processor acquires the control data according to a first time parameter;
when the control data acquired by the processor according to the first time parameter is empty, the processor sends the last vehicle control data to the server;
the server sends the last vehicle control data to the vehicle end, so that the vehicle end works according to the last vehicle control data;
when the control data acquired by the processor according to the second time parameter is empty, the processor sends a braking instruction to the server;
the server sends the braking instruction to the vehicle end, so that the vehicle end works according to the braking instruction;
after the vehicle end works according to the control equipment information and the control parameters, the method further comprises the following steps:
the vehicle end obtains environmental laser point data of the vehicle;
and after the braking instruction is executed, comparing whether the environmental laser point data collected by the vehicle is consistent with the control equipment information and the control parameters, if so, parking is successful, otherwise, parking is failed, generating alarm information and sending the alarm information to the operation control end.
2. The parallel driving operation method according to claim 1, wherein before the console in the operation control terminal receives the manipulation data input by the user according to the first vehicle state data, the method further comprises:
the server receives first vehicle state data sent by the vehicle end;
a processor in the operation control end acquires first vehicle state data sent by the server according to an acquisition instruction;
and the processor sends the first vehicle state data to a display in the operation control end, so that the display displays the first vehicle state data to a user.
3. The operating method of parallel driving according to claim 2, wherein the vehicle state data includes: vehicle environment video data and vehicle status information; the vehicle environment video data includes a plurality of camera position information.
4. The operating method for parallel driving according to claim 3, wherein the processor sends the first vehicle status data to a display in the operation control end, and the display is used for displaying the first vehicle status data to a user specifically as follows:
and the processor compresses the vehicle environment video data according to the camera position information to obtain a plurality of compressed vehicle environment video data, and the display is used for displaying the plurality of compressed vehicle environment video data to the user.
5. The operating method for parallel driving according to claim 1, wherein the console comprises: the device comprises a steering wheel control module, a gear control module, an accelerator control module and a brake control module.
6. The operating method of parallel driving according to claim 5, wherein the manipulation data includes: the accelerator control module is used for controlling the accelerator of the vehicle according to the control data, and the accelerator control module is used for controlling the accelerator of the vehicle according to the control data.
7. The parallel driving operation method according to claim 1, wherein the processor in the operation control terminal receives the vehicle state data sent by the server according to the acquisition instruction, and specifically includes:
and the processor in the operation control terminal receives the vehicle state data sent by the server through the first network protocol according to the acquisition instruction.
CN201810635557.0A 2018-06-20 2018-06-20 Operation method for parallel driving Active CN108614566B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810635557.0A CN108614566B (en) 2018-06-20 2018-06-20 Operation method for parallel driving

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810635557.0A CN108614566B (en) 2018-06-20 2018-06-20 Operation method for parallel driving

Publications (2)

Publication Number Publication Date
CN108614566A CN108614566A (en) 2018-10-02
CN108614566B true CN108614566B (en) 2022-05-24

Family

ID=63665420

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810635557.0A Active CN108614566B (en) 2018-06-20 2018-06-20 Operation method for parallel driving

Country Status (1)

Country Link
CN (1) CN108614566B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110300285B (en) * 2019-07-17 2021-09-10 北京智行者科技有限公司 Panoramic video acquisition method and system based on unmanned platform
CN110460651A (en) * 2019-07-26 2019-11-15 阿尔法巴人工智能(深圳)有限公司 A kind of 5G remotely drives integrated control system and method
CN110708504A (en) * 2019-09-06 2020-01-17 北京智行者科技有限公司 Data processing method and system based on parallel driving
CN111497835B (en) * 2020-04-24 2022-03-08 北京智行者科技有限公司 Vehicle parallel driving and automatic anti-collision system
CN112147990A (en) * 2020-09-21 2020-12-29 新石器慧义知行智驰(北京)科技有限公司 Vehicle remote monitoring method and device, electronic equipment and readable storage medium
CN112399380A (en) * 2020-10-28 2021-02-23 星火科技技术(深圳)有限责任公司 Communication method, device, equipment and storage medium based on Internet of vehicles
CN112351102A (en) * 2020-11-10 2021-02-09 上海汽车集团股份有限公司 Remote driving method and system
CN112622931A (en) * 2020-12-22 2021-04-09 北京百度网讯科技有限公司 Abnormity processing method in parallel driving, automatic driving vehicle and cloud driving cabin
CN114056352A (en) * 2021-12-24 2022-02-18 上海海积信息科技股份有限公司 Automatic driving control device and vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591358A (en) * 2012-03-12 2012-07-18 北京航空航天大学 Multi-UAV (unmanned aerial vehicle) dynamic formation control method
CN105589459A (en) * 2015-05-19 2016-05-18 中国人民解放军国防科学技术大学 Unmanned vehicle semi-autonomous remote control method
CN105610984A (en) * 2016-03-07 2016-05-25 安徽江淮汽车股份有限公司 Method for remotely controlling vehicle
CN107221182A (en) * 2016-03-21 2017-09-29 中国移动通信集团广东有限公司 Method that vehicle termination adheres in car networking, roadway segment equipment, vehicle termination
CN107215332A (en) * 2017-06-14 2017-09-29 深圳市车米云图科技有限公司 A kind of safety driving assist system and control method
CN107589745A (en) * 2017-09-22 2018-01-16 京东方科技集团股份有限公司 Drive manner, vehicle carried driving end, remotely drive end, equipment and storage medium
CN107664957A (en) * 2016-07-28 2018-02-06 比亚迪股份有限公司 Emergency driving method, system and vehicle based on vehicle remote control
CN207115197U (en) * 2017-08-28 2018-03-16 北京华清智能科技有限公司 A kind of automatic Pilot delivery car tele-control system
CN107878460A (en) * 2016-09-30 2018-04-06 Lg电子株式会社 The control method and server of automatic driving vehicle
CN107991944A (en) * 2017-12-29 2018-05-04 河南护航实业股份有限公司 A kind of vehicle remote controls security system

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6396395B1 (en) * 2000-09-11 2002-05-28 Andrew J. Zielinski Programmable vehicle stopping system and process for route learning
DE102012213815A1 (en) * 2011-08-03 2013-02-07 Continental Teves Ag & Co. Ohg Method and system for stopping a motor vehicle
US20130282238A1 (en) * 2011-11-16 2013-10-24 Flextronics Ap, Llc Monitoring state-of-health of processing modules in vehicles
US9564053B2 (en) * 2012-01-12 2017-02-07 Honda Motor Co., Ltd. Synchronized driving assist apparatus and synchronized driving assist system
US9274525B1 (en) * 2012-09-28 2016-03-01 Google Inc. Detecting sensor degradation by actively controlling an autonomous vehicle
KR102011618B1 (en) * 2014-10-30 2019-08-16 미쓰비시덴키 가부시키가이샤 Automatic drive assist system, automatic drive monitoring device, road management device, and automatic drive information collection device
CN106297406B (en) * 2015-05-15 2019-07-05 深圳市金溢科技股份有限公司 The system and method for parking lot induction parking and/or reverse car search
CN107771142B (en) * 2015-07-31 2020-10-27 大陆-特韦斯股份有限公司 Parking brake actuation method for a motor vehicle parking brake system driven by an electric motor
CN105577755A (en) * 2015-12-10 2016-05-11 安徽海聚信息科技有限责任公司 Car networking terminal service system
CN105966395A (en) * 2016-05-24 2016-09-28 北京新能源汽车股份有限公司 Vehicle and parking control method and device thereof
CN106043169A (en) * 2016-07-01 2016-10-26 百度在线网络技术(北京)有限公司 Environment perception device and information acquisition method applicable to environment perception device
CN106297283A (en) * 2016-08-11 2017-01-04 深圳市元征科技股份有限公司 Safe driving appraisal procedure based on vehicle intelligent unit and system
US10384675B2 (en) * 2016-10-17 2019-08-20 GM Global Technology Operations LLC Methods and systems for remote parking assistance
CN106864437A (en) * 2017-03-17 2017-06-20 奇瑞汽车股份有限公司 A kind of emergency brake of vehicle system and its control method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102591358A (en) * 2012-03-12 2012-07-18 北京航空航天大学 Multi-UAV (unmanned aerial vehicle) dynamic formation control method
CN105589459A (en) * 2015-05-19 2016-05-18 中国人民解放军国防科学技术大学 Unmanned vehicle semi-autonomous remote control method
CN105610984A (en) * 2016-03-07 2016-05-25 安徽江淮汽车股份有限公司 Method for remotely controlling vehicle
CN107221182A (en) * 2016-03-21 2017-09-29 中国移动通信集团广东有限公司 Method that vehicle termination adheres in car networking, roadway segment equipment, vehicle termination
CN107664957A (en) * 2016-07-28 2018-02-06 比亚迪股份有限公司 Emergency driving method, system and vehicle based on vehicle remote control
CN107878460A (en) * 2016-09-30 2018-04-06 Lg电子株式会社 The control method and server of automatic driving vehicle
CN107215332A (en) * 2017-06-14 2017-09-29 深圳市车米云图科技有限公司 A kind of safety driving assist system and control method
CN207115197U (en) * 2017-08-28 2018-03-16 北京华清智能科技有限公司 A kind of automatic Pilot delivery car tele-control system
CN107589745A (en) * 2017-09-22 2018-01-16 京东方科技集团股份有限公司 Drive manner, vehicle carried driving end, remotely drive end, equipment and storage medium
CN107991944A (en) * 2017-12-29 2018-05-04 河南护航实业股份有限公司 A kind of vehicle remote controls security system

Also Published As

Publication number Publication date
CN108614566A (en) 2018-10-02

Similar Documents

Publication Publication Date Title
CN108614566B (en) Operation method for parallel driving
CN108776481B (en) Parallel driving control method
CN109421738B (en) Method and apparatus for monitoring autonomous vehicles
US20230418307A1 (en) Autonomous Vehicle Collision Mitigation Systems and Methods
Kim et al. The impact of cooperative perception on decision making and planning of autonomous vehicles
CN107505944B (en) Method and device for remotely assisting vehicle
CN104908811B (en) Communicating messages via a vehicle steering wheel
US11618439B2 (en) Automatic imposition of vehicle speed restrictions depending on road situation analysis
US10809719B2 (en) Systems and methods of controlling an autonomous vehicle using an enhanced trajectory following configuration
US10996668B2 (en) Systems and methods for on-site recovery of autonomous vehicles
EP3893221B1 (en) Event detection method and apparatus for cloud control platform, device, and storage medium
US20210155253A1 (en) Vehicle control system and method
Reuschenbach et al. iDriver-human machine interface for autonomous cars
CN113348125B (en) Method for assisting a user in remotely controlling a motor vehicle, computer-readable storage medium, remote control device and driver assistance system for a motor vehicle
US11360474B1 (en) Planner system recovery for autonomous vehicles
US20240092392A1 (en) Detecting and Responding to Malfunctioning Traffic Lights
CA3174307A1 (en) System and methods for controlling state transitions using a vehicle controller
DE112022003364T5 (en) COMPLEMENTARY CONTROL SYSTEM FOR AN AUTONOMOUS VEHICLE
US20210394788A1 (en) Method and apparatus for detecting unexpected control state in autonomous driving system
CN114212108A (en) Automatic driving method, device, vehicle, storage medium and product
US11643109B2 (en) Vehicle control system, vehicle controller device and vehicle control method
EP4170450A1 (en) Method and system for switching between local and remote guidance instructions for an autonomous vehicle
EP4006680B1 (en) Systems and methods for controlling a robotic vehicle
JP2023021919A (en) Control method and device of autonomous driving vehicle, electronic device, and readable storage medium
CN109017634B (en) Vehicle-mounted network system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: B4-006, maker Plaza, 338 East Street, Huilongguan town, Changping District, Beijing 100096

Patentee after: Beijing Idriverplus Technology Co.,Ltd.

Address before: B4-006, maker Plaza, 338 East Street, Huilongguan town, Changping District, Beijing 100096

Patentee before: Beijing Idriverplus Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder