CN109017814B - Vehicle-mounted human-computer interaction system - Google Patents
Vehicle-mounted human-computer interaction system Download PDFInfo
- Publication number
- CN109017814B CN109017814B CN201810902413.7A CN201810902413A CN109017814B CN 109017814 B CN109017814 B CN 109017814B CN 201810902413 A CN201810902413 A CN 201810902413A CN 109017814 B CN109017814 B CN 109017814B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- information
- control unit
- parking space
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 21
- 230000008447 perception Effects 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 6
- 230000002159 abnormal effect Effects 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 9
- 230000007613 environmental effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000000034 method Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000005856 abnormality Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004069 differentiation Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/45—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
- G01S19/47—Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Electromagnetism (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
The invention provides a vehicle-mounted human-computer interaction system, which comprises: the vehicle control unit is used for receiving travel task information, calling an environment map file and generating path planning data; the first display device is used for determining the current path; receiving the vehicle parking space information and a parking space selection instruction of a user, and determining the current parking space information; the vehicle control unit is also used for receiving the current path, the vehicle operation mode selection instruction and the environment perception data and generating steering control information and torque control information; then according to the steering control information, generating steering prompt information, and according to the torque control information, generating target vehicle speed information; finally, according to the current path and the target speed information, generating estimated arrival time information of the vehicle; acquiring parking space information according to the environment sensing data, and sending the parking space information to the first display device; and generating control information to control the vehicle to park to the parking space. Therefore, the user experience is improved.
Description
Technical Field
The invention relates to the technical field of control, in particular to a vehicle-mounted human-computer interaction system.
Background
With the rapid development of computer technology and artificial intelligence technology, intelligent robot technology has become a hot spot for research of numerous scholars at home and abroad.
The classification of drive-by-wire becomes a big thing for more convenient differentiation and definition of the drive-by-wire technology. Currently, two hierarchical policies recognized by the global automobile industry are proposed by the united states highway security administration (NHTSA for short) and the international society of automotive engineers (SAE for short), respectively. Of these, the drive-by-wire technologies of the L4 and L5 classes are referred to as full drive-by-wire technologies, and by this class, the vehicle can perform all driving operations without any intervention from the driver, and the driver can also pay attention to other aspects such as work or rest. However, the difference between the two is that the line control at the level of L4 is applicable in some scenarios, usually in cities or on highways. And the L5 level requires that the drive-by-wire automobile can drive the vehicle completely in any scene.
In the prior art, a display device on a vehicle can display some specific current states of the vehicle, such as the vehicle speed, but the control of the vehicle cannot be performed through the display device.
Disclosure of Invention
The embodiment of the invention aims to provide a vehicle-mounted human-computer interaction system to solve the problem that a vehicle cannot be controlled through a display device in the prior art.
In order to solve the above problems, the present invention provides a vehicle-mounted human-computer interaction system, comprising:
the vehicle control unit is used for receiving travel task information, calling an environment map file according to the travel task information and generating path planning data; wherein the path planning data comprises at least one path information;
the first end of the first display device is connected with the vehicle control unit and used for receiving the path planning data sent by the vehicle control unit and a path selection instruction of a user, determining a current path according to the at least one piece of path information and the path selection instruction, and sending the current path to the vehicle control unit;
the vehicle control unit is further used for receiving the current path sent by the first display device, a vehicle operation mode selection instruction sent by a bottom vehicle controller BVCU and environment perception data sent by a sensing module, generating decision result information according to the current path, the vehicle operation mode selection instruction and the environment perception data, processing the decision result information and generating steering control information and torque control information; then according to the steering control information, generating steering prompt information, and according to the torque control information, generating target vehicle speed information; finally, according to the current path and the target speed information, generating estimated arrival time information of the vehicle; wherein the context awareness data comprises a current location of the vehicle and a current speed of the vehicle;
the vehicle control unit is further used for acquiring parking space information according to the environment sensing data and sending the parking space information to the first display device;
the first display device is further used for receiving the parking space information sent by the vehicle control unit and a parking space selection instruction of a user, determining current parking space information and sending the current parking space information to the vehicle control unit;
the vehicle control unit is further configured to receive the parking space information sent by the first display device, the environment perception data of the parking space sent by the sensing module, and generate control information according to the parking space information and the environment perception data of the parking space, so as to control the vehicle to park in the parking space.
Preferably, the vehicle-mounted human-computer interaction system further comprises a second display device;
the second display device is connected with the vehicle control unit and used for receiving the current path, the steering prompt information, the target vehicle speed information, the estimated arrival time information, the current position and the current speed sent by the vehicle control unit, displaying the current position of the vehicle on the current path, and displaying the steering prompt information, the target vehicle speed information, the estimated arrival time information and the current speed.
Preferably, the first display device is further configured to receive the current path, the current position, and the current speed sent by the vehicle control unit, display the current position of the vehicle on the current path, and display the current speed.
Preferably, the vehicle control unit is further configured to acquire failure information of an electronic control unit ECU;
the first display device is also used for receiving fault information sent by the vehicle control unit and displaying the fault information.
Preferably, the second end of the first display device is connected to the server, and is configured to receive a touch instruction input by a user in an abnormal condition, and send the touch instruction to the server, so that the server adjusts the operating state of the vehicle according to the touch instruction.
Preferably, the vehicle operation mode selection instruction includes:
a first driving mode when the lateral control switch is pressed;
a second driving mode when the longitudinal control switch is pressed;
a third driving mode in which the lateral control switch and the longitudinal control switch are simultaneously pressed; wherein the lateral control switch controls steering of the vehicle and the longitudinal control switch controls speed of the vehicle.
Preferably, the trip task information is sent to the vehicle control unit by a server; or,
and the travel task information is sent to the vehicle control unit by the first display device.
Preferably, the travel task information includes: a departure place, a destination, and a departure time, or a departure place and a destination.
Preferably, the vehicle control unit is specifically adapted to,
receiving travel task information;
sending map calling request information to a server according to the travel task information; wherein the map invocation request information includes: a departure location and a destination;
receiving an environment map file sent by a server;
and generating path planning data according to the environment map file.
Therefore, by applying the vehicle-mounted human-computer interaction system provided by the invention, the control of the vehicle is realized, and the user experience is improved.
Drawings
Fig. 1 is a schematic structural diagram of a vehicle-mounted human-computer interaction system according to an embodiment of the present invention.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be further noted that, for the convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Hereinafter, the first and second terms are merely used for distinguishing and have no other meaning.
Fig. 1 is a schematic structural diagram of a vehicle-mounted human-computer interaction system according to an embodiment of the present invention. The vehicle-mounted human-computer interaction system 100 can be applied to an automatic driving vehicle. As shown in fig. 1, the vehicle-mounted human-computer interaction system includes 100: a vehicle control unit 110, a first display device 120, and a second display device 130. In the vehicle commissioning phase, the vehicle Control Unit 110 may be an industrial personal computer, and after the vehicle leaves the factory, the vehicle Control Unit 110 may be an Automatic Vehicle Control Unit (AVCU). The underlying Vehicle controller hereinafter may be an autonomous Vehicle controller (BVCU).
The vehicle control unit 110 is configured to receive the trip task information, and call an environment map file according to the trip task information to generate path planning data; wherein the path planning data comprises at least one piece of path information.
The travel task information may be sent to the vehicle control unit 110 by the server, or may be sent to the vehicle control unit 110 by the first display device 120.
When the travel task information is transmitted from the server to the vehicle control unit 110, the user may select a departure place, a destination, and a travel time on the first display device 120 through a terminal or the like. By way of example and not limitation, the terminal may be a smart device such as a mobile phone or a PAD, and an Application (APP) that communicates with the server is installed on the terminal.
The vehicle control unit 110 is specifically configured to receive the trip task information, and send map calling request information to the server according to the trip task information; wherein the map calling request information includes: a departure location and a destination; receiving an environment map file sent by a server; and generating path planning data according to the environment map file. The path planning data comprises a plurality of pieces of path information, the mileage of the plurality of pieces of path information is different, and the plurality of pieces of path information have priority levels according to different mileage. It is understood that each path information may have a priority label, for example, there are 3 alternative path information in total, and three alternative path information have identifiers, for example, 1, 2, and 3, to indicate the priority of 3 alternative paths, for example and without limitation, the mileage may be increased from 1 to 3.
Further, the travel task information may include a departure place and a destination, or may further include a travel time. The vehicle control unit 110 may perform path planning according to the travel time and the environment map file, and generate path planning data. For example, different routes are planned according to whether the travel time is in a peak period or not. According to the time period of the travel time, the multiple pieces of route information may have priorities, for example, the travel time is 9 am, there are 3 alternative routes, and according to the travel time, the three alternative routes have congestion degree identifiers, and according to the congestion degree from high to low, the congestion degrees of the 3 alternative routes may be represented by a, B, and C, by way of example and not limitation, and the congestion degree may be sequentially increased from a to C. When the travel task information is selected by the user on the first display device 120, the first display device 120 transmits the departure place and the destination input by the user to the vehicle control unit 110 to cause the vehicle control unit 110 to perform path planning.
The first end of the first display device 120 is connected to the vehicle control unit 110, and is configured to receive the route planning data and the route selection instruction of the user sent by the vehicle control unit 110, determine a current route according to at least one piece of route information and the route selection instruction, and send the current route to the vehicle control unit 110.
The first Display device 120 may be a Liquid Crystal Display (LCD).
Specifically, after the vehicle control unit 110 sends the route planning data to the first display device 120, the user determines the current route by inputting a route selection instruction. The path selection instruction may be generated by the user touching the first display device 120. By way of example and not limitation, the user may determine that path 1 is the current path by touching path 1 for a preset time from among path 1, path 2, and path 3.
The vehicle control unit 110 is further configured to receive the current path sent by the first display device 120, the vehicle operation mode selection instruction sent by the BVCU, and the environment sensing data sent by the sensing module, generate decision result information according to the current path, the vehicle operation mode selection instruction, and the environment sensing data, process the decision result information, and generate steering control information and torque control information; then according to the steering control information, generating steering prompt information, and according to the torque control information, generating target vehicle speed information; finally, according to the current path and the target speed information, generating estimated arrival time information of the vehicle; wherein the context awareness data comprises a current location of the vehicle and a current speed of the vehicle.
Specifically, the whole vehicle is provided with a control panel, and the control panel is provided with a power switch, an automatic driving switch, a transverse control switch and a longitudinal control switch. Where the lateral control switch controls the steering of the vehicle, when the lateral control switch is pressed, denoted as a first driving mode, the user is required to manually control the vehicle speed. The longitudinal control switch controls the speed of the vehicle, when the longitudinal control switch is pressed, the second driving mode is indicated, the user is required to manually control the steering of the vehicle, and when the transverse control switch and the longitudinal control switch are simultaneously pressed, the third driving mode is indicated, namely, full-automatic driving. When in the first driving mode, only the steering control information is generated, when in the second driving mode, only the torque control information is generated, and when in the third driving mode, the torque control information and the steering control information are generated.
In order to prevent the transverse control switch and the longitudinal control switch from being pressed by mistake, an enabling switch can be further arranged, when the enabling switch and the transverse control switch are pressed simultaneously, the first driving mode is adopted, similarly, when the enabling switch and the longitudinal control switch are pressed simultaneously, the second driving mode is adopted, after the enabling switch and the transverse control switch are pressed, the transverse control switch is released, and the longitudinal control switch is pressed again, so that the third driving mode is adopted.
The whole vehicle is powered on by pressing a power switch, the vehicle control unit 110 and the BVCU perform self-checking after the power switch is powered on, and the vehicle control unit enters a standby mode after the self-checking is successful.
The BVCU is connected to the control panel through an Input/Output (IO) interface, and is configured to receive a vehicle operation mode selection instruction sent by the control panel, and send the vehicle operation mode selection instruction to the vehicle control unit 110 through a Controller Area Network (CAN) bus.
Specifically, the automatic driving mode is entered by pressing an automatic driving switch, and the vehicle mode selection command is generated by pressing a lateral control switch and/or a longitudinal control switch, and the vehicle mode selection command may be used to determine which automatic driving mode of the first to third driving modes the vehicle is in. The BVCU receives the ignition signal from the ignition system and then receives the vehicle mode selection signal from the control panel.
The vehicle control unit 110 and the server may communicate with each other by using a fourth generation communication system (4G) technology, a fifth generation communication system (5G) technology, a Wireless Fidelity (WI-FI) technology, or the like.
Next, the environmental sensing data of the above-mentioned sensing module will be specifically described.
The sensing module mentioned above may include a laser radar, a combined navigation system, a millimeter wave radar. The environment awareness data may be a general term for the plurality of signals, for example, the environment awareness data may include first environment awareness data of the lidar, second environment awareness data of the integrated navigation system, and third environment awareness data of the millimeter wave radar.
In the following, the source of each sensing data will be described in detail.
And the laser radar is configured to acquire first environment sensing data of the vehicle and send the first environment sensing data to the switch, so that the switch sends the first environment sensing data to the vehicle control unit 110. By way of example and not limitation, the number of lidar may be three, two 16-line lidar, one 32-line lidar. Two left 16 line lidar may be located on the left and right sides of the vehicle and a 32 line lidar may be located on the roof. Two 16-line lidar and one 32-line lidar each have their corresponding environmental awareness data, collectively referred to as first environmental awareness data. Therefore, the three laser radars work together, and the blind area of laser scanning is reduced.
And the integrated navigation system is used for acquiring second environment perception data of the vehicle and sending the second environment perception data to the communication interface conversion processing module. The integrated navigation System includes a Differential Global Positioning System (DGPS) chip and an Inertial Measurement Unit (IMU). The DGPS chip is externally connected with a Primary Global Positioning System (PGPS) antenna and a Secondary Global Positioning System (SGPS) antenna, so as to obtain the current position and the current speed of the vehicle. The inertial measurement unit is used for measuring the angular velocity and the acceleration of the moving object. The integrated navigation system measures the second environment sensing data, performs format conversion processing through the communication interface conversion module, and sends the second environment sensing data to the vehicle control unit 110.
And the millimeter wave radar is configured to acquire third environment sensing data of the vehicle and send the third environment sensing data to the vehicle control unit 110. Wherein, by way of example and not limitation, the number of millimeter wave radars may be two, the first being disposed in front of the vehicle and the other being disposed behind the vehicle for better monitoring.
Finally, the vehicle control unit 110 generates obstacle information according to the first to third environmental awareness data (collectively referred to as environmental awareness data), processes the current path and the obstacle information, generates decision result information, processes the decision result information, and generates steering control information and torque control information.
Specifically, the vehicle control unit 110 may generate the prompt information such as "turn left" according to the steering control information. The target vehicle speed information may be generated based on the torque control information. And the estimated arrival time information can also be generated according to the current path and the target vehicle speed information. The estimated time of arrival information is used to indicate an estimated time for the vehicle to reach the destination.
And a second display device 130 connected to the vehicle control unit 110, for receiving the current path, the steering prompt information, the target vehicle speed information, the estimated arrival time information, the current position and the current speed sent by the vehicle control unit 110, displaying the current position of the vehicle on the current path, and displaying the steering prompt information, the target vehicle speed information, the estimated arrival time information and the current speed.
Specifically, the second display device 130 may be disposed at the rear of the vehicle for prompting a pedestrian. The second display device 130 may also be an LCD, and in the vehicle commissioning phase, the second display device 130 may facilitate vehicle commissioning, and after the vehicle leaves the factory, the second display device 130 may prompt a pedestrian. Therefore, the second display device 130 displays the vehicle state information, and achieves the warning or reminding function.
The vehicle control unit 110 may transmit the previous path, the steering guidance information, the target vehicle speed information, the estimated arrival time information, the current position, and the current speed to the second display device 130 to cause the second display device 130 to display. Therefore, the purpose of reminding the pedestrian is achieved.
Further, the first display device 120 is further configured to receive the current path, the current position, and the current speed sent by the vehicle control unit 110, display the current position of the vehicle on the current path, and display the current speed. Therefore, the purpose of reminding passengers is achieved.
Further, the vehicle control unit 110 is also configured to acquire failure information of an Electronic Control Unit (ECU). The first display device 120 is also configured to receive failure information sent by the vehicle control unit 110 and display the failure information.
The ECU includes, but is not limited to, the vehicle control unit 110 and the BVCU, among others. Through showing fault information, can remind the passenger to the maintenance personal of being convenient for overhauls fast. Therefore, the fault processing speed of the whole vehicle is improved.
Further, a second end of the first display device 120 is connected to the server, and is configured to receive a touch instruction input by the user in an abnormal condition, and send the touch instruction to the server, so that the server adjusts the operating state of the vehicle according to the touch instruction.
When an abnormality (such as a malfunction or a road surface abnormality) has occurred in the vehicle, the passenger may generate a touch instruction by touching on the first display device 120, and the touch instruction may include an alarm message, which is sent to the server, and the server acquires the operating state of the vehicle, and the operating state of the vehicle is detected by a serviceman or the serviceman may perform maintenance of the vehicle. Therefore, the vehicle running state can be reported conveniently and rapidly, and the processing efficiency of abnormal conditions is improved.
Further, the vehicle control unit 110 is further configured to obtain parking space information according to the environment sensing data, and send the parking space information to the first display device 120; the first display device 120 is further configured to receive parking space information sent by the vehicle control unit 110 and a parking space selection instruction of a user, determine current parking space information, and send the current parking space information to the vehicle control unit 110; the vehicle control unit 110 is further configured to receive the parking space information sent by the first display device 120, the environment sensing data of the parking space sent by the sensing module, and generate control information according to the parking space information and the environment sensing data of the parking space, so as to control the vehicle to park in the parking space.
The vehicle control unit 110 may extract parking space information from the environment sensing data, display the parking space information on the first display device 120, and generate a parking space selection instruction by touching a certain parking space in the parking space information, so as to control the vehicle to park in the parking space selected by the passenger. Therefore, the user experience is improved.
Therefore, by applying the vehicle-mounted human-computer interaction system provided by the embodiment of the invention, the control of the vehicle is realized, and the user experience is improved.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, it should be understood that the above embodiments are merely exemplary embodiments of the present invention and are not intended to limit the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.
Claims (8)
1. A vehicle-mounted human-computer interaction system is characterized by comprising:
the vehicle control unit is used for receiving travel task information, calling an environment map file according to the travel task information and generating path planning data; wherein the path planning data comprises at least one path information;
the first end of the first display device is connected with the vehicle control unit and used for receiving the path planning data sent by the vehicle control unit and a path selection instruction of a user, determining a current path according to the at least one piece of path information and the path selection instruction, and sending the current path to the vehicle control unit;
the vehicle control unit is further used for receiving the current path sent by the first display device, a vehicle operation mode selection instruction sent by a bottom vehicle controller BVCU and environment perception data sent by a sensing module, generating decision result information according to the current path, the vehicle operation mode selection instruction and the environment perception data, processing the decision result information and generating steering control information and torque control information; then according to the steering control information, generating steering prompt information, and according to the torque control information, generating target vehicle speed information; finally, according to the current path and the target speed information, generating estimated arrival time information of the vehicle; wherein the vehicle operation mode selection instruction comprises: a first driving mode when the lateral control switch is pressed, a second driving mode when the longitudinal control switch is pressed, and a third driving mode when the lateral control switch and the longitudinal control switch are simultaneously pressed; the transverse control switch controls the steering of the vehicle, and the longitudinal control switch controls the speed of the vehicle; wherein the context awareness data comprises a current location of the vehicle and a current speed of the vehicle;
the vehicle control unit is further used for acquiring parking space information according to the environment sensing data and sending the parking space information to the first display device;
the first display device is further used for receiving the parking space information sent by the vehicle control unit and a parking space selection instruction of a user, determining current parking space information and sending the current parking space information to the vehicle control unit;
the vehicle control unit is further configured to receive the parking space information sent by the first display device, the environment perception data of the parking space sent by the sensing module, and generate control information according to the parking space information and the environment perception data of the parking space, so as to control the vehicle to park in the parking space.
2. The vehicle human-computer interaction system of claim 1, further comprising a second display device;
the second display device is connected with the vehicle control unit and used for receiving the current path, the steering prompt information, the target vehicle speed information, the estimated arrival time information, the current position and the current speed sent by the vehicle control unit, displaying the current position of the vehicle on the current path, and displaying the steering prompt information, the target vehicle speed information, the estimated arrival time information and the current speed.
3. The vehicle-mounted human-computer interaction system of claim 1, wherein the first display device is further configured to receive the current path, the current position and the current speed sent by the vehicle control unit, display the current position of the vehicle on the current path, and display the current speed.
4. The vehicle-mounted human-computer interaction system according to claim 1, wherein the vehicle control unit is further configured to acquire fault information of an Electronic Control Unit (ECU);
the first display device is also used for receiving fault information sent by the vehicle control unit and displaying the fault information.
5. The vehicle-mounted human-computer interaction system according to claim 1, wherein a second end of the first display device is connected with the server, and is used for receiving a touch instruction input by a user under an abnormal condition and sending the touch instruction to the server, so that the server adjusts the running state of the vehicle according to the touch instruction.
6. The vehicle-mounted human-computer interaction system according to claim 1, wherein the travel task information is sent to the vehicle control unit by a server; or,
and the travel task information is sent to the vehicle control unit by the first display device.
7. The vehicle-mounted human-computer interaction system of claim 1, wherein the travel task information comprises: a departure place, a destination, and a departure time, or a departure place and a destination.
8. An in-vehicle human-computer interaction system according to claim 7, characterized in that the vehicle control unit is specifically adapted to,
receiving travel task information;
sending map calling request information to a server according to the travel task information; wherein the map invocation request information includes: a departure location and a destination;
receiving an environment map file sent by a server;
and generating path planning data according to the environment map file.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810902413.7A CN109017814B (en) | 2018-08-09 | 2018-08-09 | Vehicle-mounted human-computer interaction system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810902413.7A CN109017814B (en) | 2018-08-09 | 2018-08-09 | Vehicle-mounted human-computer interaction system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109017814A CN109017814A (en) | 2018-12-18 |
CN109017814B true CN109017814B (en) | 2020-01-24 |
Family
ID=64632438
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810902413.7A Active CN109017814B (en) | 2018-08-09 | 2018-08-09 | Vehicle-mounted human-computer interaction system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109017814B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113183976B (en) * | 2021-04-30 | 2024-08-13 | 广东以诺通讯有限公司 | Automobile system control method |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5167051B2 (en) * | 2008-09-30 | 2013-03-21 | 富士重工業株式会社 | Vehicle driving support device |
CN104260722B (en) * | 2014-09-23 | 2017-06-06 | 北京理工大学 | A kind of automated parking system |
CN204647178U (en) * | 2015-02-26 | 2015-09-16 | 陕西法士特齿轮有限责任公司 | A kind of transmission for engineering machinery |
CN104908734A (en) * | 2015-05-19 | 2015-09-16 | 奇瑞汽车股份有限公司 | Control method and system of intelligent vehicle |
CN105620393B (en) * | 2015-12-25 | 2017-08-04 | 福建省汽车工业集团云度新能源汽车股份有限公司 | A kind of adaptive vehicle man machine's exchange method and its system |
CN107621267A (en) * | 2017-09-05 | 2018-01-23 | 上海博泰悦臻网络技术服务有限公司 | A kind of navigation method and system, car-mounted terminal based on road conditions camera |
-
2018
- 2018-08-09 CN CN201810902413.7A patent/CN109017814B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN109017814A (en) | 2018-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10678247B2 (en) | Method and apparatus for monitoring of an autonomous vehicle | |
CN105261224B (en) | Intelligent vehicle control method and apparatus | |
US20190147745A1 (en) | Apparatus and method for controlling platooning of vehicles | |
US20190064810A1 (en) | Method and apparatus for monitoring of an autonomous vehicle | |
US11972690B2 (en) | Platooning method, apparatus and system of autonomous driving platoon | |
JPWO2018096644A1 (en) | VEHICLE DISPLAY CONTROL DEVICE, VEHICLE DISPLAY CONTROL METHOD, AND VEHICLE DISPLAY CONTROL PROGRAM | |
CN105279958A (en) | Fleet management system and method | |
CN113727898B (en) | Automatic motor vehicle travel speed control based on driver driving behavior | |
US20220057796A1 (en) | Device and method for controlling autonomous driving | |
CN103770711A (en) | Method and system for adjusting side mirror | |
US20170168483A1 (en) | Method and device for receiving data values and for operating a vehicle | |
CN108819883B (en) | Vehicle controller | |
JP2024072009A (en) | Drive support device | |
CN109017814B (en) | Vehicle-mounted human-computer interaction system | |
Park et al. | Glossary of connected and automated vehicle terms | |
JP2015224954A (en) | Parking place selection device | |
US11472374B2 (en) | Vehicle control system | |
JP2023030111A (en) | Driving support device, driving support method, and program | |
US11924652B2 (en) | Control device and control method | |
CN109017789B (en) | Vehicle control method | |
US20210171060A1 (en) | Autonomous driving vehicle information presentation apparatus | |
EP3825196A1 (en) | Method, apparatus, and computer program product for automated lane merging assistance | |
CN109017634B (en) | Vehicle-mounted network system | |
CN109017790B (en) | Control management system | |
CN113939856A (en) | Communication system comprising a communication adapter and a coordinator device, and communication adapter, coordinator device and method for performing communication |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder | ||
CP01 | Change in the name or title of a patent holder |
Address after: B4-006, maker Plaza, 338 East Street, Huilongguan town, Changping District, Beijing 100096 Patentee after: Beijing Idriverplus Technology Co.,Ltd. Address before: B4-006, maker Plaza, 338 East Street, Huilongguan town, Changping District, Beijing 100096 Patentee before: Beijing Idriverplus Technology Co.,Ltd. |