CN111596814A - Man-machine interaction method and device for unmanned vehicle and unmanned vehicle - Google Patents
Man-machine interaction method and device for unmanned vehicle and unmanned vehicle Download PDFInfo
- Publication number
- CN111596814A CN111596814A CN202010299858.8A CN202010299858A CN111596814A CN 111596814 A CN111596814 A CN 111596814A CN 202010299858 A CN202010299858 A CN 202010299858A CN 111596814 A CN111596814 A CN 111596814A
- Authority
- CN
- China
- Prior art keywords
- multimedia file
- unmanned vehicle
- vehicle
- playing
- guiding
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
- B60R11/0264—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for control means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to a man-machine interaction method and device for an unmanned vehicle and the unmanned vehicle; the method comprises the following steps: loading multimedia files matched with the unmanned vehicle and used for guiding an operation process according to a preset vehicle identity, wherein the multimedia files comprise a first multimedia file and a second multimedia file; when a human body approach signal is detected, the first multimedia file is sent to a playing device carried by an unmanned vehicle for playing; and when an operation instruction input by the interactive interface is detected, controlling the playing equipment to broadcast a second multimedia file corresponding to the operation instruction. The scheme of the application guides the flow in the form of the multimedia file, and the representation form is rich and the efficiency is high; different guiding processes are carried out at different stages, so that guiding is closer to the operation of a user, and the effect is better; the scheme can be applied to the unmanned vehicle with the human-computer interaction interface so as to improve the human-computer interaction experience of the automatic driving vehicle.
Description
Technical Field
The application relates to the technical field of vehicle networking, in particular to a human-computer interaction method and device for an unmanned vehicle and the unmanned vehicle.
Background
With the development of artificial intelligence technology, artificial intelligence and automatic control permeate the aspects of daily life, and the progress of information technology and electronic technology provides a good foundation for the development of intelligent traffic, and the intelligent traffic becomes the direction of future traffic development. The unmanned vehicle is undoubtedly the development direction of future automobiles, has the advantages of safety, reliability, high efficiency and convenience, and can make up for various defects of the manned automobiles.
With the development of unmanned vehicle technology, a technical scheme for taking out, retailing and other activities by using an unmanned vehicle has appeared to solve the problem of the last kilometer of logistics.
In the related art, the interactive process of the user and the unmanned vehicle has the defects that the interactive operation completely depends on the explanation of personnel, the efficiency is low, and the cost is high.
Disclosure of Invention
In order to overcome the problems in the related art at least to a certain extent, the application provides a human-computer interaction method and device for an unmanned vehicle and the unmanned vehicle.
According to a first aspect of embodiments of the present application, there is provided a human-computer interaction method for an unmanned vehicle, including:
loading multimedia files matched with the unmanned vehicle and used for guiding an operation process according to a preset vehicle identity, wherein the multimedia files comprise a first multimedia file and a second multimedia file;
when a human body approach signal is detected, the first multimedia file is sent to a playing device carried by an unmanned vehicle for playing;
and when an operation instruction input by the interactive interface is detected, controlling the playing equipment to broadcast a second multimedia file corresponding to the operation instruction.
Further, the loading of the multimedia file matched with the unmanned vehicle for guiding the operation process comprises:
establishing communication connection with a server through a preset vehicle identity;
acquiring configuration information issued by a server, wherein the configuration information is issued after the server determines according to a vehicle identity;
and loading the multimedia file for guiding the operation flow according to the configuration information.
Further, the vehicle identity is a vehicle ID of the unmanned vehicle, and the configuration information includes a vehicle type of the unmanned vehicle;
the configuration information is determined by the server querying a database for the corresponding vehicle type based on the vehicle ID.
Further, the method further comprises:
when a human body approaching signal is detected, waking up the playing equipment;
preferably, when the human body approach signal disappears and the duration reaches a preset first time threshold, the playing device is turned off;
wherein, the human body approach signal is obtained by detecting through an infrared sensor and/or an ultrasonic radar.
Further, the playback device includes at least one of: a display screen and a loudspeaker;
the first multimedia file includes at least one of: images, text, voice;
the sending the first multimedia file to a playing device carried by an unmanned vehicle for playing comprises:
the guiding information in the form of images and/or characters is played through a display; and/or the presence of a gas in the gas,
and playing the guide information in the form of voice through a loudspeaker.
Furthermore, the display screen is a touch screen, and the interactive interface is the same touch screen;
the method further comprises the following steps:
and when the touch screen plays the first multimedia file, if a touch signal input through the touch screen is detected, controlling the touch screen to stop playing the first multimedia file, and switching the display content of the touch screen into an operation interface.
Further, the controlling the playing device to broadcast the second multimedia file corresponding to the operation instruction includes:
when the display content of the touch screen is an operation interface, acquiring an operation instruction input through the operation interface;
determining the next operation to be carried out according to the operation instruction;
and controlling the loudspeaker to broadcast corresponding voice information to prompt the next operation.
According to a second aspect of embodiments of the present application, there is provided a human-computer interaction device for an unmanned vehicle, the device comprising:
the loading module is used for loading a multimedia file which is matched with the unmanned vehicle and used for guiding an operation flow according to a preset vehicle identity, wherein the multimedia file comprises a first multimedia file and a second multimedia file;
the starting module is used for sending the first multimedia file to the playing equipment carried by the unmanned vehicle for playing when a human body approaching signal is detected;
and the guiding module is used for controlling the playing equipment to broadcast the second multimedia file corresponding to the operation instruction when the operation instruction input by the interactive interface is detected.
According to a third aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method according to any of the above embodiments.
According to a fourth aspect of embodiments of the present application, there is provided an unmanned vehicle including:
a vehicle body;
the playing device is used for playing the multimedia file guiding the operation flow;
the interactive interface is used for inputting an operation instruction;
a controller disposed within the vehicle body;
the playing device and the interactive interface are respectively electrically connected with the controller, and the controller is used for executing the operation steps of the method according to any one of the above embodiments.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
the scheme of the application guides the flow in the form of the multimedia file, and the representation form is rich and the efficiency is high; the corresponding guide process can be loaded according to the identity of the vehicle, so that the vehicle is more flexible to use; different guiding processes are carried out at different stages, so that guiding is closer to the operation of a user, and the effect is better; the scheme can be applied to the unmanned vehicle with the human-computer interaction interface so as to improve the human-computer interaction experience of the automatic driving vehicle.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic application environment diagram of a human-computer interaction method for an unmanned vehicle according to an embodiment.
FIG. 2 is a flow diagram illustrating a human-machine interaction method for an unmanned vehicle, according to an embodiment.
Fig. 3 is a schematic structural diagram of an unmanned vehicle according to an embodiment.
FIG. 4 is a flow diagram illustrating an interaction of an unmanned vehicle with a user, according to an embodiment.
Fig. 5 is a circuit block diagram of a monitoring pan/tilt control device of an unmanned vehicle according to an embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of methods and apparatus consistent with certain aspects of the present application, as detailed in the appended claims.
The man-machine interaction method for the unmanned vehicle can be applied to the application environment shown in the figure 1. The application environment comprises an unmanned vehicle 1, a network 2 and a server 3, and the unmanned vehicle 1 and the server 3 can be connected through the network 2 in a communication mode. The network system formed by the terminal 1, the network 2 and the server 3 may be based on the internet, or may be based on a local area network, or may be based on a combination network of the internet and the local area network, which is not described herein again.
The network 2 is used for realizing network connection between the unmanned vehicle 1 and the server 3, and between the server 3 and the user terminal 4, and may include various types of wired or wireless networks. The server 3 may be implemented by an independent server or a server cluster composed of a plurality of servers.
FIG. 2 is a flow diagram illustrating a human-machine interaction method for an unmanned vehicle, according to an example embodiment. The method specifically comprises the following steps:
step S1: loading multimedia files matched with the unmanned vehicle and used for guiding an operation process according to a preset vehicle identity, wherein the multimedia files comprise a first multimedia file and a second multimedia file;
step S2: when a human body approach signal is detected, the first multimedia file is sent to a playing device carried by an unmanned vehicle for playing;
step S3: and when an operation instruction input by the interactive interface is detected, controlling the playing equipment to broadcast a second multimedia file corresponding to the operation instruction.
The scheme of the application guides the flow in the form of the multimedia file, and the representation form is rich and the efficiency is high; the corresponding guide process can be loaded according to the identity of the vehicle, so that the vehicle is more flexible to use; different guiding processes are carried out at different stages, so that guiding is closer to the operation of a user, and the effect is better; the scheme can be applied to the unmanned vehicle with the human-computer interaction interface so as to improve the human-computer interaction experience of the automatic driving vehicle.
In some embodiments, the loading a multimedia file for guiding an operation flow matched with the unmanned vehicle comprises:
establishing communication connection with a server through a preset vehicle identity;
acquiring configuration information issued by a server, wherein the configuration information is issued after the server determines according to a vehicle identity;
and loading the multimedia file for guiding the operation flow according to the configuration information.
Specifically, the process of loading the multimedia file may be downloading the corresponding multimedia file from a server; or loading the type matched with the configuration information from a plurality of different types of multimedia files stored locally in advance.
In some embodiments, the vehicle identification is a vehicle ID of the unmanned vehicle, and the configuration information includes a vehicle type of the unmanned vehicle;
the configuration information is determined by the server querying a database for the corresponding vehicle type based on the vehicle ID.
For example, the unmanned vehicles may be of various types, including retail type, distribution type, etc., and may be universal in nature. At this time, during the operation of a specific unmanned vehicle, it is necessary to first confirm what type of task is executed by the unmanned vehicle. And the task assignment information of the unmanned vehicle is stored in the server side, and the unmanned vehicle needs to be confirmed through inquiry.
In some embodiments, the method further comprises:
when a human body approaching signal is detected, waking up the playing equipment;
preferably, when the human body approach signal disappears and the duration reaches a preset first time threshold, the playing device is turned off;
wherein, the human body approach signal is obtained by detecting through an infrared sensor and/or an ultrasonic radar.
For example, if an unmanned vehicle is a retail type, the playback device needs to be turned off when no user is at the accessory, so as to save electric energy and prolong the endurance time. Only when the user approaches, the playing device is awakened to start guiding; after the user leaves for 10 seconds (i.e. the first time threshold, which may also be set to 15 seconds, 20 seconds, etc.), he turns off actively.
In some embodiments, the playback device comprises at least one of: a display screen and a loudspeaker;
the first multimedia file includes at least one of: images, text, voice;
the sending the first multimedia file to a playing device carried by an unmanned vehicle for playing comprises:
the guiding information in the form of images and/or characters is played through a display; and/or the presence of a gas in the gas,
and playing the guide information in the form of voice through a loudspeaker.
The scheme guides in a mode of combining video and audio, so that the guiding content is vivid and vivid, the guiding effect is good, and the interactive experience is good.
In some embodiments, the display screen is a touch screen, and the interactive interface is the same touch screen;
the method further comprises the following steps:
and when the touch screen plays the first multimedia file, if a touch signal input through the touch screen is detected, controlling the touch screen to stop playing the first multimedia file, and switching the display content of the touch screen into an operation interface.
Currently, the most common human-computer interaction interface is a touch screen, and when a user does not operate the touch screen, the touch screen is used as a display screen to play a video guide flow. When a user clicks any position of the touch screen, the touch screen is switched to an operation mode, video playing is stopped, an operation interface is displayed, and the user can conveniently perform subsequent operation.
In some embodiments, the controlling the playing device to broadcast the second multimedia file corresponding to the operation instruction includes:
when the display content of the touch screen is an operation interface, acquiring an operation instruction input through the operation interface;
determining the next operation to be carried out according to the operation instruction;
and controlling the loudspeaker to broadcast corresponding voice information to prompt the next operation.
For example, the second multimedia file may be a voice file. When the user selects the commodity to be purchased, the user can prompt 'please pay XX element' and the like through voice; when the user finishes paying, the user can prompt with voice to 'please take your goods from the lower goods taking port', etc.
The scheme of the application is described in an expanded way by combining a specific unmanned vehicle application scene.
As shown in fig. 3, the method of the above-described embodiment of the present application may be applied to an unmanned vehicle, which may be a retail type unmanned vehicle, as shown in the drawings. One side of this unmanned vehicle 1 is provided with touch-sensitive screen 101, window 102, gets goods mouth 103, and the user can look over the commodity that unmanned vehicle 1 inside was placed through window 102, purchases the operation through touch-sensitive screen 101, takes away commodity through getting goods mouth 103 after the payment is accomplished.
The unmanned vehicle can be provided with a screen and two loudspeakers, the screen on the right side of the vehicle body displays image playing guidance, and the loudspeakers arranged corresponding to the vehicle head and the vehicle tail broadcast voice; the display of TP (Touch Panel) on the LCD is controlled by a VCI (Vehicle cloud Interface) system, and the functions provided by the unmanned Vehicle are guided and operated in a mode of controlling video playing through software. Specifically, different instructions are issued according to different service functions of the unmanned vehicle product through the cloud to trigger the terminal control system to output and play different image guidance and voice broadcast prompt guidance through the software program control screen, and unmanned experience interaction is really realized.
As shown in fig. 4, first, the vehicle is started, the unmanned vehicle establishes long connection communication with the cloud end through the vehicle ID, the cloud end identifies a database according to the vehicle ID to match (query) a corresponding vehicle type, and then issues the vehicle type to the unmanned vehicle, and the unmanned vehicle receives a vehicle type instruction (for example, the current type instruction is a vending type unmanned vehicle), and executes a corresponding image playing and voice broadcasting service function according to the received current vehicle type.
When a person approaches to an unmanned vehicle, the LCD screen and the voice broadcasting function are automatically awakened through infrared and ultrasonic radar technologies. The user leads to the guide flow of screen images and voice broadcast, and after shopping is finished, the user is prompted to 'please take your goods from the goods taking port below'. According to infrared and ultrasonic radar technique, when the user leaves, the voice broadcast suggestion "welcome and give a good news to the screen and close the voice module, reduce the loss of electric quantity.
The scheme of this application helps the user to carry out human-computer interaction more conveniently through the mode of LCD screen and pronunciation, perfects the human-computer interaction effect from the angle of vision and sense of hearing, can strengthen the friendly interaction of people's car, strengthens unmanned car and interacts and experiences and feel.
FIG. 5 is a block circuit diagram illustrating a human-machine interface for an unmanned vehicle, according to an example embodiment. The device includes:
the loading module is used for loading a multimedia file which is matched with the unmanned vehicle and used for guiding an operation flow according to a preset vehicle identity, wherein the multimedia file comprises a first multimedia file and a second multimedia file;
the starting module is used for sending the first multimedia file to the playing equipment carried by the unmanned vehicle for playing when a human body approaching signal is detected;
and the guiding module is used for controlling the playing equipment to broadcast the second multimedia file corresponding to the operation instruction when the operation instruction input by the interactive interface is detected.
With regard to the apparatus in the above embodiment, the specific steps in which the respective modules perform operations have been described in detail in the embodiment related to the method, and are not described in detail herein. The various modules in the interactive apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In some embodiments, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of: loading multimedia files matched with the unmanned vehicle and used for guiding an operation process according to a preset vehicle identity, wherein the multimedia files comprise a first multimedia file and a second multimedia file; when a human body approach signal is detected, the first multimedia file is sent to a playing device carried by an unmanned vehicle for playing; and when an operation instruction input by the interactive interface is detected, controlling the playing equipment to broadcast a second multimedia file corresponding to the operation instruction.
In some embodiments, there is provided an unmanned vehicle comprising:
a vehicle body;
the playing device is used for playing the multimedia file guiding the operation flow;
the interactive interface is used for inputting an operation instruction;
a controller disposed within the vehicle body;
the playing device and the interactive interface are respectively electrically connected with the controller, and the controller is used for executing the following steps: loading multimedia files matched with the unmanned vehicle and used for guiding an operation process according to a preset vehicle identity, wherein the multimedia files comprise a first multimedia file and a second multimedia file; when a human body approach signal is detected, the first multimedia file is sent to a playing device carried by an unmanned vehicle for playing; and when an operation instruction input by the interactive interface is detected, controlling the playing equipment to broadcast a second multimedia file corresponding to the operation instruction.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. Further, in the description of the present application, the meaning of "a plurality" means at least two unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Claims (10)
1. A human-computer interaction method for an unmanned vehicle, comprising:
loading multimedia files matched with the unmanned vehicle and used for guiding an operation process according to a preset vehicle identity, wherein the multimedia files comprise a first multimedia file and a second multimedia file;
when a human body approach signal is detected, the first multimedia file is sent to a playing device carried by an unmanned vehicle for playing;
and when an operation instruction input by the interactive interface is detected, controlling the playing equipment to broadcast a second multimedia file corresponding to the operation instruction.
2. The method of claim 1, wherein loading the multimedia file matched to the unmanned vehicle for guiding the operation flow comprises:
establishing communication connection with a server through a preset vehicle identity;
acquiring configuration information issued by a server, wherein the configuration information is issued after the server determines according to a vehicle identity;
and loading the multimedia file for guiding the operation flow according to the configuration information.
3. The method of claim 2, wherein: the vehicle identity is the vehicle ID of the unmanned vehicle, and the configuration information comprises the vehicle type of the unmanned vehicle;
the configuration information is determined by the server querying a database for the corresponding vehicle type based on the vehicle ID.
4. The method according to any one of claims 1-3, further comprising:
when a human body approaching signal is detected, waking up the playing equipment;
preferably, when the human body approach signal disappears and the duration reaches a preset first time threshold, the playing device is turned off;
wherein, the human body approach signal is obtained by detecting through an infrared sensor and/or an ultrasonic radar.
5. The method according to any of claims 1-4, wherein the playback device comprises at least one of: a display screen and a loudspeaker;
the first multimedia file includes at least one of: images, text, voice;
the sending the first multimedia file to a playing device carried by an unmanned vehicle for playing comprises:
the guiding information in the form of images and/or characters is played through a display; and/or the presence of a gas in the gas,
and playing the guide information in the form of voice through a loudspeaker.
6. The method of claim 5, wherein the display screen is a touch screen and the interactive interface is the same touch screen;
the method further comprises the following steps:
and when the touch screen plays the first multimedia file, if a touch signal input through the touch screen is detected, controlling the touch screen to stop playing the first multimedia file, and switching the display content of the touch screen into an operation interface.
7. The method according to claim 6, wherein the controlling the playback device to broadcast the second multimedia file corresponding to the operation instruction comprises:
when the display content of the touch screen is an operation interface, acquiring an operation instruction input through the operation interface;
determining the next operation to be carried out according to the operation instruction;
and controlling the loudspeaker to broadcast corresponding voice information to prompt the next operation.
8. A human-computer interaction device for an unmanned vehicle, the device comprising:
the loading module is used for loading a multimedia file which is matched with the unmanned vehicle and used for guiding an operation flow according to a preset vehicle identity, wherein the multimedia file comprises a first multimedia file and a second multimedia file;
the starting module is used for sending the first multimedia file to the playing equipment carried by the unmanned vehicle for playing when a human body approaching signal is detected;
and the guiding module is used for controlling the playing equipment to broadcast the second multimedia file corresponding to the operation instruction when the operation instruction input by the interactive interface is detected.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
10. An unmanned vehicle, comprising:
a vehicle body;
the playing device is used for playing the multimedia file guiding the operation flow;
the interactive interface is used for inputting an operation instruction;
a controller disposed within the vehicle body;
wherein the playback device and the interactive interface are electrically connected to the controller, respectively, and the controller is configured to perform the method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010299858.8A CN111596814A (en) | 2020-04-16 | 2020-04-16 | Man-machine interaction method and device for unmanned vehicle and unmanned vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010299858.8A CN111596814A (en) | 2020-04-16 | 2020-04-16 | Man-machine interaction method and device for unmanned vehicle and unmanned vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111596814A true CN111596814A (en) | 2020-08-28 |
Family
ID=72189018
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010299858.8A Pending CN111596814A (en) | 2020-04-16 | 2020-04-16 | Man-machine interaction method and device for unmanned vehicle and unmanned vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111596814A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114802029A (en) * | 2022-04-19 | 2022-07-29 | 中国第一汽车股份有限公司 | Vehicle-mounted multi-screen control method, device and system and vehicle |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105346483A (en) * | 2015-11-04 | 2016-02-24 | 常州加美科技有限公司 | Man-machine interactive system for unmanned vehicle |
CN106203929A (en) * | 2016-07-28 | 2016-12-07 | 百度在线网络技术(北京)有限公司 | The purchase method of unmanned vehicle and device |
US20180067631A1 (en) * | 2016-09-08 | 2018-03-08 | Dji Technology, Inc. | Graphical user interface customization in a movable object environment |
CN108621150A (en) * | 2017-03-17 | 2018-10-09 | 北京京东尚科信息技术有限公司 | Dispensing machine people control method, device and dispensing machine people |
CN110103878A (en) * | 2019-05-22 | 2019-08-09 | 北京百度网讯科技有限公司 | Method and apparatus for controlling unmanned vehicle |
CN110851070A (en) * | 2019-11-12 | 2020-02-28 | 新石器慧通(北京)科技有限公司 | Parking method and device, unmanned vehicle and storage medium |
-
2020
- 2020-04-16 CN CN202010299858.8A patent/CN111596814A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105346483A (en) * | 2015-11-04 | 2016-02-24 | 常州加美科技有限公司 | Man-machine interactive system for unmanned vehicle |
CN106203929A (en) * | 2016-07-28 | 2016-12-07 | 百度在线网络技术(北京)有限公司 | The purchase method of unmanned vehicle and device |
US20180067631A1 (en) * | 2016-09-08 | 2018-03-08 | Dji Technology, Inc. | Graphical user interface customization in a movable object environment |
CN108621150A (en) * | 2017-03-17 | 2018-10-09 | 北京京东尚科信息技术有限公司 | Dispensing machine people control method, device and dispensing machine people |
CN110103878A (en) * | 2019-05-22 | 2019-08-09 | 北京百度网讯科技有限公司 | Method and apparatus for controlling unmanned vehicle |
CN110851070A (en) * | 2019-11-12 | 2020-02-28 | 新石器慧通(北京)科技有限公司 | Parking method and device, unmanned vehicle and storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114802029A (en) * | 2022-04-19 | 2022-07-29 | 中国第一汽车股份有限公司 | Vehicle-mounted multi-screen control method, device and system and vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108137059B (en) | Driving support device | |
CN113460070B (en) | Vehicle control method and device | |
WO2019201304A1 (en) | Face recognition-based voice processing method, and device | |
EP4072122A1 (en) | Method and system for controlling vehicle-mounted camera by means of mobile device, and device | |
US20210122242A1 (en) | Motor Vehicle Human-Machine Interaction System And Method | |
CN110197400B (en) | Advertisement pushing method and device, head-up display (HUD) and server | |
WO2022062491A1 (en) | Vehicle-mounted smart hardware control method based on smart cockpit, and smart cockpit | |
CN106427840A (en) | Method of self-adaptive vehicle driving mode and terminal | |
CN112959998A (en) | Vehicle-mounted human-computer interaction method and device, vehicle and electronic equipment | |
KR20210000466A (en) | Control system using gesture in vehicle | |
CN111885572A (en) | Communication control method based on intelligent cabin and intelligent cabin | |
CN109774601B (en) | Holographic projection method and system and automobile | |
CN105824255A (en) | Vehicle-mounted terminal control method and device | |
CN111596814A (en) | Man-machine interaction method and device for unmanned vehicle and unmanned vehicle | |
CN110852774A (en) | Vehicle-mounted advertisement pushing method based on starting picture, vehicle networking terminal and vehicle | |
CN105620392B (en) | Method and apparatus for state dependent micro-interaction completion | |
CN111050105A (en) | Video playing method and device, toy robot and readable storage medium | |
CN112437246B (en) | Video conference method based on intelligent cabin and intelligent cabin | |
CN110822647B (en) | Control method of air conditioner, air conditioner and storage medium | |
WO2023098564A1 (en) | Voice assistant display method and related device | |
CN109658924B (en) | Session message processing method and device and intelligent equipment | |
KR20210000465A (en) | Control system using gesture in vehicle | |
CN115734206A (en) | Vehicle-mounted machine Bluetooth connection method and device, electronic equipment, storage medium and vehicle | |
CN113709954B (en) | Control method and device of atmosphere lamp, electronic equipment and storage medium | |
CN113778633A (en) | Operation control method and device of vehicle machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200828 |
|
RJ01 | Rejection of invention patent application after publication |