CN105719648B - personalized unmanned vehicle interaction method and unmanned vehicle - Google Patents

personalized unmanned vehicle interaction method and unmanned vehicle Download PDF

Info

Publication number
CN105719648B
CN105719648B CN201610256753.8A CN201610256753A CN105719648B CN 105719648 B CN105719648 B CN 105719648B CN 201610256753 A CN201610256753 A CN 201610256753A CN 105719648 B CN105719648 B CN 105719648B
Authority
CN
China
Prior art keywords
user
unmanned vehicle
voice
address
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610256753.8A
Other languages
Chinese (zh)
Other versions
CN105719648A (en
Inventor
翟宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201610256753.8A priority Critical patent/CN105719648B/en
Publication of CN105719648A publication Critical patent/CN105719648A/en
Application granted granted Critical
Publication of CN105719648B publication Critical patent/CN105719648B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a personalized unmanned vehicle interaction method and an unmanned vehicle. One embodiment of the method comprises: receiving and storing unmanned vehicle information and user information set by a user, wherein the unmanned vehicle information comprises at least one of the following items: name of the unmanned vehicle, gender of the unmanned vehicle, voice of the simulated character; collecting and storing the voice of a user; receiving a voice instruction of a user, matching the voice instruction with the stored voice of the user to identify the user who sends the voice instruction and searching the unmanned vehicle information and the user information set by the user who sends the voice instruction; and selecting the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information. The embodiment realizes personalized human-computer interaction and improves the convenience and comfort of a user when riding.

Description

personalized unmanned vehicle interaction method and unmanned vehicle
Technical Field
The application relates to the technical field of automobiles, in particular to the technical field of unmanned vehicles, and particularly relates to a personalized unmanned vehicle interaction method and an unmanned vehicle.
Background
the unmanned vehicle can automatically identify traffic signs and driving information, is provided with electronic facilities such as a radar, a camera and global satellite navigation, and is provided with a synchronous sensor. The vehicle owner can automatically drive to the destination only by inputting the destination into the navigation system. In the driving process, the automobile uploads road condition information through the sensing equipment, and real-time positioning analysis is carried out on the basis of a large amount of data, so that the driving direction and speed are judged.
in the current unmanned vehicle, the main realization mode of human-vehicle interaction has two kinds: (1) touch screen control, in which a single device of the unmanned vehicle is controlled by directly touching a switch or a corresponding control in a control panel, but a user needs to directly contact the device switch or the panel, so that the operation has limitation; (2) the voice control only realizes one of the voice recognition and synthesis functions or adds a music playing function in most of the current unmanned vehicles, but the human-vehicle conversation function is not realized mostly, or the conversation sound is unchanged, so that the experience of human-vehicle interaction is not strong, and the personalized human-computer interaction cannot be realized.
Disclosure of Invention
The present application aims to provide an improved personalized unmanned vehicle interaction method and an unmanned vehicle, so as to solve the technical problems mentioned in the background art.
in a first aspect, the present application provides a method of personalized unmanned vehicle interaction, the method comprising: receiving and storing unmanned vehicle information and user information set by a user, wherein the unmanned vehicle information comprises at least one of the following items: name of the unmanned vehicle, gender of the unmanned vehicle, voice of the simulated character; collecting and storing the voice of a user; receiving a voice instruction of a user, matching the voice instruction with the stored voice of the user to identify the user who sends the voice instruction and searching the unmanned vehicle information and the user information set by the user who sends the voice instruction; and selecting the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information.
In some embodiments, the user information comprises at least one of: name, sex, home address, unit address, telephone book, note book.
In some embodiments, the method further comprises: and analyzing the address keywords in the voice command, and matching the address keywords with the searched user information to determine a specific destination address.
In some embodiments, the method further comprises: and selecting the sound of the unmanned vehicle to play the content in the note book in the user information.
In some embodiments, the method further comprises: detecting a fuel indicator lamp of the unmanned vehicle; and when the fuel indicator lamp is on, selecting the voice of the unmanned vehicle to remind a user to go to a gas station.
In some embodiments, the method further comprises: counting the times of addresses which are reached by a user by an unmanned vehicle within a preset time; classifying the addresses of the users arriving by the unmanned vehicles; setting the address with the largest number of times of arrival of the user by the unmanned vehicle in each category as a default address of the category, wherein the categories comprise at least one of the following: restaurants, cinemas, hospitals, malls; when the user's voice command only contains the destination address category and no specific destination address, the default address is used as the specific destination address.
In a second aspect, the present application provides a personalized unmanned vehicle, the unmanned vehicle comprising: the system comprises a collecting unit, a voice collecting unit and a voice processing unit, wherein the collecting unit is configured to receive unmanned vehicle information and user information set by a user and collect voice of the user; a storage unit configured to store unmanned vehicle information set by a user, user information, and a user's voice, wherein the unmanned vehicle information includes at least one of: name of the unmanned vehicle, gender of the unmanned vehicle, voice of the simulated character; the recognition unit is configured to receive a voice instruction of a user, match the voice instruction with the stored voice of the user to recognize the user who sends the voice instruction and search the unmanned vehicle information and the user information set by the user who sends the voice instruction; and the control unit is configured to select the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information.
in some embodiments, the user information comprises at least one of: name, sex, home address, unit address, telephone book, note book.
In some embodiments, the control unit is further configured to: and analyzing the address keywords in the voice command, and matching the address keywords with the searched user information to determine a specific destination address.
In some embodiments, the control unit is further configured to: and selecting the sound of the unmanned vehicle to play the content in the note book in the user information.
In some embodiments, the control unit is further configured to: detecting a fuel indicator lamp of the unmanned vehicle; and when the fuel indicator lamp is on, selecting the voice of the unmanned vehicle to remind a user to go to a gas station.
In some embodiments, the control unit is further configured to: counting the times of addresses which are reached by a user by an unmanned vehicle within a preset time; classifying the addresses of the users arriving by the unmanned vehicles; setting the address with the largest number of times of arrival of the user by the unmanned vehicle in each category as a default address of the category, wherein the categories comprise at least one of the following: restaurants, cinemas, hospitals, malls; when the user's voice command only contains the destination address category and no specific destination address, the default address is used as the specific destination address.
According to the personalized unmanned vehicle interaction method and the unmanned vehicle, the unmanned vehicle information and the user information are set, the user voice is collected to distinguish different users through voice, the unmanned vehicle voice preset by the user is selected, personalized human-computer interaction is achieved, and convenience and comfort degree of the user when the user rides the vehicle are improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a personalized unmanned vehicle interaction method according to the application;
FIG. 3 is a schematic diagram of one application scenario of a personalized unmanned vehicle interaction method according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of a method of personalized unmanned vehicle interaction according to the present application;
FIG. 5 is a schematic diagram of an embodiment of a personalized unmanned vehicle according to the present application;
Fig. 6 is a schematic structural diagram of a computer system suitable for implementing the vehicle-mounted intelligent brain according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
it should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the personalized unmanned vehicle interaction method or personalized unmanned vehicle of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include users 101, 102, 103, an unmanned vehicle 104, and an onboard intelligent brain 105. The vehicle-mounted intelligent brain 105 is installed in the unmanned vehicle 104 and performs voice interaction with the users 101, 102, 103 in the unmanned vehicle 104.
The users 101, 102, 103 interact with the in-vehicle smart brain 105 through voice to issue voice commands or receive voice feedback of the unmanned vehicle, etc.
the users 101, 102, 103 have the right to set the in-vehicle intelligent brain 105.
The vehicle-mounted intelligent brain 105 may be a vehicle-mounted intelligent brain providing various services, for example, receiving and storing the unmanned vehicle information and the user information set by the user, collecting and storing the voice of the user, receiving the voice instruction of the user, matching the voice instruction with the stored voice of the user to identify the user who sends the voice instruction, searching the unmanned vehicle information and the user information set by the user who sends the voice instruction, and selecting the voice of the unmanned vehicle to talk with the user according to the searched unmanned vehicle information.
it should be noted that the personalized unmanned vehicle interaction method provided by the embodiment of the present application is generally executed by the in-vehicle intelligent brain 105.
It should be understood that the number of users, unmanned vehicles, and onboard intelligence brains in fig. 1 are merely illustrative. There may be any number of users, unmanned vehicles, and vehicle-mounted intelligent brains, as desired for implementation.
With continued reference to fig. 2, a flow 200 of one embodiment of a method of personalized unmanned vehicle interaction according to the present application is shown. The personalized unmanned vehicle interaction method comprises the following steps:
Step 201, receiving and storing the unmanned vehicle information and the user information set by the user.
In this embodiment, the personalized unmanned vehicle interaction method is implemented by an electronic device (for example, a vehicle-mounted intelligent brain shown in fig. 1) on an unmanned vehicle, and the unmanned vehicle can receive and store unmanned vehicle information and user information set by a user, where the unmanned vehicle information includes at least one of the following: name of the unmanned vehicle, gender of the unmanned vehicle, voice of the simulated character. The unmanned vehicle can support information input of a plurality of users, and each user can customize exclusive unmanned vehicle information and personal information. For example, user a sets the unmanned vehicle to have the name of "xiaoming" and the gender of "man", and user a sets the name of personal information to "master", and user a may call the unmanned vehicle to "xiaoming", call user a to "master" and talk with user a with a man's voice. The user B sets the name of the unmanned vehicle as 'Zhilinger', the gender of the unmanned vehicle as 'woman', the voice of the imitated character is 'Zhilinger' and the name of the personal information is 'boss', so the user B can call the unmanned vehicle as 'Zhilinger', call the user B as 'boss' and have a dialogue with the user B by 'linger' sound.
In some optional implementations of this embodiment, the user information includes at least one of: name, sex, home address, unit address, telephone book, note book. For example, the user may set "name" and "gender" and the unmanned vehicle will speak to the user on a preset scale. The user can also set some address information, such as a home address, a unit address and other common addresses, after the information setting is completed, the user only needs to speak the place name but not the specific address, and the unmanned vehicle can find the specific address as the destination. The phone book and notepad of the user can be imported into the memory of the unmanned vehicle.
Step 202, collecting and storing the voice of the user.
In this embodiment, the unmanned vehicle collects and stores a segment of voice of the user through an audio input device such as a microphone, and the voice is used for identifying different users through a voice recognition technology to find the unmanned vehicle information and the user information set by the different users respectively.
step 203, receiving a voice instruction of a user, matching the voice instruction with the stored voice of the user to identify the user who sends the voice instruction, and searching the unmanned vehicle information and the user information set by the user who sends the voice instruction.
In this embodiment, the unmanned vehicle recognizes the user identity through voice and finds the unmanned vehicle information and the user information preset by the user. For example, the user says "go home", the unmanned vehicle recognizes that the user is the user a through voice recognition, and finds the unmanned vehicle information set by the user a: user information with a name of "Xiaoming", a gender of "Man" and user A: the last name is "Master".
And 204, selecting the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information.
In this embodiment, the unmanned vehicle uses the unmanned vehicle information found in step 203 to select the voice of the unmanned vehicle to talk to the user. For example, in step 203, after the user a is identified, the gender of the unmanned vehicle is found to be "male", and then the user a is conversed with the male voice.
for example, in step 203, the sound of the person to be imitated is found to be "lingering sound" after the unmanned vehicle recognizes that the person B is found, and then the content in the note book of the user B is played by "lingering sound".
for example, when the fuel indicator light of the unmanned vehicle is detected to be on, the unmanned vehicle can use a "lingering sound" to say, "boss, vehicle is not full, can only travel 20 kilometers, and is going to refuel.
with continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the personalized unmanned vehicle interaction method according to the present embodiment. In the application scenario of fig. 3, the unmanned vehicle information 303 and the user information 304 are first set in the menu 302, after the information setting is completed, the unmanned vehicle 301 receives the voice of the user and performs user voice recognition, and after the user is recognized, a conversation is performed with the unmanned vehicle 301 using the "guo-de" sound set by the user, and the content of the conversation is as shown in a reference 305. The unmanned vehicle 301 may set a destination address according to home address information set by the user.
According to the method provided by the embodiment of the application, the unmanned vehicle information and the user information are set according to the personal preference of the user, the personalized unmanned vehicle interaction is realized, the user does not need to manually input complete address information when taking the vehicle, and the convenience and the comfort level of the user when taking the vehicle are improved.
With further reference to fig. 4, a flow 400 of yet another embodiment of a method of personalized unmanned vehicle interaction is illustrated. The process 400 of the personalized unmanned vehicle interaction method comprises the following steps:
Step 401, receiving and storing the unmanned vehicle information and the user information set by the user.
step 402, collecting and storing the voice of the user.
step 403, receiving a voice instruction of a user, matching the voice instruction with the stored voice of the user to identify the user who sends the voice instruction, and searching the unmanned vehicle information and the user information set by the user who sends the voice instruction.
And step 404, selecting the voice of the unmanned vehicle to talk with the user according to the searched unmanned vehicle information.
since steps 401-404 are substantially the same as steps 201-204, no further description is provided herein.
step 405, address keywords in the voice command are analyzed, and the address keywords are matched with the found user information to determine a specific destination address.
In this embodiment, the voice instruction of the user may be an abbreviation of an address and does not include a specific address, for example, the user a says "go home", and the unmanned vehicle finds a home address set by the user a after recognizing the identity of the user a through voice, and then can determine that the specific address is an X-way Y number. The address keywords may include: home, organization, company, school, hospital, etc.
in some optional implementations of this embodiment, the number of times that a user arrives at an address of an unmanned vehicle within a preset time is counted, the addresses that the user arrives at by the unmanned vehicle are classified, the address with the largest number of times that the user arrives at by the unmanned vehicle in each category is set as a default address for the category, the category includes at least one of a restaurant, a movie theater, a hospital, and a mall, when the voice command of the user only includes a destination address category without a specific destination address, the default address is used as the specific destination address, for example, the addresses that the user a of the unmanned vehicle frequently visits within one month are classified into a restaurant, a movie theater, a hospital, and a mall, the user a visits the roast duck restaurant X6 times, the hot pot restaurant Y2 times, the western restaurant Z1 time, the movie theater 4 times, the movie theater 1 time, the movie theater 2 times, the supermarket J3 times, the mall shop K market 1 time, the default address of the duck restaurant X of the user a is set as the default address of the roast duck restaurant X, the movie restaurant, the restaurant default address of the restaurant J, the restaurant B is set as the default address of the restaurant B, and the user B can be classified as the default address of the specific destination address of the roast duck restaurant.
as can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the flow 400 of the personalized unmanned vehicle interaction method in this embodiment highlights the steps of identifying the address keyword and matching the specific address. Therefore, the scheme described by the embodiment can avoid the repeated address input of the user, and is time-saving and labor-saving.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present application provides an embodiment of a personalized unmanned vehicle, which corresponds to the method embodiment shown in fig. 2. The personalized unmanned vehicle embodiment can be applied to various types of unmanned vehicles.
as shown in fig. 5, the personalized unmanned vehicle 500 according to the embodiment includes: the device comprises an acquisition unit 501, a storage unit 502, a recognition unit 503 and a control unit 504. The acquisition unit 501 is configured to receive the unmanned vehicle information set by the user and acquire the voice of the user through the user information; the storage unit 502 is configured to store the unmanned vehicle information set by the user, the user information, and the sound of the user, wherein the unmanned vehicle information includes at least one of: name of the unmanned vehicle, gender of the unmanned vehicle, voice of the simulated character; the recognition unit 503 is configured to receive a voice instruction of a user, match the voice instruction with a stored voice of the user to recognize the user who sent the voice instruction, and search for the unmanned vehicle information and the user information set by the user who sent the voice instruction; the control unit 504 is configured to select a voice of the unmanned vehicle to talk to the user according to the found unmanned vehicle information.
In some optional implementations of this embodiment, the user information includes at least one of: name, sex, home address, unit address, telephone book, note book.
In some optional implementations of this embodiment, the control unit 504 is further configured to: and analyzing the address keywords in the voice command, and matching the address keywords with the searched user information to determine a specific destination address.
In some optional implementations of the present embodiment, the control unit 504 is further configured to select the sound of the unmanned vehicle to play the content in the note book in the user information.
In some optional implementations of this embodiment, the control unit 504 is further configured to detect a fuel indicator light of the unmanned vehicle; and when the fuel indicator lamp is on, selecting the voice of the unmanned vehicle to remind a user to go to a gas station.
In some optional implementations of this embodiment, the control unit 504 is further configured to count the number of times that the user arrives at the address by taking the unmanned vehicle within a preset time; classifying the addresses of the users arriving by the unmanned vehicles; setting the address with the largest number of times of arrival of the user by the unmanned vehicle in each category as a default address of the category, wherein the categories comprise at least one of the following: restaurants, cinemas, hospitals, malls; when the user's voice command only contains the destination address category and no specific destination address, the default address is used as the specific destination address.
Referring now to fig. 6, there is shown a schematic block diagram of a computer system 600 suitable for implementing an in-vehicle intelligent brain according to an embodiment of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, mouse, microphone, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
in particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601.
the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a storage unit, an identification unit, and a control unit. Here, the names of the units do not constitute a limitation of the unit itself in some cases, and for example, the collecting unit may also be described as a "unit that receives unmanned vehicle information and user information set by a user and collects a voice of the user".
as another aspect, the present application also provides a non-volatile computer storage medium, which may be the non-volatile computer storage medium included in the apparatus in the above-described embodiments; or it may be a non-volatile computer storage medium that exists separately and is not incorporated into the terminal. The non-transitory computer storage medium stores one or more programs that, when executed by a device, cause the device to: receiving and storing unmanned vehicle information and user information set by a user, wherein the unmanned vehicle information comprises at least one of the following items: name of the unmanned vehicle, gender of the unmanned vehicle, voice of the simulated character; collecting and storing the voice of a user; receiving a voice instruction of a user, matching the voice instruction with the stored voice of the user to identify the user who sends the voice instruction and searching the unmanned vehicle information and the user information set by the user who sends the voice instruction; and selecting the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information.
the above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (8)

1. A method of personalized unmanned vehicle interaction, the method comprising:
receiving and storing unmanned vehicle information and user information set by a user, wherein the unmanned vehicle information comprises the sound of the imitated person and at least one of the following: name of the unmanned vehicle, gender of the unmanned vehicle;
Collecting and storing the voice of a user;
Receiving a voice instruction of a user, matching the voice instruction with stored voice of the user to identify the user sending the voice instruction, searching the unmanned vehicle information and the user information set by the user sending the voice instruction, analyzing an address keyword in the voice instruction, and matching the address keyword with the searched user information to determine a specific destination address;
Selecting the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information;
Counting the times of addresses which are reached by a user by an unmanned vehicle within a preset time;
Classifying the addresses of the users arriving by the unmanned vehicles;
Setting the address with the largest number of times of arrival of the user by the unmanned vehicle in each category as a default address of the category, wherein the categories comprise at least one of the following: restaurants, cinemas, hospitals, malls;
when the user's voice command only contains the destination address category and no specific destination address, the default address is used as the specific destination address.
2. The personalized unmanned vehicle interaction method of claim 1, wherein the user information comprises at least one of: name, sex, home address, unit address, telephone book, note book.
3. the personalized unmanned vehicle interaction method of claim 2, further comprising:
and selecting the sound of the unmanned vehicle to play the content in the note book in the user information.
4. The personalized unmanned vehicle interaction method of claim 1, further comprising:
Detecting a fuel indicator lamp of the unmanned vehicle;
and when the fuel indicator lamp is on, selecting the voice of the unmanned vehicle to remind a user to go to a gas station.
5. A personalized unmanned vehicle, comprising:
the system comprises a collecting unit, a voice collecting unit and a voice processing unit, wherein the collecting unit is configured to receive unmanned vehicle information and user information set by a user and collect voice of the user;
A storage unit configured to store unmanned vehicle information set by a user, user information, and a user's voice, wherein the unmanned vehicle information includes a voice of a simulated character and at least one of: name of the unmanned vehicle, gender of the unmanned vehicle;
The recognition unit is configured to receive a voice instruction of a user, match the voice instruction with the stored voice of the user to recognize the user who sends the voice instruction and search the unmanned vehicle information and the user information set by the user who sends the voice instruction;
The control unit is configured to analyze address keywords in the voice command, match the address keywords with the searched user information to determine a specific destination address, and select the voice of the unmanned vehicle to have a conversation with the user according to the searched unmanned vehicle information;
The control unit is further configured to:
Counting the times of addresses which are reached by a user by an unmanned vehicle within a preset time;
Classifying the addresses of the users arriving by the unmanned vehicles;
Setting the address with the largest number of times of arrival of the user by the unmanned vehicle in each category as a default address of the category, wherein the categories comprise at least one of the following: restaurants, cinemas, hospitals, malls;
When the user's voice command only contains the destination address category and no specific destination address, the default address is used as the specific destination address.
6. the personalized unmanned vehicle of claim 5, wherein the user information comprises at least one of: name, sex, home address, unit address, telephone book, note book.
7. The personalized unmanned vehicle of claim 6, wherein the control unit is further configured to:
And selecting the sound of the unmanned vehicle to play the content in the note book in the user information.
8. the personalized unmanned vehicle of claim 5, wherein the control unit is further configured to:
Detecting a fuel indicator lamp of the unmanned vehicle;
And when the fuel indicator lamp is on, selecting the voice of the unmanned vehicle to remind a user to go to a gas station.
CN201610256753.8A 2016-04-22 2016-04-22 personalized unmanned vehicle interaction method and unmanned vehicle Active CN105719648B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610256753.8A CN105719648B (en) 2016-04-22 2016-04-22 personalized unmanned vehicle interaction method and unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610256753.8A CN105719648B (en) 2016-04-22 2016-04-22 personalized unmanned vehicle interaction method and unmanned vehicle

Publications (2)

Publication Number Publication Date
CN105719648A CN105719648A (en) 2016-06-29
CN105719648B true CN105719648B (en) 2019-12-13

Family

ID=56161512

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610256753.8A Active CN105719648B (en) 2016-04-22 2016-04-22 personalized unmanned vehicle interaction method and unmanned vehicle

Country Status (1)

Country Link
CN (1) CN105719648B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106203929A (en) * 2016-07-28 2016-12-07 百度在线网络技术(北京)有限公司 The purchase method of unmanned vehicle and device
CN106297789B (en) * 2016-08-19 2020-01-14 北京光年无限科技有限公司 Personalized interaction method and system for intelligent robot
CN107025908B (en) * 2017-04-06 2020-11-03 英华达(上海)科技有限公司 Control method and control system of unmanned vehicle
CN107390689B (en) * 2017-07-21 2019-05-14 北京图森未来科技有限公司 Realize system and method, the relevant device of vehicle automatic transportation
US10453456B2 (en) * 2017-10-03 2019-10-22 Google Llc Tailoring an interactive dialog application based on creator provided content
CN107909999A (en) * 2017-11-14 2018-04-13 深圳市可可卓科科技有限公司 Car networking intelligent response method and system
CN109272984A (en) * 2018-10-17 2019-01-25 百度在线网络技术(北京)有限公司 Method and apparatus for interactive voice
CN113110435A (en) * 2021-04-06 2021-07-13 新石器慧通(北京)科技有限公司 Unmanned vehicle driving mode switching method and device, electronic equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102636171A (en) * 2012-04-27 2012-08-15 深圳市凯立德科技股份有限公司 Voice navigation method and device
CN103118193A (en) * 2013-01-30 2013-05-22 广东欧珀移动通信有限公司 Method and device for switching users by voice based on mobile terminal
CN103187051A (en) * 2011-12-28 2013-07-03 上海博泰悦臻电子设备制造有限公司 Vehicle-mounted interaction device
CN103714727A (en) * 2012-10-06 2014-04-09 南京大五教育科技有限公司 Man-machine interaction-based foreign language learning system and method thereof
CN104123938A (en) * 2013-04-29 2014-10-29 富泰华工业(深圳)有限公司 Voice control system, electronic device and voice control method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8214219B2 (en) * 2006-09-15 2012-07-03 Volkswagen Of America, Inc. Speech communications system for a vehicle and method of operating a speech communications system for a vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103187051A (en) * 2011-12-28 2013-07-03 上海博泰悦臻电子设备制造有限公司 Vehicle-mounted interaction device
CN102636171A (en) * 2012-04-27 2012-08-15 深圳市凯立德科技股份有限公司 Voice navigation method and device
CN103714727A (en) * 2012-10-06 2014-04-09 南京大五教育科技有限公司 Man-machine interaction-based foreign language learning system and method thereof
CN103118193A (en) * 2013-01-30 2013-05-22 广东欧珀移动通信有限公司 Method and device for switching users by voice based on mobile terminal
CN104123938A (en) * 2013-04-29 2014-10-29 富泰华工业(深圳)有限公司 Voice control system, electronic device and voice control method

Also Published As

Publication number Publication date
CN105719648A (en) 2016-06-29

Similar Documents

Publication Publication Date Title
CN105719648B (en) personalized unmanned vehicle interaction method and unmanned vehicle
CN108984594B (en) Presenting related points of interest
US9188456B2 (en) System and method of fixing mistakes by going back in an electronic device
US9699617B2 (en) Sharing location information among devices
JP5234160B2 (en) Vehicle apparatus and information display system
CN109878434A (en) External information is presented
CN109841212B (en) Speech recognition system and speech recognition method for analyzing commands with multiple intents
US20140244259A1 (en) Speech recognition utilizing a dynamic set of grammar elements
EP3166023A1 (en) In-vehicle interactive system and in-vehicle information appliance
CN107463700B (en) Method, device and equipment for acquiring information
CN103038818A (en) Communication system and method between an on-vehicle voice recognition system and an off-vehicle voice recognition system
CN104284257A (en) System and method for mediation of oral session service
CN104750767A (en) Method and system for providing user with information in vehicle
US20130325483A1 (en) Dialogue models for vehicle occupants
CN110770819A (en) Speech recognition system and method
JP5942775B2 (en) Facility display data creation device, facility display system, and facility display data creation program
CN105987707B (en) Entering navigation target data into a navigation system
CN108351886A (en) The system for determining vehicle driver common interest
US20120299714A1 (en) Human-machine interface (hmi) auto-steer based upon-likelihood to exceed eye glance guidelines
CN112242143B (en) Voice interaction method and device, terminal equipment and storage medium
CN115905734A (en) Method and device for carrying out intelligent recommendation based on geographic position information
US11874129B2 (en) Apparatus and method for servicing personalized information based on user interest
US10915565B2 (en) Retrieval result providing device and retrieval result providing method
JP7058914B2 (en) Search device, search system, program
US20240127810A1 (en) Dialogue Management Method, Dialogue Management System, And Computer-Readable Recording Medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant