CN111811534B - Navigation control method, device, storage medium and equipment based on voice instruction - Google Patents

Navigation control method, device, storage medium and equipment based on voice instruction Download PDF

Info

Publication number
CN111811534B
CN111811534B CN201911359715.5A CN201911359715A CN111811534B CN 111811534 B CN111811534 B CN 111811534B CN 201911359715 A CN201911359715 A CN 201911359715A CN 111811534 B CN111811534 B CN 111811534B
Authority
CN
China
Prior art keywords
state
navigation
instruction
end point
switching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911359715.5A
Other languages
Chinese (zh)
Other versions
CN111811534A (en
Inventor
张旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Didi Infinity Technology and Development Co Ltd
Original Assignee
Beijing Didi Infinity Technology and Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Didi Infinity Technology and Development Co Ltd filed Critical Beijing Didi Infinity Technology and Development Co Ltd
Priority to CN201911359715.5A priority Critical patent/CN111811534B/en
Publication of CN111811534A publication Critical patent/CN111811534A/en
Application granted granted Critical
Publication of CN111811534B publication Critical patent/CN111811534B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3679Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Automation & Control Theory (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Navigation (AREA)

Abstract

The present disclosure provides a navigation control method, device, storage medium and equipment based on voice instruction, the method includes: determining a current first navigation state of a user; and responding to a voice control instruction of a user, and switching from the first navigation state to the second navigation state. When receiving a specific voice control instruction of a user, the method and the device execute different instruction processing logics in different navigation states in combination with the current navigation state of the user, can directly ensure that a network bus driver can directly enter the navigation state without additionally inputting a destination in combination with the current order situation, do not need to carry out additional operation, and can improve the use experience of the user.

Description

Navigation control method, device, storage medium and equipment based on voice instruction
Technical Field
The disclosure relates to the field of mobile internet, and in particular relates to a navigation control method, device, storage medium and equipment based on voice instructions.
Background
In the currently available navigation voice control technology, most of the navigation voice control technologies are directed to users of self-driving drivers, and the users need to explicitly tell the destination to go to in order to enter navigation. In the network taxi scene, the network taxi driver receives the order assigned by the system in most scenes, the order has a definite starting point and a destination, the driver does not need to actively input the destination, a taxi scene often appears in the network taxi operating process, the starting point and the destination are often required to be switched in the taxi scene, and the user can perform additional operation through redundant control instructions by using the existing navigation voice control technology, so that the use experience of the user is affected.
Disclosure of Invention
The embodiment of the disclosure aims to provide a navigation control method, a device, a storage medium and equipment based on voice instructions, so as to solve the problem that in the prior art, when a network bus driver uses navigation, a user can perform additional operation through redundant control instructions, and the use experience of the user is affected.
In order to solve the above technical problems, the embodiments of the present disclosure adopt the following technical solutions: a navigation control method based on voice instruction comprises the following steps: determining a current first navigation state of a user; responding to a voice control instruction of a user, and switching from the first navigation state to a second navigation state; wherein, the voice control instruction is any one of the following instructions: a start navigation instruction, a specified destination navigation instruction, an exit navigation instruction, a search information point instruction and a sequence number clear instruction; the first navigation state and the second navigation state are any one of the following states: wait state, end point input state, end point confirmation state, and navigation on-the-way state.
Further, when the voice control command is a start navigation command, the switching from the first navigation state to the second navigation state according to the voice control command includes: switching from the waiting state to the end point input state or the end point confirmation state; and switching from the end point confirmation state to the navigation en-route state.
Further, the switching from the waiting state to the endpoint input state or the endpoint confirmation state includes: judging whether the user is currently configured with a pick-up driving order; switching from the waiting state to the end point input state when the user is not currently configured to pick up a driving order; judging whether the pick-up driving order is in a pick-up driving state or not under the condition that the user is currently configured with the pick-up driving order; when the pick-up driving order is in a pick-up driving state, taking a boarding point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state; and when the pick-up driving order is not in the pick-up driving state, taking a get-off point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state.
Further, when the voice control command is a specified destination navigation command, the switching from the first navigation state to the second navigation state according to the voice control command includes: switching from the waiting state to the end point input state or the end point confirmation state according to a navigation end point search result; switching from the end point confirmation state to the end point input state or maintaining the end point confirmation state according to a navigation end point search result; and switching from the navigation in-process state to the end point input state or the end point confirmation state according to a navigation end point search result.
Further, when the voice control command is a search information point command, the switching from the first navigation state to the second navigation state according to the voice control command includes: and switching from the end point input state to the end point confirmation state or maintaining the end point input state according to the information point search result.
Further, when the voice control command is a sequence number explicit command, the switching from the first navigation state to the second navigation state according to the voice control command includes: when the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started successfully, switching from the end point confirmation state to the navigation in-process state; switching from the end point confirmation state to the waiting state under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation starting fails; and when the end point confirmation corresponding to the sequence number clear instruction fails, maintaining the end point confirmation state.
Further, when the voice control command is an exit navigation command, the switching from the first navigation state to the second navigation state according to the voice control command includes: and switching from any one of the end point confirmation state, the end point input state, and the navigation on-the-way state to a waiting command state.
Further, the voice control instruction further comprises at least one of a broadcast start instruction, a broadcast stop instruction, a volume adjustment instruction and a base map adjustment instruction; when the control instruction is an on-broadcasting instruction or an off-broadcasting instruction, the navigation broadcasting or the navigation broadcasting is started or stopped correspondingly according to the on-broadcasting instruction or the off-broadcasting instruction; when the control instruction is a volume adjustment instruction, adjusting the volume of the navigation broadcasting according to the volume adjustment instruction; and when the control instruction is a base map adjusting instruction, adjusting the size of the base map according to the base map adjusting instruction.
The embodiment of the disclosure also discloses a navigation control device based on the voice instruction, comprising: the state determining module is used for determining the current first navigation state of the user; the state switching module is used for responding to a voice control instruction of a user and switching from the first navigation state to the second navigation state; wherein, the voice control instruction is any one of the following instructions: a start navigation instruction, a specified destination navigation instruction, an exit navigation instruction, a search information point instruction and a sequence number clear instruction; the first navigation state and the second navigation state are any one of the following states: wait state, end point input state, end point confirmation state, and navigation on-the-way state.
Further, when the voice control instruction is a start navigation instruction, the state switching module is specifically configured to: switching from the waiting state to the end point input state or the end point confirmation state; and switching from the end point confirmation state to the navigation en-route state.
Further, the state switching module is specifically configured to: judging whether the user is currently configured with a pick-up driving order; switching from the waiting state to the end point input state when the user is not currently configured to pick up a driving order; judging whether the pick-up driving order is in a pick-up driving state or not under the condition that the user is currently configured with the pick-up driving order; when the pick-up driving order is in a pick-up driving state, taking a boarding point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state; and when the pick-up driving order is not in the pick-up driving state, taking a get-off point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state.
Further, when the voice control instruction is a specified destination navigation instruction, the state switching module is specifically configured to: switching from the waiting state to the end point input state or the end point confirmation state according to a navigation end point search result; switching from the end point confirmation state to the end point input state or maintaining the end point confirmation state according to a navigation end point search result; and switching from the navigation in-process state to the end point input state or the end point confirmation state according to a navigation end point search result.
Further, when the voice control instruction is a search information point instruction, the state switching module is specifically configured to: and switching from the end point input state to the end point confirmation state or maintaining the end point input state according to the information point search result.
Further, when the voice control instruction is a sequence number explicit instruction, the state switching module is specifically configured to: when the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started successfully, switching from the end point confirmation state to the navigation in-process state; switching from the end point confirmation state to the waiting state under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation starting fails; and when the end point confirmation corresponding to the sequence number clear instruction fails, maintaining the end point confirmation state.
Further, when the voice control instruction is an exit navigation instruction, the state switching module is specifically configured to: and switching from any one of the end point confirmation state, the end point input state, and the navigation on-the-way state to a waiting command state.
Further, the voice control instruction further comprises at least one of a broadcast start instruction, a broadcast stop instruction, a volume adjustment instruction and a base map adjustment instruction; the state switching module is specifically configured to: when the control instruction is a broadcast starting instruction or a broadcast closing instruction, corresponding to the broadcast starting instruction or the broadcast closing instruction, starting navigation broadcasting or closing navigation broadcasting; when the control instruction is a volume adjustment instruction, adjusting the volume of the navigation broadcasting according to the volume adjustment instruction; and when the control instruction is a base map adjusting instruction, adjusting the size of the base map according to the base map adjusting instruction.
The embodiment of the disclosure further provides a storage medium storing a computer program, wherein the computer program is executed by a processor to implement the steps of the method in any one of the above technical solutions.
The embodiment of the disclosure further provides an apparatus, at least comprising a memory and a processor, where the memory stores a computer program, and the processor, when executing the computer program on the memory, implements the steps of the method in any of the foregoing technical solutions.
The beneficial effects of the embodiment of the disclosure are that: when a specific voice control instruction of a user is received, different instruction processing logics are executed in different navigation states in combination with the current navigation state of the user, and a network bus driver can be directly ensured to directly enter the navigation state without additionally inputting a destination in combination with the current order situation, so that additional operation is not required, and the use experience of the user can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present disclosure, and other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
FIG. 1 is a flow chart of a navigation control method based on voice instructions in a first embodiment of the present disclosure;
FIG. 2 is a schematic diagram illustrating switching between navigation states according to a first embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a decoupling diagram of a navigation control device based on voice commands according to a second embodiment of the present disclosure;
fig. 4 is a schematic structural view of an apparatus according to a fourth embodiment of the present disclosure;
fig. 5 is a specific architecture and an interaction diagram of a user side and a server side in a fourth embodiment of the disclosure.
Detailed Description
Various aspects and features of the disclosure are described herein with reference to the drawings.
It should be understood that various modifications may be made to the embodiments of the application herein. Therefore, the above description should not be taken as limiting, but merely as exemplification of the embodiments. Other modifications within the scope and spirit of this disclosure will occur to persons of ordinary skill in the art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and, together with a general description of the disclosure given above and the detailed description of the embodiments given below, serve to explain the principles of the disclosure.
These and other characteristics of the present disclosure will become apparent from the following description of a preferred form of embodiment, given as a non-limiting example, with reference to the accompanying drawings.
It should also be understood that, although the present disclosure has been described with reference to some specific examples, a person skilled in the art will certainly be able to achieve many other equivalent forms of the present disclosure, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present disclosure will become more apparent in light of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present disclosure will be described hereinafter with reference to the accompanying drawings; however, it is to be understood that the disclosed embodiments are merely examples of the disclosure, which may be embodied in various forms. Well-known and/or repeated functions and constructions are not described in detail to avoid obscuring the disclosure in unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not intended to be limiting, but merely serve as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure.
The specification may use the word "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the disclosure.
The first embodiment of the present disclosure provides a navigation control method based on voice command, which is mainly used by a network bus driver in the process of receiving and driving, and the flowchart is shown in fig. 1, and mainly includes steps S101 and S102:
s101, determining a current first navigation state of a user;
s102, responding to a voice control instruction of a user, and switching from the first navigation state to the second navigation state.
During traveling, a net bus driver is usually in any one of the following states: the waiting state, the end point input state, the end point confirmation state, and the navigation on-the-way state, that is, the first navigation state and the second navigation state, may be any one of the above-mentioned several states. The waiting state refers to a state of waiting for a voice control instruction of a user in a non-navigation state; the terminal input state is a state for indicating to guide the user to input a terminal when the user is not allocated to take over a driving order and needs to enter navigation; the terminal confirmation state is a state in which after a user actively inputs a terminal through a voice instruction or after being allocated to take over a driving order, the user waits for confirmation whether to take the terminal input by the voice instruction or a get-on point and a get-off point in the order as navigation terminals; the navigation in-process state refers to a state that navigation is started and a user voice control instruction can be received in the navigation process.
The voice control instruction mainly comprises any one of the following instructions: a start navigation instruction, a specified end navigation instruction, an exit navigation instruction, a search information point instruction, and a sequence number clear instruction. After receiving the instruction sent by the user in the form of voice, the voice can be converted into text through the voice conversion service, so that a back-end server or a processor can acquire the voice control instruction actually sent by the user. Specifically, the navigation start instruction corresponds to a voice instruction such as "start navigation", "enter navigation", "navigation start", etc. that the user currently wants to open or enter the navigation function; the appointed terminal navigation instruction corresponds to a specific terminal which the user wants to reach and wants to enter a navigation function, and often corresponds to voice instructions such as … …, … … and the like; the exit navigation instruction corresponds to the voice instructions such as 'end navigation', 'navigation stop' and the like when the user wants to end or exit the navigation function; the instruction of retrieving information points corresponds to a specific place which the user wants to reach and wants to enter a retrieval function, and often corresponds to a voice instruction of 'helping me to find nearby … …', 'nearest … … where' and the like, wherein the place input by the user is usually an information point, such as a bank, a subway station, a convenience store, an office building and the like; the sequence number clear instruction corresponds to sequence number information specified by a user, and often corresponds to voice instructions such as a … … th result, a … … th selection, and the like.
After receiving the voice of the user and converting the voice into a text instruction, determining a voice control instruction actually corresponding to the text instruction, at the moment, acquiring a current first navigation state of the user, responding to the determined voice control instruction, and switching the current first navigation state to a second navigation state according to the specific content of the control instruction, wherein the first navigation state and the second navigation state can be any one of a waiting state, an end point input state, an end point confirmation state and a navigation in-process state, and can be the same state or different states, and if the first navigation state and the second navigation state are the same, the current control instruction is proved to have no substantial influence on the navigation state. The switching logic between the navigational states is described in detail below in connection with fig. 2.
Fig. 2 is a schematic diagram illustrating switching between various navigation states, where a solid frame corresponds to each navigation state, a dashed frame corresponds to an intermediate state of transition between various navigation states, and the intermediate state refers to a process of processing by a system or software when responding to a control instruction, and during the processing, the system or software cannot respond to a voice control instruction sent by a user, but switches to different navigation states according to specific processing conditions; the direction indicated by the arrow in the figure is the switching of the navigation state, and the characters marked on the arrow are the currently received voice control instruction, wherein the characters marked with brackets are the execution result of the intermediate state.
When a voice control instruction sent by a user is a navigation starting instruction, if the current navigation state (namely, a first navigation state) of the user is a waiting state, switching the waiting state to an end point input state or an end point confirmation state (namely, a second navigation state) according to whether the user is currently configured with a driving order taking situation or not in the navigation entering process; if the current navigation state of the user is an end point confirmation state, when a navigation starting instruction is received, the user is considered to determine that the currently displayed place is taken as a navigation end point, a navigation function is started at the moment, the navigation is switched to a state in the middle of navigation under the condition that the navigation is successfully started, and if the navigation is successfully started, the navigation starting operation can be repeatedly performed until the starting is successful; in the destination input state and the navigation in-process state, if the user issues a navigation start instruction, the instruction cannot affect the current navigation state, and therefore, the current navigation state is kept unchanged (not shown in fig. 2).
Specifically, when the user is not configured to take over the driving order, the fact that no designated destination needs to be reached by the driver user is indicated, and at the moment, an end point input state can be entered to receive the destination actively input by the driver user as a navigation end point; if the user is currently configured with a pick-up driving order, the navigation terminal can be automatically determined according to the current order state of the user, the navigation terminal is switched to a terminal confirmation state, and a driver determines whether to navigate according to the place in the current order; specifically, the order state mainly comprises a receiving driving state and a sending driving state, in the receiving driving state, a driver needs to travel to a boarding point selected by a passenger to receive the passenger, and at the moment, the boarding point of the passenger needs to be used as a navigation terminal point for navigation; after receiving the passenger, the order state is switched to the driving sending state, namely the passenger is sent to the departure point to be arrived, at this time, the departure point selected by the passenger is required to be used as the navigation terminal point, and therefore, before switching the waiting state to the terminal point confirmation state, the driver needs to be combined with whether the user is in the driving receiving state or not currently so as to automatically acquire different navigation terminal points.
When a voice control instruction sent by a user is a specified destination navigation instruction, if the current navigation state of the user is a waiting state, searching a specified destination carried in the instruction when the specified destination navigation is received, entering a destination input state or a destination confirmation state according to a searching result, specifically, under the condition that searching fails, switching to the destination input state at the moment, prompting the user to input the navigation destination again, and if searching is successful, taking the position corresponding to the specified destination as the navigation destination, switching to the destination confirmation state to wait for confirmation of the user; if the current navigation state of the user is an end point confirmation state, when a specified end point navigation instruction is received, the position confirmed by the current user is not the position which the user actually wants to reach, at the moment, the specified end point carried in the instruction is searched again, and the end point input state or the end point confirmation state is entered according to the search result; if the user receives the specified destination navigation command when the current navigation state is the navigation in-process state, the user may need to change the driving destination temporarily, or the passenger currently driving wants to change the destination when the passenger does not pass through the system software, and the like, at this time, the user correspondingly switches to the destination input state or the destination confirmation state according to the search result of the specified destination carried in the command.
When a voice control instruction sent by a user is an instruction for searching information points, if the current navigation state of the user is an end point input state, searching information points carried in the instruction, entering an end point input state or an end point confirmation state according to a search result, specifically, when the search fails, the system cannot determine the corresponding appointed information points in the instruction, switching to the end point input state at the moment, prompting the user to input the information points or the end points again, if the search succeeds, presenting the searched information point result to the user, switching into the end point confirmation state, and waiting for the user to determine which search result is specifically used as the navigation end point through a sequence number clear instruction; if the current navigation state of the user is a waiting state, an end point confirmation state or an in-process navigation state, the specific operation and processing logic when receiving the instruction of the search information point are similar to those when the end point input state receives the instruction of the search information point, and the details are not repeated here, and the response switching logic is not shown in fig. 2.
When the voice control instruction sent by the user is a sequence number clear instruction, after the information point is successfully searched, waiting for the user to select a specific search result in an end point confirmation state as a navigation end point, and switching the end point confirmation state to a navigation intermediate state according to the specific sequence number determined in the sequence number clear instruction when the end point confirmation corresponding to the sequence number is successful and navigation is normally started; if the terminal confirmation is successful but the navigation is not successfully started, repeated starting operation of navigation can be performed, and under the condition that the navigation is continuously started for a plurality of times, the terminal confirmation state is switched to the waiting state, and the user can be prompted by characters or voice to start the navigation, so that software restarting or consultation maintenance personnel can be recommended; if the end point confirmation corresponding to the sequence number in the sequence number clear instruction fails, if three search results are provided currently and the user selects a fifth result in the control instruction, the system cannot confirm, and can be kept in an end point confirmation state at this time and wait for other control instructions sent by the user; in the waiting state, the end point input state, and the navigation in-process state, if the user issues a sequence number clear instruction, the instruction cannot affect the current navigation state, and therefore, the current navigation state is kept unchanged (not shown in fig. 2).
The end point confirmation state, the end point input state and the navigation in-process state are all actually states that a user is about to navigate or is navigating, and if the user sends out a navigation exit instruction in any one of the three states, the user is indicated to want to stop navigating currently, and at the moment, the user can directly switch any one of the three states to a waiting state; if the user is currently in a waiting state, the user does not actually perform any navigation-related operation, and if an exit navigation command is received at this time, the user is kept in the waiting state (not shown in fig. 2).
Further, the voice control instruction further comprises at least one of a broadcast start instruction, a broadcast stop instruction, a volume adjustment instruction and a base map adjustment instruction, and the voice control instruction is used for performing some basic operations of a user in the process of using software. For example, in the navigation process, a user can control the start or stop of the navigation broadcasting by sending a start broadcasting instruction or a stop broadcasting instruction; or the volume of the navigation broadcasting is adjusted by a volume adjustment command in any navigation state, so as to realize operations such as sound increase, sound decrease, silence and the like (not shown in fig. 2); or by adjusting the size of the base map via the base map adjustment instruction, such as performing operations of adjusting the base map level to a maximum, adjusting the base map level to a minimum, enlarging the base map, reducing the base map, etc. (not shown in fig. 2).
When receiving a specific voice control instruction of a user, the embodiment combines the current navigation state of the user, executes different instruction processing logics in different navigation states, can directly ensure that a network bus driver can directly enter the navigation state without additionally inputting a destination in combination with the current order situation, does not need to perform additional operation, and can improve the use experience of the user.
The second embodiment of the present disclosure provides a navigation control device based on voice command, which may be installed in a background server of a network bus owner or navigation software to interact with a user terminal used by the network bus owner, and the structural schematic diagram is shown in fig. 3, and mainly includes a state determining module 10 and a state switching module 20 that are coupled to each other, where the state determining module 10 is mainly used to determine a current first navigation state of the user, and the state switching module 20 is mainly used to switch the first navigation state to the second navigation state in response to a voice control command of the user.
During traveling, a net bus driver is usually in any one of the following states: the waiting state, the end point input state, the end point confirmation state, and the navigation on-the-way state, that is, the first navigation state and the second navigation state, may be any one of the above-mentioned several states. The voice control instruction mainly comprises any one of the following instructions: a start navigation instruction, a specified end navigation instruction, an exit navigation instruction, a search information point instruction, and a sequence number clear instruction. After the user terminal receives the instruction sent by the user in the voice form, the user terminal can convert the voice into the text through the voice conversion service, the text is sent to the back-end server, the back-end server or the processor can acquire the voice control instruction actually sent by the user, the user terminal can directly send the instruction sent in the voice form to the back-end server, the back-end server performs the operation of converting the voice into the text, and the voice control instruction actually sent is determined according to the text correspondence.
After determining the voice control command sent by the user, the state determining module 10 obtains the current first navigation state of the user, responds to the determined voice control command through the state switching module 20, and switches the current first navigation state to the second navigation state according to the specific content of the control command, wherein the first navigation state and the second navigation state can be any one of a waiting state, an end point input state, an end point confirmation state and a navigation in-process state, and the first navigation state and the second navigation state can be the same state or different states, and if the first navigation state and the second navigation state are the same, the current control command is proved to have no substantial influence on the navigation state. The switching logic between the navigational states is described in detail below in connection with fig. 2.
When determining that the voice control instruction sent by the user is a start navigation instruction, if the current navigation state (i.e. the first navigation state) of the user is a waiting state, the state switching module 20 switches the waiting state to an end point input state or an end point confirmation state (i.e. the second navigation state) according to whether the user is currently configured with a drive order; if the current navigation state of the user is an end point confirmation state, when a navigation starting instruction is received, the user is considered to determine that the currently displayed place is taken as a navigation end point, a navigation function is started at the moment, the navigation is switched to a state in the middle of navigation under the condition that the navigation is successfully started, and if the navigation is successfully started, the navigation starting operation can be repeatedly performed until the starting is successful; in the end point input state and the navigation in-process state, if the user sends a navigation starting instruction, the instruction cannot influence the current navigation state, so that the current navigation state is kept unchanged.
Specifically, the state switching module 20, when determining that the user is not currently configured to take over the driving order, indicates that no designated destination needs the driver user to arrive, and can enter an end point input state to receive the destination actively input by the driver user as a navigation end point; if the user is determined to be currently configured with the pick-up driving order, the navigation terminal can be automatically determined according to the current order state of the user, the navigation terminal is switched to a terminal confirmation state, and a driver determines whether to navigate according to the place in the current order; specifically, the order state mainly comprises a receiving driving state and a sending driving state, in the receiving driving state, a driver needs to travel to a boarding point selected by a passenger to receive the passenger, and at the moment, the boarding point of the passenger needs to be used as a navigation terminal point for navigation; after receiving the passenger, the order state is switched to the driving sending state, namely the passenger is sent to the departure point to be arrived, at this time, the departure point selected by the passenger is required to be used as the navigation terminal point, and therefore, before switching the waiting state to the terminal point confirmation state, the driver needs to be combined with whether the user is in the driving receiving state or not currently so as to automatically acquire different navigation terminal points.
When determining that the voice control instruction sent by the user is a specified destination navigation instruction, if the current navigation state of the user is a waiting state, the state switching module 20 searches for a specified destination carried in the instruction when receiving the specified destination navigation instruction, and enters a destination input state or a destination confirmation state according to a search result; if the current navigation state of the user is an end point confirmation state, when a specified end point navigation instruction is received, the position confirmed by the current user is not the position which the user actually wants to reach, at the moment, the specified end point carried in the instruction is searched again, and the end point input state or the end point confirmation state is entered according to the search result; if the user receives the specified destination navigation command when the current navigation state is the navigation in-process state, the user may need to change the driving destination temporarily, or the passenger currently driving wants to change the destination when the passenger does not pass through the system software, and the like, at this time, the user correspondingly switches to the destination input state or the destination confirmation state according to the search result of the specified destination carried in the command.
When determining that the voice control instruction sent by the user is an instruction for searching information points, if the current navigation state of the user is an end point input state, the state switching module 20 searches information points carried in the instruction, and enters an end point input state or an end point confirmation state according to a search result; if the current navigation state of the user is a waiting state, an end point confirmation state or an in-process navigation state, the specific operation and processing logic when the instruction of the search information point is received are similar to those when the end point input state receives the instruction of the search information point, and the detailed description is not repeated here.
When determining that the voice control command sent by the user is a sequence number clear command, the state switching module 20 waits for the user to select a specific search result in an end point confirmation state as a navigation end point after successfully searching the information point, and at this time, switches the end point confirmation state to a navigation intermediate state according to the determined specific sequence number in the sequence number clear command when the end point confirmation corresponding to the sequence number is successful and navigation is normally started; if the terminal confirmation is successful but the navigation is not successfully started, repeated starting operation of navigation can be performed, and under the condition that the navigation is continuously started for a plurality of times, the terminal confirmation state is switched to the waiting state, and the user can be prompted by characters or voice to start the navigation, so that software restarting or consultation maintenance personnel can be recommended; if the end point confirmation corresponding to the sequence number in the sequence number clear instruction fails, if three search results are provided currently and the user selects a fifth result in the control instruction, the system cannot confirm, and can be kept in an end point confirmation state at this time and wait for other control instructions sent by the user; in the waiting state, the end point input state and the navigation in-process state, if the user sends out a sequence number clear instruction, the instruction cannot influence the current navigation state, so that the current navigation state is kept unchanged.
The end point confirmation state, the end point input state and the navigation in-process state are all states that the user is about to navigate or is navigating, if the user sends out a navigation exit instruction in any one of the three states, the user is indicated to want to stop navigating currently, and at this time, the state switching module 20 only needs to switch from any one of the three states to the waiting state directly; if the user is currently in a waiting state, the user does not actually perform any navigation-related operation, and if an exit navigation instruction is received at the moment, the user is kept in the waiting state.
Further, the voice control instruction further includes at least one of an on-broadcasting instruction, an off-broadcasting instruction, a volume adjustment instruction, and a base map adjustment instruction, which are used for performing some basic operations of the user in the process of using the software, and the state switching module 20 can implement corresponding operations according to the above instructions. For example, in the navigation process, a user can control the start or stop of the navigation broadcasting by sending a start broadcasting instruction or a stop broadcasting instruction; or the volume of the navigation broadcasting is adjusted through a volume adjusting instruction in any navigation state, so as to realize operations such as sound increase, sound decrease, silence and the like; or the size of the base map is adjusted through the base map adjusting instruction, so that the operations of adjusting the base map level to the maximum, adjusting the base map level to the minimum, enlarging the base map, shrinking the base map and the like are realized.
When receiving a specific voice control instruction of a user, the embodiment combines the current navigation state of the user, executes different instruction processing logics in different navigation states, can directly ensure that a network bus driver can directly enter the navigation state without additionally inputting a destination in combination with the current order situation, does not need to perform additional operation, and can improve the use experience of the user.
A third embodiment of the present disclosure provides a storage medium, which is a computer-readable medium storing a computer program that, when executed by a processor, implements the method provided by any embodiment of the present disclosure, including steps S11 and S12 as follows:
s11, determining a current first navigation state of a user;
s12, responding to a voice control instruction of a user, and switching from the first navigation state to the second navigation state.
Specifically, the voice control instruction is any one of the following instructions: a start navigation instruction, a specified destination navigation instruction, an exit navigation instruction, a search information point instruction and a sequence number clear instruction; the first navigation state and the second navigation state are any one of the following states: wait state, end point input state, end point confirmation state, and navigation on-the-way state.
When the voice control command is a start navigation command, the computer program is executed by the processor and is switched from the first navigation state to the second navigation state according to the voice control command, and the processor specifically executes the following steps: switching from the waiting state to an end point input state or an end point confirmation state; and switching from the end point confirmation state to the navigation en-route state.
The computer program is executed by the processor to switch from the waiting state to the end point input state or the end point confirmation state, and the processor specifically executes the following steps: judging whether the user is currently configured with a pick-up driving order; switching from the waiting state to the end point input state under the condition that the user is not configured with a pick-up driving order currently; judging whether the pick-up driving order is in a pick-up driving state or not under the condition that the user is currently configured with the pick-up driving order; when the pick-up driving order is in the pick-up driving state, taking a get-on point of the pick-up driving order as a navigation terminal point, and switching from a waiting state to a terminal point confirmation state; and when the pick-up driving order is not in the pick-up driving state, taking the get-off point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state.
When the voice control instruction is a specified destination navigation instruction, the computer program is executed by the processor and is switched from the first navigation state to the second navigation state according to the voice control instruction, and the processor specifically executes the following steps: switching from the waiting state to an end point input state or an end point confirmation state according to the navigation end point search result; switching from the end point confirmation state to the end point input state or maintaining the end point confirmation state according to the navigation end point search result; and switching from the navigation in-process state to the end point input state or the end point confirmation state according to the navigation end point search result.
When the voice control instruction is a search information point instruction, the computer program is executed by the processor and is switched from the first navigation state to the second navigation state according to the voice control instruction, and the processor specifically executes the following steps: and switching from the end point input state to the end point confirmation state or maintaining the end point input state according to the information point search result.
When the voice control command is a sequence number clear command, the computer program is executed by the processor and is switched from the first navigation state to the second navigation state according to the voice control command, and the processor specifically executes the following steps: under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started successfully, switching from the end point confirmation state to the navigation in-process state; switching from the end point confirmation state to the waiting state under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started failure; when the end point confirmation corresponding to the sequence number clear instruction fails, the end point confirmation state is maintained.
When the voice control command is an exit navigation command, the computer program is executed by the processor and is switched from the first navigation state to the second navigation state according to the voice control command, and the processor specifically executes the following steps: the state is switched from any one of the end point confirmation state, the end point input state, and the state in the middle of navigation to the waiting command state.
The voice control instruction further comprises at least one of a broadcast start instruction, a broadcast stop instruction, a volume adjustment instruction and a base map adjustment instruction; the computer program is further executed by the processor to: when the control instruction is a broadcast starting instruction or a broadcast closing instruction, the navigation broadcast is started or closed correspondingly according to the broadcast starting instruction or the broadcast closing instruction; when the control instruction is a volume adjustment instruction, adjusting the volume of the navigation broadcasting according to the volume adjustment instruction; and when the control instruction is a base map adjusting instruction, adjusting the size of the base map according to the base map adjusting instruction.
When receiving a specific voice control instruction of a user, the embodiment combines the current navigation state of the user, executes different instruction processing logics in different navigation states, can directly ensure that a network bus driver can directly enter the navigation state without additionally inputting a destination in combination with the current order situation, does not need to perform additional operation, and can improve the use experience of the user.
The fourth embodiment of the present disclosure provides an apparatus, which may be used as a back-end server of internet protocol (netbook) software or navigation software, and a schematic structural diagram of the apparatus may be shown in fig. 4, and at least includes a memory 100 and a processor 200, where the memory 100 stores a computer program, and the processor 200 implements the method provided by any embodiment of the present disclosure when executing the computer program on the memory 100. Exemplary, electronic device computer program steps are as follows S21 and S22:
S21, determining a current first navigation state of a user;
s22, responding to a voice control instruction of a user, and switching from the first navigation state to the second navigation state.
Specifically, the voice control instruction is any one of the following instructions: a start navigation instruction, a specified destination navigation instruction, an exit navigation instruction, a search information point instruction and a sequence number clear instruction; the first navigation state and the second navigation state are any one of the following states: wait state, end point input state, end point confirmation state, and navigation on-the-way state.
When the voice control command is a start navigation command, the processor executes the following computer program when the processor is switched from the first navigation state to the second navigation state according to the voice control command, wherein the computer program is stored in the memory: switching from the waiting state to an end point input state or an end point confirmation state; and switching from the end point confirmation state to the navigation en-route state.
The processor, when executing the switch from the wait state to the end point input state or the end point confirmation state stored on the memory, specifically executes the following computer program: judging whether the user is currently configured with a pick-up driving order; switching from the waiting state to the end point input state under the condition that the user is not configured with a pick-up driving order currently; judging whether the pick-up driving order is in a pick-up driving state or not under the condition that the user is currently configured with the pick-up driving order; when the pick-up driving order is in the pick-up driving state, taking a get-on point of the pick-up driving order as a navigation terminal point, and switching from a waiting state to a terminal point confirmation state; and when the pick-up driving order is not in the pick-up driving state, taking the get-off point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state.
When the voice control instruction is a specified destination navigation instruction, the processor executes the following computer program when switching from the first navigation state to the second navigation state according to the voice control instruction stored in the memory: switching from the waiting state to an end point input state or an end point confirmation state according to the navigation end point search result; switching from the end point confirmation state to the end point input state or maintaining the end point confirmation state according to the navigation end point search result; and switching from the navigation in-process state to the end point input state or the end point confirmation state according to the navigation end point search result.
When the voice control instruction is a search information point instruction, the processor executes the following computer program when switching from the first navigation state to the second navigation state according to the voice control instruction stored in the memory: and switching from the end point input state to the end point confirmation state or maintaining the end point input state according to the information point search result.
When the voice control instruction is a sequence number clear instruction, the processor executes the following computer program when switching from the first navigation state to the second navigation state according to the voice control instruction stored in the memory: under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started successfully, switching from the end point confirmation state to the navigation in-process state; switching from the end point confirmation state to the waiting state under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started failure; when the end point confirmation corresponding to the sequence number clear instruction fails, the end point confirmation state is maintained.
When the voice control instruction is an exit navigation instruction, the processor executes the following computer program when the processor is switched from the first navigation state to the second navigation state according to the voice control instruction stored in the memory: the state is switched from any one of the end point confirmation state, the end point input state, and the state in the middle of navigation to the waiting command state.
Specifically, the sound control instruction further comprises at least one of a broadcast start instruction, a broadcast stop instruction, a volume adjustment instruction and a base map adjustment instruction; the processor also executes a computer program as follows: when the control instruction is a broadcast starting instruction or a broadcast closing instruction, the navigation broadcast is started or closed correspondingly according to the broadcast starting instruction or the broadcast closing instruction; when the control instruction is a volume adjustment instruction, adjusting the volume of the navigation broadcasting according to the volume adjustment instruction; and when the control instruction is a base map adjusting instruction, adjusting the size of the base map according to the base map adjusting instruction.
When receiving a specific voice control instruction of a user, the embodiment combines the current navigation state of the user, executes different instruction processing logics in different navigation states, can directly ensure that a network bus driver can directly enter the navigation state without additionally inputting a destination in combination with the current order situation, does not need to perform additional operation, and can improve the use experience of the user.
Fig. 5 shows a specific architecture and interaction diagram of the client and the server in actual use. As shown in fig. 5, the user side mainly includes a voice interaction component for receiving the voice of the user, the voice broadcast of the feedback server side, text prompt and other contents, and a voice assistant component for implementing the functions of voice conversion, instruction disassembly, and calling up the navigation component; the server side mainly comprises a voice recognition service layer, a voice assistant service layer, an instruction recognition service layer, a retrieval service layer and a navigation service layer, wherein the voice recognition service layer is used for providing voice conversion text service for a voice assistant assembly of the user side, the voice assistant service layer is mainly used for carrying out state acquisition and decision of operation execution of the user side, namely, realizing switching logic among various navigation states shown in fig. 2 in the first embodiment of the disclosure, the instruction recognition service layer is mainly used for converting voice into text corresponding to specific voice control instructions, the retrieval service layer is used for providing retrieval service, and the navigation service layer is used for serving navigation components of the user side and is used for realizing specific navigation functions.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a Local Area Network (LAN), a Wide Area Network (WAN), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The storage medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The storage medium carries one or more programs that, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects an internet protocol address from the at least two internet protocol addresses and returns the internet protocol address; receiving an Internet protocol address returned by node evaluation equipment; wherein the acquired internet protocol address indicates an edge node in the content distribution network.
Alternatively, the storage medium carries one or more programs that, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that the storage medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any storage medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.
While various embodiments of the present disclosure have been described in detail, the present disclosure is not limited to these specific embodiments, and various modifications and embodiments can be made by those skilled in the art on the basis of the concepts of the present disclosure, and these modifications and modifications should be within the scope of the present disclosure as claimed.

Claims (14)

1. A navigation control method based on voice instructions, comprising:
determining a current first navigation state of a user;
responding to a voice control instruction of a user, and switching from the first navigation state to a second navigation state;
wherein if the voice control instruction is a start navigation instruction, the first navigation state includes a waiting state, the second navigation state includes an end point input state or an end point confirmation state, and switching from the first navigation state to the second navigation state includes:
Judging whether the user is currently configured with a pick-up driving order;
switching from the waiting state to an end point input state when the user is not currently configured to pick up a driving order;
judging whether the pick-up driving order is in a pick-up driving state or not under the condition that the user is currently configured with the pick-up driving order;
when the pick-up driving order is in a pick-up driving state, taking a boarding point of the pick-up driving order as a navigation terminal point, and switching from a waiting state to a terminal point confirmation state;
and when the pick-up driving order is not in the pick-up driving state, taking a get-off point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state.
2. The navigation control method of claim 1, wherein switching from the first navigation state to the second navigation state if the voice control instruction is a specified destination navigation instruction comprises:
switching from the waiting state to an end point input state or the end point confirmation state according to a navigation end point search result;
switching from the end point confirmation state to the end point input state or maintaining the end point confirmation state according to a navigation end point search result;
And switching from the navigation in-process state to the end point input state or the end point confirmation state according to the navigation end point search result.
3. The navigation control method of claim 1, wherein switching from the first navigation state to a second navigation state if the voice control instruction is a retrieve information point instruction comprises:
and switching from the end point input state to the end point confirmation state or maintaining the end point input state according to the information point search result.
4. The navigation control method according to claim 1, wherein switching from the first navigation state to a second navigation state if the voice control instruction is a sequence number explicit instruction, comprises:
when the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started successfully, switching from the end point confirmation state to the navigation in-process state;
switching from the end point confirmation state to the waiting state under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation starting fails;
and when the end point confirmation corresponding to the sequence number clear instruction fails, maintaining the end point confirmation state.
5. The navigation control method according to claim 1, wherein switching from the first navigation state to a second navigation state if the voice control instruction is an exit navigation instruction, comprises:
and switching from any one of the end point confirmation state, the end point input state and the navigation on-the-way state to a waiting command state.
6. The navigation control method of claim 1, wherein the voice control instruction further comprises at least one of an on-broadcast instruction, an off-broadcast instruction, a volume adjustment instruction, a floor adjustment instruction; wherein,,
when the control instruction is a broadcast starting instruction or a broadcast closing instruction, corresponding to the broadcast starting instruction or the broadcast closing instruction, starting navigation broadcasting or closing navigation broadcasting;
when the control instruction is a volume adjustment instruction, adjusting the volume of the navigation broadcasting according to the volume adjustment instruction;
and when the control instruction is a base map adjusting instruction, adjusting the size of the base map according to the base map adjusting instruction.
7. A navigation control device based on voice instructions, comprising:
the state determining module is used for determining the current first navigation state of the user;
The state switching module is used for responding to a voice control instruction of a user and switching from the first navigation state to the second navigation state;
wherein, if the voice control instruction is a start navigation instruction, the first navigation state includes a waiting state, the second navigation state includes an end point input state or an end point confirmation state, and the state switching module is specifically configured to:
judging whether the user is currently configured with a pick-up driving order;
switching from the waiting state to an end point input state when the user is not currently configured to pick up a driving order;
judging whether the pick-up driving order is in a pick-up driving state or not under the condition that the user is currently configured with the pick-up driving order;
when the pick-up driving order is in a pick-up driving state, taking a boarding point of the pick-up driving order as a navigation terminal point, and switching from a waiting state to a terminal point confirmation state;
and when the pick-up driving order is not in the pick-up driving state, taking a get-off point of the pick-up driving order as a navigation terminal point, and switching from the waiting state to the terminal point confirmation state.
8. The navigation control device of claim 7, wherein if the voice control command is a specified destination navigation command, the state switching module is specifically configured to:
Switching from the waiting state to an end point input state or the end point confirmation state according to a navigation end point search result;
switching from the end point confirmation state to the end point input state or maintaining the end point confirmation state according to a navigation end point search result;
and switching from the navigation in-process state to the end point input state or the end point confirmation state according to the navigation end point search result.
9. The navigation control device of claim 7, wherein if the voice control instruction is a retrieve information point instruction, the state switching module is specifically configured to:
and switching from the end point input state to the end point confirmation state or maintaining the end point input state according to the information point search result.
10. The navigation control device of claim 7, wherein if the voice control command is a sequence number explicit command, the state switching module is specifically configured to:
when the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation is started successfully, switching from the end point confirmation state to the navigation in-process state;
switching from the end point confirmation state to the waiting state under the condition that the end point confirmation corresponding to the sequence number clear instruction is successful and the navigation starting fails;
And when the end point confirmation corresponding to the sequence number clear instruction fails, maintaining the end point confirmation state.
11. The navigation control device of claim 7, wherein if the voice control command is an exit navigation command, the state switching module is specifically configured to:
and switching from any one of the end point confirmation state, the end point input state and the navigation on-the-way state to a waiting command state.
12. The navigation control device of claim 7, wherein the voice control instructions further comprise at least one of an on-broadcast instruction, an off-broadcast instruction, a volume adjustment instruction, a floor adjustment instruction; the state switching module is specifically configured to:
when the control instruction is a broadcast starting instruction or a broadcast closing instruction, corresponding to the broadcast starting instruction or the broadcast closing instruction, starting navigation broadcasting or closing navigation broadcasting;
when the control instruction is a volume adjustment instruction, adjusting the volume of the navigation broadcasting according to the volume adjustment instruction;
and when the control instruction is a base map adjusting instruction, adjusting the size of the base map according to the base map adjusting instruction.
13. A storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 6.
14. A navigation control device based on voice instructions, comprising at least a memory, a processor, said memory having stored thereon a computer program, characterized in that said processor, when executing the computer program on said memory, implements the steps of the method according to any of claims 1 to 6.
CN201911359715.5A 2019-12-25 2019-12-25 Navigation control method, device, storage medium and equipment based on voice instruction Active CN111811534B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911359715.5A CN111811534B (en) 2019-12-25 2019-12-25 Navigation control method, device, storage medium and equipment based on voice instruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911359715.5A CN111811534B (en) 2019-12-25 2019-12-25 Navigation control method, device, storage medium and equipment based on voice instruction

Publications (2)

Publication Number Publication Date
CN111811534A CN111811534A (en) 2020-10-23
CN111811534B true CN111811534B (en) 2023-10-31

Family

ID=72844598

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911359715.5A Active CN111811534B (en) 2019-12-25 2019-12-25 Navigation control method, device, storage medium and equipment based on voice instruction

Country Status (1)

Country Link
CN (1) CN111811534B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE1013328A5 (en) * 1998-08-06 2001-12-04 Bosch Gmbh Robert Navigation system and method for moving average.
CN103591947A (en) * 2012-08-13 2014-02-19 百度在线网络技术(北京)有限公司 Voice background navigation method of mobile terminal and mobile terminal
CN105115491A (en) * 2015-08-13 2015-12-02 努比亚技术有限公司 Navigation method, master device and slave devices
CN107305483A (en) * 2016-04-25 2017-10-31 北京搜狗科技发展有限公司 A kind of voice interactive method and device based on semantics recognition
CN107682536A (en) * 2017-09-25 2018-02-09 努比亚技术有限公司 A kind of sound control method, terminal and computer-readable recording medium
CN108592938A (en) * 2018-06-11 2018-09-28 百度在线网络技术(北京)有限公司 Navigation route planning method, apparatus and storage medium
CN108694452A (en) * 2017-04-06 2018-10-23 北京嘀嘀无限科技发展有限公司 Vehicle-mounted order method and device, server, onboard system, vehicle
CN110132300A (en) * 2019-05-22 2019-08-16 未来(北京)黑科技有限公司 The air navigation aid and device, storage medium and electronic device of net about vehicle
CN110472095A (en) * 2019-08-16 2019-11-19 百度在线网络技术(北京)有限公司 Voice guide method, apparatus, equipment and medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9230556B2 (en) * 2012-06-05 2016-01-05 Apple Inc. Voice instructions during navigation

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BE1013328A5 (en) * 1998-08-06 2001-12-04 Bosch Gmbh Robert Navigation system and method for moving average.
CN103591947A (en) * 2012-08-13 2014-02-19 百度在线网络技术(北京)有限公司 Voice background navigation method of mobile terminal and mobile terminal
CN105115491A (en) * 2015-08-13 2015-12-02 努比亚技术有限公司 Navigation method, master device and slave devices
CN107305483A (en) * 2016-04-25 2017-10-31 北京搜狗科技发展有限公司 A kind of voice interactive method and device based on semantics recognition
CN108694452A (en) * 2017-04-06 2018-10-23 北京嘀嘀无限科技发展有限公司 Vehicle-mounted order method and device, server, onboard system, vehicle
CN107682536A (en) * 2017-09-25 2018-02-09 努比亚技术有限公司 A kind of sound control method, terminal and computer-readable recording medium
CN108592938A (en) * 2018-06-11 2018-09-28 百度在线网络技术(北京)有限公司 Navigation route planning method, apparatus and storage medium
CN110132300A (en) * 2019-05-22 2019-08-16 未来(北京)黑科技有限公司 The air navigation aid and device, storage medium and electronic device of net about vehicle
CN110472095A (en) * 2019-08-16 2019-11-19 百度在线网络技术(北京)有限公司 Voice guide method, apparatus, equipment and medium

Also Published As

Publication number Publication date
CN111811534A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
US10354478B2 (en) Method, apparatus, and system for automatic refueling of driverless vehicle
CN105702254A (en) Voice control system based on mobile terminal and voice control method thereof
US11067400B2 (en) Request and provide assistance to avoid trip interruption
CN103680134A (en) Method, device and system of providing taxi calling service
CN109272133A (en) Method and apparatus for reserving vehicle
CN113135178A (en) Parking route sharing method, device, equipment and storage medium
JP2012256001A (en) Device and method for voice recognition in mobile body
CN110019740A (en) Exchange method, car-mounted terminal, server and the storage medium of car-mounted terminal
KR101889046B1 (en) Method and system for processing an order for traffic demand service
JP6966598B2 (en) How to handle a feasible set of transfers to calculate an itinerary within a multi-modal transportation network
CN101620814A (en) Tour guide and navigation method and tour guide and navigation terminal
CN110489670B (en) Civil service mobile application platform system based on multidimensional map and application method thereof
CN111811534B (en) Navigation control method, device, storage medium and equipment based on voice instruction
WO2022205357A1 (en) Autonomous driving control method, electronic device, mobile terminal, and vehicle
JP2002150039A (en) Service intermediation device
KR102099144B1 (en) Travel help service providing system based on location
CN112163685A (en) Intelligent trip matching method and system based on voice AI
US11320804B2 (en) Multi information provider system of guidance robot and method thereof
US20200226707A1 (en) Vehicle information processing apparatus, vehicle information processing system, and method of processing vehicle information
JPWO2017159662A1 (en) Route learning system and route learning program
JP2017228221A (en) Reservation device, reservation method and on-vehicle system
WO2020102372A1 (en) Predictive cleanliness function for vehicle-sharing fleet
CN103188138A (en) Interactive message data processing system
CN114840653B (en) Dialogue processing method, device, equipment and storage medium
CN109357682A (en) A kind of road navigation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant