CN117590998A - Human-vehicle interaction method combining physical keys in voice scene and related equipment - Google Patents

Human-vehicle interaction method combining physical keys in voice scene and related equipment Download PDF

Info

Publication number
CN117590998A
CN117590998A CN202311691474.0A CN202311691474A CN117590998A CN 117590998 A CN117590998 A CN 117590998A CN 202311691474 A CN202311691474 A CN 202311691474A CN 117590998 A CN117590998 A CN 117590998A
Authority
CN
China
Prior art keywords
page turning
control
display screen
page
central control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311691474.0A
Other languages
Chinese (zh)
Inventor
樊倩
张亭亭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Great Wall Motor Co Ltd
Original Assignee
Great Wall Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Great Wall Motor Co Ltd filed Critical Great Wall Motor Co Ltd
Priority to CN202311691474.0A priority Critical patent/CN117590998A/en
Publication of CN117590998A publication Critical patent/CN117590998A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • G06F3/04897Special input arrangements or commands for improving display capability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a human-vehicle interaction method combining physical keys in a voice scene and related equipment, wherein when a voice instruction triggers a control interface and a central control display screen cannot display all display contents of the control interface, the physical keys are switched from a current control mode to a page turning control mode, triggering operation for the physical keys is determined, and a multiplexing physical key function is realized through mode switching of the physical keys. Then, according to the triggering operation, controlling the central control display screen to turn pages for displaying the control interface; the multimode interaction of voice and physical keys is realized, the user target is realized rapidly and efficiently, and the condition that page turning control is invalid or delayed is avoided. And if the selection confirmation instruction is received after the page turning control is finished, switching the physical key from the page turning control mode to the current control mode. The multiplexing physical key function is realized by recovering the original mode of the physical key, and the man-vehicle interaction under a plurality of modes is realized.

Description

Human-vehicle interaction method combining physical keys in voice scene and related equipment
Technical Field
The application relates to the technical field, in particular to a human-vehicle interaction method combining physical keys in a voice scene and related equipment.
Background
With the development of intelligent automobiles, the demands of users are increasing, and more vehicles are increasing physical keys on steering wheels in order to meet personalized demands. Due to the convenience of the physical keys, the physical keys can enable a user to set a desired function, so that the function can be started quickly, and in the related art, the physical keys correspond to one function after being set. In the related art, page turning control needs to be performed by means of voice instructions such as a previous page or a next page of a user, however, the voice dialogue process is not intelligent and efficient enough, the condition that page turning control is invalid or delayed easily occurs, and poor experience is brought to a vehicle interaction process.
Disclosure of Invention
In view of this, the present application aims to provide a human-vehicle interaction method and related equipment combining physical keys in a voice scene.
Based on the above object, a first aspect of the present application provides a human-vehicle interaction method combined with physical keys in a voice scene, including:
receiving a voice instruction of a user;
Responding to the voice instruction to trigger a control interface, wherein the central control display screen cannot display all display contents of the control interface, switching a physical key from a current control mode to a page turning control mode, and determining triggering operation for the physical key;
controlling the central control display screen to turn pages for displaying the control interface according to the triggering operation;
and responding to receiving a selection confirmation instruction, and switching the physical key from the page turning control mode to the current control mode.
A second aspect of the present application provides a human-vehicle interaction device that combines physical keys in a voice scene, including:
an instruction receiving module configured to: receiving a voice instruction of a user;
a mode switching module configured to: responding to the voice instruction to trigger a control interface, wherein the central control display screen cannot display all display contents of the control interface, switching a physical key from a current control mode to a page turning control mode, and determining triggering operation for the physical key;
the page turning display module is configured to: controlling the central control display screen to turn pages for displaying the control interface according to the triggering operation;
A mode recovery module configured to: and responding to receiving a selection confirmation instruction, and switching the physical key from the page turning control mode to the current control mode.
A third aspect of the present application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method as provided in the first aspect of the present application when executing the program.
A fourth aspect of the present application provides a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method provided in the first aspect of the present application.
From the above, it can be seen that, according to the human-vehicle interaction method and related equipment combined with physical keys in the voice scene provided by the application, after receiving the voice command of the user, if the voice command triggers the control interface and the central control display screen cannot display all display contents of the control interface, the physical keys are switched from the current control mode to the page turning control mode, the triggering operation of the physical keys is determined, the function of multiplexing the physical keys is realized through the mode switching of the physical keys, and the human-vehicle interaction in multiple modes is realized. Then, according to the triggering operation, controlling the central control display screen to turn pages for displaying the control interface; the multimode interaction of voice and physical keys is realized, the user target is realized rapidly and efficiently, and the condition that page turning control is invalid or delayed is avoided. And if the selection confirmation instruction is received after the page turning control is finished, switching the physical key from the page turning control mode to the current control mode. The multiplexing physical key function is realized by recovering the original mode of the physical key, and the man-vehicle interaction under a plurality of modes is realized.
Drawings
In order to more clearly illustrate the technical solutions of the present application or related art, the drawings that are required to be used in the description of the embodiments or related art will be briefly described below, and it is apparent that the drawings in the following description are only embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort to those of ordinary skill in the art.
Fig. 1 is a schematic diagram of a control interface corresponding to a voice dialogue scene in an embodiment of the present application;
FIG. 2 is a flowchart of a method for human-vehicle interaction with physical keys in a voice scenario according to an embodiment of the present application;
FIG. 3 is a flowchart of a control interface page turning display according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a position distribution of physical keys according to an embodiment of the present application;
FIG. 5 is a flowchart of a page turning display when only one physical key exists in the embodiment of the present application;
FIG. 6 is a flowchart of controlling a central control display screen to display a page according to a default page turning direction according to an embodiment of the present application;
FIG. 7 is a flowchart of continuous page turning when the operation type is the second type of operation in the embodiment of the present application;
FIG. 8 is a flowchart of a page turning display when at least two physical keys exist in an embodiment of the present application;
FIG. 9 is a flowchart of a method for human-vehicle interaction with physical keys in a voice scenario according to another embodiment of the present application;
FIG. 10 is a schematic structural diagram of a human-vehicle interaction device with physical keys in a voice scenario according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below with reference to the accompanying drawings.
It should be noted that unless otherwise defined, technical or scientific terms used in the embodiments of the present application should be given the ordinary meaning as understood by one of ordinary skill in the art to which the present application belongs. The terms "first," "second," and the like, as used in embodiments of the present application, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
In this document, it should be understood that any number of elements in the drawings is for illustration and not limitation, and that any naming is used only for distinction and not for any limitation.
Based on the above description of the background art, there are also the following cases in the related art:
steering wheels are the most frequent vehicle components that the driver contacts during driving. In order to provide convenience to the driver, a plurality of steering wheel keys are generally arranged on the steering wheel, so that the driver can realize a plurality of functions, such as call receiving, music playing switching and the like, by pressing the steering wheel keys during driving. However, the related steering wheel keys are generally provided with only one function, and the function cannot be changed at will. This creates a significant problem: the driver cannot customize different keys to realize different functions according to own requirements. Even if the function can be changed, the driver is required to manually change, and the function cannot be switched according to the scene, so that the function cannot be switched when the scene with higher timeliness requirement is faced, and the method is not efficient enough.
In the speech dialogue scene shown in fig. 1, the in-car speech system often has the condition that the interface exceeds the central control display screen in the dialogue, so that the central control display screen cannot display all the display contents of the interface. For example, as shown in fig. 1, if the control interface appears to display the selection control interface beyond the current screen, if the display content of the selection control interface includes 14 options and the central control display screen can only display 4 options at the same time, the selection control interface needs to be displayed in 4 pages, the first 3 pages include 4 options, and the last page includes 2 options. At this time, the central control display screen displays a first page of the selection control interface, wherein the first page comprises four options including 1, 2, 3 and 4, and a previous page control and a next page control for performing page turning control. When the 4 options of the first page do not have the requirement options of the user, the user can select to click a next page control to turn pages backwards, and can also instruct the next page to turn pages backwards through voice.
However, when the next page control is clicked to control the central control display screen to turn pages, a driver is required to leave the steering wheel to click the central control display screen, so that the driver is influenced to control the vehicle; when the page turning control is performed by voice instructions such as the previous page or the next page, the page turning control is not effective or delayed easily because the voice dialogue process is not intelligent and efficient enough.
According to the human-vehicle interaction method and the related equipment combining the physical keys in the voice scene, after a voice command of a user is received, if the voice command triggers a control interface and a central control display screen cannot display all display contents of the control interface, the physical keys are switched from a current control mode to a page turning control mode, triggering operation of the physical keys is determined, and a multiplexing physical key function is achieved through mode switching of the physical keys. Then, according to the triggering operation, controlling the central control display screen to turn pages for displaying the control interface; the multimode interaction of voice and physical keys is realized, the user target is realized rapidly and efficiently, and the condition that page turning control is invalid or delayed is avoided. And if the selection confirmation instruction is received after the page turning control is finished, switching the physical key from the page turning control mode to the current control mode. The multiplexing physical key function is realized by recovering the original mode of the physical key, and the man-vehicle interaction under a plurality of modes is realized.
A human-vehicle interaction method combined with physical keys in a voice scene according to an exemplary embodiment of the present application is described below with reference to the accompanying drawings.
In some embodiments, as shown in fig. 2, a human-vehicle interaction method combined with physical keys in a voice scene includes:
step 201: and receiving a voice instruction of a user.
In the specific implementation, if the user issues the instruction in a mode of touching the central control display screen and the like, the user is provided with enough energy to achieve the instruction issuing through touching the central control display screen while the steering wheel is not affected, at the moment, the user generally performs page turning control through a display control in the central control display screen, but in order to facilitate the control of the user, the mode switching of the physical keys can be performed, and the user can realize the page turning control of the central control display screen through the physical keys, voice interaction and the display control. However, since the user uses the display control to perform the page turning control more conveniently at this time, the mode switching of the physical key which is not performed may be selected. Optionally, the physical key may be a custom key disposed on the steering wheel, or a key disposed on the dashboard.
However, if the voice command is received, it is indicated that the user cannot perform the page turning control on the central control display screen through the display control at this time, so that in order to avoid the situation that the page turning control is invalid or delayed, it is necessary to perform mode switching of the physical key, means for performing the page turning control by the user is increased, and control efficiency is improved.
Step 202: and responding to the voice command to trigger the control interface, wherein the central control display screen cannot display all display contents of the control interface, switching the physical key from the current control mode to the page turning control mode, and determining triggering operation for the physical key.
When the method is implemented, after a voice control instruction is received, firstly, whether the voice instruction triggers a control interface needs to be determined, if the voice instruction does not trigger the control interface, page turning control on a central control display screen is not needed, and physical key is not needed to perform mode switching naturally. If the voice command is "opening the window", the corresponding window control device is directly controlled to shake the window, so that the window is opened, and further control is not needed through the central control display screen in the whole process, so that the voice command for opening the window does not trigger the control interface, page turning display control is not needed for the central control display screen, and mode switching of physical keys is not needed.
If the voice command triggers the control interface, whether the central control display screen can display all the display contents of the control interface is further determined, if the central control display screen can display all the display contents of the control interface, page turning control on the central control display screen is not needed, and a user directly performs next selection control according to the display contents of the central control display screen, so that mode switching of physical keys is not needed. For example, if the voice command is "turn on navigation", the control interface of whether to turn on the positioning authorization is triggered, and the content displayed on the control interface is "whether to turn on the positioning authorization? "text prompt, and is located at what is a certain authorization to be opened? The parallel display control under the characters is yes display control and no display control, the display content of the control interface is less, and at the moment, the user can realize positioning authorization control directly through the voice command of yes.
However, for some control interfaces with more options or more display contents, the whole content of the control interface cannot be displayed only by single page display of the central control display screen, and at this time, the display content required by the user needs to be displayed by page turning display of the central control display screen. At this time, it is necessary to switch modes of the physical keys, and further, means for performing page turning control by the user is added. When the mode is switched, the current control mode of the physical key is firstly determined according to the bound function of the physical key at the current moment. For example, if the function currently bound by the physical key is automatic parking, the current control mode of the self-defined key is an automatic parking start-stop control mode, and a single click of the physical key enters an automatic parking mode, and in the automatic parking mode, the single click of the physical key exits the automatic parking mode.
When the voice command triggers the control interface and the central control display screen cannot display all the display contents of the control interface, the physical keys are required to be switched from the current control mode to the page turning control mode, namely, at the moment, the automatic parking function and the physical keys are temporarily unbinding, the physical keys are bound with the page turning function, the automatic parking start-stop control mode of the physical keys is switched to the page turning control mode, and the switching from the current control mode to the page turning control mode is realized. For example, if the voice command is "open music list", since the music list is composed of information such as song names of a plurality of songs, and the display content of the music list control interface includes a plurality of options, it is generally impossible to perform single-page display of all display contents through the central control display screen, and the switching condition of the mode for performing the physical key is met, it is necessary to switch the physical key from the current control mode to the page-turning control mode, and realize page-turning control according to the triggering operation of the user on the physical key.
Step 203: and controlling the central control display screen to turn pages for displaying the control interface according to the triggering operation.
In specific implementation, the number of keys can be divided into two cases, wherein the first case is a case where only one physical key exists, and the second case is a case where at least two physical keys exist.
In the first case, since there is only one physical key, but it is necessary to implement page turning actions such as page turning backward, page turning forward, page turning backward continuously, page turning forward continuously, etc., different trigger operations need to be determined for different page turning actions. Illustratively, after the mode switching is completed, a page turning action of turning back one page is realized by clicking a physical key; the page turning direction is determined by continuously pressing the physical key (generally, continuously pressing twice) in a short way, after the mode switching is finished, the default page turning direction is backward page turning, the page turning direction is switched by continuously pressing the physical key twice in a short way, and the default backward page turning is switched to forward page turning. After the page turning direction is switched, single page turning in the page turning direction is realized continuously by clicking the physical key, and the display of the next page of the current page is performed. Continuous page turning in the page turning direction can be realized by long-pressing the physical key.
For the second case, since there are only at least two physical keys, two physical keys are selected for page turning control. One physical key is used for backward page turning, the other physical key is used for forward page turning, and the page turning mode is the same as that of a single physical key, namely, one physical key is used for backward page turning by short pressing, and continuous page turning is performed by long pressing of the physical key. Pressing another physical key for forward page turning is short, and pressing the physical key for forward page turning is long. The triggering operation of determining the page turning direction is omitted with respect to the first case.
The method comprises the steps of distinguishing short presses and long presses through a preset time threshold, and determining that the triggering operation is short presses if the pressing duration of a physical key is smaller than or equal to the time threshold; if the pressing time of the physical key is longer than the time threshold, determining that the triggering operation is long pressing, and continuously turning pages corresponding to the long pressing operation is turning pages according to a certain page turning speed, for example, turning pages every second. The method comprises the steps that a preset interval threshold is used for distinguishing the direction of continuous single page turning and switching page turning, and if the interval between two short presses is smaller than or equal to the interval threshold, the triggering operation for a physical key is determined to be the direction of switching page turning; if the interval between the two short presses is greater than the interval threshold, determining that the triggering operation for the physical key is to turn pages twice in the page turning direction.
According to the number of the keys and the triggering operation of the physical keys, the central control display screen is controlled to turn pages for displaying the control interface, so that multimode interaction of voice and the physical keys is realized, a user target is realized rapidly and efficiently, and the condition that the page turning control is invalid or delayed is avoided.
Step 204: and responding to the receiving of the selection confirmation instruction, and switching the physical key from the page turning control mode to the current control mode.
In specific implementation, taking the scenario shown in fig. 1 as an example, after the control interface is triggered, if the required option of the user is the 10 th option, the user can realize twice page turning by pressing the physical key 2 times, at this time, the central control display screen displays the 9 th to 12 th options, the user sees the 10 th option as the required option in the central control display screen, at this time, the user can select and determine through the voice instruction "10" or text information corresponding to the 10 th option, and obtain a selection confirmation instruction for the 10 th option, which indicates that the user has completed selecting, and does not need to continue operations such as page turning, at this time, the physical key needs to be switched from the page turning control mode to the current control mode, thereby avoiding influencing the subsequent operation of the physical key of the user, and realizing the multiplexing physical key function by recovering the original mode of the physical key.
In summary, according to the human-vehicle interaction method combined with the physical key in the voice scene provided by the embodiment of the application, after the control interface is triggered by the fruit voice command and the central control display screen cannot display all display contents of the control interface, the physical key is switched from the current control mode to the page turning control mode, the triggering operation of the physical key is determined, and the function of multiplexing the physical key is realized through the mode switching of the physical key. Then, according to the triggering operation, controlling the central control display screen to turn pages for displaying the control interface; the multimode interaction of voice and physical keys is realized, the user target is realized rapidly and efficiently, and the condition that page turning control is invalid or delayed is avoided. And if the selection confirmation instruction is received after the page turning control is finished, switching the physical key from the page turning control mode to the current control mode. The multiplexing physical key function is realized by restoring the original mode of the physical key.
In some embodiments, as shown in fig. 3, the step of controlling the central control display screen to display the control interface by turning pages according to the triggering operation includes:
step 301: the number of keys of the physical keys is determined.
In the implementation, after the mode switching of the physical keys is completed, the number of keys of the physical keys needs to be further determined, and when the number of the physical keys is different, the triggering operation of page turning control through the physical keys is different. If only one physical key exists, different page turning control actions are executed according to different triggering operations on the physical key. If at least two physical keys exist, different page turning control actions are executed according to trigger operations of different physical keys. Namely, the mode switching of the physical keys is used for realizing the function of multiplexing the physical keys.
Step 302: and controlling the central control display screen to turn pages for displaying aiming at the triggering operation of the physical keys in response to the fact that the number of the keys is one.
In the specific implementation, for the case that only one physical key exists, the page turning actions such as page turning backward, page turning forward, page turning backward continuously, page turning forward continuously, page turning direction switching and the like need to be realized because only one physical key exists, and different trigger operations need to be determined for different page turning actions. For example, if the operation of switching the page turning direction is not performed, since the default page turning direction is page turning backward, the page turning action of page turning backward is realized by clicking (i.e. pressing one short time) a physical key; the continuous page-backward turning action is realized by pressing the physical key for a long time. If the operation of switching the page turning direction is performed, the page turning direction is switched to forward page turning, and the page turning action of the backward previous page is realized by clicking (i.e. short pressing once) a physical key; the continuous forward page turning action is realized by long pressing of the physical key.
Step 303: and determining a first physical key for page turning and a second physical key for page turning backward in response to the number of keys being at least two, and performing page turning display for a first triggering operation of the first physical key and a second triggering operation of the first physical key.
In particular, in the case that there are at least two physical keys, two physical keys (preferably two physical keys in the same column or the same row are selected for facilitating the user to perform page turning control) need to be selected from the at least two physical keys, and a first physical key for page turning and a second physical key for page turning backward are determined in the two physical keys. For example, if the first physical key and the second physical key are located in the same row, the physical key located on the left side is selected as the first physical key, and the physical key located on the right side is selected as the second physical key. If the first physical key and the second physical key are positioned in the same column, selecting the physical key positioned at the upper side as the first physical key and selecting the physical key positioned at the lower side as the second physical key.
For example, as shown in fig. 4, taking a physical key as a custom key disposed on a steering wheel as an example, if the distribution of two custom keys has no explicit positional relationship, the user is led to make the page turning direction corresponding to the two custom keys clear in the central control display screen.
The page turning mode is the same as the case of only one custom key, namely, the first custom key is pressed for realizing backward page turning, and the first custom key is pressed for long time for realizing backward continuous page turning. And the second user-defined key is pressed for a short time to realize forward page turning, and the second user-defined key is pressed for a long time to realize forward continuous page turning. Therefore, compared with the condition that only one self-defined key is provided, the triggering operation for determining the page turning direction is omitted, and the page turning control is simpler.
In some embodiments, as shown in fig. 5, in response to the number of keys being one, the central control display screen is controlled to perform page turning display for triggering operation of the physical keys, including:
step 501: and responding to the trigger operation without the direction control operation, and controlling the central control display screen to display the page according to the default page turning direction according to the page turning control operation in the trigger operation.
When the method is implemented, if the user does not perform the operation of switching the page turning direction by continuous short pressing, the triggering operation does not include the direction control operation, and the central control display screen is controlled to perform page turning display according to the default page turning direction continuously according to the page turning control operation in the triggering operation. For example, since the default page turning direction is page turning backward, if the page turning control class is operated as a single click (i.e. one short press) of the physical key, the central control display screen is controlled to perform page turning display of page turning backward according to the default page turning direction; if the page turning control type is operated as long-time pressing of the physical key, the central control display screen is controlled to perform page turning display of continuously turning pages backwards according to the default page turning direction.
Step 502: and the response triggering operation comprises a direction control operation, the default page turning direction is replaced by a target page turning direction opposite to the default page turning direction, and the central control display screen is controlled to perform page turning display according to the target page turning direction according to the page turning control operation in the triggering operation.
When the method is implemented, if a user performs operation of switching the page turning direction by continuous short pressing, replacing the default page turning direction with a target page turning direction opposite to the default page turning direction, and controlling the central control display screen to perform page turning display according to the target page turning direction according to page turning control type operation in triggering operation. For example, since the default page turning direction is page turning backward, the default page turning direction is replaced by page turning forward, and if the page turning control class is operated as a single click (i.e. one short press) physical key, the central control display screen is controlled to perform page turning display of page turning forward according to the default page turning direction; if the page turning control type is operated as long-time pressing of the physical key, the page turning display of the central control display screen for continuously turning pages forwards is controlled.
In some embodiments, the default direction is page backward, as shown in fig. 6, according to a page turning control operation in the triggering operation, the control central control display screen performs page turning display according to the default page turning direction, including:
step 601: and determining the operation type of the page turning control type operation and the current page of the control interface.
In specific implementation, the operation types of the page turning control operation include short press, long press and continuous short press. The short press and the long press are distinguished through a preset time threshold, and if the pressing duration of the physical key is less than or equal to the time threshold, the triggering operation is determined to be the short press; if the pressing time of the physical key is longer than the time threshold, determining that the triggering operation is long pressing, and continuously turning pages corresponding to the long pressing operation is turning pages according to a certain page turning speed, for example, turning pages every second. The method comprises the steps of distinguishing continuous single page turning and continuous short pressing with the switching page turning direction for multiple times through a preset interval threshold, and determining triggering operation for a physical key to switch the page turning direction if the interval between the two short pressing is smaller than or equal to the interval threshold; if the interval between the two short presses is greater than the interval threshold, determining that the triggering operation for the physical key is to turn pages twice in the page turning direction. Typically, the current page is the first page of the control interface.
Step 602: and responding to the operation type as the first type of operation, and controlling the central control display screen to display the next page of the current page.
In the implementation, the first type of operation is a short-press operation, namely, an operation of page backward turning, so that when the operation type is the first type of operation, the central control display screen is controlled to display a page next to the current page.
Step 603: and responding to the operation type as the second type of operation, and controlling the central control display screen to continuously turn pages according to the default page turning direction.
In the implementation, the second type of operation is a long-press operation, i.e. a backward continuous page turning operation, so when the operation type is the second type of operation, the central control display screen is controlled to perform continuous page turning display according to the default page turning direction, for example, page turning display of backward page turning is performed once per second.
In some embodiments, as shown in fig. 7, in response to the operation type being the second type of operation, controlling the central control display screen to perform continuous page turning display according to the default page turning direction includes:
step 701: and in response to the operation type being the second type of operation, determining a time interval of continuous page turning and a duration of the second type of operation.
When the operation type is determined to be the second type operation, determining a time interval of continuous page turning and duration time of the second type operation, wherein the time interval represents that page turning is performed every other time interval, and if the time interval is 1 second, page turning is performed every second; the duration of the second type of operation represents the pressing duration of the physical key for long pressing, if the duration is 3.5 seconds, the page is turned back by 3 pages, and when the duration is 4 seconds, the page is turned for the 4 th time.
Step 702: and determining the page turning number according to the duration and the time interval.
In specific implementation, the number of pages turned is determined according to the ratio of the duration to the time interval (rounding down), that is, when the duration is 3.5 seconds and the time interval is 1 second, the ratio of the duration to the time interval is 3.5, and rounding down is 3.
Step 703: and controlling the central control display screen to continuously turn pages according to the default page turning direction.
In specific implementation, when the page turning number is 3, turning back 3 pages, and when the duration is 4 seconds, turning back the page 4 times.
In some embodiments, as shown in fig. 8, the page turning display for the first triggering operation of the first physical key and the second triggering operation of the first physical key includes:
step 801: and responding to the first triggering operation as a first type operation, and controlling the central control display screen to display a page next to the current page.
In the implementation, if the first triggering operation is a first type operation, the user is indicated to press the first determination key for a short time, page turning is performed backwards, and the central control display screen is controlled to display a next page of the current page.
Step 802: and responding to the first triggering operation as a second type operation, and controlling the central control display screen to continuously turn pages backwards for display.
When the method is implemented, if the first triggering operation is the second type operation, the fact that the user presses the first defining key for a long time is indicated, the page turning number is determined according to the time interval of continuous page turning and the duration time of the second type operation, and the central control display screen is controlled to continuously and backwards turn pages according to the page turning number and the time interval to display.
Step 803: and responding to the second triggering operation as the first type operation, and controlling the central control display screen to display the previous page of the current page.
In the implementation, if the second triggering operation is the first type operation, the user is indicated to press the second definition key for a short time, forward page turning is performed, and the central control display screen is controlled to display the previous page of the current page.
Step 804: and responding to the second trigger operation as a second type operation, and controlling the central control display screen to continuously forward page turning display.
In the implementation, if the second trigger operation is the second type operation, it is indicated that the user presses the second definition key for a long time, the page turning number is determined according to the time interval of continuous page turning and the duration of the second type operation, and the central control display screen is controlled to continuously forward page turning display according to the page turning number and the time interval.
In some embodiments, as shown in fig. 9, the human-vehicle interaction method combined with physical keys in the voice scene further includes:
Step 901: and responding to the voice command to trigger the control interface, and controlling the central control display screen to display the first page of the control interface and display a first control for backward page turning and a second control for forward page turning, wherein the central control display screen cannot display all display contents of the control interface.
In specific implementation, as shown in fig. 1, when the voice command triggers the control interface and the central control display screen cannot display all the display contents of the control interface, the central control display screen is controlled by default to display a first page of the control interface, and display a first control for backward page turning and a second control for forward page turning, and the manner of performing page turning control by the user is increased, for example, the first display control is "next page" and the second display control is "last page".
Step 902: and in response to the fact that the direction control operation is not detected, controlling the central control display screen to highlight the first control.
In the specific implementation, for the case that only one physical key is provided, if the direction control operation is not detected, the central control display screen is controlled to highlight the first control, and the user is prompted to turn the page backwards in the page turning direction at the moment, namely, the page can be turned backwards when the physical key is pressed for a short time.
Step 903: and in response to detecting the direction control operation, controlling the central control display screen to highlight the second control.
In the specific implementation, for the case that only one physical key is provided, if the direction control operation is detected, the central control display screen is controlled to highlight the second control, so that the user is prompted that the page turning direction is switched at the moment, namely, the current page turning direction is forward page turning, namely, the physical key is pressed for a short time to forward page turning.
The current page turning direction is prompted by highlighting, so that the page turning control error caused by forgetting the page turning direction by the user is avoided.
It should be noted that, the method of the embodiments of the present application may be performed by a single device, for example, a computer or a server. The method of the embodiment can also be applied to a distributed scene, and is completed by mutually matching a plurality of devices. In the case of such a distributed scenario, one of the devices may perform only one or more steps of the methods of embodiments of the present application, and the devices may interact with each other to complete the methods.
It should be noted that some embodiments of the present application are described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Based on the same inventive concept, the application also provides a human-vehicle interaction device combined with physical keys in a voice scene, which corresponds to the method of any embodiment.
Referring to fig. 10, the human-vehicle interaction device combined with physical keys in the voice scene includes:
an instruction receiving module 10 configured to: receiving a voice instruction of a user;
a mode switching module 20 configured to: responding to the voice command to trigger the control interface, wherein the central control display screen cannot display all display contents of the control interface, switching the physical key from the current control mode to the page turning control mode, and determining triggering operation for the physical key;
the page turning display module 30 is configured to: according to the triggering operation control central control display screen, page turning display is carried out on the control interface;
a mode recovery module 40 configured to: and responding to the receiving of the selection confirmation instruction, and switching the physical key from the page turning control mode to the current control mode.
For convenience of description, the above devices are described as being functionally divided into various modules, respectively. Of course, the functions of each module may be implemented in the same piece or pieces of software and/or hardware when implementing the present application.
The device of the above embodiment is used for implementing the human-vehicle interaction method combined with the physical key under the corresponding voice scene in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Based on the same inventive concept, the application also provides an electronic device corresponding to the method of any embodiment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the human-vehicle interaction method combined with the physical key in the voice scene of any embodiment when executing the program.
Fig. 11 is a schematic diagram showing a hardware structure of a more specific electronic device according to the present embodiment, where the device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 implement communication connections therebetween within the device via a bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit ), microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, etc. for executing relevant programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of ROM (Read Only Memory), RAM (Random Access Memory ), static storage device, dynamic storage device, or the like. Memory 1020 may store an operating system and other application programs, and when the embodiments of the present specification are implemented in software or firmware, the associated program code is stored in memory 1020 and executed by processor 1010.
The input/output interface 1030 is used to connect with an input/output module for inputting and outputting information. The input/output module may be configured as a component in a device (not shown) or may be external to the device to provide corresponding functionality. Wherein the input devices may include a keyboard, mouse, touch screen, microphone, various types of sensors, etc., and the output devices may include a display, speaker, vibrator, indicator lights, etc.
Communication interface 1040 is used to connect communication modules (not shown) to enable communication interactions of the present device with other devices. The communication module may implement communication through a wired manner (such as USB, network cable, etc.), or may implement communication through a wireless manner (such as mobile network, WIFI, bluetooth, etc.).
Bus 1050 includes a path for transferring information between components of the device (e.g., processor 1010, memory 1020, input/output interface 1030, and communication interface 1040).
It should be noted that although the above-described device only shows processor 1010, memory 1020, input/output interface 1030, communication interface 1040, and bus 1050, in an implementation, the device may include other components necessary to achieve proper operation. Furthermore, it will be understood by those skilled in the art that the above-described apparatus may include only the components necessary to implement the embodiments of the present description, and not all the components shown in the drawings.
The electronic device of the foregoing embodiment is configured to implement the human-vehicle interaction method combined with the physical key under the corresponding voice scene in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which is not described herein.
Based on the same inventive concept, corresponding to the method of any embodiment, the application further provides a non-transitory computer readable storage medium, wherein the non-transitory computer readable storage medium stores computer instructions, and the computer instructions are used for enabling the computer to execute the human-vehicle interaction method combined with the physical key in the voice scene according to any embodiment.
The computer readable media of the present embodiments, including both permanent and non-permanent, removable and non-removable media, may be used to implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the foregoing embodiments are used to make the computer execute the human-vehicle interaction method combined with the physical key in the voice scene according to any one of the foregoing embodiments, and have the beneficial effects of the corresponding method embodiments, which are not described herein.
It will be appreciated that before using the technical solutions of the various embodiments in the disclosure, the user may be informed of the type of personal information involved, the range of use, the use scenario, etc. in an appropriate manner, and obtain the authorization of the user.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Therefore, the user can select whether to provide personal information to the software or hardware such as the electronic equipment, the application program, the server or the storage medium for executing the operation of the technical scheme according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt information is sent to the user may be, for example, a popup, in which the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide personal information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative, and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
Those of ordinary skill in the art will appreciate that: the discussion of any of the embodiments above is merely exemplary and is not intended to suggest that the scope of the application (including the claims) is limited to these examples; the technical features of the above embodiments or in the different embodiments may also be combined within the idea of the present application, the steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present application as described above, which are not provided in detail for the sake of brevity.
Additionally, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown within the provided figures, in order to simplify the illustration and discussion, and so as not to obscure the embodiments of the present application. Furthermore, the devices may be shown in block diagram form in order to avoid obscuring the embodiments of the present application, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform on which the embodiments of the present application are to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the application, it should be apparent to one skilled in the art that embodiments of the application can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative in nature and not as restrictive.
While the present application has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of those embodiments will be apparent to those skilled in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic RAM (DRAM)) may use the embodiments discussed.
The present embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Accordingly, any omissions, modifications, equivalents, improvements and/or the like which are within the spirit and principles of the embodiments are intended to be included within the scope of the present application.

Claims (10)

1. The human-vehicle interaction method combining physical keys in a voice scene is characterized by comprising the following steps of:
receiving a voice instruction of a user;
responding to the voice instruction to trigger a control interface, wherein the central control display screen cannot display all display contents of the control interface, switching a physical key from a current control mode to a page turning control mode, and determining triggering operation for the physical key;
controlling the central control display screen to turn pages for displaying the control interface according to the triggering operation;
And responding to receiving a selection confirmation instruction, and switching the physical key from the page turning control mode to the current control mode.
2. The method of claim 1, wherein the controlling the central control display to turn the page of the control interface according to the triggering operation includes:
determining the number of keys of the physical keys;
responding to the number of the keys to be one, and controlling the central control display screen to turn pages for displaying aiming at the triggering operation of the physical keys;
and determining a first physical key for page turning and a second physical key for page turning backward in response to the number of keys being at least two, and performing page turning display for a first triggering operation of the first physical key and a second triggering operation of the first physical key.
3. The method according to claim 2, wherein the controlling the central control display to perform page turning display for the triggering operation of the physical key in response to the number of keys being one includes:
responding to the trigger operation without the direction control operation, and controlling the central control display screen to turn pages according to a default page turning direction according to the page turning control operation in the trigger operation;
And responding to the trigger operation, wherein the trigger operation comprises a direction control operation, replacing the default page turning direction with a target page turning direction opposite to the default page turning direction, and controlling the central control display screen to turn pages according to the target page turning direction according to the page turning control operation in the trigger operation.
4. A method according to claim 3, wherein the default direction is page backward; the step of controlling the central control display screen to perform page turning display according to a default page turning direction according to the page turning control operation in the triggering operation comprises the following steps:
determining the operation type of the page turning control operation and the current page of the control interface;
responding to the operation type as a first type of operation, and controlling the central control display screen to display a page next to the current page;
and responding to the operation type being the second type operation, and controlling the central control display screen to continuously turn pages according to the default page turning direction.
5. The method of claim 4, wherein the controlling the central control display to perform continuous page turning display according to the default page turning direction in response to the operation type being a second type of operation comprises:
Determining a time interval of continuous page turning and a duration of a second type operation in response to the operation type being the second type operation;
determining the page turning number according to the duration and the time interval;
and controlling the central control display screen to continuously turn pages according to the page turning number in the default page turning direction by the current page.
6. The method of claim 2, wherein the first triggering operation for the first physical key and the second triggering operation for the first physical key are for page turning display, comprising:
responding to the first triggering operation as a first type operation, and controlling the central control display screen to display a page subsequent to the current page;
responding to the first triggering operation as a second type operation, and controlling the central control display screen to continuously turn pages backwards for display;
responding to the second triggering operation as a first type operation, and controlling the central control display screen to display a previous page of the current page;
and responding to the second trigger operation as a second type operation, and controlling the central control display screen to continuously forward page turning display.
7. The method as recited in claim 1, further comprising:
Responding to the voice instruction to trigger a control interface, wherein the central control display screen cannot display all display contents of the control interface, controls the central control display screen to display a first page of the control interface, and displays a first control for backward page turning and a second control for forward page turning;
controlling the central control display screen to highlight the first control in response to the fact that the direction control operation is not detected;
and responding to the detection of the direction control operation, and controlling the central control display screen to highlight the second control.
8. The utility model provides a human-computer interaction device that combines physical button under pronunciation scene which characterized in that includes:
an instruction receiving module configured to: receiving a voice instruction of a user;
a mode switching module configured to: responding to the voice instruction to trigger a control interface, wherein the central control display screen cannot display all display contents of the control interface, switching a physical key from a current control mode to a page turning control mode, and determining triggering operation for the physical key;
the page turning display module is configured to: controlling the central control display screen to turn pages for displaying the control interface according to the triggering operation;
A mode recovery module configured to: and responding to receiving a selection confirmation instruction, and switching the physical key from the page turning control mode to the current control mode.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the method of any one of claims 1 to 7 when the program is executed by the processor.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 7.
CN202311691474.0A 2023-12-08 2023-12-08 Human-vehicle interaction method combining physical keys in voice scene and related equipment Pending CN117590998A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311691474.0A CN117590998A (en) 2023-12-08 2023-12-08 Human-vehicle interaction method combining physical keys in voice scene and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311691474.0A CN117590998A (en) 2023-12-08 2023-12-08 Human-vehicle interaction method combining physical keys in voice scene and related equipment

Publications (1)

Publication Number Publication Date
CN117590998A true CN117590998A (en) 2024-02-23

Family

ID=89915132

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311691474.0A Pending CN117590998A (en) 2023-12-08 2023-12-08 Human-vehicle interaction method combining physical keys in voice scene and related equipment

Country Status (1)

Country Link
CN (1) CN117590998A (en)

Similar Documents

Publication Publication Date Title
US10029723B2 (en) Input system disposed in steering wheel and vehicle including the same
CN109933388B (en) Vehicle-mounted terminal equipment and display processing method of application components thereof
US9787812B2 (en) Privacy management
US20190034048A1 (en) Unifying user-interface for multi-source media
US20140168130A1 (en) User interface device and information processing method
CN110096330B (en) Interface switching method and device and electronic equipment
CN110780783B (en) Interface element moving method, system, vehicle and storage medium
CN104572322A (en) Method for operating terminal screen
CN111497611A (en) Vehicle interaction method and device
KR20160099721A (en) Method and device for displaying information and for operating an electronic device
CN112445393A (en) Data processing method, device, equipment and machine readable medium
CN111913769A (en) Application display method, device and equipment
CN112644617B (en) Navigation prompting method and device for riding vehicle and storage medium
CN102141894B (en) User interface displaying method and device
CN112677973A (en) Vehicle driving state control method and device and electronic equipment
CN117590998A (en) Human-vehicle interaction method combining physical keys in voice scene and related equipment
CN112612388B (en) Multimedia control method, equipment and storage medium for riding vehicle
US20160286036A1 (en) Method for quick access to application functionalities
KR20120018636A (en) Apparatus and method for processing input data in avn system
RU2710309C2 (en) Method and apparatus for processing split screen and vehicle
CN111098796A (en) Method for controlling terminal based on steering wheel, vehicle machine and vehicle
CN116176432B (en) Vehicle-mounted device control method and device, vehicle and storage medium
Sandnes et al. An eyes-free in-car user interface interaction style based on visual and textual mnemonics, chording and speech
CN110618750A (en) Data processing method, device and machine readable medium
CN115268759B (en) Multiplexing control method and device for steering wheel keys, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination