CN113495621A - Interactive mode switching method and device, electronic equipment and storage medium - Google Patents
Interactive mode switching method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113495621A CN113495621A CN202010261062.3A CN202010261062A CN113495621A CN 113495621 A CN113495621 A CN 113495621A CN 202010261062 A CN202010261062 A CN 202010261062A CN 113495621 A CN113495621 A CN 113495621A
- Authority
- CN
- China
- Prior art keywords
- interaction
- screen
- area
- touch
- switching
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 31
- 230000003993 interaction Effects 0.000 claims abstract description 218
- 230000015654 memory Effects 0.000 claims description 20
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 239000000126 substance Substances 0.000 claims description 2
- 230000009466 transformation Effects 0.000 claims description 2
- 230000007246 mechanism Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/225—Feedback of the input speech
Abstract
The application discloses a switching method and device of an interaction mode, electronic equipment and a storage medium, and relates to the technical field of intelligent interaction. The specific implementation scheme of the switching method is as follows: under the condition that the voice interaction equipment with the screen is in a touch interaction mode, switching to a plurality of interaction modes according to a first switching instruction; the multiple interactive modes are modes supporting voice interactive information display and touch interactive modes; under the condition of multiple interaction modes, a first screen area of the voice interaction equipment with the screen is an area for displaying voice interaction information, and a second screen area is an area for responding to a touch instruction; under the condition of a touch interaction mode, the first screen area and the second screen area of the voice interaction equipment with the screen are areas responding to touch instructions. By the scheme, a mechanism of a plurality of interaction modes is adopted, the interaction requirements of users can be met, and the interaction experience is improved.
Description
Technical Field
The application relates to the technical field of computers, in particular to the technical field of intelligent interaction.
Background
The existing screen intelligent voice interaction equipment cannot simultaneously perform other operations when performing voice interaction with a user, for example, voice interaction can be performed as far as possible on a voice interaction interface, and other forms of operations such as touch control of the user cannot be supported, that is, the existing screen intelligent voice interaction equipment only supports a single interaction mode at the same time.
Disclosure of Invention
Embodiments of the present application provide a method and an apparatus for switching an interaction mode, an electronic device, and a storage medium, so as to solve one or more technical problems in the prior art.
In a first aspect, the present application provides a method for switching an interactive mode, including:
under the condition that the voice interaction equipment with the screen is in a touch interaction mode, switching to a plurality of interaction modes according to a first switching instruction; the multiple interaction modes are voice interaction information display supporting and touch interaction modes;
under the condition of multiple interaction modes, a first screen area of the voice interaction equipment with the screen is an area for displaying voice interaction information, and a second screen area is an area for responding to a touch instruction;
under the condition of a touch interaction mode, a first screen area and a second screen area of the voice interaction equipment with the screen are areas responding to touch instructions.
Through the scheme, under the condition of a plurality of interaction modes, the voice interaction equipment with the screen can support the display of voice information of a user or display feedback information of a question or a voice control instruction of the user. Meanwhile, a touch instruction of a user can be executed. Due to the adoption of a mechanism of a multi-item interaction mode, the multi-item interaction requirements of the user can be met simultaneously, and the interaction experience is improved.
In a second aspect, the present application provides an apparatus for switching an interaction mode, including:
the first interaction mode switching module is used for switching to a plurality of interaction modes according to a first switching instruction under the condition that the voice interaction equipment with the screen is in a touch interaction mode; the multiple interaction modes are voice interaction information display supporting and touch interaction modes;
the interactive mode display module is used for taking a first screen area of the voice interactive equipment with the screen as an area for displaying voice interactive information and taking a second screen area as an area for responding to a touch instruction under the condition of a plurality of interactive modes;
and under the condition of a touch interaction mode, taking a first screen area and a second screen area of the voice interaction equipment with the screen as areas responding to touch instructions.
In a third aspect, an embodiment of the present application provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform a method provided by any one of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform a method provided by any one of the embodiments of the present application.
Other effects of the above-described alternative will be described below with reference to specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a flowchart of a switching method of an interactive mode according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a plurality of interaction patterns according to an embodiment of the present application;
FIG. 3 is a schematic illustration of a plurality of interaction patterns according to another embodiment of the present application;
FIG. 4 is a schematic illustration of a plurality of interaction patterns according to another embodiment of the present application;
FIG. 5 is a schematic diagram of a touch interaction mode according to an embodiment of the present application;
FIG. 6 is a flow diagram of interaction mode switching according to an embodiment of the present application;
FIG. 7 is a flow diagram of interaction mode switching according to an embodiment of the present application;
FIG. 8 is a flow diagram of interaction mode switching according to another embodiment of the present application;
FIG. 9 is a schematic diagram of an apparatus for switching interaction modes according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a first interaction mode switching module according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a first region transformation submodule according to an embodiment of the present application;
FIG. 12 is a diagram of a second interaction mode switching module, according to an embodiment of the present application;
fig. 13 is a block diagram of an electronic device for implementing the method for switching the interaction mode according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
As shown in fig. 1, in an embodiment, a method for switching an interactive mode is provided, and the method can be applied to a voice interactive device with a screen, and includes the following steps:
s101: under the condition that the voice interaction equipment with the screen is in a touch interaction mode, switching to a plurality of interaction modes according to a first switching instruction; the multiple interaction modes are voice interaction information display supporting and touch interaction modes.
S102: under the condition of multiple interaction modes, a first screen area of the voice interaction equipment with the screen is an area for displaying voice interaction information, and a second screen area is an area for responding to a touch instruction;
under the condition of a touch interaction mode, a first screen area and a second screen area of the voice interaction equipment with the screen are areas responding to touch instructions.
In the embodiment of the application, the touch interaction mode includes receiving and responding to a touch instruction. That is, the screen of the voice interaction device with screen may receive a touch instruction such as a point touch or a slide of the user, and the response to the touch instruction may be to execute a corresponding operation according to the touch instruction of the user. For example, according to the touch instruction of the user, operations such as opening or closing an application program can be executed, and according to the sliding instruction of the user, operations such as turning pages can be correspondingly executed.
The multiple interaction modes may include receiving and executing a touch instruction while displaying voice interaction information. In the case of a multi-item interaction mode, a display area of the voice interaction device with screen may be divided into a first screen area and a second screen area.
The first screen area may serve as a display area for voice interaction information, for example, displaying the user's voice information "do you eat today", or displaying feedback information for the user's questions or voice control instructions. For example, the user's question is "how today's weather is". According to the query result, it can be displayed that "the XX zone temperature of XX city today is XX ℃" in the display area (first screen area) of the voice interaction information. For another example, the voice control command of the user is "play next episode", and "you have switched to the XX-th episode" may be displayed in the display area of the voice interaction information.
The second screen region may be used to receive a tap or swipe operation by a user. For example, voice information of the user is displayed in the first screen area, or feedback information for a question or voice control instruction of the user is displayed. At the same time, under the condition that the second screen area receives a sliding instruction of a user, according to the analysis of the sliding instruction, the operations such as screen page turning, or screen brightness and volume adjustment can be correspondingly carried out.
The multiple interaction modes can be executed in a desktop state of the screen-mounted voice interaction device or an application program opening state. For example, fig. 2 shows the case of being executed in a desktop state, fig. 3 shows the case of being executed in a state where playing music is turned on, and fig. 4 shows the case of being executed in a state where playing video is turned on. In fig. 2 to 4, the display area of the voice interactive information shows "how the weather is today" as an example.
Under the condition of the touch interaction mode, the touch interaction mode can be switched to a plurality of interaction modes according to a first switching instruction. The first switching instruction may include a line of sight switching instruction, a touch switching instruction, a gesture switching instruction, a voice switching instruction, or the like. As shown in fig. 5, in the touch interaction mode, the entire area of the screen-equipped voice interaction device may receive and respond to a click or slide operation of the user. That is, both the first screen area and the second screen area can receive and respond to the touch instruction.
In addition, an interaction mode switch is arranged on the interface of the voice interaction equipment with the screen. And under the condition that the user touches the interactive mode change-over switch, the interactive mode is changed over. The form of the interactive mode switching switch may be changed as the interactive mode is switched. Taking fig. 2 to 5 as an example, the label at the lower left corner in the figure is an interaction mode switch. As shown in fig. 2 to 4, in the multiple interaction modes, the interaction mode switch may be presented in a first form; as shown in fig. 5, in the touch interaction mode, the interaction mode switch may be in a second mode.
Through the scheme, under the condition of a plurality of interaction modes, the voice interaction equipment with the screen can support the display of voice information of a user or display feedback information of a question or a voice control instruction of the user. Meanwhile, a touch instruction of a user can be executed. Due to the adoption of a mechanism of a plurality of interaction modes, the interaction requirements of users can be met, and the interaction experience is improved.
As shown in fig. 6, in one embodiment, step S101 includes:
s1011: and under the condition of receiving a first switching instruction, converting a first screen area of the voice interaction equipment with the screen into an area for displaying voice interaction information.
S1012: the second screen area is left unchanged.
The first screen area (voice interaction information display area) is converted into an area for displaying voice interaction information from an area responding to the touch instruction. The first screen area and the second screen area (touch instruction receiving area) may be in the form of partially overlapping areas. For example, the first screen area may be overlaid on the second screen area in an overlying layer. The first screen area and the covered portion in the second screen area are adjusted to not respond to the received touch command, and other portions of the second screen area remain to respond to the received touch command. The not responding to the received touch instruction may be not performing a corresponding operation after receiving the touch instruction of the user.
By the scheme, the voice interaction information can be displayed, and meanwhile, the touch instruction of a user can be received. The interaction requirements of the user are met, and the interaction experience of the user is improved.
As shown in fig. 7, in one embodiment, step S1011 includes:
s10111: and adjusting the first screen area to a state of not responding to the received touch instruction.
S10112: and displaying the voice interaction information in the first screen area.
As mentioned above, when the voice interaction device with screen is in the touch interaction mode, the first screen area is an area for displaying voice interaction information. Therefore, the response state of the first screen area may be adjusted to a state of not responding to the received touch instruction. Through the adjustment mode, the first screen area does not respond to the touch instruction of the user in the process of displaying the voice interaction information.
The voice interaction information may include received voice interaction information and/or feedback information to the received voice information. After the attributes such as the font and the font size are adjusted for the voice interaction information, the voice interaction information can be displayed in the first screen area in the form of a floating layer or a map.
Through the scheme, the touch instruction is not responded in the information display process of the first screen area, so that the information display of the first screen area is not interfered. The interaction requirements of the user are met, and the interaction experience of the user is improved.
In one embodiment, the first switching instruction includes one of a touch switching instruction, a voice switching instruction, a gesture switching instruction, and a line of sight switching instruction.
The touch switching instruction may include acquiring that the user clicks an interactive mode switching switch on the screen to confirm that the switching control instruction is received.
The voice switching instruction may include a specific wake-up word, such as "degree, degree". Or a specific instruction such as "loud click", "play next song", or the like. And confirming that the voice switching instruction is received under the condition that the awakening word or the specific instruction is received.
The gesture switching instruction may include a preset specific gesture action and the like. And under the condition that the specific gesture action made by the user is collected, confirming that a gesture switching instruction is received.
The gaze switching instruction may include determining that the gaze switching instruction is received if it is detected that the user's gaze is focused on the on-screen voice interaction device and exceeds a certain time.
By the scheme, various switching modes can be supported, and switching means of interactive modes are enriched.
In one embodiment, the method further comprises:
and under the condition that the voice interaction equipment with the screen is in a plurality of interaction modes, switching to a touch interaction mode according to a second switching instruction.
Under the condition of multiple interaction modes, the touch interaction mode can be switched to according to the second switching instruction. The second switching instruction may include a touch switching instruction, a gesture switching instruction, a voice switching instruction, or the like. For example, switching can be performed by acquiring an interactive mode switch clicked by a user on a screen, acquiring a specific gesture of the user, or receiving a voice switching instruction of the user.
In addition, the second switching instruction may also be a predetermined period of time. For example, in playing a video conference scene, playing a movie scene, or playing a video scene such as a game. The user may be left for 90 seconds as voice interaction time after entering the video scene. And within 90 seconds, corresponding control can be performed according to the collected voice instruction, and a control result is displayed. For example, when voice commands such as "loud click" and "dim brightness" are collected, operations such as adjusting volume or adjusting brightness may be performed correspondingly. Meanwhile, in the first screen area, a voice control instruction of a user can be correspondingly displayed, and a control result of the voice control instruction can be displayed. For example, information such as "the volume has been turned up for you" or "the brightness has been adjusted for you" may be displayed.
In case of more than 90 seconds, it is possible to automatically switch to the touch interaction mode. The voice interaction information is no longer displayed on the screen, so that immersive video viewing by the user can be achieved.
Under the condition of switching to the touch interaction mode, an effective third switching instruction can be determined according to the time of receiving the touch instruction. And the third switching instruction is an instruction for switching the voice interaction equipment with the screen back to a plurality of interaction modes.
For example, in a first predetermined time period after the multi-item interaction mode is switched to the touch interaction mode, whether the user still has a switching intention can be confirmed according to whether a touch instruction is received. That is, within a first predetermined time period (e.g., within 5 seconds), in the case where no touch instruction is received, it may indicate that the user has finished operating. In this case, there may be a case where the user's line of sight still stays at the display interface of the screen-equipped voice interaction apparatus. For this case, the sight line switching instruction may be temporarily prohibited for the first predetermined period of time. And determining that the touch switching instruction, the gesture switching instruction or the voice switching instruction is a valid third switching instruction within the first preset time period.
And determining that the touch switching instruction, the gesture switching instruction, the voice switching instruction or the sight switching instruction is a valid third switching instruction when the first predetermined time period is exceeded.
In another case, if a touch instruction of a user is received within a first predetermined time period, a time when the touch instruction is received last time is determined. And determining the touch switching instruction, the gesture switching instruction or the voice switching instruction as an effective third switching instruction in a second preset time period by taking the moment as an initial moment.
And in the first preset time period, the condition that a touch instruction is received exists, and the operation intention of the user is only to close the display of the voice interaction information. In this case, the touch condition of the user may be continuously monitored until the last touch behavior of the user is determined. The last touch behavior may be determined according to a time interval between two adjacent times of receiving the touch instruction.
In the case that the second predetermined time period is exceeded, it may be determined that the touch switching instruction, the gesture switching instruction, the voice switching instruction, or the line of sight switching instruction is a valid third switching instruction.
The value range of the first predetermined time period includes any value from 5 seconds to 10 seconds. The second predetermined period of time is 10 seconds.
By the scheme, the switching of the interaction mode can be realized. In some specific scenarios, automatic switching may be achieved using the second switching instruction, thereby simplifying the switching process.
As shown in fig. 8, in an embodiment, the switching to the touch interaction mode includes:
s801: and under the condition of receiving a second switching instruction, converting the first screen area of the voice interaction equipment with the screen into an area responding to the touch instruction.
S802: the second screen area is left unchanged.
In the case of the touch interaction mode, the entire area of the screen-equipped voice interaction apparatus may receive and respond to a click or slide operation of the user. That is, both the first screen area and the second screen area can receive and respond to the touch instruction. In case of receiving the second switching instruction, the switching of the interaction mode may be implemented in a manner of changing a response state of the first screen region.
Through the scheme, under the condition of the touch interaction mode, the touch interaction mode is recovered to be a full screen and can respond to the touch instruction.
In one embodiment, the second switching instruction includes one of a touch switching instruction, a voice switching instruction, and a gesture switching instruction.
The line-of-sight switching instruction is reduced in the second switching instruction compared to the first switching instruction. Since multiple interaction modes may be executed in a video state, the eye switching instruction may cause a false switching situation. Based on this, a touch switching instruction, a voice switching instruction, a gesture switching instruction and the like which can represent subjective intention of the user are reserved in the second switching instruction.
Through the scheme, the switching instruction representing the subjective intention of the user can be used as an effective switching instruction in the process of switching the multi-item interaction mode to the touch interaction mode. Thereby, the occurrence of a malfunction can be avoided.
As shown in fig. 9, in one embodiment, there is provided an interactive mode switching apparatus, including the following components:
the first interaction mode switching module 901 is configured to switch to multiple interaction modes according to a first switching instruction when the voice interaction device with a screen is in the touch interaction mode; the multiple interaction modes are voice interaction information display supporting and touch interaction modes.
An interaction mode display module 902, configured to, in a case of multiple interaction modes, set a first screen area of the voice interaction device with a screen as an area for displaying voice interaction information, and set a second screen area as an area for responding to a touch instruction;
and under the condition of a touch interaction mode, taking a first screen area and a second screen area of the voice interaction equipment with the screen as areas responding to touch instructions.
As shown in fig. 10, in one embodiment, the first interaction mode switching module 901 includes:
the first area conversion sub-module 9011 is configured to, in a case that the first switching instruction is received, convert the first screen area of the on-screen voice interaction device into an area for displaying voice interaction information.
A first region holding sub-module 9012 for holding the second screen region unchanged.
As shown in fig. 11, in an embodiment, the first region conversion sub-module 9011 includes:
the state transition unit 90111 is configured to adjust the first screen area to a state that does not respond to the received touch instruction.
And an information display unit 90112, configured to display the voice interaction information in the first screen area.
In one embodiment, the first switching instruction includes one of a touch switching instruction, a voice switching instruction, a gesture switching instruction, and a line of sight switching instruction.
In one embodiment, the method further comprises:
and the second interaction mode switching module is used for switching to the touch interaction mode according to a second switching instruction under the condition that the voice interaction equipment with the screen is in a plurality of interaction modes.
As shown in fig. 12, in an embodiment, the second interaction mode switching module includes:
and the second area conversion sub-module 1201 is configured to convert the first screen area of the on-screen voice interaction device into an area responding to the touch instruction in the case that the second switching instruction is received.
A second area holding sub-module 1202 for holding the second screen area unchanged.
In one embodiment, the second switching instruction includes one of a touch switching instruction, a voice switching instruction, and a gesture switching instruction.
The functions of each module in each apparatus in the embodiment of the present application may refer to corresponding descriptions in the above method, and are not described herein again.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 13, the electronic apparatus includes: one or more processors 1310, a memory 1320, and interfaces for connecting the various components, including a high speed interface and a low speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). One processor 1310 is illustrated in fig. 13.
The memory 1320 may be used to store non-transitory computer readable storage media, such as non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the switching method of the interaction mode in the embodiment of the present application (for example, the interaction mode display module 901 and the first interaction mode switching module 902 shown in fig. 9). The processor 1310 executes various functional applications of the server and data processing, i.e., implements the switching method of the interaction mode in the above-described method embodiment, by running non-transitory software programs, instructions, and modules stored in the memory 1320.
The memory 1320 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device of the switching method of the interactive mode, and the like. Further, the memory 1320 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 1320 may optionally include memory located remotely from the processor 1310, and these remote memories may be connected over a network to the electronic device of the switching method of the interaction mode. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the method for switching the interaction mode may further include: an input device 1330 and an output device 1340. The processor 1310, the memory 1320, the input device 1330, and the output device 1340 may be connected by a bus or other means, such as by a bus in FIG. 13.
The input device 1330 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus for the switching method of the interaction mode, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, etc. The output devices 1340 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (14)
1. A switching method of an interactive mode is applied to a voice interactive device with a screen, and is characterized by comprising the following steps:
under the condition that the voice interaction equipment with the screen is in a touch interaction mode, switching to a plurality of interaction modes according to a first switching instruction; the multiple interaction modes are voice interaction information display supporting and touch interaction modes;
under the condition of the multiple interaction modes, a first screen area of the voice interaction equipment with the screen is an area for displaying voice interaction information, and a second screen area of the voice interaction equipment with the screen is an area for responding to a touch instruction;
and under the condition of the touch interaction mode, the first screen area and the second screen area of the voice interaction equipment with the screen are areas responding to touch instructions.
2. The method according to claim 1, wherein the switching to the plurality of interaction modes according to the first switching instruction comprises:
under the condition that the first switching instruction is received, converting a first screen area of the voice interaction equipment with the screen into an area for displaying voice interaction information;
the second screen area is left unchanged.
3. The method of claim 2, wherein converting the first screen area of the on-screen voice interaction device into the area for displaying the voice interaction information comprises:
adjusting the first screen area to a state of not responding to the received touch instruction;
and displaying the voice interaction information in the first screen area.
4. The method of claim 1 or 2, wherein the first switching instruction comprises one of a touch switching instruction, a voice switching instruction, a gesture switching instruction, and a line of sight switching instruction.
5. The method of claim 1, further comprising:
and under the condition that the voice interaction equipment with the screen is in a plurality of interaction modes, switching to a touch interaction mode according to a second switching instruction.
6. The method of claim 5, wherein switching to the touch interaction mode comprises:
under the condition that the second switching instruction is received, converting a first screen area of the voice interaction equipment with the screen into an area responding to the touch instruction;
the second screen area is left unchanged.
7. The method of claim 5 or 6, wherein the second switching instruction comprises one of a touch switching instruction, a voice switching instruction, and a gesture switching instruction.
8. The utility model provides a switching device of interactive mode, is applied to area screen pronunciation interactive device which characterized in that includes:
the first interaction mode switching module is used for switching to a plurality of interaction modes according to a first switching instruction under the condition that the voice interaction equipment with the screen is in a touch interaction mode; the multiple interaction modes are voice interaction information display supporting and touch interaction modes;
the interactive mode display module is used for taking a first screen area of the voice interactive equipment with the screen as an area for displaying voice interactive information and taking a second screen area as an area for responding to a touch instruction under the condition of the multiple interactive modes;
and under the condition of the touch interaction mode, taking the first screen area and the second screen area of the voice interaction equipment with the screen as areas responding to touch instructions.
9. The apparatus of claim 8, wherein the first interaction mode switching module comprises:
the first area conversion submodule is used for converting a first screen area of the voice interaction equipment with the screen into an area for displaying voice interaction information under the condition of receiving the first switching instruction;
a first area holding sub-module for holding the second screen area unchanged.
10. The apparatus of claim 9, wherein the first region transformation submodule comprises:
the state conversion unit is used for adjusting the first screen area to a state which does not respond to the received touch instruction;
and the information display unit is used for displaying the voice interaction information in the first screen area.
11. The apparatus of claim 8, further comprising:
and the second interaction mode switching module is used for switching to the touch interaction mode according to a second switching instruction under the condition that the voice interaction equipment with the screen is in a plurality of interaction modes.
12. The apparatus of claim 11, wherein the second interaction mode switching module comprises:
the second area conversion submodule is used for converting the first screen area of the voice interaction equipment with the screen into an area responding to the touch instruction under the condition of receiving the second switching instruction;
a second area holding sub-module for holding the second screen area unchanged.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 7.
14. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010261062.3A CN113495621A (en) | 2020-04-03 | 2020-04-03 | Interactive mode switching method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010261062.3A CN113495621A (en) | 2020-04-03 | 2020-04-03 | Interactive mode switching method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113495621A true CN113495621A (en) | 2021-10-12 |
Family
ID=77995267
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010261062.3A Pending CN113495621A (en) | 2020-04-03 | 2020-04-03 | Interactive mode switching method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113495621A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023173888A1 (en) * | 2022-03-18 | 2023-09-21 | 上海瑾盛通信科技有限公司 | Interface interaction method and apparatus, and mobile terminal and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103838487A (en) * | 2014-03-28 | 2014-06-04 | 联想(北京)有限公司 | Information processing method and electronic device |
US20150133197A1 (en) * | 2013-11-08 | 2015-05-14 | Samsung Electronics Co., Ltd. | Method and apparatus for processing an input of electronic device |
KR20160097467A (en) * | 2015-02-08 | 2016-08-18 | 박남태 | The method of voice control for display device and voice control display device |
CN108196675A (en) * | 2017-12-29 | 2018-06-22 | 珠海市君天电子科技有限公司 | For the exchange method, device and touch control terminal of touch control terminal |
CN108595085A (en) * | 2018-04-03 | 2018-09-28 | Oppo广东移动通信有限公司 | A kind of information processing method, electronic equipment and storage medium |
KR20180109214A (en) * | 2017-03-27 | 2018-10-08 | 삼성전자주식회사 | Touch input processing method and electronic device supporting the same |
CN108804010A (en) * | 2018-05-31 | 2018-11-13 | 北京小米移动软件有限公司 | Terminal control method, device and computer readable storage medium |
CN108920085A (en) * | 2018-06-29 | 2018-11-30 | 百度在线网络技术(北京)有限公司 | Information processing method and device for wearable device |
CN109712621A (en) * | 2018-12-27 | 2019-05-03 | 维沃移动通信有限公司 | A kind of interactive voice control method and terminal |
CN109830233A (en) * | 2019-01-22 | 2019-05-31 | Oppo广东移动通信有限公司 | Exchange method, device, storage medium and the terminal of voice assistant |
-
2020
- 2020-04-03 CN CN202010261062.3A patent/CN113495621A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150133197A1 (en) * | 2013-11-08 | 2015-05-14 | Samsung Electronics Co., Ltd. | Method and apparatus for processing an input of electronic device |
CN103838487A (en) * | 2014-03-28 | 2014-06-04 | 联想(北京)有限公司 | Information processing method and electronic device |
KR20160097467A (en) * | 2015-02-08 | 2016-08-18 | 박남태 | The method of voice control for display device and voice control display device |
KR20180109214A (en) * | 2017-03-27 | 2018-10-08 | 삼성전자주식회사 | Touch input processing method and electronic device supporting the same |
CN108196675A (en) * | 2017-12-29 | 2018-06-22 | 珠海市君天电子科技有限公司 | For the exchange method, device and touch control terminal of touch control terminal |
CN108595085A (en) * | 2018-04-03 | 2018-09-28 | Oppo广东移动通信有限公司 | A kind of information processing method, electronic equipment and storage medium |
CN108804010A (en) * | 2018-05-31 | 2018-11-13 | 北京小米移动软件有限公司 | Terminal control method, device and computer readable storage medium |
CN108920085A (en) * | 2018-06-29 | 2018-11-30 | 百度在线网络技术(北京)有限公司 | Information processing method and device for wearable device |
CN109712621A (en) * | 2018-12-27 | 2019-05-03 | 维沃移动通信有限公司 | A kind of interactive voice control method and terminal |
CN109830233A (en) * | 2019-01-22 | 2019-05-31 | Oppo广东移动通信有限公司 | Exchange method, device, storage medium and the terminal of voice assistant |
Non-Patent Citations (1)
Title |
---|
聂波;王绪刚;王宏安;王纲;: "手持移动设备中多通道交互的通用开发框架", 计算机应用研究, no. 09, 15 September 2007 (2007-09-15) * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023173888A1 (en) * | 2022-03-18 | 2023-09-21 | 上海瑾盛通信科技有限公司 | Interface interaction method and apparatus, and mobile terminal and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7159358B2 (en) | Video access method, client, device, terminal, server and storage medium | |
US8756516B2 (en) | Methods, systems, and computer program products for interacting simultaneously with multiple application programs | |
US10042420B2 (en) | Gaze-aware control of multi-screen experience | |
US10394437B2 (en) | Custom widgets based on graphical user interfaces of applications | |
US20080196038A1 (en) | Utilizing a first managed process to host at least a second managed process | |
US9720567B2 (en) | Multitasking and full screen menu contexts | |
US20210149558A1 (en) | Method and apparatus for controlling terminal device, and non-transitory computer-readle storage medium | |
KR102358012B1 (en) | Speech control method and apparatus, electronic device, and readable storage medium | |
CN111586459B (en) | Method and device for controlling video playing, electronic equipment and storage medium | |
CN112148160B (en) | Floating window display method and device, electronic equipment and computer readable storage medium | |
CN113014939A (en) | Display device and playing method | |
US20160021417A1 (en) | Interactive system and method for intelligent television | |
CN112055261A (en) | Subtitle display method and device, electronic equipment and storage medium | |
KR20210038278A (en) | Speech control method and apparatus, electronic device, and readable storage medium | |
CN113495621A (en) | Interactive mode switching method and device, electronic equipment and storage medium | |
CN112002321B (en) | Display device, server and voice interaction method | |
CN113495620A (en) | Interactive mode switching method and device, electronic equipment and storage medium | |
CN112584280A (en) | Control method, device, equipment and medium for intelligent equipment | |
CN112578962A (en) | Information flow display method, device, equipment and medium | |
CN113495622A (en) | Interactive mode switching method and device, electronic equipment and storage medium | |
CN113727165A (en) | Video live broadcast method and device, electronic equipment and storage medium | |
CN112199560A (en) | Setting item searching method and display device | |
CN113676744A (en) | Switching control method and device for live broadcast room, electronic equipment and storage medium | |
CN115079919A (en) | Multi-window control method and device, electronic equipment and storage medium | |
KR20210038277A (en) | Speech control method and apparatus, electronic device, and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |