CN115421682A - Device control method, device, electronic device and storage medium - Google Patents
Device control method, device, electronic device and storage medium Download PDFInfo
- Publication number
- CN115421682A CN115421682A CN202110601591.8A CN202110601591A CN115421682A CN 115421682 A CN115421682 A CN 115421682A CN 202110601591 A CN202110601591 A CN 202110601591A CN 115421682 A CN115421682 A CN 115421682A
- Authority
- CN
- China
- Prior art keywords
- instruction
- time period
- recommendation
- content
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 230000004044 response Effects 0.000 claims abstract description 14
- 230000007613 environmental effect Effects 0.000 claims description 5
- 230000001960 triggered effect Effects 0.000 abstract description 11
- 238000010586 diagram Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 7
- 238000007667 floating Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Acoustics & Sound (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a device control method, a device, an electronic device and a storage medium, wherein the instruction content of a recommended instruction is displayed firstly, and then the recommended instruction is executed by a digital voice assistant in response to a trigger operation acted on the instruction content. Therefore, the electronic equipment can actively present the instruction content of the recommendation instruction to the user through the mode, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content under the scene that the user is inconvenient to sound, the instruction can be executed through the digital voice assistant, and the user experience of the digital voice assistant is improved.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a device control method and apparatus, an electronic device, and a storage medium.
Background
With the development of speech recognition technology, digital speech assistants are generally configured in electronic devices to assist users in controlling the electronic devices. For example, the user may control the electronic device to start a desired application program by voice. However, the digital voice assistant in the related electronic device still needs to be improved in user experience.
Disclosure of Invention
In view of the foregoing problems, embodiments of the present application provide a device control method, apparatus, electronic device, and storage medium to improve the foregoing problems.
In a first aspect, an embodiment of the present application provides a device control method, which is applied to an electronic device including a digital voice assistant, and the method includes: displaying instruction content of the recommendation instruction; and executing the recommendation instruction by the digital voice assistant in response to the trigger operation acting on the instruction content.
In a second aspect, an embodiment of the present application provides an apparatus control device, which is configured to operate in an electronic device, where the electronic device includes a digital voice assistant, and the apparatus includes: the instruction display unit is used for displaying the instruction content of the recommended instruction; and the instruction execution unit is used for responding to the trigger operation acted on the instruction content and executing the recommendation instruction through the digital voice assistant.
In a third aspect, an embodiment of the present application provides an electronic device, including one or more processors and a memory; one or more programs are stored in the memory and configured to be executed by the one or more processors to implement the methods described above.
In a fourth aspect, the present application provides a computer-readable storage medium, in which a program code is stored, wherein when the program code is executed by a processor, the method described above is performed.
According to the equipment control method, the equipment control device, the electronic equipment and the storage medium, the instruction content of the recommendation instruction is displayed firstly, then the digital voice assistant executes the recommendation instruction in response to the trigger operation acting on the instruction content. Therefore, the electronic equipment can actively present the instruction content of the recommendation instruction to the user through the mode, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content under the scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user use experience of the digital voice assistant is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating an apparatus control method according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a display recommendation instruction in an embodiment of the application;
FIG. 3 is a schematic diagram illustrating a display area for displaying recommendation instructions encroaching on a display position of an application icon in an embodiment of the application;
FIG. 4 is a schematic diagram of another embodiment of the present application showing a recommendation instruction;
FIG. 5 is a schematic diagram illustrating a triggering recommendation instruction in an embodiment of the present application;
FIG. 6 is a flow chart illustrating a method for controlling a device according to another embodiment of the present application;
FIG. 7 shows a flowchart of one embodiment of S210 in FIG. 6;
FIG. 8 is a schematic diagram showing the corresponding dates in the embodiment of the present application;
FIG. 9 shows a schematic diagram of a time period in which an executed instruction is recorded and a position of an electronic device in an embodiment of the application;
FIG. 10 is a schematic diagram illustrating one embodiment of determining a recommendation instruction in the present application;
fig. 11 is a flowchart illustrating a device control method according to still another embodiment of the present application;
FIG. 12 is a schematic diagram showing another example of determining a recommendation instruction in the embodiment of the present application;
fig. 13 is a flowchart illustrating a device control method according to still another embodiment of the present application;
fig. 14 is a flowchart illustrating a device control method according to still another embodiment of the present application;
FIG. 15 is a diagram showing the result of an instruction executed by a digital voice assistant in an embodiment of the present application;
FIG. 16 is a diagram showing a speech recognition content in an embodiment of the present application;
FIG. 17 is a schematic diagram showing an interface for displaying the result of instruction execution in an embodiment of the present application;
FIG. 18 is a schematic diagram showing the addition of cards to the desktop in an embodiment of the application;
FIG. 19 shows a schematic view of a card moving from a desktop to minus one screen in an embodiment of the application;
FIG. 20 shows a schematic view of a card moving from minus one screen to a desktop in an embodiment of the application;
FIG. 21 shows a schematic view of a card being removed from a desktop in an embodiment of the application;
FIG. 22 shows a schematic view of a card being removed from minus one screen in an embodiment of the present application;
fig. 23 is a block diagram showing a configuration of an electronic device for executing the device control method according to the embodiment of the present application in real time;
fig. 24 is a block diagram showing a configuration of an electronic apparatus of the present application for executing an apparatus control method according to an embodiment of the present application;
fig. 25 illustrates a storage unit for storing or carrying a program code implementing the device control method according to the embodiment of the present application in real time.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Voice assistants are increasingly used in electronic devices. When the voice assistant is installed in the electronic device, a user may first operate the electronic device to call the voice assistant, then call the voice assistant, and trigger the voice assistant to control the electronic device by means of voice input. For example, after the voice assistant calls out, the user inputs "start music software" by voice, and then the voice assistant can control the electronic device to start the installed music software.
After the inventor researches the related voice assistant, the related voice assistant still needs to be improved in user experience. For example, the voice assistant needs the user to operate the electronic device to call the voice assistant first, and then the voice assistant can input the voice command through the voice assistant after calling the voice assistant, which results in a long interaction path with the user. Furthermore, in some scenarios, it may be inconvenient for the user to make a voice utterance, and the user may not be able to use the voice assistant to control the electronic device.
Therefore, the inventor proposes a device control method, an apparatus, an electronic device, and a storage medium in the embodiments of the present application, in the device control method, the electronic device may first display instruction content of a recommendation instruction, and then execute the recommendation instruction by the digital voice assistant in response to a trigger operation acting on the instruction content. Therefore, the electronic equipment can actively present the instruction content of the recommendation instruction to the user through the mode, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content under the scene where the voice is not convenient to produce (for example, in a meeting scene or a public place), the instruction execution through the digital voice assistant is realized, and the user use experience of the digital voice assistant is improved.
Embodiments included in the present application will be described with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a flowchart illustrating a device control method according to an embodiment of the present disclosure, applied to an electronic device, the method including:
s110: and displaying instruction content of a recommendation instruction, wherein the recommendation instruction is generated by the digital voice assistant.
The recommended instruction can be understood as a control instruction which is actively presented to a user by the electronic equipment, and the electronic equipment can be controlled by triggering the control instruction. In the embodiment of the present application, the electronic device may determine the recommendation instruction in various ways.
As one approach, the electronic device may determine the recommended instructions to be currently displayed based on historical control instructions. For example, the electronic device may infer a habit of a user using the electronic device on the same day or the same week according to a habit of the user using the electronic device on the previous day or the previous week, and further determine which kind of electronic device the user will control within a certain time period to determine the corresponding recommendation instruction to be displayed. For example, if the current day is saturday of week 18, the electronic device may estimate the recommended instruction that the electronic device will specifically display on the current day according to the control instruction executed by the electronic device on saturday of week 17. For another example, if the current time is 10 o ' clock and 15 minutes, the electronic device may estimate the recommended instruction required to be displayed at the current time based on the control instruction executed between 10 o ' clock and 11 o ' clock of each day of the previous week.
Alternatively, the electronic device may determine the recommendation instruction based on a display scene in which the electronic device is currently located. The electronic device may determine the current display scene according to an application program in the electronic device that is running in the foreground. Optionally, if the application program running in the foreground in the electronic device is a text content browsing program, the current scene is a text display scene.
In the embodiment of the present application, there may be multiple embodiments of the display position of the recommendation instruction.
As one way, as shown in fig. 2, the recommendation instruction may be displayed in a display area 11 in the desktop 10. Here, the "XX instruction" in the display area 11 may be understood as an instruction content of a recommendation instruction. In this manner, the display area 11 may be a display area at the same level as the application icons in the desktop 10. Here, the same level may be understood as that the display region 11 may occupy the display position of the application icon if there is contact with the display position of the application icon during the movement of the desktop. For example, as shown in fig. 3, in the desktop editing mode, when the display area 11 is moved in the direction indicated by the dotted arrow in the leftmost image in fig. 3 and moved to the position indicated in the middle image in fig. 3, the display area 11 overlaps with the application icon named "chat", and the corresponding application icon named "chat" is moved in the direction indicated by the dotted arrow in the middle image in fig. 3 and moved to the position indicated in the right image in fig. 3, so that the display area 11 occupies the original display position of the application icon named "chat". Upon exiting the desktop editing mode from the situation shown in the right image of fig. 3, the display area 11 and the application icons in the desktop remain in the current display position.
Alternatively, as shown in fig. 4, the recommendation instruction may be displayed in a pop-up window 12 displayed in a floating manner. In this way, the electronic device may display the popup window 12 and display the recommendation instruction in the popup window 12 in the case where it is determined that the recommendation instruction display is required. Optionally, if the electronic device detects that the sliding operation is performed on the pop-up window 12, the pop-up window 12 may be moved out of the display area of the electronic device along the sliding direction of the sliding operation. It should be noted that the sliding operation may include a pressing operation and a sliding phase continuous with the pressing operation, wherein continuous with the pressing operation may be understood as an object performing the pressing operation directly starting to enter the sliding phase while maintaining contact with the screen. The sliding operation applied to the pop-up window 12 is understood to include a pressing operation directly applied to the pop-up window 12.
The time for displaying the recommendation command can be determined according to various modes.
As one mode, in a mode in which the recommendation instruction is displayed in a display area provided in the desktop, after the display area is provided on the desktop and displayed, the display of the recommendation instruction in the display area may be started.
As another mode, in the mode of displaying the recommendation instruction through the pop-up window, a switch control may be configured in the electronic device, and the switch control is used for controlling whether the electronic device acquires the recommendation instruction and displaying the recommendation instruction. In this way, if the electronic device detects that the representation of the switch control is in the on state, the electronic device starts to determine the recommended instruction that needs to be displayed currently, and displays the determined recommended instruction in a pop-up mode. And if the electronic equipment detects that the representation of the switch control is in the closed state, stopping determining the recommended instruction.
S120: and executing the recommendation instruction by the digital voice assistant in response to the trigger operation acting on the instruction content.
By one approach, the touch operation applied to the instruction content may include a touch operation applied to the instruction content. The touch operation may include a click operation, a long-time press operation, or a double-click operation. For example, if the electronic device detects a click operation on the instruction content, the electronic device may execute the recommendation instruction through the digital voice assistant. For example, as shown in fig. 5, if the instruction content of the displayed recommendation instruction is "weather today", and there is a click operation acting on "weather today" as shown in the middle image of fig. 5, the digital voice assistant may control the electronic device to display an interface 13 shown in the right image of fig. 5, and display detailed content of weather today in the interface 13.
Alternatively, the correspondence between the physical key combination and the instruction content may be established. In this manner, if it is detected that the physical key combination corresponding to the instruction content is pressed, it is determined that there is a touch operation acting on the instruction content. For example, if the physical key combination is a combination of a power key and a volume up key, it may be determined that a touch operation applied to the instruction content is detected when the power key and the volume up key are detected to be pressed together. For another example, if the physical key combination is a combination of a volume-up key and a volume-down key, it may be determined that a touch operation applied to the instruction content is detected when it is detected that the volume-up key and the volume-down key are pressed together.
Wherein, executing the recommendation instruction by the digital voice assistant may be understood as executing the recommendation instruction by the digital voice assistant. For example, if the instruction content of the recommendation instruction is "today weather", in the process of executing the recommendation instruction by the digital voice assistant, the digital voice assistant may perform a weather query through the network, and then display the result of the query through an interface (e.g., interface 13 in fig. 5).
In this embodiment, S110 and S120 may be executed by the digital voice assistant itself, and in this way, the digital voice assistant may display the recommendation instruction and is also responsible for detecting whether there is a trigger operation acting on the instruction content, and then directly execute the recommendation instruction after the recommendation instruction is triggered. Furthermore, S110 and S120 may be executed by a module other than the digital voice assistant in the electronic device, and in this way, for the case that the recommendation instruction needs to be executed by the digital voice assistant, the recommendation instruction may be transmitted to the digital voice assistant so that the digital voice assistant can execute the recommendation instruction.
In the device control method provided by this embodiment, the instruction content of the recommendation instruction is displayed first, and then the digital voice assistant executes the recommendation instruction in response to the trigger operation applied to the instruction content. Therefore, the electronic equipment can actively present the instruction content of the recommendation instruction to the user through the mode, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content under the scene that the user is inconvenient to sound, the instruction can be executed through the digital voice assistant, and the user experience of the digital voice assistant is improved.
Referring to fig. 6, fig. 6 is a flowchart illustrating a device control method according to an embodiment of the present disclosure, where the method includes:
s210: and obtaining a recommendation instruction based on a historical control instruction corresponding to the current environment parameter where the equipment is located, wherein the recommendation instruction is generated by the digital voice assistant.
Optionally, in this embodiment of the application, when the electronic device executes the control instruction, the electronic device may record an environment in which the control instruction is executed, so as to obtain an environment parameter of the device corresponding to the control instruction. That is, the environment parameters of the device may be used to record the environment in which the instructions are executed. In the embodiment of the present application, the environmental parameter of the device may include at least one of a time period and a location. Wherein the time period represents a time period in which the control instructions are executed, and the location represents a location at which the electronic device is located when the control instructions are executed.
As one mode, as shown in fig. 7, the environment parameter of the device includes a time period, and the obtaining a recommendation instruction based on a historical control instruction corresponding to the current environment parameter of the device includes:
s211: and acquiring a historical time period corresponding to the current time period, and taking a control instruction correspondingly executed in the historical time period as a to-be-processed instruction.
The time period in the embodiment of the present application may be a time period obtained by dividing the time of the same day. For example, if the time of day is divided at 1 hour intervals, 23 time periods can be obtained, and each time period has a length of 1 hour. Further, if the time of day may be divided at intervals of 15 minutes, 96 time slots may be obtained, and the length of each time slot is 15 minutes.
It should be noted that the user may have the same operation habit in the same time period on different days. For example, a user may open a takeaway application for taking a take-away from a point 11 to a point 12 on a day on a weekday using the electronic device, which means that the electronic device executes a control command for controlling the takeaway application once between the point 11 and the point 12 on the day on the weekday. Then, the electronic device may acquire which control instructions are executed in the historical time period by acquiring the historical time period corresponding to the current time period, and further, the control instruction that the user most likely needs to execute in the current time period is estimated according to the control instructions executed in the historical time period.
Then in the present embodiment, the same time period in the date corresponding to the previous time period as the date on which the current time period is located may be taken as the corresponding historical time period. Wherein the same date in the time period of the corresponding sort position is the corresponding date.
Illustratively, as shown in fig. 8, a week time is taken as a time period, seven dates from 3 to 9 are included in the previous time period, if the current time period is the time period in 12, the date of the current time period is 12, and the rank of 12 in the current time period is the third day, 12 corresponds to the date in the previous time period, that is, the 5 in the previous time period, which is also ranked on the third day. If the current time period is from 11 o 'clock to 11 o' clock 15, the historical time period corresponding to the current time period is from 11 o 'clock to 11 o' clock 15 of No. 5. For another example, if the current time period is the time period in the 16 th day, the date of the current time period is the 16 th day, and the sequence of the 16 th day in the current time period is the seventh day, the 16 th day corresponds to the date in the previous time period, that is, the 9 th day also sequenced on the seventh day in the previous time period. If the current time period is from 11 to 12, the historical time period corresponding to the current time period is from 11 to 12 of the number 9.
S212: and acquiring the execution times of the same instruction to be processed in the corresponding historical time period.
As described above, in the case of recording the time period during which each control instruction is executed, in the process of determining the recommended instruction, which control instructions are executed in the history time period and the number of times that the same control instruction is executed may be obtained. Illustratively, the control instructions for execution recorded in chronological order over the historical time period include instruction a, instruction B, instruction a, instruction C, and instruction a. Then instruction a executed 3 words, instruction B executed 1 word, and instruction C executed 1 time during the historical period.
S213: and acquiring the distance between the position of the same instruction to be processed, which is executed each time in the corresponding historical time period, and the current position to obtain a plurality of reference distances.
It should be noted that, the execution of some control commands may have a certain relationship with the current location of the electronic device. For example, if the electronic device is located at the work site of the user, the control command instructed by the electronic device may be more work-related commands, such as starting a takeaway application, or starting an information query program, or starting a taxi-taking program. For another example, if the electronic device is in the user's residence, the control instructions that the electronic device may execute are more life-related instructions, such as launching a shopping application, querying the weather, etc. Therefore, the position of the control instruction when being executed is recorded, so that the recommended instruction required by the user at a certain moment or time period can be more accurately determined. Correspondingly, in the embodiment of the application, when the control instruction is instructed, in addition to recording the time (or the time period) when the control instruction is executed, the position of the electronic device when the control instruction is instructed can be recorded. For example, as shown in fig. 9, the execution time and the execution position (i.e., the location 1 and the location 2 in the figure) of the control instruction executed by the electronic device every day may be recorded. For example, in fig. 9, the control commands are recorded for one time period of 7 days, and for example, the control commands executed in one day of the time period shown in the figure include command a, command B, command C, and command D. Wherein the location of the electronic device when the instruction a is executed comprises a location 1.
S214: and determining a recommended instruction from the instructions to be processed based on the number of times corresponding to each instruction to be processed and the plurality of reference distances.
Optionally, the determining, based on the number of times corresponding to each instruction to be processed and the plurality of reference distances, a recommended instruction from the instructions to be processed includes:
and acquiring a recommended value corresponding to each instruction to be processed, wherein the recommended value is a ratio of a first value and a second value, the first value is a product of the times and a reference value, and the second value is an average value of the multiple reference distances. The reference value may be configured according to actual needs, for example, the reference value may be configured to be 1000.
And taking the instruction to be processed with the maximum corresponding recommendation value as a recommendation instruction.
The following explains the determination process of the recommended instruction again with reference to fig. 10.
As shown in fig. 10, one week time is one time period in the example shown in fig. 10. It means that in determining a recommended instruction for a date in the current time period, the determination is made based on the date corresponding to the date in the previous time period. For example, if the current time period is the time period in time period number 13 of the current time period, then number 13 corresponds to the date in the previous time period being number 6. The electronic device divides the same date into N time periods, and if the current time period is the second time period of the number 13, the historical time period corresponding to the second time period of the number 13 is the second time period of the number 6. The control instructions executed in the second time period of No. 6 include an instruction A, an instruction B and an instruction D, wherein the execution times of the instruction A are 3 times, the execution times of the instruction B are 2 times, and the execution times of the instruction D are 1 time. The corresponding positions of the instruction a in the 3-time execution process are a place 1, a place 2 and a place 3, respectively. The corresponding positions of the instruction B in the 2-time execution process are the location 2 and the location 3, respectively. The corresponding position of the instruction D in the 1 execution process is a position 2.
Then in the process of obtaining the recommended value corresponding to the instruction, if the reference value is 1000, the corresponding first value is 1000 × 3=3000. If the current position is location 4, and the distance d1 between location 1 and location 4, the distance d2 between location 2 and location 4, and the distance d3 between location 3 and location 4 are determined, then the reference distances corresponding to instruction a include d1, d2, and d3, and the average value of d1, d2, and d3 may be d4, and then the second value corresponding to instruction a is d4. Then instruction a corresponds to a recommended value of 3000/d4. Correspondingly, the recommended values corresponding to the instruction B and the instruction D can be calculated based on the same manner, and if the maximum corresponding recommended value is the instruction a, the instruction a is taken as the recommended instruction. And the electronic device will take instruction a as the recommended instruction at the beginning of the second time period of No. 13.
S220: and displaying the instruction content of the recommendation instruction.
S230: and executing the recommendation instruction by the digital voice assistant in response to a trigger operation acting on the instruction content.
According to the device control method provided by the embodiment, the electronic device can actively present the instruction content of the recommendation instruction to the user through the manner, and the digital voice assistant can be triggered to execute the recommendation instruction through the trigger operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the trigger operation acting on the instruction content in a scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user experience of the digital voice assistant is improved. In addition, in the embodiment of the application, the current environmental parameter of the device can be combined, and then the displayed recommendation instruction can be obtained from the historical control instruction corresponding to the current environmental parameter of the device, so that the recommendation instruction can better fit the use habit of the user of the electronic device, the intelligent degree of the electronic device for determining the recommendation instruction is improved, and the user can more conveniently control the electronic device.
Referring to fig. 11, fig. 11 is a flowchart illustrating an apparatus control method according to an embodiment of the present application, where the method includes:
s310: and acquiring a historical time period corresponding to the current time period.
S320: and if the executed control instruction is corresponding to the historical time period, taking the control instruction correspondingly executed in the historical time period as a to-be-processed instruction.
S330: and if the executed control instruction does not correspond to the historical time period, taking the control instruction correspondingly executed in the historical time period adjacent to the historical time period and corresponding to the next executed control instruction as the instruction to be processed.
It should be noted that the user may not use the electronic device for some period of time, and the electronic device may not perform the control instruction for this period of time. Then, in order to make it possible to recommend the instruction required by the user more likely in each time period, the control instruction in the current time period may be determined based on the control instruction executed in the historical time period next to the historical time period corresponding to the control instruction executed next.
For example, as shown in fig. 12, if the current time period is still the second time period of the current day (current date), the historical time period corresponding to the second time period of the current day is the second time period of the corresponding date, and if no control instruction has been executed in the second time period of the corresponding date, the query will be started in the next time period adjacent to the second time period of the corresponding date until the time period corresponding to the execution of the control instruction is queried as the historical time period corresponding to the next execution of the control instruction. For example, if the historical time period corresponding to the current time period is the second time period in the corresponding date, the historical time period corresponding to the execution of the control instruction is queried from the third time period adjacent to the second time period, and if it is directly queried that the control instruction is executed in the third time period in the corresponding date, the third time period is taken as the next adjacent historical time period corresponding to the execution of the control instruction.
S340: and acquiring the execution times of the same instruction to be processed in the corresponding historical time period.
S350: and acquiring the distance between the position of the same instruction to be processed, which is executed each time in the corresponding historical time period, and the current position to obtain a plurality of reference distances.
S360: and determining a recommended instruction from the to-be-processed instructions based on the number of times corresponding to each to-be-processed instruction and the plurality of reference distances, wherein the recommended instruction is generated by the digital voice assistant.
For example, if the determination of the recommended instruction is made based on the control instruction in the third time period of the corresponding date shown in fig. 12, the determined recommended instruction may be instruction C.
S370: and displaying the instruction content of the recommendation instruction.
S380: and executing the recommendation instruction by the digital voice assistant in response to a trigger operation acting on the instruction content.
According to the device control method provided by the embodiment, the electronic device can actively present the instruction content of the recommendation instruction to the user through the manner, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content in a scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user experience of the digital voice assistant is improved. Moreover, in this embodiment, in the process of acquiring the to-be-processed instruction, under the condition that the executed control instruction does not correspond to the history time period corresponding to the current time period, the control instruction corresponding to the next history time period corresponding to the executed control instruction, which is adjacent to the corresponding history time period, may be used as the to-be-processed instruction, so that the electronic device may display the recommended instruction in each time period, and the integrity of displaying the recommended instruction is improved.
Referring to fig. 13, fig. 13 is a flowchart illustrating an apparatus control method according to an embodiment of the present application, where the method includes:
s410: and determining a recommendation instruction based on the display scene where the electronic equipment is located, wherein the recommendation instruction is generated by the digital voice assistant.
It should be noted that the control currently performed on the electronic device by the user may be related to the display scene where the electronic device is currently located. For example, if the electronic device is currently running a travel-related application, the user may query for weather at a certain location, or for a ticket that is not available to a certain location. Furthermore, when the electronic device detects that the electronic device is currently in the travel content display scene, the determined recommendation instruction may be "inquire weather of XX" or "inquire train ticket to XX".
S420: and displaying the instruction content of the recommendation instruction.
S430: and executing the recommendation instruction by the digital voice assistant in response to the trigger operation acting on the instruction content.
It should be noted that, in the foregoing embodiment, a description is given of a case where, when the time of day is divided into a plurality of time periods, there may be a case where there is no control instruction that has been executed correspondingly in a history time period corresponding to a current time period, and in this embodiment of the present application, as another way, in the foregoing way of determining a recommendation instruction based on a history time period corresponding to a current time period, in a case where it is detected that there is no control instruction that has been executed correspondingly in a history time period corresponding to a current time period, a recommendation instruction may be determined based on a display scene where the electronic device is located. For example, if the current time period is a tenth time period in the current day, and the historical time period corresponding to the tenth time period is a tenth time period in the current day, if there is a control instruction that has been executed correspondingly in the tenth time period in the current day, the recommended instruction is determined based on the control instruction to be processed that has been executed in the tenth time period in the current day directly. If there is no control instruction executed correspondingly in the tenth time period in the corresponding date of the current day, the recommendation instruction may be determined based on the display scene in which the electronic device is located.
Furthermore, it should be further noted that, in the foregoing embodiment, if the historical time period corresponding to the current time period does not correspond to the execution of the control instruction, the control instruction in the next time period corresponding to the execution of the control instruction, which is adjacent to the historical time period, is used as the instruction to be processed. In one case, if the time difference between the next time period and the historical time period is large, the displayed recommended instruction may actually have a certain difference from the current demand of the user. For example, the historical time period is from 7 o 'clock to 7 o' clock and 15 o 'clock, and the next time period to be queried for execution of the control command is from 17 o' clock to 17 o 'clock and 15 o' clock. Then, as one mode, when the difference between the time period of the next corresponding execution of the control instruction adjacent to the historical time period and the historical time period is larger than a specified time difference value, the recommendation instruction may be determined based on the display scene where the electronic device is located. The time difference may be configured by the user, and may be, for example, 1 hour or 2 hours.
According to the device control method provided by the embodiment, the electronic device can actively present the instruction content of the recommendation instruction to the user through the manner, and the digital voice assistant can be triggered to execute the recommendation instruction through the trigger operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the trigger operation acting on the instruction content in a scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user experience of the digital voice assistant is improved. In addition, in the embodiment of the application, the recommendation instruction can be determined according to the display scene of the electronic device, so that the displayed recommendation instruction is more favorably matched with the current use condition of the electronic device, the probability of triggering the recommendation instruction is improved, and the utilization rate of the digital voice assistant is improved.
Referring to fig. 14, fig. 14 is a flowchart illustrating an apparatus control method according to an embodiment of the present application, where the method includes:
s510: and displaying instruction content of recommendation instructions in a first area of the designated interface, wherein the recommendation instructions are generated by the digital voice assistant.
In one approach, the designated interface includes an interface of a card displayed in a screen of the electronic device. In this manner, referring back to fig. 2, the display area 11 shown in fig. 2 can be the interface of the card.
S520: and executing the recommendation instruction by the digital voice assistant in response to a trigger operation acting on the instruction content.
S530: and responding to the trigger operation of the first control in the second area of the designated interface, and performing voice content recognition.
It should be noted that, in some cases, if the user does not want to trigger the instruction content corresponding to the recommendation instruction but wants to input another instruction, the first control may be provided as an input interface in the designated interface, so that the user may conveniently input another control instruction in a voice manner.
S540: and executing an instruction corresponding to the identified voice content.
For example, taking an interface in which the designated interface is a card as an example, as shown in fig. 15, an instruction content "XX instruction" of the recommendation instruction is displayed in a first area of the card 20, and a first control 21 is displayed in a second area, and if a trigger operation acting on the first control 21 is detected, the electronic device may start to collect voice content and perform voice content recognition on the collected voice. For example, if the identified voice content is "Saturday weather," the digital voice assistant may begin to query for Saturday weather and display the queried content via interface 13. In this way, the two entrances of text instructions (instruction contents of recommended instructions are displayed in text form) and voice interaction (voice recognition contents) are complemented with each other, the utilization rate of voice instruction cards (such as the card 20) is increased, and the utilization rate of the digital voice assistant is also increased.
As one mode, the executing the instruction corresponding to the recognized voice content further includes:
canceling the instruction content of the recommendation instruction, and displaying the recognized voice content in the first area. For example, as shown in fig. 16, the speech recognition content "how sunday" may be displayed in the first area of the card 21 in the intermediate image of fig. 16, so that the instruction content of the original recommendation instruction is replaced with the speech recognition content.
As a mode, the digital voice assistant may further display a second control in the process of executing the recommendation instruction or executing the instruction corresponding to the recognized voice content, so as to continue to recognize the voice content after detecting the trigger operation acting on the second control, and execute the instruction corresponding to the recognized voice content. For example, as shown in fig. 17, the control 30 is the aforementioned second control, and if a trigger operation acting on the control 30 is detected, the recognition of the voice content may be continued, and an instruction corresponding to the recognized voice content may be executed.
In the embodiment of the present application, the recommendation instruction may be displayed in the form of a card (for example, the aforementioned card 20). And cards for the display recommendation instructions may be added via the desktop editing mode. For example, as shown in fig. 18, the left image in fig. 18 shows a style of entering a desktop editing mode, in the desktop editing mode, a control named "add card" is displayed on the desktop, and if a touch operation acting on the control named "add card" is detected, the electronic device may display the style shown in the middle image in fig. 18, and a type of card that can be added to the desktop is displayed in the middle image in fig. 18. If a touch operation on the card named "voice assistant" in the middle image of fig. 18 is detected, the electronic device displays a style shown in the right image of fig. 18, in which an interface where the selected card is displayed on the desktop and controls for further operation, including a control named "cancel" and a control named "add". If a control for "add" is detected, the electronic device adds a card (e.g., the aforementioned card 20) corresponding to the voice assistant to the desktop.
Moreover, for the cards added to the desktop, the cards can be moved to the negative screen of the electronic equipment for placement in a dragging mode. Illustratively, as shown in fig. 19, for the card 20, it is possible to drag in the direction indicated by the arrow of the broken line in the left image of fig. 19 and move the card 20 to the position shown in the middle image of fig. 19, and in the case where the card 20 is kept at the position shown in the middle image of fig. 19, the electronic apparatus performs the switching of the interface until switching to the negative one-screen interface 40 shown in the right image of fig. 19. After the electronic device displays the negative one-screen interface 40, the card 20 may continue to be dragged and moved to the position shown in the right-hand diagram of fig. 19 for placing, thereby enabling the card 20 to be dragged from the interface for storing the application icon to the negative one-screen for placing.
Correspondingly, a card dragged to minus one screen may also be moved again to the interface storing the application icon. For example, as shown in fig. 20, the card 20 with negative one screen may be dragged in the direction indicated by the dotted arrow shown in the left image in fig. 20 and dragged to the position shown in the middle image in fig. 20, after the card 20 is kept at the position shown in the middle image in fig. 20, the electronic device may switch the displayed interface and switch to the desktop 10 (interface for storing the application icon) shown in the right image in fig. 20, and after the electronic device displays the desktop 10, the electronic device may continue to drag the card 20 and move to the desired position in the desktop 10 for placing, thereby implementing dragging the card 20 from negative one screen to the desktop 10 for placing.
Furthermore, the cards displayed for displaying the recommendation instructions can be removed from the desktop or the negative screen. For example, a long press on a card may trigger a floating interface in which a control for removing the card is displayed. For example, as shown in fig. 21, in a case where the card 20 is displayed on the desktop 10, if an operation of pressing the card 20 for a long time is detected, a floating interface 41 may be displayed, a control named "remove" is displayed in the floating interface 41, and if a touch operation acting on the control named "remove" is detected, the card 20 may be removed from the desktop 10. Correspondingly, as shown in fig. 22, in the case where the card 20 is displayed on the negative one-screen interface 40, if an operation of pressing the card 20 for a long time is detected, the floating interface 41 may be displayed, a control named "remove" is displayed in the floating interface 41, and if a touch operation acting on the control named "remove" is detected, the card 20 may be removed from the negative one-screen interface 40.
According to the device control method provided by the embodiment, the electronic device can actively present the instruction content of the recommendation instruction to the user through the manner, and the digital voice assistant can be triggered to execute the recommendation instruction through the trigger operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the trigger operation acting on the instruction content in a scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user experience of the digital voice assistant is improved. In addition, in this embodiment, in addition to the instruction content of the recommendation instruction, the first control is also displayed in the second area of the designated interface, so that the user can trigger the voice content recognition through the first control, and the electronic device can provide a triggering mode for performing instruction recommendation and receiving the input of the user at the same time, which is beneficial to improving the utilization rate of the digital voice assistant.
Referring to fig. 23, fig. 23 is a block diagram of a device control apparatus according to an embodiment of the present disclosure, where the apparatus includes:
and an instruction display unit 510, configured to display instruction content of a recommendation instruction, where the recommendation instruction is generated by the digital voice assistant.
And the instruction execution unit 520 is used for responding to the trigger operation acted on the instruction content and executing the recommendation instruction through the digital voice assistant.
In one mode, the instruction display unit 510 is further configured to obtain a recommended instruction based on a historical control instruction corresponding to an environmental parameter where the current device is located. Optionally, the environment parameter of the device includes a time period, and the instruction display unit 510 is specifically configured to obtain a historical time period corresponding to a current time period, and use a control instruction executed correspondingly in the historical time period as a to-be-processed instruction; acquiring the execution times of the same instruction to be processed in the corresponding historical time period; acquiring the distance between the position of the same instruction to be processed, which is executed each time in the corresponding historical time period, and the current position to obtain a plurality of reference distances;
and determining a recommended instruction from the instructions to be processed based on the number of times corresponding to each instruction to be processed and the plurality of reference distances.
Optionally, the instruction display unit 510 is specifically configured to obtain a recommended value corresponding to each instruction to be processed, where the recommended value is a ratio of a first value and a second value, the first value is a product of the number of times and a reference value, and the second value is an average value of the multiple reference distances;
and taking the instruction to be processed with the maximum corresponding recommendation value as a recommendation instruction.
Optionally, the instruction display unit 510 is specifically configured to obtain a historical time period corresponding to the current time period; if the executed control instruction is corresponding to the historical time period, taking the control instruction correspondingly executed in the historical time period as a to-be-processed instruction; and if the executed control instruction does not correspond to the historical time period, taking the control instruction correspondingly executed in the historical time period adjacent to the historical time period and corresponding to the next executed control instruction as the instruction to be processed.
Optionally, the environment parameter of the device includes at least one of a time period and a location.
By one approach, the instruction display unit 510 is specifically configured to display the instruction content of the recommendation instruction in the first area of the designated interface. In this manner, the instruction execution unit 520 is further specifically configured to perform speech content recognition in response to a trigger operation of the first control acting on the second area of the designated interface; and executing an instruction corresponding to the identified voice content. Optionally, the instruction display unit 510 is further specifically configured to cancel displaying the instruction content of the recommendation instruction before executing the instruction corresponding to the identified voice content, and display the identified voice content in the first area. Wherein the designation interface may include an interface of a card displayed in a screen of the electronic device.
According to the equipment control device, the instruction content of the recommendation instruction is displayed firstly, then the digital voice assistant executes the recommendation instruction in response to the trigger operation acting on the instruction content. Therefore, the electronic equipment can actively present the instruction content of the recommendation instruction to the user through the mode, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content under the scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user use experience of the digital voice assistant is improved.
It should be noted that the apparatus embodiment in the present application corresponds to the foregoing method embodiment, and specific principles in the apparatus embodiment may refer to the contents in the foregoing method embodiment, which is not described herein again.
An electronic device provided by the present application will be described below with reference to fig. 24.
Referring to fig. 24, based on the device control method and apparatus, another electronic device 200 capable of executing the device control method is further provided in the embodiment of the present application. The electronic device 200 includes one or more processors 102 (only one shown), memory 104, and network module 106 coupled to each other. The memory 104 stores programs that can execute the content of the foregoing embodiments, and the processor 102 can execute the programs stored in the memory 104.
The Memory 104 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 104 may be used to store instructions, programs, code sets, or instruction sets. The memory 104 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, such as a phonebook, audio-video data, chatting log data, and the like.
The network module 106 is configured to receive and transmit electromagnetic waves, and implement interconversion between the electromagnetic waves and electrical signals, so as to communicate with a communication network or other devices, for example, an audio playing device. The network module 106 may include various existing circuit elements for performing these functions, such as an antenna, a radio frequency transceiver, a digital signal processor, an encryption/decryption chip, a Subscriber Identity Module (SIM) card, memory, and so forth. The network module 106 may communicate with various networks, such as the internet, an intranet, a wireless network, or with other devices via a wireless network. The wireless network may comprise a cellular telephone network, a wireless local area network, or a metropolitan area network. For example, the network module 106 may interact with a base station.
The sensor module 108 may include at least one sensor. Specifically, the sensor module 108 may include, but is not limited to: light sensors, motion sensors, pressure sensors, infrared heat sensors, distance sensors, acceleration sensors, and other sensors.
Among them, the pressure sensor may detect a pressure generated by pressing on the electronic device 200. That is, the pressure sensor detects pressure generated by contact or pressing between the user and the electronic device, for example, contact or pressing between the user's ear and the mobile terminal. Accordingly, the pressure sensor may be used to determine whether contact or pressure has occurred between the user and the electronic device 200, and the magnitude of the pressure.
The acceleration sensor may detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when stationary, and may be used for applications (such as horizontal and vertical screen switching, related games, magnetometer attitude calibration) for recognizing the attitude of the electronic device 200, and related functions (such as pedometer and tapping) for vibration recognition. In addition, the electronic device 200 may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer and a thermometer, which are not described herein,
the camera 110 may include a color lens and an NIR lens. The color lens can be used for collecting images in a specified color mode, and the NIR lens can be used for collecting images in a near infrared mode.
Referring to fig. 25, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable medium. The computer readable storage medium 800 has storage space for program code 810 for performing any of the method steps of the method described above. The program code can be read from and written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
According to the equipment control method, the equipment control device, the electronic equipment and the storage medium, the instruction content of the recommendation instruction is displayed firstly, then the digital voice assistant executes the recommendation instruction in response to the trigger operation acting on the instruction content. Therefore, the electronic equipment can actively present the instruction content of the recommendation instruction to the user through the mode, and the digital voice assistant can be triggered to execute the recommendation instruction through the triggering operation directly acting on the instruction content, so that the user can replace the actual voice input operation of the user through the triggering operation acting on the instruction content under the scene where the user is inconvenient to sound, the instruction execution through the digital voice assistant is realized, and the user use experience of the digital voice assistant is improved.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (12)
1. A device control method applied to an electronic device including a digital voice assistant, the method comprising:
displaying instruction content of a recommendation instruction, wherein the recommendation instruction is generated by a digital voice assistant;
and executing the recommendation instruction by the digital voice assistant in response to the trigger operation acting on the instruction content.
2. The method of claim 1, wherein the step of displaying the instruction content of the recommendation instruction further comprises:
and obtaining a recommended instruction based on the historical control instruction corresponding to the current environment parameter of the equipment.
3. The method according to claim 2, wherein the environment parameter of the device includes a time period, and the obtaining of the recommended instruction based on the historical control instruction corresponding to the current environment parameter of the device includes:
acquiring a historical time period corresponding to the current time period, and taking a control instruction correspondingly executed in the historical time period as a to-be-processed instruction;
acquiring the execution times of the same instruction to be processed in the corresponding historical time period;
acquiring the distance between the position of the same instruction to be processed, which is executed each time in the corresponding historical time period, and the current position to obtain a plurality of reference distances;
and determining a recommended instruction from the instructions to be processed based on the number of times corresponding to each instruction to be processed and the plurality of reference distances.
4. The method according to claim 3, wherein the determining a recommended instruction from the instructions to be processed based on the number of times corresponding to each instruction to be processed and the plurality of reference distances comprises:
acquiring a recommended value corresponding to each instruction to be processed, wherein the recommended value is a ratio of a first value and a second value, the first value is a product of the times and a reference value, and the second value is an average value of the multiple reference distances;
and taking the instruction to be processed with the maximum corresponding recommendation value as a recommendation instruction.
5. The method according to claim 3, wherein obtaining a history time period corresponding to a current time period, and taking a control instruction correspondingly executed in the history time period as a to-be-processed instruction comprises:
acquiring a historical time period corresponding to the current time period;
if the executed control instruction is corresponding to the historical time period, taking the control instruction correspondingly executed in the historical time period as a to-be-processed instruction;
and if the executed control instruction does not correspond to the control instruction in the historical time period, taking the control instruction which is correspondingly executed in the historical time period which is adjacent to the historical time period and corresponds to the next control instruction which is correspondingly executed as the instruction to be processed.
6. The method of claim 2, wherein the environmental parameter of the device comprises at least one of a time period and a location.
7. The method according to any one of claims 1 to 6, wherein the displaying of the instruction content of the recommendation instruction comprises: displaying instruction content of a recommendation instruction in a first area of a designated interface; the method further comprises the following steps:
responding to the trigger operation of the first control in the second area of the designated interface, and performing voice content recognition;
and executing the instruction corresponding to the recognized voice content.
8. The method of claim 7, wherein the executing the instruction corresponding to the identified speech content further comprises:
canceling the instruction content of the recommendation instruction, and displaying the recognized voice content in the first area.
9. The method of claim 7, wherein specifying the interface comprises specifying an interface for a card displayed in a screen of the electronic device.
10. An apparatus for controlling a device, the apparatus operable with an electronic device including a digital voice assistant, the apparatus comprising:
the instruction display unit is used for displaying the instruction content of a recommendation instruction, and the recommendation instruction is generated by the digital voice assistant;
and the instruction execution unit is used for responding to the triggering operation acted on the instruction content and executing the recommendation instruction by the digital voice assistant.
11. An electronic device comprising one or more processors and memory; one or more programs stored in the memory and configured to be executed by the one or more processors to perform the method of any of claims 1-9.
12. A computer-readable storage medium, characterized in that a program code is stored in the computer-readable storage medium, wherein the program code when executed by a processor performs the method of any of claims 1-9.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110601591.8A CN115421682A (en) | 2021-05-31 | 2021-05-31 | Device control method, device, electronic device and storage medium |
PCT/CN2022/088704 WO2022252872A1 (en) | 2021-05-31 | 2022-04-24 | Device control method and apparatus, electronic device, and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110601591.8A CN115421682A (en) | 2021-05-31 | 2021-05-31 | Device control method, device, electronic device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115421682A true CN115421682A (en) | 2022-12-02 |
Family
ID=84230527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110601591.8A Pending CN115421682A (en) | 2021-05-31 | 2021-05-31 | Device control method, device, electronic device and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115421682A (en) |
WO (1) | WO2022252872A1 (en) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10380208B1 (en) * | 2015-12-28 | 2019-08-13 | Amazon Technologies, Inc. | Methods and systems for providing context-based recommendations |
CN110647274A (en) * | 2019-08-15 | 2020-01-03 | 华为技术有限公司 | Interface display method and equipment |
CN116564304A (en) * | 2019-09-30 | 2023-08-08 | 华为终端有限公司 | Voice interaction method and device |
-
2021
- 2021-05-31 CN CN202110601591.8A patent/CN115421682A/en active Pending
-
2022
- 2022-04-24 WO PCT/CN2022/088704 patent/WO2022252872A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
WO2022252872A1 (en) | 2022-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11455491B2 (en) | Method and device for training image recognition model, and storage medium | |
CN106708282B (en) | A kind of recommended method and device, a kind of device for recommendation | |
KR20180055708A (en) | Device and method for image processing | |
CN108595573B (en) | Page display method and device, storage medium and electronic equipment | |
KR101709427B1 (en) | Method and device for selecting information | |
CN110764627B (en) | Input method and device and electronic equipment | |
KR20180132493A (en) | System and method for determinig input character based on swipe input | |
CN108540649B (en) | Content display method and mobile terminal | |
CN109032491A (en) | Data processing method, device and mobile terminal | |
CN113923515A (en) | Video production method and device, electronic equipment and storage medium | |
CN109144447A (en) | Split screen window adjusting method, device, storage medium and electronic equipment | |
CN109901726B (en) | Candidate word generation method and device and candidate word generation device | |
CN113032075A (en) | Information processing method and electronic device | |
CN111596832B (en) | Page switching method and device | |
CN110648657A (en) | Language model training method, language model construction method and language model construction device | |
CN108073291B (en) | Input method and device and input device | |
CN112286611A (en) | Icon display method and device and electronic equipment | |
CN109684006B (en) | Terminal control method and device | |
CN116910368A (en) | Content processing method, device, equipment and storage medium | |
CN112784151A (en) | Method and related device for determining recommendation information | |
CN112306251A (en) | Input method, input device and input device | |
CN115421682A (en) | Device control method, device, electronic device and storage medium | |
CN115994266A (en) | Resource recommendation method, device, electronic equipment and storage medium | |
CN112882394B (en) | Equipment control method, control device and readable storage medium | |
CN114567694A (en) | Alarm clock reminding method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |