CN112860169B - Interaction method and device, computer readable medium and electronic equipment - Google Patents

Interaction method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN112860169B
CN112860169B CN202110190469.6A CN202110190469A CN112860169B CN 112860169 B CN112860169 B CN 112860169B CN 202110190469 A CN202110190469 A CN 202110190469A CN 112860169 B CN112860169 B CN 112860169B
Authority
CN
China
Prior art keywords
gesture control
control mode
acquisition unit
image acquisition
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110190469.6A
Other languages
Chinese (zh)
Other versions
CN112860169A (en
Inventor
姬晨曦
黎瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110190469.6A priority Critical patent/CN112860169B/en
Publication of CN112860169A publication Critical patent/CN112860169A/en
Application granted granted Critical
Publication of CN112860169B publication Critical patent/CN112860169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an interaction method and device, a computer readable medium and electronic equipment, and relates to the technical field of man-machine interaction. The method comprises the following steps: responding to the detected starting time of a preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode; based on the detected gesture posture image acquired by the image acquisition unit, matching corresponding gesture control according to the gesture posture image; and executing a corresponding control function according to the gesture control instruction. According to the method and the device, the terminal equipment with the common camera can realize the function of interaction of the space-apart gestures, meanwhile, the power consumption of the terminal equipment without the low-power normally-open camera is effectively reduced, and the cruising ability of the terminal equipment is improved.

Description

Interaction method and device, computer readable medium and electronic equipment
Technical Field
The disclosure relates to the technical field of man-machine interaction, in particular to an interaction method, an interaction device, a computer readable medium and electronic equipment.
Background
The space gesture interaction is an interaction mode for conveying information by means of hand actions or modes. With the development of gesture recognition technologies based on computer vision, sensing, myoelectricity recognition, infrared sensing and the like, gesture interaction is gradually going into people's lives. From the object control in the virtual or augmented reality, to the interface interaction of a computer, a mobile phone and wearable equipment, and to the function control of intelligent products, intelligent televisions and vehicle-mounted equipment, gesture interaction is integrated into the interaction of people, products and interfaces in a multiple complementary mode due to the advantages of nature, non-contact and the like.
At present, when the terminal equipment realizes the function of interaction of space-apart gestures, a front normally-open low-power consumption AON (Always-on) camera is required to be installed while a common camera is installed. However, most terminal devices do not have an AON camera, and additional installation increases hardware cost; meanwhile, if the function of interaction of the space-apart gestures is realized only by means of the common camera, the common camera which is always open can generate very large power consumption for the terminal equipment.
Disclosure of Invention
The disclosure aims to provide an interaction method, an interaction device, a computer readable medium and electronic equipment, so as to at least overcome the problems that an AON camera must be installed to realize a space-free gesture interaction function in the related art and the hardware cost of terminal equipment is high to a certain extent.
According to a first aspect of the present disclosure, there is provided an interaction method comprising:
responding to the detected starting time of a preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode;
based on the detected gesture posture image acquired by the image acquisition unit, matching corresponding gesture control according to the gesture posture image;
and executing a corresponding control function according to the gesture control instruction.
According to a second aspect of the present disclosure, there is provided an interaction device comprising:
the gesture control starting module is used for responding to the monitoring of the starting time of a preset gesture control mode and starting an image acquisition unit matched with the preset gesture control mode;
the gesture control instruction matching module is used for acquiring detected gesture posture images based on the image acquisition unit and matching corresponding gesture control images according to the gesture posture images;
and the control module is used for executing corresponding control functions according to the gesture control instruction.
According to a third aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
and a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the methods described above.
According to the interaction method provided by the embodiment of the disclosure, an image acquisition unit matched with a preset gesture control mode is started in response to the detection of the starting time of the preset gesture control mode; based on the detected gesture posture image collected by the image collecting unit, matching corresponding gesture control instructions according to the gesture posture image; and executing corresponding control functions according to the gesture control instructions. On the one hand, when the starting time of the preset gesture control mode is detected, the corresponding image acquisition unit is started to acquire gesture posture images, so that the problem of high power consumption of terminal equipment caused by the fact that the image acquisition unit is always started can be effectively avoided; on the other hand, the gesture control function is realized through the started image acquisition unit, and a low-power normally open camera is not required to be additionally installed, so that the hardware cost of the terminal equipment is effectively reduced; on the other hand, the acquired gesture images are matched with the corresponding gesture control instructions, and then the control function is executed through the matched gesture control instructions, so that the response speed of gesture control can be improved, and the accuracy of gesture recognition is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of one method of interaction in an exemplary embodiment of the present disclosure;
fig. 4 schematically illustrates a flowchart of an image acquisition unit corresponding to an active on preset gesture control mode in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flowchart of another image acquisition unit corresponding to an active on preset gesture control mode in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a flowchart of an image acquisition unit corresponding to a passive shutdown preset gesture control mode in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a composition diagram of an interaction device in an exemplary embodiment of the present disclosure;
fig. 8 schematically illustrates a composition diagram of another interaction device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
FIG. 1 illustrates a schematic diagram of a system architecture of an exemplary application environment in which the interaction methods and apparatus of embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of the terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. The terminal devices 101, 102, 103 may be various electronic devices with image acquisition units including, but not limited to, desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 105 may be a server cluster formed by a plurality of servers.
The interaction method provided by the embodiments of the present disclosure is generally performed in the terminal devices 101, 102, 103, and accordingly, the interaction means are generally provided in the terminal devices 101, 102, 103. However, it will be readily understood by those skilled in the art that the interaction method provided in the embodiments of the present disclosure may be performed by the server 105, and accordingly, the interaction device may also be disposed in the server 105, which is not specifically limited in the exemplary embodiment. For example, in an exemplary embodiment, the image capturing unit started by the terminal device 101, 102, 103 may capture a gesture image, and then upload the gesture image to the server 105, and after the server generates the gesture control instruction through the interaction method provided by the embodiment of the present disclosure, the gesture control instruction is transmitted to the terminal device 101, 102, 103, and so on.
The exemplary embodiments of the present disclosure provide an electronic device for implementing an interaction method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the interaction method via execution of the executable instructions.
The configuration of the electronic device will be exemplarily described below using the mobile terminal 200 of fig. 2 as an example. It will be appreciated by those skilled in the art that the configuration of fig. 2 can also be applied to stationary type devices in addition to components specifically for mobile purposes. In other embodiments, mobile terminal 200 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is shown schematically only and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also employ a different interface from that of fig. 2, or a combination of interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: processor 210, internal memory 221, external memory interface 222, universal serial bus (Universal Serial Bus, USB) interface 230, charge management module 240, power management module 241, battery 242, antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 271, receiver 272, microphone 273, headset interface 274, sensor module 280, display screen 290, camera module 291, indicator 292, motor 293, keys 294, and subscriber identity module (subscriber identification module, SIM) card interface 295, and the like. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyro sensor 2803, and the like.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processor (Image Signal Processor, ISP), a controller, a video codec, a digital signal processor (Digital Signal Processor, DSP), a baseband processor, and/or a Neural network processor (Neural-Network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The NPU is a Neural-Network (NN) computing processor, and can rapidly process input information by referencing a biological Neural Network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the mobile terminal 200 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The processor 210 has a memory disposed therein. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transfer instructions, and notification instructions, and are controlled to be executed by the processor 210.
The charge management module 240 is configured to receive a charge input from a charger. The power management module 241 is used for connecting the battery 242, the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the display 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the wireless communication module 260 may provide solutions for wireless communication including wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (Wireless Fidelity, wi-Fi) network), bluetooth (BT), etc., as applied on the mobile terminal 200. In some embodiments, antenna 1 and mobile communication module 250 of mobile terminal 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, so that mobile terminal 200 may communicate with a network and other devices through wireless communication techniques.
The mobile terminal 200 implements display functions through a GPU, a display screen 290, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The mobile terminal 200 may implement a photographing function through an ISP, a camera module 291, a video codec, a GPU, a display screen 290, an application processor, and the like. The ISP is used for processing the data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the mobile terminal 200. The external memory card communicates with the processor 210 via an external memory interface 222 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 221 may be used to store computer executable program code that includes instructions. The internal memory 221 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like. The processor 210 performs various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement audio functions through an audio module 270, a speaker 271, a receiver 272, a microphone 273, an earphone interface 274, an application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided at the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 2802 may be disposed on display 290. The pressure sensor 2802 is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 2803. The gyro sensor 2803 can be used to capture anti-shake, navigation, motion-sensing game scenes, and the like.
In addition, sensors for other functions, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices that provide auxiliary functionality may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, etc., by which a user can generate key signal inputs related to user settings and function controls of the mobile terminal 200. As another example, indicator 292, motor 293, SIM card interface 295, and the like.
The following describes an interaction method and an interaction device according to exemplary embodiments of the present disclosure in detail.
Fig. 3 shows a flow of an interaction method in the present exemplary embodiment, including the following steps S310 to S330:
in step S310, in response to detecting the turn-on timing of the preset gesture control mode, an image acquisition unit matched with the preset gesture control mode is started.
In an exemplary embodiment, the preset gesture control mode may refer to a mode corresponding to different settings for gesture control, which are preset by a user, for example, the preset gesture control mode may include a mapping relationship between different gesture gestures set by the user and gesture control instructions, a designated image acquisition unit (such as a front image acquisition unit or a rear image acquisition unit) for acquiring gesture images, a timeout time for automatically closing (passively closing) the image acquisition unit, different opening modes for opening the preset gesture control mode, and the like, that is, the user may preset different options for implementing gesture control functions, to form the preset gesture control mode belonging to a user-defined style, which is not limited in this exemplary embodiment. When the opening time of the preset gesture control mode is monitored, the terminal equipment can be controlled according to different setting parameters in the preset gesture control mode.
The image acquisition unit may be a non-low power consumption normally open unit for acquiring an image, for example, the image acquisition unit may be a common camera (such as a front camera or a rear camera) built in the terminal device, or may be an external common camera in communication connection (such as wired connection or wireless connection) with the terminal device, or of course, may be other non-low power consumption normally open units for assisting the terminal device to acquire a gesture image, which is not limited in this example embodiment.
If the opening time of the preset gesture control mode is monitored, calling a corresponding image acquisition unit according to the setting data in the preset gesture control mode, for example, the image acquisition unit set in the preset gesture control mode is a front-end camera, and opening the front-end camera to acquire images when the opening time of the preset gesture control mode is monitored; the image acquisition unit set in the preset gesture control mode is a rear camera, and the rear camera is started to acquire images when the starting time of the preset gesture control mode is detected, however, the preset gesture control mode is only schematically illustrated herein, and the preset gesture control mode can also be a combination of a front camera, a rear camera and an external camera to acquire images, which is not particularly limited in this exemplary embodiment.
In step S320, the detected gesture posture image is collected based on the image collection unit, and the corresponding gesture control instruction is matched according to the gesture posture image.
In an exemplary embodiment, the gesture posture image may refer to an image including a hand posture acquired by the image acquisition unit, for example, the gesture posture image may be an image including a gesture posture of a front palm or a back palm (i.e., a palm is opposite to the camera or a palm is opposite to the camera), or may be an image set including a gesture motion, or may be other images including a gesture posture or a gesture motion, which is not limited in this exemplary embodiment.
After the image acquisition unit is started, the current environment can be continuously monitored, and when the image containing the hand characteristics is monitored, the corresponding hand gesture image (static gesture) or hand gesture image set (dynamic gesture) is intercepted and sent to a cache so as to be convenient for subsequent processing.
The gesture control instruction may be a preset control instruction triggered by a gesture, for example, if a gesture image including a palm front is detected, the corresponding gesture control instruction may be pause or return, if a gesture image including a palm swinging up and down is detected, the corresponding gesture control instruction may be sliding down or page down, where, of course, the gesture control instruction is merely illustrative, and the mapping relationship between a specific gesture image and the gesture control instruction is different according to the preset gesture control mode set by the user, which is not limited in particular.
In step S330, a corresponding control function is executed according to the gesture control instruction.
In an exemplary embodiment, the control function may be a man-machine interaction function corresponding to a gesture control instruction, for example, if the gesture control instruction may be a pause instruction or a return instruction, the corresponding pause function or return function in the current interface is triggered according to the gesture control instruction, and if the gesture control instruction may be a swipe down instruction or a page down instruction, the page swipe down function or page down function is triggered in the current interface according to the gesture control instruction, which is, of course, only schematically illustrated herein and should not cause any special limitation to the present exemplary embodiment.
The following explains step S310 to step S330 in detail.
In an exemplary embodiment, the opening timing may include an active opening timing, where the active opening timing refers to a timing when a user actively controls the preset gesture control mode to open through a related opening operation, for example, the active opening timing may be a timing when the user opens the preset gesture control mode through a voice command, may also be a timing when the user actively controls the preset gesture control mode to open through a preset trigger control, and of course, may also be a timing when other users actively and autonomously open the preset gesture control mode, where the exemplary embodiment is not limited in particular.
Specifically, whether the active opening timing is monitored may be determined through the steps in fig. 4, and referring to fig. 4, the method may specifically include:
step S410, obtaining a voice command input by a voice assistant;
step S420, if it is detected that the voice command includes a target on keyword, determining that the active on time of the preset gesture control mode is monitored, and starting an image acquisition unit matched with the preset gesture control mode.
The voice assistant is an intelligent application program which realizes interaction in the form of intelligent dialogue and instant question and answer, can receive voice information of a user, translate the voice information into a voice instruction which can be recognized by a machine, and further realize corresponding functions through the voice instruction obtained through recognition. The voice assistant can be awakened in a voice instruction mode containing the awakening keywords, and then the awakened voice assistant monitors the voice instruction.
The target opening keyword may be a keyword in a voice command for indicating to open the preset gesture control mode, for example, assuming that the input voice command may be "please help me open a blank gesture", the target opening keyword may be "open", "blank gesture", if the target opening keyword is detected in the voice command, the target opening keyword may be considered to be a command for opening the preset gesture control mode, and then the active opening time when the preset gesture control mode is monitored is determined; of course, the input voice command may be "i want to interact through gestures", and the target opening keyword may be "gesture", "interaction", where specific voice command settings for opening the preset gesture control mode are different according to different preset word banks or user-defined voice commands, and this embodiment is not limited in particular.
In an exemplary embodiment, it may also be determined whether the active on opportunity is monitored through the steps in fig. 5, and referring to fig. 5, the method may specifically include:
step S510, generating a control instruction according to the triggering operation of the target control;
step S520, if the control instruction is detected to be a gesture control start instruction, determining to monitor the active start timing of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
The target control may be a preset control for triggering to open or close the preset gesture control mode, for example, the target control may be a shortcut button in a navigation bar for opening or closing the preset gesture control mode, or a shortcut command button in a voice assistant for generating a voice command for opening or closing the preset gesture control mode by one key, or a preset touch key combination for opening or closing the preset gesture control mode (for example, a combination of a return key, a menu key and a main page key, or a combination of a volume key and a power key, and the form of the key combination is not limited in this example implementation), and of course, other controls for triggering to open or close the preset gesture control mode may be used.
If the control instruction generated after the triggering operation triggering target control is detected to be the gesture control starting instruction for starting the preset gesture control mode, the active starting time of the preset gesture control mode is detected, an image acquisition unit matched with the preset gesture control mode is started, and the gesture control function is realized based on the image acquisition unit.
In an exemplary embodiment, the opening timing includes a passive opening timing, where the terminal device automatically controls the opening of the preset gesture control mode due to the triggering of the related event, for example, the passive opening timing may be the timing when the terminal device receives the communication incoming call, or the timing when the terminal device receives the information, or may be the timing when other terminal devices automatically controls the opening of the preset gesture control mode due to the triggering of the related event, which is not limited in this exemplary embodiment.
Specifically, if a passive trigger event triggering a screen-lighting instruction is detected, the screen-lighting instruction is executed, and the screen-lighting instruction triggered by the passive trigger event can be used as a passive start time of a preset gesture control mode.
The passive triggering event refers to an event that the terminal device triggers by itself under the condition that the user does not actively participate, for example, the passive triggering event may be a communication incoming call event, where the communication incoming call event refers to an event that the current terminal device receives a communication request of other terminal devices, and may also be an information receiving event, where the current terminal device receives information sent by other terminal devices, and of course, the passive triggering event may also be an event that the terminal device triggers by itself under the condition that the user does not actively participate, for example, the passive triggering event may also be an alarm clock reminding event, and this example embodiment is not limited in particular.
The passive trigger event triggers a screen lighting instruction to be a passive screen lighting instruction for notifying the user of the passive trigger event under the condition that the user does not actively participate, for example, when a communication incoming call event, an information receiving event or an alarm clock reminding event is detected, the passive screen lighting instruction is triggered, the passive screen lighting instruction is used as a passive start time of a preset gesture control mode, and when the passive start time is detected, an image acquisition unit corresponding to the preset gesture control mode is started, so that a gesture control function is realized.
When a passive triggering event is detected, the fact that the user is probably not near the terminal equipment is indicated, at the moment, the gesture control function is not required to be started actively by the user, but is started passively, the interaction efficiency of the user can be effectively improved, the efficiency of the user for processing the passive triggering event is improved, if the user can rapidly realize answering or hanging-up of a communication incoming call through the space gesture control when the communication incoming call event is detected, the user can rapidly realize opening and browsing of information through the space gesture control when the information is received, and the processing efficiency of the communication incoming call event or the information receiving event is effectively improved.
In an exemplary embodiment, when an active start timing or a passive start timing is detected, an image acquisition unit corresponding to a preset gesture control mode is started, and a blank gesture function is realized based on the image acquisition unit, after the blank gesture function is started, a start identifier corresponding to the preset gesture control mode may be generated, for example, a start icon corresponding to the preset gesture control mode may be preset in a status display column, and then the start state of the preset gesture control mode is prompted through the start identifier, so that a user can clearly know the start state of the preset gesture control mode, further judge whether to start or not the preset gesture control mode, and realize interactive control through the blank gesture in the state (the existence of the start identifier) in which the preset gesture control mode is started.
In an exemplary embodiment, the turned-on image capturing unit may perform the turn-off process in response to monitoring the turn-off timing of the preset gesture control mode.
Specifically, the closing timing may include a passive closing timing, where the terminal device passively triggers closing of the preset gesture control mode by self-control, for example, the passive closing timing may be a timing for detecting that the user does not use the space gesture control function for more than a preset period of time, or may be a timing for detecting that the terminal device is closed, or may be a timing for other terminal devices passively triggers closing of the preset gesture control mode by self-control, which is not limited in this example embodiment.
In an exemplary embodiment, whether the passive on opportunity is monitored may be determined through the steps in fig. 6, and referring to fig. 6, the method may specifically include:
step S610, obtaining a preset time threshold;
step S620, if the time of the image acquisition unit not detecting the gesture image is greater than or equal to the time threshold, determining that the passive closing timing of the preset gesture control mode is monitored, and closing the opened image acquisition unit.
The time threshold may be a preset threshold for passively closing the image acquisition unit corresponding to the preset gesture control mode, for example, the time threshold may be 10 minutes or 60 minutes, and may be dynamically updated according to an actual use condition or determined according to user-defined settings, which is not limited in this example embodiment. When the time that the image acquisition unit cannot detect the gesture image is greater than or equal to the time threshold, the user can be considered to have not used the space-apart gesture function for a long time or does not need to use the space-apart gesture function, so that the image acquisition unit corresponding to the preset gesture control mode is closed to finish the space-apart gesture function.
By setting the preset time threshold, and when the time that the image acquisition unit cannot detect the gesture image is greater than or equal to the time threshold, the image acquisition unit corresponding to the preset gesture control mode is passively closed under the condition that the user does not actively participate, so that the power consumption of the terminal equipment is effectively reduced, and the cruising duration of the terminal equipment is improved.
Further, when the passive closing timing is monitored and the opened image acquisition unit is closed, a prompt message for closing the preset gesture control mode can be generated, for example, when the time that the image acquisition unit cannot detect the gesture image is greater than or equal to a time threshold, a user can be prompted in a message popup window mode that the current space gesture function is idle for more than the time, so that the user can conveniently know the switching state of the current preset gesture control mode, follow-up operation is performed, and the memory burden of the user is reduced.
In an exemplary embodiment, the closing timing may include an active closing timing, where the user actively controls the closing of the preset gesture control mode through a related closing operation, for example, the active closing timing may be a timing when the user closes the preset gesture control mode through a voice command, or may be a timing when the user actively controls the closing of the preset gesture control mode through a preset trigger control, or may be a timing when other users actively and automatically close the preset gesture control mode, which is not limited in this exemplary embodiment.
Specifically, if the voice command input by the voice assistant includes a target closing keyword, or if the control command generated by triggering the operation to trigger the target control is a gesture control closing command, or if an active triggering event triggering the screen-closing command is monitored, it may be determined that a closing timing of a preset gesture control mode is monitored, and the opened image acquisition unit performs closing processing.
In summary, in the present exemplary embodiment, in response to monitoring the start timing of the preset gesture control mode, the image acquisition unit matched with the preset gesture control mode is started; based on the detected gesture posture image collected by the image collecting unit, matching corresponding gesture control instructions according to the gesture posture image; and executing corresponding control functions according to the gesture control instructions. On the one hand, when the starting time of the preset gesture control mode is detected, the corresponding image acquisition unit is started to acquire gesture posture images, so that the problem of high power consumption of terminal equipment caused by the fact that the image acquisition unit is always started can be effectively avoided; on the other hand, the gesture control function is realized through the started image acquisition unit, and a low-power normally open camera is not required to be additionally installed, so that the hardware cost of the terminal equipment is effectively reduced; on the other hand, the acquired gesture images are matched with the corresponding gesture control instructions, and then the control function is executed through the matched gesture control instructions, so that the response speed of gesture control can be improved, and the accuracy of gesture recognition is improved.
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 7, in this exemplary embodiment, an interaction device 700 is further provided, which may include a gesture control starting module 710, a gesture control instruction matching module 720, and a control module 730. Wherein:
the gesture control starting module 710 may be configured to start an image acquisition unit matched with a preset gesture control mode in response to monitoring a start timing of the preset gesture control mode;
the gesture control instruction matching module 720 may be configured to collect the detected gesture image based on the image collecting unit, and match the corresponding gesture control according to the gesture image;
the control module 730 may be configured to execute a corresponding control function according to the gesture control instruction.
In an exemplary embodiment, gesture control initiation module 710 may include an active turn-on unit that may be used to:
Acquiring a voice instruction input by a voice assistant;
if the voice command is detected to contain a target opening keyword, determining to monitor the active opening time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
In an exemplary embodiment, the active opening unit may also be used to:
generating a control instruction according to the triggering operation of the target control;
if the control instruction is detected to be a gesture control start instruction, determining to monitor the active start time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
In an exemplary embodiment, gesture control initiation module 710 may include a passive opening unit that may be used to:
if a passive trigger event triggering a screen-brightening instruction is detected, executing the screen-brightening instruction, and taking the screen-brightening instruction triggered by the passive trigger event as the passive start time of the preset gesture control mode;
wherein the passive trigger event comprises a communication incoming call event or an information receiving event.
In an exemplary embodiment, the interaction device 700 may further include an opening identification generating unit, which may be configured to:
and generating an opening identifier corresponding to the preset gesture control mode, and prompting the opening state of the preset gesture control mode through the opening identifier.
In an exemplary embodiment, referring to fig. 8, the interaction device 700 may further include a gesture control closing module 840, and the gesture control closing module 840 may be configured to:
and responding to the monitoring of the closing time of the preset gesture control mode, and closing the opened image acquisition unit.
In an exemplary embodiment, the gesture control closing module 840 may include a passive closing unit that may be used to:
acquiring a preset time threshold;
if the time of the image acquisition unit which cannot detect the gesture image is greater than or equal to the time threshold, determining that the passive closing time of the preset gesture control mode is monitored, and closing the opened image acquisition unit.
In an exemplary embodiment, the gesture control shutdown module 840 may include a shutdown hint unit that may be used to:
And generating prompt information for closing the preset gesture control mode so as to prompt the closing state of the preset gesture control mode.
In an exemplary embodiment, the gesture control shut down module 840 may include an active shut down unit that may be configured to:
if the voice command input by the voice assistant contains a target closing keyword, or if the control command generated by triggering the operation triggering target control is a gesture control closing command, or if an active triggering event triggering the screen-extinguishing command is monitored, determining that the closing time of the preset gesture control mode is monitored, and closing the opened image acquisition unit.
The specific details of each module in the above apparatus are already described in the method section, and the details that are not disclosed can be referred to the embodiment of the method section, so that they will not be described in detail.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device, e.g. any one or more of the steps of fig. 3 to 6 may be carried out.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, the program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. An interaction method, comprising:
responding to the detected starting time of a preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode; the opening time comprises active opening time or passive opening time, and the image acquisition unit is a non-low-power-consumption normally-open camera; the preset gesture control mode refers to modes corresponding to different settings preset by a user and used for gesture control;
Based on the detected gesture posture image acquired by the image acquisition unit, matching corresponding gesture control according to the gesture posture image;
executing corresponding control functions according to the gesture control instruction;
the response to monitoring of the opening time of the preset gesture control mode starts an image acquisition unit matched with the preset gesture control mode, and the method comprises the following steps:
if a passive trigger event triggering a screen-lighting instruction is detected, executing the screen-lighting instruction, and taking the screen-lighting instruction triggered by the passive trigger event as the passive start time of the preset gesture control mode;
the passive triggering event comprises an event which is automatically triggered by the terminal equipment under the condition that a user does not actively participate, and the passive triggering event comprises at least one of a communication incoming call event, an information receiving event and an alarm clock reminding event.
2. The method of claim 1, wherein the activating an image acquisition unit that matches a preset gesture control mode in response to monitoring an on-occasion of the preset gesture control mode comprises:
acquiring a voice instruction input by a voice assistant;
If the voice command is detected to contain a target starting keyword, determining the active starting time for monitoring the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
3. The method according to claim 2, wherein the responding to the monitoring of the turn-on timing of the preset gesture control mode, starting the image acquisition unit matched with the preset gesture control mode, comprises:
generating a control instruction according to the triggering operation of the target control;
if the control instruction is detected to be a gesture control start instruction, determining to monitor the active start time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
4. A method according to any one of claims 1 to 3, characterized in that the method further comprises:
and generating an opening identifier corresponding to the preset gesture control mode, and prompting the opening state of the preset gesture control mode through the opening identifier.
5. The method according to claim 1, wherein the method further comprises:
and responding to the monitoring of the closing time of the preset gesture control mode, and closing the opened image acquisition unit.
6. The method of claim 5, wherein the closing timing comprises a passive closing timing, and the responding to the monitoring of the closing timing of the preset gesture control mode, the closing processing of the opened image acquisition unit comprises:
acquiring a preset time threshold;
if the time of the image acquisition unit which cannot detect the gesture image is greater than or equal to the time threshold, determining that the passive closing time of the preset gesture control mode is monitored, and closing the opened image acquisition unit.
7. The method of claim 5, wherein determining that the passive closing timing of the preset gesture control mode is monitored, and closing the opened image acquisition unit further comprises:
and generating prompt information for closing the preset gesture control mode so as to prompt the closing state of the preset gesture control mode.
8. The method of claim 5, wherein the closing timing comprises an active closing timing, and the responding to monitoring the closing timing of the preset gesture control mode, performing closing processing on the opened image acquisition unit comprises:
If the voice command input by the voice assistant contains a target closing keyword, or if the control command generated by triggering the operation triggering target control is a gesture control closing command, or if an active triggering event triggering the screen-extinguishing command is monitored, determining that the closing time of the preset gesture control mode is monitored, and closing the opened image acquisition unit.
9. An interactive apparatus, comprising:
the gesture control starting module is used for responding to the monitoring of the starting time of a preset gesture control mode and starting an image acquisition unit matched with the preset gesture control mode; the opening time comprises active opening time or passive opening time, and the image acquisition unit is a non-low-power-consumption normally-open camera; the preset gesture control mode refers to modes corresponding to different settings preset by a user and used for gesture control;
the gesture control instruction matching module is used for acquiring detected gesture posture images based on the image acquisition unit and matching corresponding gesture control images according to the gesture posture images;
the control module is used for executing corresponding control functions according to the gesture control instruction;
The gesture control starting module comprises a passive starting unit, wherein the passive starting unit is used for:
if a passive trigger event triggering a screen-lighting instruction is detected, executing the screen-lighting instruction, and taking the screen-lighting instruction triggered by the passive trigger event as the passive start time of the preset gesture control mode;
the passive triggering event comprises an event which is automatically triggered by the terminal equipment under the condition that a user does not actively participate, and the passive triggering event comprises at least one of a communication incoming call event, an information receiving event and an alarm clock reminding event.
10. A computer readable medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any one of claims 1 to 8.
11. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any one of claims 1 to 8 via execution of the executable instructions.
CN202110190469.6A 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment Active CN112860169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110190469.6A CN112860169B (en) 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110190469.6A CN112860169B (en) 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112860169A CN112860169A (en) 2021-05-28
CN112860169B true CN112860169B (en) 2024-01-12

Family

ID=75989676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110190469.6A Active CN112860169B (en) 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112860169B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342251A (en) * 2021-06-10 2021-09-03 北京字节跳动网络技术有限公司 Control method, system and device based on gesture, electronic equipment and storage medium
CN117939287A (en) * 2021-06-16 2024-04-26 荣耀终端有限公司 Abnormality prompting method and electronic equipment
CN114115536A (en) * 2021-11-22 2022-03-01 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN114327062A (en) * 2021-12-28 2022-04-12 深圳Tcl新技术有限公司 Man-machine interaction method, device, electronic equipment, storage medium and program product
CN117032447A (en) * 2022-05-31 2023-11-10 荣耀终端有限公司 Alternate gesture interaction method and device, electronic chip and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484037A (en) * 2014-12-12 2015-04-01 三星电子(中国)研发中心 Method for intelligent control by virtue of wearable device and wearable device
CN106843729A (en) * 2017-01-20 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of terminal control method and terminal
CN110109537A (en) * 2019-04-01 2019-08-09 努比亚技术有限公司 A kind of wearable device and its gesture identification method and computer readable storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102233728B1 (en) * 2013-10-31 2021-03-30 삼성전자주식회사 Method, apparatus and computer readable recording medium for controlling on an electronic device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484037A (en) * 2014-12-12 2015-04-01 三星电子(中国)研发中心 Method for intelligent control by virtue of wearable device and wearable device
CN106843729A (en) * 2017-01-20 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of terminal control method and terminal
CN110109537A (en) * 2019-04-01 2019-08-09 努比亚技术有限公司 A kind of wearable device and its gesture identification method and computer readable storage medium

Also Published As

Publication number Publication date
CN112860169A (en) 2021-05-28

Similar Documents

Publication Publication Date Title
CN112860169B (en) Interaction method and device, computer readable medium and electronic equipment
EP2879095B1 (en) Method, apparatus and terminal device for image processing
KR20180083587A (en) Electronic device and operating method thereof
EP3642838B1 (en) Method for operating speech recognition service and electronic device and server for supporting the same
CN104657057A (en) Terminal waking method and device
CN108135033B (en) Bluetooth connection method and mobile terminal
CN107870674B (en) Program starting method and mobile terminal
CN110908513B (en) Data processing method and electronic equipment
CN109101517B (en) Information processing method, information processing apparatus, and medium
WO2020108101A1 (en) Virtual data processing method and apparatus, and storage medium, and terminal
EP3933570A1 (en) Method and apparatus for controlling a voice assistant, and computer-readable storage medium
CN111968641B (en) Voice assistant awakening control method and device, storage medium and electronic equipment
WO2021212388A1 (en) Interactive communication implementation method and device, and storage medium
CN112364799A (en) Gesture recognition method and device
US9575538B2 (en) Mobile device
CN112634895A (en) Voice interaction wake-up-free method and device
CN108133708B (en) Voice assistant control method and device and mobile terminal
CN111681655A (en) Voice control method and device, electronic equipment and storage medium
CN113766127B (en) Mobile terminal control method and device, storage medium and electronic equipment
CN112995402A (en) Control method and device, computer readable medium and electronic equipment
WO2019156412A1 (en) Method for operating voice recognition service and electronic device supporting same
CN110944056A (en) Interaction method, mobile terminal and readable storage medium
CN108388399B (en) Virtual idol state management method and system
CN112764600B (en) Resource processing method, device, storage medium and computer equipment
CN113342170A (en) Gesture control method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant