CN112860169A - Interaction method and device, computer readable medium and electronic equipment - Google Patents

Interaction method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN112860169A
CN112860169A CN202110190469.6A CN202110190469A CN112860169A CN 112860169 A CN112860169 A CN 112860169A CN 202110190469 A CN202110190469 A CN 202110190469A CN 112860169 A CN112860169 A CN 112860169A
Authority
CN
China
Prior art keywords
gesture control
control mode
preset gesture
preset
starting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110190469.6A
Other languages
Chinese (zh)
Other versions
CN112860169B (en
Inventor
姬晨曦
黎瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110190469.6A priority Critical patent/CN112860169B/en
Publication of CN112860169A publication Critical patent/CN112860169A/en
Application granted granted Critical
Publication of CN112860169B publication Critical patent/CN112860169B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure provides an interaction method and device, a computer readable medium and electronic equipment, and relates to the technical field of human-computer interaction. The method comprises the following steps: starting an image acquisition unit matched with a preset gesture control mode in response to monitoring the starting time of the preset gesture control mode; acquiring a detected gesture image based on the image acquisition unit, and matching a corresponding gesture control instruction according to the gesture image; and executing a corresponding control function according to the gesture control instruction. This openly can make the terminal equipment who only has ordinary camera realize separating empty hand potential interaction function, effectively reduces the terminal equipment's that the camera was normally opened to the low-power consumption power dissipation of not installing simultaneously, promotes terminal equipment's duration.

Description

Interaction method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to an interaction method, an interaction apparatus, a computer-readable medium, and an electronic device.
Background
The space gesture interaction is an interaction mode for communicating information by means of hand movements or forms. With the development of gesture recognition technologies based on computer vision, sensing, myoelectricity recognition, infrared sensing and the like, gesture interaction is gradually going into people's lives. From object control in virtual or augmented reality, interface interaction of a computer, a mobile phone and wearable equipment and function control of intelligent products, intelligent televisions and vehicle-mounted equipment, gesture interaction is blended into interaction of people, products and interfaces in a multivariate complementary mode due to the advantages of nature, non-contact and the like.
At present, when the terminal equipment realizes the function of separating the space and interacting the potential, a front-mounted normally-open low-power-consumption AON (Always-on) camera is required to be mounted while a common camera is mounted. However, most terminal devices do not have AON cameras, and additional installation increases hardware cost; meanwhile, if the function of potential interaction in the air is realized only by the common camera, the common camera which is always open always can generate very large power consumption for the terminal equipment.
Disclosure of Invention
The purpose of the present disclosure is to provide an interaction method, an interaction apparatus, a computer-readable medium, and an electronic device, so as to overcome, at least to some extent, the problem in the related art that an AON camera must be installed to implement an overhead gesture interaction function, which results in a high hardware cost of a terminal device.
According to a first aspect of the present disclosure, there is provided an interaction method, comprising:
starting an image acquisition unit matched with a preset gesture control mode in response to monitoring the starting time of the preset gesture control mode;
acquiring a detected gesture image based on the image acquisition unit, and matching a corresponding gesture control instruction according to the gesture image;
and executing a corresponding control function according to the gesture control instruction.
According to a second aspect of the present disclosure, there is provided an interaction apparatus comprising:
the gesture control starting module is used for responding to the monitored starting time of a preset gesture control mode and starting an image acquisition unit matched with the preset gesture control mode;
the gesture control instruction matching module is used for acquiring a detected gesture attitude image based on the image acquisition unit and matching a corresponding gesture control instruction according to the gesture attitude image;
and the control module is used for executing corresponding control functions according to the gesture control instruction.
According to a third aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the above-mentioned method.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the above-described method.
According to the interaction method provided by the embodiment of the disclosure, an image acquisition unit matched with a preset gesture control mode is started in response to the monitored starting time of the preset gesture control mode; acquiring a detected gesture image based on an image acquisition unit, and matching a corresponding gesture control instruction according to the gesture image; and executing a corresponding control function according to the gesture control instruction. On one hand, when the starting time of the preset gesture control mode is detected, the corresponding image acquisition unit is started to acquire the gesture image, so that the problem of high power consumption of the terminal equipment caused by the fact that the image acquisition unit is always started can be effectively solved; on the other hand, the gesture control function is realized through the started image acquisition unit, and a low-power-consumption normally-open camera is not required to be additionally installed, so that the hardware cost of the terminal equipment is effectively reduced; on the other hand, the corresponding gesture control instruction is matched through the collected gesture posture image, and then the control function is executed through the matched gesture control instruction, so that the response speed of gesture control can be improved, and the accuracy of gesture recognition is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of a method of interaction in an exemplary embodiment of the disclosure;
fig. 4 schematically illustrates a flowchart of actively turning on an image capturing unit corresponding to a preset gesture control mode in an exemplary embodiment of the present disclosure;
fig. 5 schematically illustrates another flowchart of actively turning on an image capturing unit corresponding to a preset gesture control mode in an exemplary embodiment of the present disclosure;
fig. 6 schematically illustrates a flowchart of passively turning off an image capturing unit corresponding to a preset gesture control mode in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a composition diagram of an interaction device in an exemplary embodiment of the disclosure;
fig. 8 schematically shows a composition diagram of another interactive device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an interaction method and apparatus of the embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices with image capturing units, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The interaction method provided by the embodiment of the present disclosure is generally executed by the terminal devices 101, 102, and 103, and accordingly, the interaction apparatus is generally disposed in the terminal devices 101, 102, and 103. However, it is easily understood by those skilled in the art that the interaction method provided by the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the interaction apparatus may also be disposed in the server 105, which is not particularly limited in the exemplary embodiment. For example, in an exemplary embodiment, the gesture posture image may be acquired by an image acquisition unit started by the terminal devices 101, 102, and 103, and then the gesture posture image is uploaded to the server 105, and after the server generates a gesture control instruction by using the interaction method provided by the embodiment of the present disclosure, the gesture control instruction is transmitted to the terminal devices 101, 102, and 103, and so on.
The exemplary embodiments of the present disclosure provide an electronic device for implementing an interaction method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the interaction method via execution of the executable instructions.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, a camera module 291, an indicator 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
The following describes an interaction method and an interaction apparatus in exemplary embodiments of the present disclosure in detail.
Fig. 3 shows a flow of an interaction method in the present exemplary embodiment, including the following steps S310 to S330:
in step S310, in response to monitoring the starting time of the preset gesture control mode, an image acquisition unit matched with the preset gesture control mode is started.
In an exemplary embodiment, the preset gesture control mode may refer to a mode corresponding to different settings for gesture control preset by a user, for example, the preset gesture control mode may include a mapping relationship between different gesture postures set by the user and a gesture control instruction, an image acquisition unit (such as a front image acquisition unit or a rear image acquisition unit) for acquiring an image of the gesture posture, an timeout time for automatically turning off (passively turning off) the image acquisition unit, and different turning-on modes for turning on the preset gesture control mode, where the user may preset different options for implementing a gesture control function to form a preset gesture control mode belonging to a user-defined style, and this is not particularly limited in this exemplary embodiment. When the starting time of the preset gesture control mode is monitored, the terminal equipment can be controlled according to different set parameters in the preset gesture control mode.
The image capturing unit may be a non-low power consumption normally open unit for capturing an image, for example, the image capturing unit may be a built-in general camera (such as a front camera or a rear camera) in the terminal device, may also be an external general camera in communication connection with the terminal device (such as a wired connection or a wireless connection), and of course, may also be another non-low power consumption normally open unit for assisting the terminal device to capture a gesture image, which is not limited in this example embodiment.
If the starting time of the preset gesture control mode is monitored, calling a corresponding image acquisition unit according to the set data in the preset gesture control mode, for example, if the image acquisition unit set in the preset gesture control mode is a front camera, starting the front camera to acquire images when the starting time of the preset gesture control mode is monitored; an image acquisition unit set in a preset gesture control mode is a rear camera, and then the rear camera is started for image acquisition when the starting time of the preset gesture control mode is monitored, which is only illustrated schematically here, and the preset gesture control mode may also be a setting for acquiring an image in a form of a combination of a front camera, a rear camera and an external camera, and this is not particularly limited in this example embodiment.
In step S320, the detected gesture image is collected based on the image collecting unit, and a corresponding gesture control command is matched according to the gesture image.
In an exemplary embodiment, the gesture posture image may refer to an image including a hand posture acquired by the image acquisition unit, for example, the gesture posture image may be an image including a gesture posture of a palm front side or a palm back side (that is, a palm center is directly facing the camera or a palm center is back facing the camera), or may be an image set including a gesture motion, or of course, may be another image including a gesture posture or a gesture motion, which is not particularly limited in this exemplary embodiment.
After the image acquisition unit is started, the current environment can be continuously monitored, and when the image containing the hand features is monitored, the corresponding hand posture image (static gesture) or the hand posture image set (dynamic gesture) is intercepted and sent to the cache so as to be convenient for subsequent processing.
The gesture control instruction may be a preset control instruction triggered by a gesture posture, for example, if a gesture image including a palm front face is detected, the corresponding gesture control instruction may be a pause or a return, and if a gesture image including a palm center swinging up and down, the corresponding gesture control instruction may be a downward sliding or a downward page turning, of course, the gesture control instruction herein is only an illustrative example, and a mapping relationship between a specific gesture posture image and a gesture control instruction is different according to a preset gesture control mode set by a user, and is not only an example mapping relationship in the present exemplary embodiment, which is not particularly limited.
In step S330, a corresponding control function is executed according to the gesture control instruction.
In an exemplary embodiment, the control function may be a human-computer interaction function corresponding to the gesture control instruction, for example, if the gesture control instruction may be a pause instruction or a return instruction, the corresponding pause function or return function in the current interface is triggered according to the gesture control instruction, and if the gesture control instruction may be a slide-down instruction or a page-down instruction, the slide-down function or page-down function of the display page in the current interface is triggered according to the gesture control instruction, which is, of course, only illustrated here by way of example, and should not cause any particular limitation to this exemplary embodiment.
The following explains step S310 to step S330 in detail.
In an exemplary embodiment, the starting time may include an active starting time, where the active starting time refers to a time when the user actively controls the preset gesture control mode to start through a related starting operation, for example, the active starting time may be a time when the user starts the preset gesture control mode through a voice instruction, or may be a time when the user actively controls the preset gesture control mode to start through a preset trigger control, or of course, other users actively start the preset gesture control mode in a self-initiated manner, which is not particularly limited in this exemplary embodiment.
Specifically, whether the active start timing is monitored or not may be judged through the steps in fig. 4, and as shown in fig. 4, the method specifically includes:
step S410, acquiring a voice instruction input by a voice assistant;
step S420, if it is detected that the voice command includes a target start keyword, determining to monitor the active start time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
The voice assistant is an intelligent application program which realizes interaction in the form of intelligent conversation and instant question and answer, can receive voice information of a user, translates the voice information into a voice instruction which can be recognized by a machine, and further realizes a corresponding function through the recognized voice instruction. The voice assistant can be awakened by a voice instruction containing an awakening keyword, and then the awakened voice assistant monitors the voice instruction.
The target opening keyword may be a keyword in the voice instruction for instructing to open the preset gesture control mode, for example, if the input voice instruction may be "please help me to open the spaced gesture", and the target opening keyword may be "open" or "spaced gesture", if the target opening keyword is detected in the voice instruction, it may be considered that the voice instruction is intended to open the preset gesture control mode, and it is determined that the active opening time of the preset gesture control mode is monitored; of course, the input voice instruction may be "i want to interact through a gesture", the target opening keyword may be "a gesture" or "an interaction", and the specific voice instruction setting for opening the preset gesture control mode is different according to the preset lexicon or the user-defined voice instruction, which is not particularly limited in this exemplary embodiment.
In an exemplary embodiment, it may also be determined whether the active turn-on timing is monitored through the steps in fig. 5, and as shown in fig. 5, the method specifically includes:
step S510, generating a control instruction according to the trigger operation of the target control;
step S520, if the control instruction is detected to be a gesture control starting instruction, determining to monitor the active starting time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
The target control may be a preset control for triggering to open or close the preset gesture control mode, for example, the target control may be a shortcut button for opening or closing the preset gesture control mode in a navigation bar, a shortcut instruction button for generating a voice instruction for opening or closing the preset gesture control mode by one key in a voice assistant, a preset touch key combination for opening or closing the preset gesture control mode (for example, a combination of a return key, a menu key, and a home page key may be used, or a combination of a volume key and a power key may be used, in this example, the form of the key combination is not specially limited), of course, other controls for triggering to open or close the preset gesture control mode may also be used, and this example is not limited to this.
If the control instruction generated after the trigger operation triggering target control is detected is a gesture control starting instruction used for starting a preset gesture control mode, the active starting time of the preset gesture control mode can be monitored at the moment, an image acquisition unit matched with the preset gesture control mode is started, and a gesture control function is achieved based on the image acquisition unit.
In an exemplary embodiment, the opening timing includes a passive opening timing, where the passive opening timing refers to a timing when the terminal device automatically controls the preset gesture control mode to open due to triggering of a related event, for example, the passive opening timing may be a timing when the terminal device receives an incoming communication call, or a timing when the terminal device receives information, or of course, a timing when other terminal devices automatically controls the preset gesture control mode to open due to triggering of a related event, which is not particularly limited in this exemplary embodiment.
Specifically, if a passive trigger event triggering the screen-up instruction is monitored, the screen-up instruction is executed, and the screen-up instruction triggered by the passive trigger event can be used as a passive opening time of the preset gesture control mode.
The passive trigger event refers to an event triggered by the terminal device itself when the user does not actively participate, for example, the passive trigger event may be an incoming communication event, the incoming communication event refers to an event that the current terminal device receives a communication request from another terminal device, and may also be an information receiving event, and the information receiving event refers to an event that the current terminal device receives information sent by another terminal device.
The passive trigger event triggering screen-on instruction refers to an instruction for passively lighting up a screen to inform a user of a passive trigger event under the condition that the user does not actively participate, for example, when a communication incoming call event, an information receiving event or an alarm clock reminding event is detected, the passive trigger screen-on instruction is used as a passive start opportunity of a preset gesture control mode, and when the passive start opportunity is detected, an image acquisition unit corresponding to the preset gesture control mode is started to realize a gesture control function.
When a passive trigger event is detected, it indicates that a user may not be near a terminal device, and at this time, the user does not need to actively start a gesture control function, but passively start the gesture control function, so that the interaction efficiency of the user can be effectively improved, and the efficiency of the user in processing the passive trigger event is improved.
In an exemplary embodiment, when an active start opportunity or a passive start opportunity is detected, an image acquisition unit corresponding to a preset gesture control mode is started, and an air-isolated gesture function is realized based on the image acquisition unit, after the air-isolated gesture function is started, a start identifier corresponding to the preset gesture control mode may be generated, for example, a start icon corresponding to the gesture control mode may be preset in a state display bar, and then the start state of the preset gesture control mode is prompted through the start identifier, so that a user can know the on-off state of the preset gesture control mode, and then judge whether to switch the preset gesture control mode, and realize interactive control through the air-isolated gesture in the state where the preset gesture control mode is started (the start identifier exists).
In an exemplary embodiment, the turning-on image capturing unit may be turned off in response to monitoring of the turning-off timing of the preset gesture control mode.
Specifically, the closing time may include a passive closing time, where the passive closing time refers to a time when the terminal device passively triggers the self-control preset gesture control mode to close, for example, the passive closing time may be a time when the user does not use the air-separating gesture control function for more than a preset time period, or may be a time when the terminal device is detected to close, or of course, other terminal devices may also passively trigger the self-control preset gesture control mode to close, which is not particularly limited in this example embodiment.
In an exemplary embodiment, it may be determined whether the passive opening timing is monitored through the steps in fig. 6, and as shown in fig. 6, the determining may specifically include:
step S610, acquiring a preset time threshold;
step S620, if the time that the image acquisition unit cannot detect the gesture image is greater than or equal to the time threshold, determining that the passive closing time of the preset gesture control mode is monitored, and closing the opened image acquisition unit.
The time threshold may be a preset threshold used for passively turning off the image capturing unit corresponding to the preset gesture control mode, for example, the time threshold may be 10 minutes or 60 minutes, and may be specifically updated dynamically according to an actual use condition or determined according to a user-defined setting, which is not particularly limited in this example embodiment. When the time that the image acquisition unit cannot detect the gesture posture image is greater than or equal to the time threshold, it can be considered that the user does not use the space gesture function or does not need to use the space gesture function for a long time, and therefore the image acquisition unit corresponding to the preset gesture control mode is closed to finish the space gesture function.
By setting the preset time threshold value, and when the time that the image acquisition unit cannot detect the gesture posture image is greater than or equal to the time threshold value, under the condition that a user does not actively participate, the image acquisition unit corresponding to the preset gesture control mode is passively closed, the power consumption of the terminal device is effectively reduced, and the endurance of the terminal device is improved.
Further, when the passive closing time is monitored and the opened image acquisition unit is closed, prompt information for closing the preset gesture control mode can be generated, for example, when the time that the image acquisition unit cannot detect the gesture image is greater than or equal to a time threshold, the user can be prompted in a message popup window mode that the current blank gesture function is idle for more than time, so that the user can know the on-off state of the current preset gesture control mode, perform subsequent operations, and reduce the memory burden of the user.
In an exemplary embodiment, the closing timing may include an active closing timing, where the active closing timing refers to a timing at which a user actively controls the preset gesture control mode to close through a related closing operation, for example, the active closing timing may be a timing at which the user closes the preset gesture control mode through a voice instruction, or a timing at which the user actively controls the preset gesture control mode to close through a preset trigger control, or of course, a timing at which other users actively and autonomously close the preset gesture control mode may also be used, which is not particularly limited in this exemplary embodiment.
Specifically, if a voice instruction input by the voice assistant includes a target closing keyword, or if a control instruction generated by triggering an operation to trigger a target control is a gesture control closing instruction, or if an active trigger event triggering a screen trap instruction is monitored, it may be determined that a closing time of a preset gesture control mode is monitored, and the opened image acquisition unit is closed.
In summary, in the exemplary embodiment, in response to monitoring the starting time of the preset gesture control mode, the image acquisition unit matched with the preset gesture control mode is started; acquiring a detected gesture image based on an image acquisition unit, and matching a corresponding gesture control instruction according to the gesture image; and executing a corresponding control function according to the gesture control instruction. On one hand, when the starting time of the preset gesture control mode is detected, the corresponding image acquisition unit is started to acquire the gesture image, so that the problem of high power consumption of the terminal equipment caused by the fact that the image acquisition unit is always started can be effectively solved; on the other hand, the gesture control function is realized through the started image acquisition unit, and a low-power-consumption normally-open camera is not required to be additionally installed, so that the hardware cost of the terminal equipment is effectively reduced; on the other hand, the corresponding gesture control instruction is matched through the collected gesture posture image, and then the control function is executed through the matched gesture control instruction, so that the response speed of gesture control can be improved, and the accuracy of gesture recognition is improved.
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 7, an interactive apparatus 700 is further provided in the present example, which may include a gesture control starting module 710, a gesture control instruction matching module 720, and a control module 730. Wherein:
the gesture control starting module 710 may be configured to start an image acquisition unit matched with a preset gesture control mode in response to monitoring a starting time of the preset gesture control mode;
the gesture control instruction matching module 720 may be configured to collect the detected gesture posture image based on the image collection unit, and match a corresponding gesture control instruction according to the gesture posture image;
the control module 730 can be used for executing corresponding control functions according to the gesture control instructions.
In an exemplary embodiment, the gesture control activation module 710 may include an active activation unit, which may be used to:
acquiring a voice instruction input by a voice assistant;
and if the voice command is detected to contain a target starting keyword, determining to monitor the active starting opportunity of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
In an exemplary embodiment, the active opening unit may be further configured to:
generating a control instruction according to the triggering operation of the target control;
and if the control instruction is detected to be a gesture control opening instruction, determining to monitor the active opening time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
In an exemplary embodiment, the gesture control activation module 710 may include a passive activation unit, which may be configured to:
if a passive trigger event triggering a screen-up instruction is monitored, executing the screen-up instruction, and taking the screen-up instruction triggered by the passive trigger event as the passive starting opportunity of the preset gesture control mode;
the passive trigger event comprises an incoming communication event or an information receiving event.
In an exemplary embodiment, the interaction apparatus 700 may further include an opening identifier generating unit, and the opening identifier generating unit may be configured to:
and generating a starting identifier corresponding to the preset gesture control mode so as to prompt the starting state of the preset gesture control mode through the starting identifier.
In an exemplary embodiment, as shown with reference to fig. 8, the interaction device 700 may further include a gesture-controlled closing module 840, and the gesture-controlled closing module 840 may be configured to:
and responding to the monitored closing time of the preset gesture control mode, and closing the opened image acquisition unit.
In an exemplary embodiment, the gesture-controlled closing module 840 may include a passive closing unit, which may be used to:
acquiring a preset time threshold;
if the time that the image acquisition unit cannot detect the gesture attitude image is greater than or equal to the time threshold, determining to monitor the passive closing time of the preset gesture control mode, and closing the opened image acquisition unit.
In an exemplary embodiment, the gesture control close module 840 may include a close cue unit, which may be configured to:
and generating prompt information for closing the preset gesture control mode so as to prompt the closing state of the preset gesture control mode.
In an exemplary embodiment, the gesture-controlled closing module 840 may include an active closing unit, which may be used to:
if the voice command input by the voice assistant comprises a target closing keyword, or if the control command generated by triggering the target control is a gesture control closing command, or if an active triggering event triggering a screen-touching command is monitored, determining a closing time when the preset gesture control mode is monitored, and closing the started image acquisition unit.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3 to 6 may be performed.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (12)

1. An interaction method, comprising:
starting an image acquisition unit matched with a preset gesture control mode in response to monitoring the starting time of the preset gesture control mode;
acquiring a detected gesture image based on the image acquisition unit, and matching a corresponding gesture control instruction according to the gesture image;
and executing a corresponding control function according to the gesture control instruction.
2. The method of claim 1, wherein the turn-on timing comprises an active turn-on timing, and wherein the starting an image capture unit matched with a preset gesture control mode in response to monitoring the turn-on timing of the preset gesture control mode comprises:
acquiring a voice instruction input by a voice assistant;
and if the voice command is detected to contain a target starting keyword, determining to monitor the active starting opportunity of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
3. The method of claim 2, wherein the starting an image capture unit that matches a preset gesture control mode in response to monitoring a turn-on timing of the preset gesture control mode comprises:
generating a control instruction according to the triggering operation of the target control;
and if the control instruction is detected to be a gesture control opening instruction, determining to monitor the active opening time of the preset gesture control mode, and starting an image acquisition unit matched with the preset gesture control mode.
4. The method of claim 1, wherein the turn-on timing comprises a passive turn-on timing, and wherein the starting an image capture unit that matches a preset gesture control mode in response to monitoring the turn-on timing of the preset gesture control mode comprises:
if a passive trigger event triggering a screen-up instruction is monitored, executing the screen-up instruction, and taking the screen-up instruction triggered by the passive trigger event as the passive starting opportunity of the preset gesture control mode;
the passive trigger event comprises an incoming communication event or an information receiving event.
5. The method of any one of claims 1 to 4, further comprising:
and generating a starting identifier corresponding to the preset gesture control mode so as to prompt the starting state of the preset gesture control mode through the starting identifier.
6. The method of claim 1, further comprising:
and responding to the monitored closing time of the preset gesture control mode, and closing the opened image acquisition unit.
7. The method of claim 6, wherein the turn-off timing comprises a passive turn-off timing, and the turning-off processing of the turned-on image capturing unit in response to monitoring the turn-off timing of the preset gesture control mode comprises:
acquiring a preset time threshold;
if the time that the image acquisition unit cannot detect the gesture attitude image is greater than or equal to the time threshold, determining to monitor the passive closing time of the preset gesture control mode, and closing the opened image acquisition unit.
8. The method according to claim 7, wherein the determining that the passive turn-off timing of the preset gesture control mode is monitored and turning off the turned-on image capturing unit further comprises:
and generating prompt information for closing the preset gesture control mode so as to prompt the closing state of the preset gesture control mode.
9. The method according to claim 6, wherein the turn-off timing comprises an active turn-off timing, and the turning-off processing of the turned-on image capturing unit in response to monitoring the turn-off timing of the preset gesture control mode comprises:
if the voice command input by the voice assistant comprises a target closing keyword, or if the control command generated by triggering the target control is a gesture control closing command, or if an active triggering event triggering a screen-touching command is monitored, determining a closing time when the preset gesture control mode is monitored, and closing the started image acquisition unit.
10. An interactive apparatus, comprising:
the gesture control starting module is used for responding to the monitored starting time of a preset gesture control mode and starting an image acquisition unit matched with the preset gesture control mode;
the gesture control instruction matching module is used for acquiring a detected gesture attitude image based on the image acquisition unit and matching a corresponding gesture control instruction according to the gesture attitude image;
and the control module is used for executing corresponding control functions according to the gesture control instruction.
11. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 9 via execution of the executable instructions.
CN202110190469.6A 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment Active CN112860169B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110190469.6A CN112860169B (en) 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110190469.6A CN112860169B (en) 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112860169A true CN112860169A (en) 2021-05-28
CN112860169B CN112860169B (en) 2024-01-12

Family

ID=75989676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110190469.6A Active CN112860169B (en) 2021-02-18 2021-02-18 Interaction method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112860169B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342251A (en) * 2021-06-10 2021-09-03 北京字节跳动网络技术有限公司 Control method, system and device based on gesture, electronic equipment and storage medium
CN114115536A (en) * 2021-11-22 2022-03-01 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN114327062A (en) * 2021-12-28 2022-04-12 深圳Tcl新技术有限公司 Man-machine interaction method, device, electronic equipment, storage medium and program product
CN115484393A (en) * 2021-06-16 2022-12-16 荣耀终端有限公司 Abnormity prompting method and electronic equipment
CN117032447A (en) * 2022-05-31 2023-11-10 荣耀终端有限公司 Alternate gesture interaction method and device, electronic chip and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484037A (en) * 2014-12-12 2015-04-01 三星电子(中国)研发中心 Method for intelligent control by virtue of wearable device and wearable device
US20150121228A1 (en) * 2013-10-31 2015-04-30 Samsung Electronics Co., Ltd. Photographing image changes
CN106843729A (en) * 2017-01-20 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of terminal control method and terminal
CN110109537A (en) * 2019-04-01 2019-08-09 努比亚技术有限公司 A kind of wearable device and its gesture identification method and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150121228A1 (en) * 2013-10-31 2015-04-30 Samsung Electronics Co., Ltd. Photographing image changes
CN104484037A (en) * 2014-12-12 2015-04-01 三星电子(中国)研发中心 Method for intelligent control by virtue of wearable device and wearable device
CN106843729A (en) * 2017-01-20 2017-06-13 宇龙计算机通信科技(深圳)有限公司 A kind of terminal control method and terminal
CN110109537A (en) * 2019-04-01 2019-08-09 努比亚技术有限公司 A kind of wearable device and its gesture identification method and computer readable storage medium

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342251A (en) * 2021-06-10 2021-09-03 北京字节跳动网络技术有限公司 Control method, system and device based on gesture, electronic equipment and storage medium
CN115484393A (en) * 2021-06-16 2022-12-16 荣耀终端有限公司 Abnormity prompting method and electronic equipment
WO2022262453A1 (en) * 2021-06-16 2022-12-22 荣耀终端有限公司 Abnormality prompting method and electronic device
CN115484393B (en) * 2021-06-16 2023-11-17 荣耀终端有限公司 Abnormality prompting method and electronic equipment
CN114115536A (en) * 2021-11-22 2022-03-01 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and storage medium
CN114327062A (en) * 2021-12-28 2022-04-12 深圳Tcl新技术有限公司 Man-machine interaction method, device, electronic equipment, storage medium and program product
CN117032447A (en) * 2022-05-31 2023-11-10 荣耀终端有限公司 Alternate gesture interaction method and device, electronic chip and electronic equipment

Also Published As

Publication number Publication date
CN112860169B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
CN112860169B (en) Interaction method and device, computer readable medium and electronic equipment
EP4030422B1 (en) Voice interaction method and device
CN104657057A (en) Terminal waking method and device
CN108135033B (en) Bluetooth connection method and mobile terminal
WO2019017687A1 (en) Method for operating speech recognition service and electronic device and server for supporting the same
CN110503959A (en) Voice recognition data distribution method, device, computer equipment and storage medium
CN111739517B (en) Speech recognition method, device, computer equipment and medium
CN107870674B (en) Program starting method and mobile terminal
CN110908513B (en) Data processing method and electronic equipment
CN107734170B (en) Notification message processing method, mobile terminal and wearable device
WO2020108101A1 (en) Virtual data processing method and apparatus, and storage medium, and terminal
CN108958587B (en) Split screen processing method and device, storage medium and electronic equipment
WO2019156412A1 (en) Method for operating voice recognition service and electronic device supporting same
CN111968641B (en) Voice assistant awakening control method and device, storage medium and electronic equipment
WO2020171548A1 (en) User input processing method and electronic device supporting same
CN112188461A (en) Control method and device for near field communication device, medium and electronic equipment
CN112995402A (en) Control method and device, computer readable medium and electronic equipment
CN113766127B (en) Mobile terminal control method and device, storage medium and electronic equipment
CN106330672B (en) Instant messaging method and system
CN113325948A (en) Air-isolated gesture adjusting method and terminal
CN110944056A (en) Interaction method, mobile terminal and readable storage medium
CN109040427B (en) Split screen processing method and device, storage medium and electronic equipment
CN111554314B (en) Noise detection method, device, terminal and storage medium
CN108089935B (en) Application program management method and mobile terminal
CN112764600B (en) Resource processing method, device, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant