CN112995402A - Control method and device, computer readable medium and electronic equipment - Google Patents

Control method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN112995402A
CN112995402A CN202110235742.2A CN202110235742A CN112995402A CN 112995402 A CN112995402 A CN 112995402A CN 202110235742 A CN202110235742 A CN 202110235742A CN 112995402 A CN112995402 A CN 112995402A
Authority
CN
China
Prior art keywords
control instruction
wearable device
control
target data
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110235742.2A
Other languages
Chinese (zh)
Inventor
冯东杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110235742.2A priority Critical patent/CN112995402A/en
Publication of CN112995402A publication Critical patent/CN112995402A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0254Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity detecting a user operation or a tactile contact or a motion of the device

Abstract

The disclosure provides a control method and device, a computer readable medium and electronic equipment, and relates to the technical field of communication. The method comprises the following steps: after communication connection with the wearable equipment is detected, target data are obtained and transmitted to the wearable equipment for processing; receiving a first control instruction sent by the wearable device based on the detected distance data, and controlling the transmission state corresponding to the target data processed by the wearable device according to the first control instruction. The wearable equipment can automatically pause or play the target data processed by the terminal equipment after the user takes off or wears the wearable equipment, so that the power consumption of the wearable equipment is effectively reduced, and the user use experience is improved.

Description

Control method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of communications technologies, and in particular, to a control method, a control apparatus, a computer-readable medium, and an electronic device.
Background
Along with the continuous improvement of living standard of people, Wearable equipment (Wearable device) is more and more popular. Wearable devices often exist in the form of portable accessories having a part of computing functions and being connectable to mobile phones and various terminals, such as smart glasses supporting Augmented Reality (AR) technology, smart watches supporting multimedia data playing, and the like.
In the related art, when multimedia data such as audio or video in an application program is played through a wearable device connected with a mobile phone, after a user takes off or wears the wearable device, the audio or video in the application program cannot be paused or played, so that the wearable device is in a working state in an unworn state, power consumption of the wearable device is high, power consumption of the wearable device is increased, and user experience is poor.
Disclosure of Invention
The present disclosure aims to provide a control method, a control apparatus, a computer-readable medium, and an electronic device, so as to avoid, at least to a certain extent, a problem that in the related art, a wearable device cannot control playing or pausing of target data transmitted by a terminal device, which results in high power consumption when the wearable device continues to play the target data when not worn.
According to a first aspect of the present disclosure, there is provided a control method applied to a terminal device, including:
after communication connection with wearable equipment is detected to be established, target data are obtained and transmitted to the wearable equipment for processing;
receiving a first control instruction sent by the wearable device based on the detected distance data, and controlling a transmission state corresponding to the target data processed by the wearable device according to the first control instruction.
According to a second aspect of the present disclosure, there is provided a control method applied to a wearable device, including:
after communication connection with the terminal equipment is detected, distance data between the terminal equipment and a target object are acquired;
generating a first control instruction according to the distance data, and sending the first control instruction to the terminal device, wherein the terminal device can control a transmission state corresponding to target data processed by the wearable device according to the first control instruction;
and receiving a second control instruction returned by the terminal equipment so as to change the current working state according to the second control instruction.
According to a third aspect of the present disclosure, there is provided a control apparatus comprising:
the target data processing module is used for acquiring target data after communication connection with the wearable equipment is detected, and transmitting the target data to the wearable equipment for processing;
and the transmission state control module is used for receiving a first control instruction sent by the wearable device based on the detected distance data and controlling the transmission state corresponding to the target data processed by the wearable device according to the first control instruction.
According to a fourth aspect of the present disclosure, there is provided a control apparatus comprising:
the distance data acquisition module is used for acquiring distance data between the terminal equipment and a target object after detecting that communication connection is established between the terminal equipment and the target object;
the transmission state control module is used for generating a first control instruction according to the distance data and sending the first control instruction to the terminal equipment, wherein the terminal equipment can control a transmission state corresponding to target data processed by the wearable equipment according to the first control instruction;
and the working state changing module is used for receiving a second control instruction returned by the terminal equipment so as to change the current working state according to the second control instruction.
According to a fifth aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the above-mentioned method.
According to a sixth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the above-described method.
According to the control method provided by one embodiment of the disclosure, after communication connection with the wearable device is detected, target data is transmitted to the wearable device for processing; receiving a first control instruction sent by the wearable device based on the detected distance data, and controlling a transmission state corresponding to target data processed by the wearable device according to the first control instruction; and generating a second control instruction based on the first control instruction, and sending the second control instruction to the wearable device, so that the wearable device changes the current working state according to the second control instruction. On one hand, according to a first control instruction sent by the wearable device based on the detected distance data, the current wearing condition of the wearable device can be quickly determined, and further, according to the transmission state of the target data of the first control instruction, the transmission of the target data can be timely suspended when the wearable device is not worn, or the transmission of the target data can be timely recovered when the wearable device is worn again, so that the transmission state of the target data can be automatically controlled according to the wearing condition, the power consumption of the wearable device under the non-wearing condition is effectively reduced, and the use duration is prolonged; on the other hand, after the transmission state corresponding to the target data is controlled according to the first control instruction, a second control instruction is generated and returned to the wearable device, so that the wearable device can timely enter a sleep state, the power consumption of the wearable device is further reduced, or the wearable device can timely enter a wake-up state when being worn again, the response speed of the wearable device is increased, the fluency is improved, and the use experience of a user is guaranteed.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of a control method in an exemplary embodiment of the disclosure;
fig. 4 schematically illustrates a flow chart for controlling a transmission status of target data transmitted to a wearable device for processing in an exemplary embodiment of the disclosure;
fig. 5 schematically illustrates a flow chart for controlling a current operating state of a wearable device in an exemplary embodiment of the disclosure;
fig. 6 schematically illustrates a flow chart of a manner of altering a current operating state of a wearable device in an exemplary embodiment of the present disclosure;
fig. 7 schematically illustrates a flowchart for controlling switching of an audio playback master device in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow chart of another control method in an exemplary embodiment of the disclosure;
fig. 9 schematically illustrates an interaction flow diagram of a control terminal device and a wearable device in an exemplary embodiment of the disclosure;
FIG. 10 schematically illustrates a schematic composition diagram of a control device in an exemplary embodiment of the present disclosure;
fig. 11 schematically shows a composition diagram of another control apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which a control method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having an image processing function, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The control method provided by the embodiment of the present disclosure is generally executed by the terminal devices 101, 102, 103, and accordingly, the control device is generally disposed in the terminal devices 101, 102, 103. However, it is easily understood by those skilled in the art that the control method provided in the embodiment of the present disclosure may also be executed by the server 105, and accordingly, the control device may also be disposed in the server 105, which is not particularly limited in the exemplary embodiment.
The exemplary embodiment of the present disclosure provides an electronic device for implementing a control method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the control method via execution of the executable instructions.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, a camera module 291, an indicator 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
In the related art, when wearable devices such as split AR (Augmented Reality) glasses are used, a user takes off the AR glasses, and if the AR glasses do not enter a deep sleep state, input and output devices of target data such as audio or video can be selected on the AR glasses, and the input and output devices of the target data cannot be switched to a mobile phone. In addition, when the user takes off or wears the AR glasses again, the APP (Application) corresponding to the music/video cannot be paused or resumed, or can only pause or resume the playing of the Application developed for the AR glasses, and cannot pause or resume the playing of the target data in the mobile phone system or the Application downloaded by the third party.
In addition, when the user takes off wearable equipment like split type AR glasses, audio input and output equipment can not be selected on the cell-phone, and the AR glasses can not get into the deep sleep state very fast either, if call time or message prompt time appear in this time quantum cell-phone system before getting into the deep sleep state this moment, then because the AR glasses have got into the shallow sleep state, the audio equipment on the AR glasses has all been closed this moment, will lead to the user can't hear the sound of these times this moment, and then can lead to the condition emergence of missed calls, just can discover these incident messages when needing the user to look over the cell-phone voluntarily, and then brought very big inconvenience for the user uses the AR glasses. Therefore, in order to solve the problem, the mobile phone needs to be capable of automatically selecting the audio input and output device according to whether the user wears the AR glasses or not in a state of connecting the split AR glasses.
Meanwhile, when the user takes off the AR glasses or wears the AR glasses again, the playing state of the music/video APP cannot follow the actions of taking off and wearing again of the user, and therefore, when the user takes off the AR glasses, if the music/video application program cannot detect the state, the application program will continue to play, and then when the user wears the AR glasses again, the user cannot continue to experience the AR glasses from the playing node of the audio/video when taking off the glasses. From the aspect of the endurance of AR glasses, when the user takes off the AR glasses, if the music/video APP cannot pause playing, the power consumption of the AR glasses is high, and the endurance time of the AR glasses is reduced.
Based on one or more of the above problems, the present exemplary embodiment first provides a control method, and the following specifically describes the control method according to the exemplary embodiment of the present disclosure by taking the terminal device as an example to execute the method.
Fig. 3 shows a flow of a control method in the present exemplary embodiment, including the following steps S310 to S320:
in step S310, after detecting that a communication connection is established with a wearable device, target data is obtained and transmitted to the wearable device for processing.
In an exemplary embodiment, the wearable device refers to a portable mobile device that can be worn by a human body and has a computing capability and can play processing target data, for example, the wearable device may be an AR glasses, may also be a smart watch, and of course, may also be another portable mobile device that has a computing capability and can play processing target data and can be worn by a human body, which is not particularly limited in this exemplary embodiment.
The wearable Device can perform wired communication or wireless communication with the terminal Device through a preset communication module, for example, the communication module of the wearable Device may be a bluetooth module, and may also be an HID (Human Interface Device, a data exchange specification, exchanged data is stored in a structure called a report, a firmware of the Device must support a format of the HID report, the host transmits and requests the report in control and interrupt transmission to transmit and receive the data, and the report has a very flexible format and can process data of any type).
The communication connection may be a communication manner for connecting the terminal device and the wearable device and performing data transmission, for example, the communication connection may be a bluetooth communication connection, a WiFi communication connection, a communication connection based on a 2G/3G/4G/5G mobile network, or other communication manners capable of connecting the mobile terminal and the first wearable device and performing data transmission, for example, the communication connection may also be a communication manner for performing wired connection by supporting interface protocols such as Micro-USB, Type-C, Thunderbolt 3(Thunderbolt3), and this is not particularly limited in this example.
The target data refers to data provided by a third-party application program in the terminal device and used for processing in the wearable device, for example, the target data may be audio data or video data, and of course, may also be data corresponding to a game application program, which is not particularly limited in this example embodiment.
In step S320, a first control instruction sent by the wearable device based on the detected distance data is received, and a transmission state corresponding to the target data processed by the wearable device is controlled according to the first control instruction.
In an exemplary embodiment, the first control instruction refers to an instruction generated by the wearable device based on the detected distance data, for example, when the distance data is equal to or greater than a distance threshold, the wearable device may be considered as a non-wearing state, and a far control instruction is generated; when the distance data is smaller than the distance threshold, the wearable device can be considered as a wearing state, and then a proximity control instruction is generated.
Specifically, the distance data between the wearable device and the user may be detected through a proximity Sensor (e.g., a mobile phone distance Sensor P-Sensor, the face of the user approaches the screen, the screen light may be turned off, and automatically locks the screen, so as to prevent a misoperation of the face, when the face leaves the screen, the screen light may be automatically turned on, and automatically unlocked), the acceleration data of the wearable device may also be obtained through an acceleration Sensor equipped in the wearable device, and then the distance data between the wearable device and the user is disassembled from the acceleration data, of course, other ways of detecting the distance data between the wearable device and the user may also be used, and this example embodiment is not limited thereto.
The transmission state refers to a state of data transmission corresponding to processing the target data, for example, the transmission state may be a paused state corresponding to the paused target data, or may be a playing state corresponding to the played target data, which is not limited in this exemplary embodiment.
When the wearable device is detected to be in a non-wearing state, the target data provided by the application program in the control terminal device are in a pause state, and when the wearable device is detected to be restored to the wearing state, the target data provided by the application program in the control terminal device are in a playing state, so that a user can continue playing from a playing node of the target data when the user picks off the wearable device last time, and the use experience of the user is improved.
The control method in steps S310 to S320 will be further described below.
In an exemplary embodiment, after controlling the transmission state corresponding to the target data transmitted to the wearable device for processing according to the first control instruction, a second control instruction may be further generated based on the first control instruction and sent to the wearable device, so that the wearable device changes the current operating state according to the second control instruction.
The second control instruction is an instruction for controlling the working state of the wearable device, for example, when the first control instruction is a remote control instruction, the wearable device may be considered as a non-wearing state, and the user cannot browse target data through the wearable device, at this time, the terminal device sets the target data provided by the application program to a suspended state according to the first control instruction, and controls the wearable device to enter a sleep state through the generated second control instruction, so as to reduce power consumption of the wearable device in an idle state; when the first control instruction is close to the control instruction, the wearable device can be considered to be in a wearing state, the user needs to browse target data through the wearable device, at the moment, the terminal device sets the target data provided by the application program to be in a playing state according to the first control instruction, meanwhile, the wearable device is communicated and transmitted to the wearable device to be processed, and the wearable device is controlled to enter a wake-up state through the generated second control instruction, so that the response speed of the wearable device when the user continues to use is increased, the device operation fluency is increased, and the user experience is improved.
In an exemplary embodiment, the step in fig. 4 may be implemented to control a transmission state corresponding to target data transmitted to the wearable device for processing according to the first control instruction, and as shown with reference to fig. 4, the step may specifically include:
step S410, responding to the first control instruction as a remote control instruction, and controlling the target data transmitted to the wearable device for processing to be in a pause state; or
Step S420, if the first control instruction is a proximity control instruction, controlling the target data transmitted to the wearable device for processing to be in a playing state, so as to resume the suspended state.
The distance control instruction refers to a control instruction generated when it is detected that distance data between the wearable device and a user is greater than or equal to a preset distance threshold, the wearable device may be considered to be in a non-wearing state when the distance data between the wearable device and the user is greater than or equal to the preset distance threshold, the control instruction is used for controlling target data provided by an application program in the terminal device to be in a suspended state, for example, the preset distance threshold may be 50cm, when it is detected that the distance data between the wearable device and the user is greater than or equal to 50cm, the user may be considered to have failed to use the wearable device, at this time, it is determined that the wearable device is in the non-wearing state, and the target data provided by the application program in the terminal device is controlled to; of course, the preset distance threshold may also be 100cm, and the specific numerical value may be set by self-definition according to the actual situation, which is not particularly limited in this example embodiment.
The approach control instruction is a control instruction generated when it is detected that distance data between the wearable device and the user is smaller than a preset distance threshold, and when the distance data between the wearable device and the user is smaller than the preset distance threshold, the wearable device may be considered to be in a wearing state, and the approach control instruction is used for controlling target data provided by an application program in the terminal device to be in a playing state, for example, the preset distance threshold may be 50cm, and when it is detected that the distance data between the wearable device and the user is smaller than 50cm, the wearable device may be considered to be in use or to be in use by the user, at this time, it is determined that the wearable device is in the wearing state, and the target data provided by the application program in the terminal device is controlled to be in the playing state or to; of course, the preset distance threshold may also be 100cm, and the specific numerical value may be set by self-definition according to the actual situation, which is not particularly limited in this example embodiment.
In an exemplary embodiment, the generating of the second control instruction for controlling the wearable device to change the current working state based on the first control instruction may be implemented by the steps in fig. 5, and as shown with reference to fig. 5, specifically, the generating of the second control instruction may include:
step S510, in response to the first control instruction being a remote control instruction, generating the sleep control instruction, so that the wearable device changes the current working state to a sleep state according to the sleep control instruction; or
Step S520, in response to that the first control instruction is an approaching control instruction, generating the wake-up control instruction, so that the wearable device changes the current working state into a wake-up state according to the wake-up control instruction.
The sleep control instruction is an instruction for controlling the wearable device to enter a sleep state, and after receiving the sleep control instruction, the wearable device turns off modules, such as the audio playing unit and the display unit, which do not need to be operated temporarily, and only keeps necessary operations, such as the communication unit, and of course, the specific sleep state may be set to turn off and operate each unit by user-definition, which is not limited in this example embodiment.
The waking control instruction refers to an instruction for controlling the wearable device to enter a waking state, and after the wearable device receives the waking control instruction, the wearable device recovers operation through a module closed by the sleep control instruction, so that the wearable device can respond to various operation instructions quickly.
In an exemplary embodiment, the transmission state corresponding to the target data transmitted to the wearable device for processing may be changed through the steps in fig. 6, and as shown in fig. 6, the method specifically includes:
step S610, responding to the first control instruction, and acquiring transmission state change control coordinates corresponding to an application program providing the target data;
step S620, executing a click event at the coordinate of the transmission state change control, so as to trigger the transmission state change control according to the click event, and controlling the transmission state corresponding to the target data transmitted to the wearable device for processing.
The transmission state change control is a control provided in the application program and used for controlling the playing, pausing, fast forwarding or fast rewinding of the target data, and the transmission state change control coordinate is a position coordinate of the transmission state change control on an interactive interface corresponding to the application program.
The click event refers to an event for simulating a user to manually click a touch screen of the terminal device to trigger a transmission state change control by calling an interface provided by a test frame in an operating system (such as an Android operating system).
When the target data is in a playing state, clicking operation can be executed at the coordinates of the transmission state change control through a clicking event so as to switch the transmission state of the target data to a pause state; when the target data is in the pause state, the click operation can be executed at the coordinate of the transmission state change control through the click event so as to switch the transmission state of the target data to the play state.
In an exemplary embodiment, the system broadcast data may be further generated in response to the first control instruction, and the application program providing the target data may be further controlled according to the system broadcast data, so as to change a transmission state corresponding to the target data transmitted to the wearable device for processing.
The system Broadcast data refers to a Broadcast message (Broadcast) used by the operating system background to control each application program, for example, the system Broadcast data may be a standard Broadcast (a Broadcast executed completely asynchronously) or an ordered Broadcast (a Broadcast executed synchronously), which is not limited in this exemplary embodiment.
In an exemplary embodiment, the switching of the audio playing device may be implemented through the steps in fig. 7, and as shown in fig. 7, the switching may specifically include:
step S710, responding to the first control instruction as a far control instruction, generating a first parameter, wherein the first parameter is used for setting the terminal equipment as audio playing main equipment; or
Step S720, in response to the first control instruction being a proximity control instruction, generating a second parameter, where the second parameter is used to set the wearable device as an audio playing master device.
The first parameter is a parameter set through a preset interface, and the parameter is used to set the terminal device as the audio playback master device, for example, when the terminal device detects that the received first control instruction is a remote control instruction, the first parameter is set through a setParameter interface in audiomanager.
The second parameter refers to a parameter set through a preset interface, and the parameter is used for setting the wearable device as the audio playback master device, for example, when the terminal device detects that the received first control instruction is a proximity control instruction, the second parameter is set through a setParameter interface in audiomanager.
The audio playing master device is a device for playing all events with a prompt tone, for example, the event with the prompt tone may be an incoming call prompt event, an alarm clock prompt event, or another event with the prompt tone, and this is not limited in this example embodiment.
The audio playing main equipment used for playing all events with the prompt tones is switched through the first parameters and the second parameters, so that the problem that the user cannot receive the prompt tones when the user does not wear the wearable equipment and plays the events with the prompt tones through the wearable equipment can be effectively avoided.
The present exemplary embodiment further provides a control method, and another control method according to the exemplary embodiment of the present disclosure is specifically described below by taking the wearable device executing the method as an example.
Fig. 8 shows a flow of a control method in the present exemplary embodiment, including the following steps S810 to S830:
step S810, after detecting that the communication connection is established with the terminal equipment, acquiring distance data between the terminal equipment and a target object;
step S820, generating a first control instruction according to the distance data, and sending the first control instruction to the terminal device, wherein the terminal device can control a transmission state corresponding to target data processed by the wearable device according to the first control instruction;
and step S830, receiving a second control instruction returned by the terminal equipment, so as to change the current working state according to the second control instruction.
The target object is an object that has a relationship with the wearable device and is used to identify a wearing condition of the wearable device, for example, the target object may be a user, or may be a positioning device (such as a smart ring, a smart necklace, and the like having a low power consumption positioning module) worn by the user, and of course, the target object may also be another object used to identify the wearing condition of the wearable device, which is not particularly limited in this example embodiment.
The distance data between the wearable device and the target object, such as a user, may be detected by a distance sensor in the wearable device, acceleration data between the wearable device and the target object, such as a user, may also be detected by a gyroscope sensor in the wearable device, and then the distance data may be calculated and resolved by the acceleration data, and the distance data between the wearable device and the target object, such as a positioning device worn by the user, may also be measured by a positioning module in the wearable device, such as an Ultra Wide Band (UWB) module, which is not particularly limited in this example embodiment.
In an exemplary embodiment, distance data to a target object may be detected and acquired from a distance sensor in a wearable device; or acquiring acceleration data of the wearable device according to the gyroscope sensor, and resolving and calculating according to the acceleration data to obtain distance data between the wearable device and the target object; of course, the final distance data can be determined according to the distance data detected by the distance sensor and the distance data obtained by the resolving calculation according to the acceleration data and the statistical characteristics such as the average value or the minimum value of the two distance data, so as to improve the accuracy of the distance data.
In an exemplary embodiment, the first control instruction may be generated according to the detected touch operation, for example, the first control instruction may be generated according to a double-click operation of a user at a corresponding touch unit (TP) of the wearable device.
Alternatively, the first control instruction may be generated according to a wearing state of the associated device, for example, the first control instruction may be generated according to a wearing condition of a finger ring (AR device that is matched with AR glasses) that is associated with the wearable device (e.g., AR glasses) by the user, when the user takes off the AR glasses, since the finger ring is connected with the AR glasses, if the user is far away from the AR glasses, the associated finger ring needs to be taken off, and at this time, when the finger ring is in a non-wearing state, it is determined that the AR glasses are also in a non-wearing state. On the contrary, if the user approaches the AR glasses, the associated finger ring needs to be worn, and at this time, when the finger ring is in a wearing state, it indicates that the AR glasses are also in the wearing state.
Fig. 9 schematically shows an interaction flowchart of a control terminal device and a wearable device in an exemplary embodiment of the disclosure.
Referring to fig. 9, in step S901, detecting whether the terminal device is in communication connection with the wearable device, if a communication connection is established between the terminal device and the wearable device, executing step S902, otherwise, ending the current flow;
step S902, detecting whether the distance data acquired by the wearable device is equal to or greater than a distance threshold, if the distance data is equal to or greater than the distance threshold, executing step S903, otherwise executing step S907;
step S903, if the distance data is equal to or greater than the distance threshold, the wearable device can be considered to be in an unworn state, and at the moment, the wearable device sends a far-away control instruction to the terminal device;
step S904, the terminal equipment responds to the remote control instruction sent by the wearable equipment, generates a sleep control instruction and returns the sleep control instruction to the wearable equipment;
step S905, judging whether target data are transmitted in the terminal equipment, if so, executing step S906, otherwise, ending the current process;
step S906, switching the transmission state of the target data to a pause state;
step S907, if the distance data is smaller than the distance threshold, the wearable device can be considered to be in a wearing state at present, and the wearable device sends a proximity control instruction to the terminal device;
step 908, the terminal device responds to the approach control instruction sent by the wearable device, generates a wake-up control instruction, and returns the wake-up control instruction to the wearable device;
step S909, switching or restoring the transmission state of the target data to the play state;
step S910, the terminal device invokes a preset interface to generate a first parameter or a second parameter based on the remote control instruction or the proximity control instruction, for example, the terminal device sets the first parameter or the second parameter through a setParameter interface in audiomanager.
Step S911, the audio playing master device is switched according to the transferred first parameter or second parameter, for example, the terminal device may call the setParameter method to the native layer through binder and jni based on the setParameter interface, and analyze the first parameter or second parameter set in audiomanager.
Step S912, when the first parameter is detected, the terminal equipment is selected to be switched to the audio playing main equipment;
step S913, when the second parameter is detected, selecting the wearable device to switch to the audio playing main device;
step S914, the driving bottom layer completes the device switching, for example, the device switching can be completed to the Hal layer library through Hidl call, and then to the driving bottom layer, and ends the current process,
in summary, in the exemplary embodiment, after detecting that the communication connection is established with the wearable device, the target data is transmitted to the wearable device for processing; receiving a first control instruction sent by the wearable device based on the detected distance data, and controlling a transmission state corresponding to target data processed by the wearable device according to the first control instruction; and generating a second control instruction based on the first control instruction, and sending the second control instruction to the wearable device, so that the wearable device changes the current working state according to the second control instruction. On one hand, according to a first control instruction sent by the wearable device based on the detected distance data, the current wearing condition of the wearable device can be quickly determined, and further, according to the transmission state of the target data of the first control instruction, the transmission of the target data can be timely suspended when the wearable device is not worn, or the transmission of the target data can be timely recovered when the wearable device is worn again, so that the transmission state of the target data can be automatically controlled according to the wearing condition, the power consumption of the wearable device under the non-wearing condition is effectively reduced, and the use duration is prolonged; on the other hand, after the transmission state corresponding to the target data is controlled according to the first control instruction, a second control instruction is generated and returned to the wearable device, so that the wearable device can timely enter a sleep state, the power consumption of the wearable device is further reduced, or the wearable device can timely enter a wake-up state when being worn again, the response speed of the wearable device is increased, the fluency is improved, and the use experience of a user is guaranteed.
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 10, the present exemplary embodiment further provides a control apparatus 1000, which includes a target data processing module 1010 and a transmission status control module 1020. Wherein:
the target data processing module 1010 is configured to, after detecting that communication connection is established with a wearable device, acquire target data and transmit the target data to the wearable device for processing;
the transmission state control module 1020 is configured to receive a first control instruction sent by the wearable device based on the detected distance data, and control a transmission state corresponding to the target data processed by being transmitted to the wearable device according to the first control instruction.
In an exemplary embodiment, the control device 1000 may further include an operating state control module, and the operating state control module may be configured to:
generating a second control instruction based on the first control instruction, and sending the second control instruction to the wearable device, so that the wearable device changes the current working state according to the second control instruction
In an exemplary embodiment, the transmission state control module 1020 may further be configured to:
and in response to the first control instruction being a remote control instruction, controlling the target data transmitted to the wearable device for processing to be in a suspended state.
In an exemplary embodiment, the transmission state control module 1020 may further be configured to:
and responding to the first control instruction as a close control instruction, and controlling the target data transmitted to the wearable device for processing to be in a playing state so as to recover the pause state.
In an exemplary embodiment, the operating state control module 1030 is further configured to:
responding to the first control instruction as a far control instruction, and generating the sleep control instruction so that the wearable device changes the current working state into a sleep state according to the sleep control instruction; or
And responding to the first control instruction as a close control instruction, and generating the awakening control instruction so that the wearable device changes the current working state into an awakening state according to the awakening control instruction.
In an exemplary embodiment, the transmission state control module 1020 may further be configured to:
responding to the first control instruction, and acquiring transmission state change control coordinates corresponding to an application program providing the target data;
and executing a click event at the coordinate of the transmission state change control, so as to trigger the transmission state change control according to the click event, and controlling the transmission state corresponding to the target data transmitted to the wearable device for processing.
In an exemplary embodiment, the transmission state control module 1020 may further be configured to:
responding to the first control instruction, and generating system broadcast data;
and controlling an application program providing the target data according to the system broadcast data so as to change the transmission state corresponding to the target data transmitted to the wearable device for processing.
In an exemplary embodiment, the control apparatus 1000 further includes an audio playing master switching unit, and the audio playing master switching unit may be configured to:
responding to the first control instruction as a remote control instruction, and generating a first parameter; or
Responding to the first control instruction as an approaching control instruction, and generating a second parameter;
the first parameter is used for setting the terminal device as an audio playing main device, and the second parameter is used for setting the wearable device as an audio playing main device.
Referring to fig. 11, the exemplary embodiment further provides a control device 1100, which includes a distance data obtaining module 1110, a transmission state control module 1120, and an operating state changing module 1130. Wherein:
the distance data acquiring module 1110 is configured to acquire distance data between the terminal device and a target object after detecting that a communication connection is established with the terminal device;
the transmission state control module 1120 is configured to generate a first control instruction according to the distance data, and send the first control instruction to the terminal device, where the terminal device can control a transmission state corresponding to target data processed by being transmitted to the wearable device according to the first control instruction;
the working state changing module 1130 is configured to receive a second control instruction returned by the terminal device, so as to change the current working state according to the second control instruction.
In an exemplary embodiment, the distance data acquisition module 1110 may be further configured to:
detecting and acquiring distance data between the target object and the distance sensor; and/or
And acquiring acceleration data according to the gyroscope sensor, and determining distance data between the gyroscope sensor and a target object according to the acceleration data.
In an exemplary embodiment, the transmission status control module 1120 is further configured to:
acquiring a preset distance threshold;
generating the distance control instruction in response to the distance data being equal to or greater than the distance threshold;
and responding to the distance data being smaller than the distance threshold value, generating the approach control instruction.
In an exemplary embodiment, the control device 1100 further comprises a control instruction generation unit, the control instruction generation unit is further configured to:
and generating a first control instruction according to the detected touch operation or the wearing state of the associated equipment.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3 to 9 may be performed.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (16)

1. A control method is applied to a terminal device, and comprises the following steps:
after communication connection with wearable equipment is detected to be established, target data are obtained and transmitted to the wearable equipment for processing;
receiving a first control instruction sent by the wearable device based on the detected distance data, and controlling a transmission state corresponding to the target data processed by the wearable device according to the first control instruction.
2. The method of claim 1, further comprising:
and generating a second control instruction based on the first control instruction, and sending the second control instruction to the wearable device, so that the wearable device changes the current working state according to the second control instruction.
3. The method of claim 1, wherein the first control command comprises a remote control command, and wherein the transmission state comprises a suspend state;
the controlling, according to the first control instruction, a transmission state corresponding to the target data transmitted to the wearable device for processing includes:
and in response to the first control instruction being a remote control instruction, controlling the target data transmitted to the wearable device for processing to be in a suspended state.
4. The method of claim 3, wherein the first control command comprises a proximity control command, and wherein the transmission state comprises a play state;
if the response is that the first control instruction is a remote control instruction, controlling the target data transmitted to the wearable device for processing to be in a suspended state, further comprising:
and responding to the first control instruction as a close control instruction, and controlling the target data transmitted to the wearable device for processing to be in a playing state so as to recover the pause state.
5. The method of claim 1, wherein the approach control instruction comprises a sleep control instruction and a wake-up control instruction, and wherein generating a second control instruction based on the first control instruction to cause the wearable device to change the current operating state according to the second control instruction comprises:
responding to the first control instruction as a far control instruction, and generating the sleep control instruction so that the wearable device changes the current working state into a sleep state according to the sleep control instruction; or
And responding to the first control instruction as a close control instruction, and generating the awakening control instruction so that the wearable device changes the current working state into an awakening state according to the awakening control instruction.
6. The method according to claim 3 or 4, wherein the controlling of the transmission state corresponding to the target data transmitted to the wearable device for processing according to the first control instruction comprises:
responding to the first control instruction, and acquiring transmission state change control coordinates corresponding to an application program providing the target data;
and executing a click event at the coordinate of the transmission state change control, so as to trigger the transmission state change control according to the click event, and controlling the transmission state corresponding to the target data transmitted to the wearable device for processing.
7. The method according to claim 3 or 4, wherein the controlling of the transmission state corresponding to the target data transmitted to the wearable device for processing according to the first control instruction comprises:
responding to the first control instruction, and generating system broadcast data;
and controlling an application program providing the target data according to the system broadcast data so as to change the transmission state corresponding to the target data transmitted to the wearable device for processing.
8. The method of claim 2, further comprising:
responding to the first control instruction as a remote control instruction, and generating a first parameter; or
Responding to the first control instruction as an approaching control instruction, and generating a second parameter;
the first parameter is used for setting the terminal device as an audio playing main device, and the second parameter is used for setting the wearable device as an audio playing main device.
9. A control method is applied to a wearable device, and comprises the following steps:
after communication connection with the terminal equipment is detected, distance data between the terminal equipment and a target object are acquired;
generating a first control instruction according to the distance data, and sending the first control instruction to the terminal device, wherein the terminal device can control a transmission state corresponding to target data processed by the wearable device according to the first control instruction;
and receiving a second control instruction returned by the terminal equipment so as to change the current working state according to the second control instruction.
10. The method of claim 9, wherein the wearable device comprises a distance sensor and a gyroscope sensor, and wherein the acquiring distance data from the target object comprises:
detecting and acquiring distance data between the target object and the distance sensor; and/or
And acquiring acceleration data according to the gyroscope sensor, and determining distance data between the gyroscope sensor and a target object according to the acceleration data.
11. The method of claim 9, wherein the first control command comprises a far control command and a near control command, and wherein generating the first control command from the distance data comprises:
acquiring a preset distance threshold;
generating the distance control instruction in response to the distance data being equal to or greater than the distance threshold;
and responding to the distance data being smaller than the distance threshold value, generating the approach control instruction.
12. The method of claim 9, further comprising:
and generating a first control instruction according to the detected touch operation or the wearing state of the associated equipment.
13. A control device, comprising:
the target data processing module is used for acquiring target data after communication connection with the wearable equipment is detected, and transmitting the target data to the wearable equipment for processing;
and the transmission state control module is used for receiving a first control instruction sent by the wearable device based on the detected distance data and controlling the transmission state corresponding to the target data processed by the wearable device according to the first control instruction.
14. A control device, comprising:
the distance data acquisition module is used for acquiring distance data between the terminal equipment and a target object after detecting that communication connection is established between the terminal equipment and the target object;
the transmission state control module is used for generating a first control instruction according to the distance data and sending the first control instruction to the terminal equipment, wherein the terminal equipment can control a transmission state corresponding to target data processed by the wearable equipment according to the first control instruction;
and the working state changing module is used for receiving a second control instruction returned by the terminal equipment so as to change the current working state according to the second control instruction.
15. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to one of claims 1 to 8 or carries out the method according to one of claims 9 to 12.
16. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 8, or the method of any of claims 9 to 12, via execution of the executable instructions.
CN202110235742.2A 2021-03-03 2021-03-03 Control method and device, computer readable medium and electronic equipment Pending CN112995402A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110235742.2A CN112995402A (en) 2021-03-03 2021-03-03 Control method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110235742.2A CN112995402A (en) 2021-03-03 2021-03-03 Control method and device, computer readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112995402A true CN112995402A (en) 2021-06-18

Family

ID=76352389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110235742.2A Pending CN112995402A (en) 2021-03-03 2021-03-03 Control method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112995402A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741634A (en) * 2021-08-30 2021-12-03 海信视像科技股份有限公司 State control method based on wearable device and wearable device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103024193A (en) * 2012-12-25 2013-04-03 北京百度网讯科技有限公司 Mobile terminal and play control method and play control device for same
US20160349519A1 (en) * 2015-05-29 2016-12-01 Shenzhen Royole Technologies Co. Ltd. Display module assembly and an electronic device using the same
CN108646316A (en) * 2018-05-02 2018-10-12 四川斐讯信息技术有限公司 A kind of wearing state recognition methods of wearable device and wearable device
CN108769387A (en) * 2018-05-03 2018-11-06 Oppo广东移动通信有限公司 Application control method and relevant device
CN108966067A (en) * 2018-06-07 2018-12-07 Oppo广东移动通信有限公司 Control method for playing back and Related product

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103024193A (en) * 2012-12-25 2013-04-03 北京百度网讯科技有限公司 Mobile terminal and play control method and play control device for same
US20160349519A1 (en) * 2015-05-29 2016-12-01 Shenzhen Royole Technologies Co. Ltd. Display module assembly and an electronic device using the same
CN107003516A (en) * 2015-05-29 2017-08-01 深圳市柔宇科技有限公司 Display module and the electronic installation with the display module
CN108646316A (en) * 2018-05-02 2018-10-12 四川斐讯信息技术有限公司 A kind of wearing state recognition methods of wearable device and wearable device
CN108769387A (en) * 2018-05-03 2018-11-06 Oppo广东移动通信有限公司 Application control method and relevant device
CN108966067A (en) * 2018-06-07 2018-12-07 Oppo广东移动通信有限公司 Control method for playing back and Related product

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113741634A (en) * 2021-08-30 2021-12-03 海信视像科技股份有限公司 State control method based on wearable device and wearable device

Similar Documents

Publication Publication Date Title
CN110022489B (en) Video playing method, device and storage medium
CN108093307B (en) Method and system for acquiring playing file
US20230010969A1 (en) Voice information processing method and electronic device
CN111968641B (en) Voice assistant awakening control method and device, storage medium and electronic equipment
CN112860169B (en) Interaction method and device, computer readable medium and electronic equipment
CN113778663A (en) Scheduling method of multi-core processor and electronic equipment
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN112527174A (en) Information processing method and electronic equipment
CN113238727A (en) Screen switching method and device, computer readable medium and electronic equipment
CN112188461A (en) Control method and device for near field communication device, medium and electronic equipment
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN111276122A (en) Audio generation method and device and storage medium
CN110493635B (en) Video playing method and device and terminal
US9749455B2 (en) Electronic device and method for sending messages using the same
CN113766127B (en) Mobile terminal control method and device, storage medium and electronic equipment
CN112770177B (en) Multimedia file generation method, multimedia file release method and device
CN112995402A (en) Control method and device, computer readable medium and electronic equipment
CN111104827A (en) Image processing method and device, electronic equipment and readable storage medium
CN109714628B (en) Method, device, equipment, storage medium and system for playing audio and video
CN116301290A (en) Screen state control method and device, electronic equipment and storage medium
CN113407318B (en) Operating system switching method and device, computer readable medium and electronic equipment
CN113325948B (en) Air-isolated gesture adjusting method and terminal
CN113763932A (en) Voice processing method and device, computer equipment and storage medium
CN111444289A (en) Incidence relation establishing method
KR102192027B1 (en) Method for providing contents based on inference engine and electronic device using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination