CN117572966A - Device control method, device, electronic device and readable storage medium - Google Patents

Device control method, device, electronic device and readable storage medium Download PDF

Info

Publication number
CN117572966A
CN117572966A CN202311514403.3A CN202311514403A CN117572966A CN 117572966 A CN117572966 A CN 117572966A CN 202311514403 A CN202311514403 A CN 202311514403A CN 117572966 A CN117572966 A CN 117572966A
Authority
CN
China
Prior art keywords
wearable device
target
gesture
module
recognition module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311514403.3A
Other languages
Chinese (zh)
Inventor
郭建珲
陈国强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202311514403.3A priority Critical patent/CN117572966A/en
Publication of CN117572966A publication Critical patent/CN117572966A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The application discloses a device control method, a device, electronic equipment and a readable storage medium, which are applied to at least two wearable devices in communication connection, wherein the wearable devices comprise gesture recognition modules, and belong to the technical field of communication. Wherein the method comprises the following steps: determining the working state of the first wearable device under the condition that the first wearable device and the second wearable device are worn by a user, wherein the first wearable device is worn on the wrist of the user; under the condition that at least two wearable devices detect a target gesture and the working state of a first wearable device is a awakened state, determining a target wearable device in the at least two wearable devices, wherein the target wearable device is a wearable device watched by a user; starting a gesture recognition module of the target wearable device, and recognizing a target gesture through the gesture recognition module to obtain target gesture information; and controlling the target wearable device according to the target gesture information.

Description

Device control method, device, electronic device and readable storage medium
Technical Field
The application belongs to the technical field of communication, and particularly relates to a device control method, a device, electronic equipment and a readable storage medium.
Background
With the rapid development of new smart terminal devices, wearable devices are widely used, and in some scenarios, users wear at least two wearable devices. When a user wants to control one of the at least two wearable devices through gestures, the at least two wearable devices can recognize the gestures of the user, so that misoperation is easy between the at least two wearable devices, and the use experience of the user is reduced.
Therefore, in the case where the user wears at least two wearable devices, misoperation is easy between the at least two wearable devices.
Disclosure of Invention
An object of the embodiments of the present application is to provide a device control method, an apparatus, an electronic device, and a readable storage medium, which can solve a problem that at least two wearable devices are easy to operate by mistake.
In a first aspect, an embodiment of the present application provides a device control method, applied to at least two wearable devices connected by communication, where the wearable devices include a gesture recognition module, the method including:
determining the working state of the first wearable device under the condition that the first wearable device and the second wearable device are worn by a user, wherein the first wearable device is worn on the wrist of the user;
Under the condition that at least two wearable devices detect a target gesture and the working state of a first wearable device is a awakened state, determining a target wearable device in the at least two wearable devices, wherein the target wearable device is a wearable device watched by a user;
starting a gesture recognition module of the target wearable device, and recognizing a target gesture through the gesture recognition module to obtain target gesture information;
and controlling the target wearable device according to the target gesture information.
In a second aspect, an embodiment of the present application provides a device control apparatus, applied to at least two wearable devices in communication connection, where the wearable devices include a gesture recognition module, and the apparatus includes:
the determining module is used for determining the working state of the first wearable device when the first wearable device and the second wearable device are detected to be worn by a user, and the first wearable device is worn on the wrist of the user;
the determining module is further configured to determine, in the case where the at least two wearable devices detect the target gesture and the working state of the first wearable device is the awakened state, a target wearable device, where the target wearable device is a wearable device that the user gazes at;
The recognition module is used for starting a gesture recognition module of the target wearable device, and recognizing a target gesture through the gesture recognition module to obtain target gesture information;
and the control module is used for controlling the target wearable equipment according to the target gesture information.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, under the condition that the first wearable device and the second wearable device are worn by a user, the working state of the first wearable device worn on the wrist of the user is determined, under the condition that the target gesture is detected by at least two wearable devices and the working state of the first wearable device is in a awakened state, in order to accurately identify the device which the target gesture of the user wants to control, in at least two wearable devices, the wearable device which the user looks at is taken as the target wearable device, the device which the target gesture of the user wants to control can be effectively and accurately determined, then, a gesture recognition module of the target wearable device is started, the target gesture is recognized through the gesture recognition module, and target gesture information is obtained, so that the target gesture can be accurately recognized based on the gesture recognition module of the target wearable device, the target wearable device which the user looks at is controlled according to the target gesture information, misoperation on the wearable devices which are beyond the target wearable device is prevented, and the accuracy of device control is improved.
Drawings
Fig. 1 is a schematic view of a wearing scenario of a first wearable device and a second wearable device provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a method for controlling a device according to an embodiment of the present application;
fig. 3 is an interaction schematic diagram of a first wearable device and a second wearable device provided in an embodiment of the present application;
fig. 4 is a block diagram of an apparatus control device according to an embodiment of the present application;
fig. 5 is one of the hardware structural diagrams of the electronic device according to the embodiment of the present application;
fig. 6 is a second schematic diagram of a hardware structure of the electronic device according to the embodiment of the present application.
Detailed Description
Technical solutions of embodiments of the present application will be clearly described below with reference to the accompanying drawings of embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The device control method provided by the embodiment of the application can be at least applied to the following application scenarios, and the following description is provided.
Currently, with the advent of more and more consumer electronic products, consumers keep open-inclusion attitudes for new forms and new technologies of smart terminal products, and development of new smart terminal devices such as Virtual Reality (VR), augmented Reality (Augmented Reality, AR), mixed display (MR) is very rapid, and consumers are also rapidly adapting to such products, even regarding them as living necessities. The AR equipment is expected to lead the development direction of the next intelligent terminal, and further expands the product line of the electronic product to meet the requirements of more users and user scenes.
AR is a technology for superimposing virtual information and a real world, and can display virtual information such as characters, images, videos, 3D models, etc. for a user without affecting the real life. There are mainly two implementations.
One is based on video composition technology. And obtaining picture information in the real scene through a camera, synthesizing virtual information such as characters, images, videos and 3D models into a picture of the real scene through information processing, and finally displaying the synthesized picture to a user through a display screen. And secondly, the mode based on optical elements. The image of the real scene directly enters human eyes after certain dimming treatment, and the information of the virtual channel enters human eyes after projection reflection.
Some AR devices may be connected to a host device such as a computer, and then a large screen is virtually created by the AR device. The AR device is provided with a plurality of cameras, and can capture gestures of a user to control a screen, so that peripheral equipment is not needed, and the user is more convenient. At present, AR devices have poor cruising ability due to large power consumption.
The following techniques of the present application are used to illustrate:
the AR is a technology for skillfully fusing virtual information with a real environment, and widely uses various technical means such as multimedia, three-dimensional modeling, real-time tracking and registering, intelligent interaction, sensing and the like, and applies virtual information such as characters, images, three-dimensional models, music, videos and the like generated by a computer to the real environment after simulation, wherein the two information are mutually complemented, so that the 'enhancement' of the real environment is realized.
The intelligent watch has information processing capability and meets the basic technical requirements of the watch.
The intelligent watch has one or more functions of reminding, navigation, calibration, monitoring, interaction and the like besides indicating time; the display modes include pointers, numbers, images, and the like. With the development of electronic technology, smart watches are becoming more popular, and some smart watches can also be gesture operated, so that users can use the smart watches conveniently.
Aiming at the problems in the related art, the embodiment of the application provides a device control method and device, which can solve the problem of misoperation on wearable devices in the related art.
The device control method provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The following describes a scenario in which a user wears a first wearable device and a second wearable device with reference to fig. 1:
as shown in fig. 1, taking a first wearable device as a smart watch and a second wearable device as an AR device as an example, the AR device 100 can obtain virtual information and real information through the AR device 100, and meanwhile has a function of capturing a real scene through a camera, and has functions of connecting the internet, wirelessly connecting an electronic device, and performing an eyeball tracking technology.
Virtual screen 110: a virtual screen generated by the AR device that the user sees.
The smart watch 120, which has a gesture operation function, is wirelessly connected to the AR device 100.
The user's finger 130 may control the virtual screen of the smartwatch 120 or AR device 100.
The user's eye 140 is not visible to the eye after wearing the AR device 100.
The sensor module 150 in the smart watch is configured to identify gesture information of a user, and the sensor module 150 may specifically include: optical sensors, acceleration sensors, gyroscopes, etc.
The camera module 160 on the AR device is used to identify gesture information of the user.
Fig. 2 is a flowchart of a device control method according to an embodiment of the present application.
As shown in fig. 2, the device control method may include steps 210-230, applied to at least two wearable devices connected in communication, where the wearable devices include a gesture recognition module, and the method is applied to a device control apparatus, and specifically includes the following steps:
step 210, determining an operating state of the first wearable device, where the first wearable device and the second wearable device are worn by the user, and the first wearable device is worn on the wrist of the user.
The first wearable device may be a smart watch or a smart bracelet; the second wearable device may include: AR device, VR device and MR device.
The at least two wearable devices in communication connection may specifically be network connection between the at least two wearable devices. For example, the AR device and the smart watch are connected through a network to conduct information interaction.
In a possible embodiment, the first wearable device includes a first gesture recognition module and a sensor module, and the second wearable device includes a second gesture recognition module and a camera module, after the step 210, the method may further include the steps of:
under the condition that the working state of the first wearable device is not awakened, starting a second gesture recognition module, and recognizing a target gesture through a sensor module or a camera module to obtain target gesture information;
and controlling the second wearable device according to the target gesture information.
Under the condition that the working state of the first wearable device is not awakened, the target gesture can be recognized through the sensor module or the camera module, and target gesture information is obtained.
Starting a second gesture recognition module, and recognizing a target gesture through the second gesture recognition module of the second wearable device to obtain target gesture information; or, the target gesture can be identified through the sensor module of the first wearable device, so that target gesture information is obtained.
Under the condition that the working state of the first wearable device is not awakened, the first wearable device can report the information of not awakening to the second wearable device, and because the working state of the first wearable device is not awakened, the target gesture can be determined to be used for controlling the second wearable device, so that the power consumption of a plurality of cameras of the second wearable device is saved, at the moment, the camera module of the second wearable device can not work, the target gesture can be identified through the sensor module, target gesture information is obtained, and the second wearable device is controlled according to the target gesture information, so that the power consumption of the second wearable device can be saved, and the cruising ability of the second wearable device is improved.
Therefore, under the condition that the working state of the first wearable device is not awakened, the sensor module or the camera module is used for identifying the target gesture to obtain target gesture information, so that the second wearable device is controlled according to the target gesture information, and the second wearable device can be accurately controlled.
In a possible embodiment, the step of starting the second gesture recognition module to recognize the target gesture through the sensor module or the camera module to obtain the target gesture information may specifically include the following steps:
determining a remaining power of the second wearable device;
under the condition that the residual electric quantity is smaller than the preset electric quantity, identifying a target gesture through a sensor module of the first wearable device, and obtaining target gesture information;
and under the condition that the residual electric quantity is greater than or equal to the preset electric quantity, identifying a target gesture through a camera module of the second wearable device, and obtaining target gesture information.
Specifically, the residual electric quantity of the second wearable device can be used for determining which one of the first wearable device and the second wearable device is used for identifying the target gesture, so that the target gesture information is obtained.
The preset electric quantity can be 50%, 60% or 70%.
Under the condition that the residual electric quantity is smaller than the preset electric quantity, the residual electric quantity of the second wearable device is not quite abundant, and the sensor module of the first wearable device is used for identifying the target gesture to obtain target gesture information, so that the power consumption of the second wearable device can be saved;
and under the condition that the residual electric quantity is larger than or equal to the preset electric quantity, the residual electric quantity of the second wearable device is more abundant, and the target gesture is identified through the camera module of the second wearable device, so that the target gesture information is obtained.
From this, confirm by the remaining capacity of second wearable equipment which of first wearable equipment and second wearable equipment discerns the target gesture, obtain target gesture information, can practice thrift the remaining capacity of second wearable equipment under the circumstances that the remaining capacity of second wearable equipment is not enough, promote the duration of second wearable equipment.
Step 220, determining a target wearable device in the at least two wearable devices, where the target wearable device is the wearable device watched by the user, when the at least two wearable devices detect the target gesture and the working state of the first wearable device is the awakened state.
The fact that the at least two wearable devices detect the target gesture means that the target gesture exists in the user is detected, meaning of the target gesture is not resolved at the moment, namely target gesture information indicated by the target gesture is not recognized, and the target gesture needs to be recognized through a gesture recognition module of the target wearable device.
When the user gazes at the first wearable device, the user may specifically be gazing at a display screen of the first wearable device; when the user looks at the second wearable device, the user may specifically look at a virtual screen of the second wearable device.
Step 230, starting a gesture recognition module of the target wearable device, and recognizing a target gesture through the gesture recognition module to obtain target gesture information. Starting a gesture recognition module of the target wearable device, namely starting a gesture recognition function of the target wearable device, and analyzing the detected target gesture under the condition of starting the gesture recognition module of the target wearable device to obtain target gesture information.
Step 240, controlling the target wearable device according to the target gesture information.
The target gesture information is a meaning indicated by a target gesture, for example, the target gesture is a left slide, and the target gesture information is a current interface of a left slide target wearable device; the target gesture is a swipe up, and the target gesture information is a current interface of the exit target wearable device.
In one possible embodiment, the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, step 230, comprising:
under the condition that the target wearable device is a first wearable device, a first gesture recognition module is started, and a target gesture is recognized through a sensor module to obtain target gesture information;
accordingly, step 240 includes: and controlling the first wearable device according to the target gesture information.
Under the condition that the user is detected to watch the first wearable device, the current target gesture of the user is used for controlling the first wearable device, so that the target gesture is recognized through the sensor module of the first wearable device at the moment, target gesture information is obtained, and the first wearable device is controlled according to the target gesture information.
In order to avoid misoperation, the camera module of the second wearable device is not operated at this time, that is to say, the second gesture recognition module of the second wearable device is in a closed state.
As shown in fig. 3, taking the first wearable device as an intelligent watch and the second wearable device as an AR device as an example, the sensor module of the first wearable device may identify gesture information of a user, obtain target gesture information, and further control the first wearable device according to the target gesture information.
Therefore, under the condition that the target wearable device is the first wearable device, the first gesture recognition module is started, the target gesture is recognized through the sensor module, target gesture information is obtained, and then the first wearable device is controlled according to the target gesture information, so that the second wearable device can be prevented from being controlled by mistake, namely misoperation of the second wearable device is prevented.
In one possible embodiment, the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, step 230, comprising:
under the condition that the target wearable device is a second wearable device, a second gesture recognition module is started, and the target gesture is recognized through a sensor module or a camera module to obtain target gesture information;
accordingly, step 240 includes:
and controlling the second wearable device according to the target gesture information.
And under the condition that the user is detected to watch the virtual screen of the second wearable device, the current target gesture of the user is used for controlling the second wearable device, so that the target gesture is recognized through the sensor module or the camera module at the moment, and the target gesture information is obtained.
In order to avoid misoperation, the first gesture recognition module of the first wearable device is turned off at this time, that is, the target gesture information is not used for controlling the first wearable device.
Therefore, under the condition that the target wearable device is the second wearable device, the second gesture recognition module is started, the target gesture is recognized through the sensor module or the camera module, the target gesture information is obtained, and then the second wearable device is controlled according to the target gesture information, so that the first wearable device can be prevented from being controlled by mistake, namely, misoperation of the first wearable device is prevented.
In summary, in the working state of the first wearable device is the awakened state, the eyeball tracking technology can be utilized to detect the target wearable device watched by the user, and then whether the target gesture of the user is used for controlling the first wearable device or the second wearable device is distinguished, misoperation is prevented from occurring between the first wearable device and the second wearable device, the accuracy of device control is improved, and user experience is improved.
In the embodiment of the application, under the condition that the first wearable device and the second wearable device are worn by a user, the working state of the first wearable device worn on the wrist of the user is determined, under the condition that the target gesture is detected by at least two wearable devices and the working state of the first wearable device is in a awakened state, in order to accurately identify the device which the target gesture of the user wants to control, in at least two wearable devices, the wearable device which the user looks at is taken as the target wearable device, the device which the target gesture of the user wants to control can be effectively and accurately determined, then, a gesture recognition module of the target wearable device is started, the target gesture is recognized through the gesture recognition module, and target gesture information is obtained, so that the target gesture can be accurately recognized based on the gesture recognition module of the target wearable device, the target wearable device which the user looks at is controlled according to the target gesture information, misoperation on the wearable devices which are beyond the target wearable device is prevented, and the accuracy of device control is improved.
In the device control method provided by the embodiment of the present application, the execution body may be a device control apparatus. In the embodiment of the present application, an example is taken as an example of a device control method executed by a device control apparatus, and the device control apparatus provided in the embodiment of the present application is described.
Fig. 4 is a block diagram of a device control apparatus provided in an embodiment of the present application, applied to at least two wearable devices connected by communication, where the wearable devices include a gesture recognition module, and the apparatus 400 includes:
a determining module 410, configured to determine an operating state of the first wearable device when it is detected that the first wearable device and the second wearable device are worn by the user, where the first wearable device is worn on a wrist of the user;
the determining module 410 is further configured to determine, in the case where the at least two wearable devices detect the target gesture and the working state of the first wearable device is the awakened state, a target wearable device, where the target wearable device is a wearable device that the user gazes at;
the recognition module 420 is configured to turn on a gesture recognition module of the target wearable device, and recognize a target gesture through the gesture recognition module to obtain target gesture information;
The control module 430 is configured to control the target wearable device according to the target gesture information.
In one possible embodiment, the first wearable device includes a first gesture recognition module and a sensor module, and the second wearable device includes a second gesture recognition module and a camera module, and the recognition module 420 is specifically configured to:
under the condition that the target wearable device is a first wearable device, a first gesture recognition module is started, and a target gesture is recognized through a sensor module to obtain target gesture information;
the control module 430 is specifically configured to:
and controlling the first wearable device according to the target gesture information.
In one possible embodiment, the first wearable device includes a first gesture recognition module and a sensor module, and the second wearable device includes a second gesture recognition module and a camera module, and the recognition module 420 is specifically configured to:
under the condition that the target wearable device is a second wearable device, a second gesture recognition module is started, and the target gesture is recognized through a sensor module or a camera module to obtain target gesture information;
the control module 430 is specifically configured to:
and controlling the second wearable device according to the target gesture information.
In one possible embodiment, the first wearable device includes a first gesture recognition module and a sensor module, and the second wearable device includes a second gesture recognition module and a camera module, the recognition module 420, further configured to:
under the condition that the working state of the first wearable device is not awakened, starting a second gesture recognition module, and recognizing a target gesture through a sensor module or a camera module to obtain target gesture information;
the control module 430 is further configured to: and controlling the second wearable device according to the target gesture information.
In one possible embodiment, the identification module 420 is specifically configured to:
determining a remaining power of the second wearable device;
under the condition that the residual electric quantity is smaller than the preset electric quantity, identifying a target gesture through a sensor module of the first wearable device, and obtaining target gesture information;
and under the condition that the residual electric quantity is greater than or equal to the preset electric quantity, identifying a target gesture through a camera module of the second wearable device, and obtaining target gesture information.
In the embodiment of the application, under the condition that the first wearable device and the second wearable device are worn by a user, the working state of the first wearable device worn on the wrist of the user is determined, under the condition that the target gesture is detected by at least two wearable devices and the working state of the first wearable device is in a awakened state, in order to accurately identify the device which the target gesture of the user wants to control, in at least two wearable devices, the wearable device which the user looks at is taken as the target wearable device, the device which the target gesture of the user wants to control can be effectively and accurately determined, then, a gesture recognition module of the target wearable device is started, the target gesture is recognized through the gesture recognition module, and target gesture information is obtained, so that the target gesture can be accurately recognized based on the gesture recognition module of the target wearable device, the target wearable device which the user looks at is controlled according to the target gesture information, misoperation on the wearable devices which are beyond the target wearable device is prevented, and the accuracy of device control is improved.
The device control apparatus in the embodiments of the present application may be an electronic device, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The device control apparatus of the embodiment of the present application may be an apparatus having an action system. The action system may be an Android (Android) action system, an iOS action system, or other possible action systems, and the embodiment of the application is not specifically limited.
The device control apparatus provided in the embodiment of the present application can implement each process implemented by the foregoing method embodiment, and in order to avoid repetition, details are not repeated here.
Optionally, as shown in fig. 5, the embodiment of the present application further provides an electronic device 510, including a processor 511, a memory 512, and a program or an instruction stored in the memory 512 and capable of being executed on the processor 511, where the program or the instruction implements each step of any one of the above device control method embodiments when executed by the processor 511, and the steps achieve the same technical effects, and for avoiding repetition, a description is omitted herein.
The electronic device of the embodiment of the application includes the mobile electronic device and the non-mobile electronic device.
Fig. 6 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: radio frequency unit 601, network module 602, audio output unit 603, input unit 604, sensor 605, display unit 606, user input unit 607, interface unit 608, memory 609, and processor 610.
Those skilled in the art will appreciate that the electronic device 600 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 610 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The processor 610 is configured to determine, when it is detected that the first wearable device and the second wearable device are worn by the user, an operating state of the first wearable device, where the first wearable device is worn on a wrist of the user;
the processor 610 is further configured to determine, in the case where the at least two wearable devices detect the target gesture and the working state of the first wearable device is the awake state, a target wearable device, where the target wearable device is a wearable device that the user gazes at;
the processor 610 is further configured to turn on a gesture recognition module of the target wearable device, and recognize a target gesture through the gesture recognition module to obtain target gesture information;
the processor 610 is further configured to control the target wearable device according to the target gesture information.
Optionally, the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, and the processor 610 is further configured to, when the target wearable device is the first wearable device, turn on the first gesture recognition module, recognize a target gesture through the sensor module, and obtain target gesture information;
The processor 610 is further configured to control the first wearable device according to the target gesture information.
Optionally, the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, and the processor 610 is further configured to, when the target wearable device is the second wearable device, turn on the second gesture recognition module, and recognize the target gesture through the sensor module or the camera module, to obtain target gesture information;
the processor 610 is further configured to control the second wearable device according to the target gesture information.
Optionally, the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, and the processor 610 is further configured to, when the working state of the first wearable device is a state that is not awakened, turn on the second gesture recognition module, and recognize a target gesture through the sensor module or the camera module, so as to obtain target gesture information;
the processor 610 is further configured to control the second wearable device according to the target gesture information.
Optionally, the processor 610 is further configured to determine a remaining power of the second wearable device;
The processor 610 is further configured to identify a target gesture through a sensor module of the first wearable device to obtain target gesture information when the remaining power is less than a preset power;
the processor 610 is further configured to identify a target gesture through a camera module of the second wearable device, and obtain target gesture information when the remaining power is greater than or equal to a preset power.
In the embodiment of the application, under the condition that the first wearable device and the second wearable device are worn by a user, the working state of the first wearable device worn on the wrist of the user is determined, under the condition that the target gesture is detected by at least two wearable devices and the working state of the first wearable device is in a awakened state, in order to accurately identify the device which the target gesture of the user wants to control, in at least two wearable devices, the wearable device which the user looks at is taken as the target wearable device, the device which the target gesture of the user wants to control can be effectively and accurately determined, then, a gesture recognition module of the target wearable device is started, the target gesture is recognized through the gesture recognition module, and target gesture information is obtained, so that the target gesture can be accurately recognized based on the gesture recognition module of the target wearable device, the target wearable device which the user looks at is controlled according to the target gesture information, misoperation on the wearable devices which are beyond the target wearable device is prevented, and the accuracy of device control is improved.
It should be understood that in the embodiment of the present application, the input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, and the graphics processor 6041 processes image data of a still picture or a video image obtained by an image capturing apparatus (such as a camera) in the video image capturing mode or the image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes at least one of a touch panel 6071 and other input devices 6072. The touch panel 6071 is also referred to as a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 609 may be used to store software programs as well as various data including, but not limited to, application programs and an action system. The processor 610 may integrate an application processor that primarily processes action systems, user pages, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 609 may include volatile memory or nonvolatile memory, or the memory x09 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 609 in the present embodiment includes, but is not limited to, these and any other suitable types of memory.
The processor 610 may include one or more processing units; optionally, the processor 610 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the embodiment of the device control method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the device control method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the above-described device control method embodiments, and achieve the same technical effects, and for avoiding repetition, a detailed description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. A device control method applied to at least two communicatively connected wearable devices, the wearable devices including a gesture recognition module, the method comprising:
determining the working state of a first wearable device and a second wearable device when the first wearable device and the second wearable device are detected to be worn by a user, wherein the first wearable device is worn on the wrist of the user;
determining a target wearable device in at least two wearable devices under the condition that the target gesture is detected by at least two wearable devices and the working state of the first wearable device is a awakened state, wherein the target wearable device is a wearable device watched by a user;
starting a gesture recognition module of the target wearable device, and recognizing the target gesture through the gesture recognition module to obtain target gesture information;
and controlling the target wearable equipment according to the target gesture information.
2. The method of claim 1, wherein the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, the turning on the gesture recognition module of the target wearable device recognizes the target gesture through the gesture recognition module to obtain target gesture information, and the method includes:
Under the condition that the target wearable device is the first wearable device, starting the first gesture recognition module, and recognizing the target gesture through the sensor module to obtain the target gesture information; the controlling the target wearable device according to the target gesture information includes:
and controlling the first wearable device according to the target gesture information.
3. The method of claim 1, wherein the first wearable device includes a first gesture recognition module and a sensor module, the second wearable device includes a second gesture recognition module and a camera module, the turning on the gesture recognition module of the target wearable device recognizes the target gesture through the gesture recognition module to obtain target gesture information, and the method includes:
when the target wearable device is the second wearable device, starting the second gesture recognition module, and recognizing the target gesture through the sensor module or the camera module to obtain the target gesture information;
the controlling the target wearable device according to the target gesture information includes:
And controlling the second wearable device according to the target gesture information.
4. The method of claim 1, wherein the first wearable device comprises a first gesture recognition module and a sensor module, the second wearable device comprises a second gesture recognition module and a camera module, and wherein upon detecting that the first wearable device and the second wearable device are worn by a user, determining the operational state of the first wearable device, the method further comprises:
when the working state of the first wearable device is not awakened, starting the second gesture recognition module, and recognizing the target gesture through the sensor module or the camera module to obtain target gesture information;
and controlling the second wearable device according to the target gesture information.
5. The method of claim 4, wherein the starting the second gesture recognition module to recognize the target gesture through the sensor module or the camera module when the working state of the first wearable device is not awake state, to obtain target gesture information, comprises:
Determining a remaining power of the second wearable device;
when the residual electric quantity is smaller than a preset electric quantity, the target gesture is recognized through a sensor module of the first wearable device, and the target gesture information is obtained;
and under the condition that the residual electric quantity is larger than or equal to the preset electric quantity, identifying the target gesture through a camera module of the second wearable device, and obtaining the target gesture information.
6. A device control apparatus, characterized by being applied to at least two communicatively connected wearable devices, the wearable devices comprising a gesture recognition module, the apparatus comprising:
the device comprises a determining module, a first wearable device and a second wearable device, wherein the determining module is used for determining the working state of the first wearable device under the condition that the first wearable device and the second wearable device are detected to be worn by a user, and the first wearable device is worn on the wrist of the user;
the determining module is further configured to determine, in the case where the at least two wearable devices detect a target gesture and the working state of the first wearable device is a awakened state, a target wearable device, where the target wearable device is a wearable device that a user gazes at;
The gesture recognition module is used for starting the gesture recognition module of the target wearable device, and recognizing the target gesture through the gesture recognition module to obtain target gesture information;
and the control module is used for controlling the target wearable equipment according to the target gesture information.
7. The apparatus of claim 6, wherein the first wearable device comprises a first gesture recognition module and a sensor module, the second wearable device comprises a second gesture recognition module and a camera module, the recognition module is specifically configured to:
under the condition that the target wearable device is the first wearable device, starting the first gesture recognition module, and recognizing the target gesture through the sensor module to obtain the target gesture information;
the control module is specifically configured to:
and controlling the first wearable device according to the target gesture information.
8. The apparatus of claim 6, wherein the first wearable device comprises a first gesture recognition module and a sensor module, the second wearable device comprises a second gesture recognition module and a camera module, the recognition module is specifically configured to:
When the target wearable device is the second wearable device, starting the second gesture recognition module, and recognizing the target gesture through the sensor module or the camera module to obtain the target gesture information;
the control module is specifically configured to:
and controlling the second wearable device according to the target gesture information.
9. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method of any one of claims 1 to 5.
10. A readable storage medium, characterized in that it stores thereon a program or instructions, which when executed by a processor, implement the steps of the method according to any of claims 1-5.
CN202311514403.3A 2023-11-13 2023-11-13 Device control method, device, electronic device and readable storage medium Pending CN117572966A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311514403.3A CN117572966A (en) 2023-11-13 2023-11-13 Device control method, device, electronic device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311514403.3A CN117572966A (en) 2023-11-13 2023-11-13 Device control method, device, electronic device and readable storage medium

Publications (1)

Publication Number Publication Date
CN117572966A true CN117572966A (en) 2024-02-20

Family

ID=89863619

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311514403.3A Pending CN117572966A (en) 2023-11-13 2023-11-13 Device control method, device, electronic device and readable storage medium

Country Status (1)

Country Link
CN (1) CN117572966A (en)

Similar Documents

Publication Publication Date Title
CN106716302B (en) Method, apparatus, and computer-readable medium for displaying image
CN112068698A (en) Interaction method and device, electronic equipment and computer storage medium
CN112044065A (en) Virtual resource display method, device, equipment and storage medium
CN112911147A (en) Display control method, display control device and electronic equipment
CN112416172A (en) Electronic equipment control method and device and electronic equipment
EP3327551A1 (en) Electronic device for displaying image and method for controlling the same
CN112929734A (en) Screen projection method and device and electronic equipment
CN115480639A (en) Human-computer interaction system, human-computer interaction method, wearable device and head display device
CN117572966A (en) Device control method, device, electronic device and readable storage medium
CN115002551A (en) Video playing method and device, electronic equipment and medium
CN115529405A (en) Image display method and device of front camera
CN113703592A (en) Secure input method and device
CN112068699A (en) Interaction method, interaction device, electronic equipment and storage medium
CN114077465A (en) UI (user interface) rendering method and device, electronic equipment and storage medium
CN117608408A (en) Operation execution method, device, electronic equipment and readable storage medium
CN116027908A (en) Color acquisition method, device, electronic equipment and storage medium
CN116166120A (en) Gesture recognition method and device, electronic equipment and storage medium
CN115794019A (en) Projection method, projection device, electronic equipment and readable storage medium
CN117615228A (en) Video content marking method and device
CN116033282A (en) Shooting processing method and electronic equipment
CN117555416A (en) Gesture control method, device, equipment and medium
CN117369634A (en) Display method, display device, electronic equipment and readable storage medium
CN116452781A (en) Image display method, device, electronic equipment and storage medium
CN117608432A (en) Processing method, processing device, processing equipment and storage medium
CN117742551A (en) Display control method and device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination