CN112818825B - Working state determining method and device - Google Patents

Working state determining method and device Download PDF

Info

Publication number
CN112818825B
CN112818825B CN202110120545.6A CN202110120545A CN112818825B CN 112818825 B CN112818825 B CN 112818825B CN 202110120545 A CN202110120545 A CN 202110120545A CN 112818825 B CN112818825 B CN 112818825B
Authority
CN
China
Prior art keywords
user
hand
working state
control device
association information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110120545.6A
Other languages
Chinese (zh)
Other versions
CN112818825A (en
Inventor
汪铭扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110120545.6A priority Critical patent/CN112818825B/en
Publication of CN112818825A publication Critical patent/CN112818825A/en
Priority to PCT/CN2022/073568 priority patent/WO2022161324A1/en
Application granted granted Critical
Publication of CN112818825B publication Critical patent/CN112818825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a working state determining method and device, wherein the method comprises the following steps: acquiring association information between the hand of a user and control equipment; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. According to the embodiment of the application, convenience of a user in controlling the control equipment can be improved.

Description

Working state determining method and device
Technical Field
The embodiment of the application relates to the field of information processing, in particular to a working state determining method and device.
Background
With the use of electronic devices becoming more common, the functions of the electronic devices are increasing, so that in order to facilitate the implementation of multiple types of operations on the electronic devices, control devices capable of implementing multiple operation modes are generated. For example, the control device may have a joystick mode in addition to a mouse mode that controls movement and positioning of a cursor. However, in the actual use process, the user is required to manually switch different modes of the control device, which is very inconvenient to use.
In the course of implementing the present application, the applicant found that at least the following problems exist in the related art:
the user needs to manually switch different modes of the control device, and the operation is complex.
Disclosure of Invention
The embodiment of the application provides a working state determining method and device, which are used for solving the problem that a user needs to manually switch different modes of control equipment and is complex in operation.
In order to solve the technical problems, the application is realized as follows:
in a first aspect, an embodiment of the present application provides a method for determining an operating state, where the method may include:
acquiring association information between the hand of a user and control equipment;
determining the working state of the control equipment according to the association information;
the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition.
In a second aspect, an embodiment of the present application provides an operating state determining device, which may include:
the acquisition module is used for acquiring the association information between the hand of the user and the control equipment;
the determining module is used for determining the working state of the control equipment according to the association information;
the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor, a memory and a program or instruction stored on the memory and executable on the processor, the program or instruction implementing the steps of the method according to the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In the embodiment of the application, the association information between the hand of the user and the control equipment is obtained; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. Therefore, the user does not need to manually switch different modes of the control equipment, the working state of the control equipment can be directly determined according to the association information between the user hand and the control equipment, and the convenience of the user for controlling the control equipment can be improved.
Drawings
The present application will be better understood from the following description of specific embodiments thereof, taken in conjunction with the accompanying drawings, in which like or similar reference characters designate like or similar features.
Fig. 1 is a schematic diagram of an application scenario of a working state determining method provided in an embodiment of the present application;
fig. 2 is a flowchart of a method for determining a working state according to an embodiment of the present application;
fig. 3 is a schematic view of a depth of field effect of an image according to an embodiment of the present application;
fig. 4 is a schematic diagram of a user's hand according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a working state determining device according to an embodiment of the present application;
fig. 6 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic hardware structure of another electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The method for determining the working state provided by the embodiment of the application can be at least applied to the following application scenarios, and the following description is provided.
As shown in fig. 1, to use the control device 20 to control the electronic device 30, the user 10 may switch the control device 20 to a mouse mode to control the electronic device 30, for example, use the control device 20 to control the electronic device 30 to play video; control of the electronic device 30 may also be achieved by switching the control device 20 to a handgrip mode, such as controlling the electronic device 30 to play a game using the control device 20. At present, if a user wants to change the working state of the control equipment, the user needs to manually switch, and the operation is complex.
Aiming at the problems of the related art, the embodiment of the application provides a working state determining method, a device, electronic equipment and a storage medium, so as to solve the problems that in the related art, a user needs to manually switch different modes of control equipment and the operation is complicated.
The method provided by the embodiment of the application can be applied to the application scenes and also can be applied to any scene in which a user needs to manually switch different modes of the control equipment.
According to the method provided by the embodiment of the application, the association information between the hand of the user and the control equipment is obtained; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. Therefore, the user does not need to manually switch different modes of the control equipment, the working state of the control equipment can be directly determined according to the association information between the user hand and the control equipment, and the convenience of the user for controlling the control equipment can be improved.
Based on the above application scenario, the working state determining method provided in the embodiment of the present application is described in detail below.
Fig. 2 is a flowchart of a method for determining a working state according to an embodiment of the present application.
As shown in fig. 2, the operation state determining method may include steps 210 to 220, and the method is applied to the operation state determining device, specifically as follows:
step 210, obtaining association information between the hand of the user and the control device.
Step 220, determining the working state of the control equipment according to the association information.
The control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition.
According to the working state determining method, the association information between the hand of the user and the control equipment is obtained; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. Therefore, the user does not need to manually switch different modes of the control equipment, the working state of the control equipment can be directly determined according to the association information between the user hand and the control equipment, and the convenience of the user for controlling the control equipment can be improved.
The contents of steps 210 to 220 are described below:
first, step 210 is referred to.
It will be appreciated that the control device may be provided with a plurality of modes of operation. For example, the control device may have a handle mode in addition to a mouse mode that controls movement and positioning of a cursor.
Wherein, the related information comprises: information of the relative position of the user's hand and the control device, and/or information of the contact area of the user's hand and the control device.
In a possible embodiment, the above-mentioned step 210 may specifically include the following steps:
collecting a target image; identifying a user hand and control equipment in a target image; and determining the relative position information of the hand of the user and the control equipment.
The target image can be acquired by an electronic device (such as intelligent glasses) with an image acquisition function.
Specifically, a user's hand may be identified from the target image based on an image segmentation algorithm, and the device controlled; and determining the relative position relation between the hand of the user and the control device based on a depth of field algorithm.
First, a user's hand is identified from a target image based on an image segmentation algorithm, and a device is controlled.
Wherein the image segmentation algorithm is a classification algorithm at the pixel level, and pixels belonging to the same class are classified into one class, so that the image segmentation is understood from the pixel level.
For example, the target image includes a user's hand and a control device, and the pixels belonging to the user's hand are classified into one type, and the pixels belonging to the control device are classified into one type, and in addition, the background pixels may be classified into one type.
Thus, the user's hand can be identified from the target image based on the image segmentation algorithm, and the device can be controlled.
Specifically, the image area corresponding to the hand of the user and the image area corresponding to the control device can be determined by the physical characteristics (such as refraction of light, reflection of light, color and the like) of the hand of the user and the control device, so that two different physical objects of the hand of the user and the control device can be classified.
Then, the relative position relation between the hand of the user and the control device can be determined based on the depth of field algorithm, and the association information is determined according to the relative position relation.
The depth of field refers to a range of distances between the front and rear of a subject measured by imaging that can obtain a clear image at the front of a camera lens or other imager. The aperture, lens, and distance of the focal plane to the subject are important factors affecting the depth of field. After focusing is completed, the distance of the clear image presented in the range before and after the focus is called depth of field.
In general, the farther the subject is from, the greater the corresponding depth of field in the image; the closer the subject distance, the smaller the corresponding depth of field in the image. As shown in fig. 3, the smaller the depth of field corresponding to the "flower" closer to the camera, the larger the depth of field corresponding to the "grass" closer to the camera.
Thus, the relative position relation (such as the front-back position relation) between the hand of the user and the control device can be detected through the depth of field algorithm, and the association information is determined according to the relative position relation. If the hand of the user is in a cladding gesture above the control equipment, the control equipment can be controlled to work in a mouse state through preliminary judgment, and the contact area between the hand of the user and the control equipment can be combined for further confirmation; if the hand of the user is in a holding posture behind the control equipment, the control equipment can be controlled to work in a handle state through preliminary judgment, and further confirmation can be carried out by combining the contact area of the hand of the user and the control equipment.
In another possible embodiment, the above-mentioned step 210 may specifically include the following steps: and acquiring depth data according to the sensor, and judging the relative position information of the hand of the user and the control equipment according to the depth data.
It should be understood that the above-mentioned relative position information of the user's hand and the control device may at least include the following cases: the first preset condition corresponding to the first working state (such as a mouse) may be that the hand of the user is in contact with the control device on the first surface; the second preset condition corresponding to the second working state (such as a handle) may be that the hand of the user contacts the control device with the second surface, and the first surface is opposite to the second surface.
The step of dividing the image of the control device from the user's hand in the identification target image may further include the steps of: and determining the contact area of the hand of the user and the control equipment.
First, a portion overlapping between an image area corresponding to a user's hand and an image area corresponding to a control device may be determined based on a depth of field algorithm or based on sensor data, and then an area corresponding to the overlapping portion is determined as the contact area referred to above.
The target image related to the above may be a top view of the user's hand and the target device. After the step of identifying the user hand and the control device in the target image, the method may further include the following steps: and identifying the fingers and the number of the fingers of the hand of the user and the contact area of the hand of the user and the target equipment from the target image based on the hand identification model.
The hand recognition model can be obtained through training according to a first image comprising the hand of a user and a finger mark in the first image; the hand recognition model can be obtained by training according to a first image comprising the hand of the user and the number of fingers in the first image; the hand recognition model can be further obtained through training according to a first image comprising the hand of the user and the target equipment and the contact area of the hand of the user and the target equipment.
As shown in fig. 4, the user hand may be modeled based on the key feature points of the user hand, and in the case that the image includes more elements, the image area corresponding to the user hand may be quickly identified from the target image. Further, the fingers and the number of the fingers of the hand of the user, the contact area between the hand of the user and the target equipment and the like can be determined from the image area corresponding to the hand according to the hand recognition model.
Therefore, the method and the device can quickly and accurately determine the association information between the user hand and the control device through the fingers and the number of the fingers of the user hand recognized from the target image based on the hand recognition model and the contact area between the user hand and the target device.
Next, step 220 is referred to.
And determining the working state of the control equipment according to the association information. And the control equipment is in a first working state under the condition that the associated information meets the first preset condition, and is in a second working state under the condition that the associated information meets the second preset condition.
Under the condition that the associated information meets a first preset condition, determining that the control equipment is in a first working state, so that the control equipment can be controlled to work in the first working state; and under the condition that the associated information meets the second preset condition, determining that the control equipment is in a second working state, so that the control equipment can be controlled to work in the second working state.
Wherein the related information meets a first preset condition, including:
the contact area information is greater than a first threshold, or the user's hand is overlaid on the control device, or the number of fingers in the user's hand that are manipulating the control device is greater than or equal to a preset number, and includes a first finger.
Since a specific finger (e.g. index finger and/or middle finger) is typically used in case the control device is in the first operating state; in addition, in the case that the user uses the control device in the mouse mode, the user generally needs to cover his hand on the mouse, so that the number of the recognizable fingers needs to be greater than or equal to the preset number; and the contact area of the hand of the user and the control device is larger than a first threshold value under the condition that the control device is used in a mouse mode by the user.
By way of example, the number of fingers of the user's hand and the detected fingers in the target image may be determined by the hand recognition model, and the control device may be controlled to operate in the first operation state in a case where two conditions are satisfied among three conditions that the number of recognized fingers is more than two, the middle finger is included in the fingers, and the contact area is greater than the first threshold.
Therefore, when the contact area information is larger than the first threshold value, or the hands of the user are covered on the control device, or the number of fingers in the hands of the user for controlling the control device is larger than or equal to the preset number, and the first fingers are included, the control device can be controlled to work in the first working state, and the corresponding working state of the control device can be rapidly determined.
Wherein the related information meets a second preset condition, including:
the contact area information is smaller than a first threshold value, or after the user holds the control device with the hand, or the number of fingers in the user's hand for manipulating the control device is smaller than a preset number, and the control device comprises a second finger.
Since in case the control device is in the second operating state (handle mode) a specific finger (e.g. thumb) is usually used; and in the case where the user uses the control apparatus in the handle mode, it is generally necessary to hold the handle, so that the number of fingers identifiable in the plan view needs to be less than or equal to a preset number; and the contact area of the hand of the user and the control device is smaller than or equal to a first threshold value under the condition that the control device is used in a handle mode by the user.
By way of example, the number of fingers and the detected fingers are determined by the finger detection algorithm, and the control device can be controlled to operate in the second operating state when two conditions are satisfied among three conditions that the number of fingers of the identified user finger is less than or equal to two, the finger includes a thumb, and the contact area is less than or equal to the first threshold.
Therefore, when the contact area information is smaller than the first threshold value, or after the user holds the control device by hand, or when the number of fingers in the user's hand for controlling the control device is smaller than the preset number and the second fingers are included, the control target device works in the second working state, and the working state corresponding to the control device can be rapidly determined.
In addition, when the mode switching to the target device is completed, a prompt message may be output on the electronic device controlled by the control device. And may shake the control device and/or switch the color of the indicator light on the control device. To feed back that the user has completed the mode switching of the control device.
In one possible embodiment, the steering control device operates in the second steering mode in case it is detected that the user wears the smart glasses.
The smart glasses, also called smart glasses, are the generic name of such glasses with independent operating systems like smart phones.
A distance sensor can be installed in the intelligent glasses. Whether the user wears the smart glasses may be detected using the distance sensor. In the event that the user wearing smart glasses is detected, indicating that the user may be about to play a game or a remote control function, the control device may be operated in a second operation mode (e.g., a handle mode).
Therefore, the control device is controlled to work in the second control mode under the condition that the user wears the intelligent glasses is detected, and the working state of the control device can be quickly switched to the working state that the user needs to enter the control device.
In summary, in the embodiment of the present application, by acquiring association information between a user's hand and a control device; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. Therefore, the user does not need to manually switch different modes of the control equipment, and the working state of the control equipment can be directly determined according to the association information between the user hand and the control equipment, so that the convenience of controlling the control equipment by the user can be improved.
It should be noted that, in the working state determining method provided in the embodiments of the present application, the execution body may be a working state determining device, or a control module in the working state determining device for executing the loading working state determining method. In the embodiment of the present application, an example is taken as an example where the working state determining device executes a loading working state determining method, and the working state determining method provided in the embodiment of the present application is described.
In addition, based on the above working state determining method, the embodiment of the present application further provides a working state determining device, which is specifically described in detail with reference to fig. 5.
Fig. 5 is a schematic structural diagram of a working state determining device according to an embodiment of the present application.
As shown in fig. 5, the operation state determining apparatus 500 may include:
the acquiring module 510 is configured to acquire association information between a hand of a user and a control device.
The determining module 520 is configured to determine an operating state of the control device according to the association information.
The control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition.
In one possible embodiment, the association information includes: information of the relative position of the user's hand and the control device, and/or information of the contact area of the user's hand and the control device.
In one possible embodiment, the obtaining module 510 includes:
and the acquisition module is used for acquiring the target image.
And the identification module is used for identifying the hands of the user in the target image and the control equipment.
The determining module 520 is further configured to determine relative position information of the user's hand and the control device.
In a possible embodiment, the association information satisfies a first preset condition, including:
the contact area information is greater than a first threshold, or the user's hand is overlaid on the control device, or the number of fingers in the user's hand that are manipulating the control device is greater than or equal to a preset number, and includes a first finger.
In a possible embodiment, the association information satisfies a second preset condition, including:
the contact area information is smaller than a first threshold value, or after the user holds the control device with the hand, or the number of fingers in the user's hand for manipulating the control device is smaller than a preset number, and the control device comprises a second finger.
In summary, the working state determining device provided by the embodiment of the application obtains the association information between the hand of the user and the control equipment; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. Therefore, the user does not need to manually switch different modes of the control equipment, and the working state of the control equipment can be directly determined according to the association information between the user hand and the control equipment, so that the convenience of controlling the control equipment by the user can be improved.
The working state determining device in the embodiment of the application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc., and the non-mobile electronic device may be a server, network attached storage (Network Attached storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The working state determining device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The working state determining device provided in the embodiment of the present application can implement each process implemented by the working state determining device in the method embodiment of fig. 2 to fig. 4, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 6, the embodiment of the present application further provides an electronic device 600, including a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and capable of being executed on the processor 601, where the program or the instruction implements each process of the embodiment of the chat group creation method described above when executed by the processor 601, and the same technical effects can be achieved, and for avoiding repetition, a description is omitted herein.
It should be noted that, the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 7 is a schematic hardware structure of another electronic device according to an embodiment of the present application.
The electronic device 700 includes, but is not limited to: radio frequency unit 701, network module 702, audio output unit 703, input unit 704, sensor 705, display unit 706, user input unit 707, interface unit 708, memory 709, processor 710, and power supply 711. Among other things, the input unit 704 may include a graphics processor 7041 and a microphone 7042; the display unit 706 may include a display panel 7061; the user input unit 707 may include a touch panel 7071 and other input devices 7072; memory 709 may include application programs and an operating system.
Those skilled in the art will appreciate that the electronic device 700 may further include a power source (e.g., a battery) for powering the various components, and that the power source 711 may be logically coupled to the processor 710 via a power management system, thereby performing functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
And a processor 710 for acquiring association information between the user's hand and the control device.
The processor 710 is further configured to determine an operating state of the control device according to the association information.
Optionally, the network module 702 is configured to acquire a target image.
The processor 710 is further configured to identify a user's hand and control device in the target image.
The processor 710 is further configured to determine relative position information of the user's hand and the control device.
In the embodiment of the application, the association information between the hand of the user and the control equipment is obtained; determining the working state of the control equipment according to the association information; the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition. Therefore, the user does not need to manually switch different modes of the control equipment, and the working state of the control equipment can be directly determined according to the association information between the user hand and the control equipment, so that the convenience of controlling the control equipment by the user can be improved.
The embodiment of the application further provides a readable storage medium, on which a program or an instruction is stored, where the program or the instruction realizes each process of the above working state determining method embodiment when executed by a processor, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium such as a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is configured to run a program or an instruction, implement each process of the above working state determining method embodiment, and achieve the same technical effect, so as to avoid repetition, and not be repeated here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), including several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (6)

1. A method for determining an operating condition, comprising:
collecting a target image;
identifying a user hand and control equipment in the target image;
determining association information between the user's hand and the control device, the association information comprising: the relative position information of the user hand and the control equipment and/or the contact area information of the user hand and the control equipment;
determining the working state of the control equipment according to the association information;
the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition, wherein the first working state is in a mouse mode, and the second working state is in a handle mode.
2. The method of claim 1, wherein the association information satisfies a first preset condition, comprising:
the contact area information is greater than a first threshold, or the user's hand is covered on the control device, or the number of fingers in the user's hand for manipulating the control device is greater than or equal to a preset number, and includes a first finger.
3. The method of claim 1, wherein the association information satisfies a second preset condition, comprising:
the contact area information is smaller than a first threshold value, or after the user hand is held by the control device, or the number of fingers in the user hand for controlling the control device is smaller than a preset number, and the control device comprises second fingers.
4. An operating condition determining apparatus, characterized by comprising:
the acquisition module is used for acquiring a target image;
the identification module is used for identifying the hands of the user and the control equipment in the target image;
a determining module, configured to determine association information between the user's hand and the control device, where the association information includes: the relative position information of the user hand and the control equipment and/or the contact area information of the user hand and the control equipment;
the determining module is also used for determining the working state of the control equipment according to the association information;
the control device is in a first working state when the association information meets a first preset condition, and is in a second working state when the association information meets a second preset condition, wherein the first working state is in a mouse mode, and the second working state is in a handle mode.
5. The apparatus of claim 4, wherein the association information satisfies a first preset condition, comprising:
the contact area information is greater than a first threshold, or the user's hand is covered on the control device, or the number of fingers in the user's hand for manipulating the control device is greater than or equal to a preset number, and includes a first finger.
6. The apparatus of claim 4, wherein the association information satisfies a second preset condition, comprising:
the contact area information is smaller than a first threshold value, or after the user hand is held by the control device, or the number of fingers in the user hand for controlling the control device is smaller than a preset number, and the control device comprises second fingers.
CN202110120545.6A 2021-01-28 2021-01-28 Working state determining method and device Active CN112818825B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110120545.6A CN112818825B (en) 2021-01-28 2021-01-28 Working state determining method and device
PCT/CN2022/073568 WO2022161324A1 (en) 2021-01-28 2022-01-24 Working state determination method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110120545.6A CN112818825B (en) 2021-01-28 2021-01-28 Working state determining method and device

Publications (2)

Publication Number Publication Date
CN112818825A CN112818825A (en) 2021-05-18
CN112818825B true CN112818825B (en) 2024-02-23

Family

ID=75859895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110120545.6A Active CN112818825B (en) 2021-01-28 2021-01-28 Working state determining method and device

Country Status (2)

Country Link
CN (1) CN112818825B (en)
WO (1) WO2022161324A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112818825B (en) * 2021-01-28 2024-02-23 维沃移动通信有限公司 Working state determining method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105117016A (en) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 Interaction handle used in interaction control of virtual reality and augmented reality
CN107831921A (en) * 2017-11-24 2018-03-23 深圳多哚新技术有限责任公司 A kind of handle space position and the determination method, apparatus and system of coding corresponding relation
CN108536273A (en) * 2017-03-01 2018-09-14 天津锋时互动科技有限公司深圳分公司 Man-machine menu mutual method and system based on gesture
CN111459317A (en) * 2020-04-24 2020-07-28 维沃移动通信有限公司 Input device and electronic device input system
CN211653611U (en) * 2020-03-23 2020-10-09 北京凌宇智控科技有限公司 Multifunctional input device
WO2020215565A1 (en) * 2019-04-26 2020-10-29 平安科技(深圳)有限公司 Hand image segmentation method and apparatus, and computer device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080276172A1 (en) * 2007-05-03 2008-11-06 International Business Machines Corporation Dynamic mouse over handling for tightly packed user interface components
CN107193366A (en) * 2017-04-01 2017-09-22 邓伟娜 A kind of keyboard/mouse automatic switching method and device, a kind of computer system
CN109908575A (en) * 2019-02-14 2019-06-21 深圳威尔视觉传媒有限公司 The method and relevant apparatus of input pattern are judged based on images match
CN112818825B (en) * 2021-01-28 2024-02-23 维沃移动通信有限公司 Working state determining method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105117016A (en) * 2015-09-07 2015-12-02 众景视界(北京)科技有限公司 Interaction handle used in interaction control of virtual reality and augmented reality
CN108536273A (en) * 2017-03-01 2018-09-14 天津锋时互动科技有限公司深圳分公司 Man-machine menu mutual method and system based on gesture
CN107831921A (en) * 2017-11-24 2018-03-23 深圳多哚新技术有限责任公司 A kind of handle space position and the determination method, apparatus and system of coding corresponding relation
WO2020215565A1 (en) * 2019-04-26 2020-10-29 平安科技(深圳)有限公司 Hand image segmentation method and apparatus, and computer device
CN211653611U (en) * 2020-03-23 2020-10-09 北京凌宇智控科技有限公司 Multifunctional input device
CN111459317A (en) * 2020-04-24 2020-07-28 维沃移动通信有限公司 Input device and electronic device input system

Also Published As

Publication number Publication date
CN112818825A (en) 2021-05-18
WO2022161324A1 (en) 2022-08-04

Similar Documents

Publication Publication Date Title
CN113132618B (en) Auxiliary photographing method and device, terminal equipment and storage medium
WO2021035646A1 (en) Wearable device and control method therefor, gesture recognition method, and control system
JP2015521312A (en) User input processing by target tracking
US20220383523A1 (en) Hand tracking method, device and system
CN104298340A (en) Control method and electronic equipment
CN114138121B (en) User gesture recognition method, device and system, storage medium and computing equipment
CN111881703A (en) Graphic code identification method and device and electronic equipment
CN110866940A (en) Virtual picture control method and device, terminal equipment and storage medium
WO2023016372A1 (en) Control method and apparatus, and electronic device and storage medium
CN111625157A (en) Fingertip key point detection method, device, equipment and readable storage medium
CN112818825B (en) Working state determining method and device
CN111651106A (en) Unread message prompting method, unread message prompting device, unread message prompting equipment and readable storage medium
CN106569716B (en) Single-hand control method and control system
CN114125226A (en) Image shooting method and device, electronic equipment and readable storage medium
CN114217754A (en) Screen projection control method and device, electronic equipment and storage medium
CN111597009A (en) Application program display method and device and terminal equipment
CN112333439A (en) Face cleaning equipment control method and device and electronic equipment
CN113282164A (en) Processing method and device
JP5558899B2 (en) Information processing apparatus, processing method thereof, and program
CN111007942A (en) Wearable device and input method thereof
CN112328164B (en) Control method and electronic equipment
CN110944084B (en) Single-hand mode control method, terminal and computer storage medium
CN113867863A (en) Graphic code processing method and device
CN103873769A (en) Image search systems and methods
CN103793053B (en) Gesture projection method and device for mobile terminals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant