WO2022161324A1 - Procédé et dispositif de détermination d'état de fonctionnement - Google Patents
Procédé et dispositif de détermination d'état de fonctionnement Download PDFInfo
- Publication number
- WO2022161324A1 WO2022161324A1 PCT/CN2022/073568 CN2022073568W WO2022161324A1 WO 2022161324 A1 WO2022161324 A1 WO 2022161324A1 CN 2022073568 W CN2022073568 W CN 2022073568W WO 2022161324 A1 WO2022161324 A1 WO 2022161324A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control device
- user
- hand
- working state
- preset condition
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 59
- 238000004422 calculation algorithm Methods 0.000 claims description 13
- 239000004984 smart glass Substances 0.000 claims description 8
- 238000003709 image segmentation Methods 0.000 claims description 7
- 238000004891 communication Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims 1
- 210000003811 finger Anatomy 0.000 description 39
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 244000025254 Cannabis sativa Species 0.000 description 1
- 238000007635 classification algorithm Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/22—Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
- G06V10/267—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
Definitions
- the embodiments of the present application relate to the field of information processing, and in particular, to a method and device for determining a working state.
- control devices that can realize various control modes emerge as the times require.
- the control device may have a handle mode in addition to a mouse mode for controlling the movement and positioning of the cursor.
- the user is required to manually switch between different modes of the control device, which is very inconvenient to use.
- the user needs to manually switch between different modes of the control device, which is cumbersome to operate.
- the embodiments of the present application provide a working state determination method and apparatus, so as to solve the problem that the user needs to manually switch between different modes of the control device, and the operation is complicated.
- an embodiment of the present application provides a method for determining a working state, and the method may include:
- control device when the associated information satisfies the first preset condition, the control device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state.
- an embodiment of the present application provides an apparatus for determining a working state, and the apparatus may include:
- an acquisition module used to acquire the association information between the user's hand and the control device
- a determination module used to determine the working state of the control device according to the associated information
- control device when the associated information satisfies the first preset condition, the control device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state.
- embodiments of the present application provide an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored on the memory and executable on the processor, the program or instruction being The processor implements the steps of the method according to the first aspect when executed.
- an embodiment of the present application provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the method according to the first aspect are implemented .
- an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, and implement the first aspect the method described.
- the association information between the user's hand and the control device is obtained; according to the association information, the working state of the control device is determined; wherein, when the association information satisfies the first preset condition, the control device is the first In a working state, when the associated information satisfies the second preset condition, the control device is in the second working state. Therefore, it is not necessary for the user to manually switch between different modes of the control device, and the working state of the control device can be directly determined according to the association information between the user's hand and the control device, thereby improving the convenience for the user to operate the control device.
- FIG. 1 is a schematic diagram of an application scenario of the method for determining a working state provided by an embodiment of the present application
- FIG. 2 is a flowchart of a method for determining a working state provided by an embodiment of the present application
- FIG. 3 is a schematic diagram of a depth of field effect of an image provided by an embodiment of the present application.
- FIG. 4 is a schematic diagram of a user's hand provided by an embodiment of the present application.
- FIG. 5 is a schematic structural diagram of an apparatus for determining a working state according to an embodiment of the present application
- FIG. 6 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
- FIG. 7 is a schematic diagram of a hardware structure of another electronic device according to an embodiment of the present application.
- the working state determination method provided by the embodiment of the present application can be applied to at least the following application scenarios, which will be described below.
- the user 10 wants to use the control device 20 to control the electronic device 30 , and the user 10 can control the electronic device 30 by switching the control device 20 to the mouse mode, for example, using the control device 20 to control the electronic device 30 Play a video; or switch the control device 20 to the handle mode to control the electronic device 30 , for example, use the control device 20 to control the electronic device 30 to play games.
- the user wants to change the working state of the control device, it needs to be switched manually, and the operation is cumbersome.
- the embodiments of the present application provide a working state determination method, device, electronic device and storage medium, so as to solve the problem in the related art that the user needs to manually switch between different modes of the control device and the operation is tedious.
- the methods provided in the embodiments of the present application can also be applied to any scenarios where the user needs to manually switch between different modes of the control device.
- the association information between the user's hand and the control device is obtained; according to the association information, the working state of the control device is determined; wherein, when the association information satisfies the first preset condition, the control device is controlled The device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state. Therefore, it is not necessary for the user to manually switch between different modes of the control device, and the working state of the control device can be directly determined according to the association information between the user's hand and the control device, thereby improving the convenience for the user to operate the control device.
- FIG. 2 is a flowchart of a method for determining a working state provided by an embodiment of the present application.
- the working state determination method may include steps 210 to 220 , and the method is applied to the working state determination device, and is specifically as follows:
- Step 210 Acquire association information between the user's hand and the control device.
- Step 220 Determine the working state of the control device according to the associated information.
- control device when the associated information satisfies the first preset condition, the control device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state.
- the working state determination method obtains the association information between the user's hand and the control device; determines the working state of the control device according to the association information; wherein, when the association information satisfies the first preset condition, the control device is controlled. The device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state. Therefore, it is not necessary for the user to manually switch between different modes of the control device, and the working state of the control device can be directly determined according to the association information between the user's hand and the control device, thereby improving the convenience for the user to operate the control device.
- step 210 is involved.
- control device may have various control modes.
- control device may have a handle mode in addition to a mouse mode for controlling the movement and positioning of the cursor.
- the above-mentioned related information includes: relative position information of the user's hand and the control device, and/or contact area information of the user's hand and the control device.
- the above-mentioned step 210 may specifically include the following steps:
- Collect the target image identify the user's hand and the control device in the target image; determine the relative position information of the user's hand and the control device.
- the target image may be collected through an electronic device with an image collection function (eg, smart glasses).
- an image collection function eg, smart glasses.
- the user's hand and the control device can be identified from the target image based on the image segmentation algorithm; the relative positional relationship between the user's hand and the control device can be determined based on the depth-of-field algorithm.
- the user's hand is identified from the target image based on an image segmentation algorithm, and the device is controlled.
- the image segmentation algorithm is a classification algorithm at the pixel level, and the pixels belonging to the same class must be classified into one class, so the image segmentation is to understand the image from the pixel level.
- the target image includes the user's hand and the control device
- the pixels belonging to the user's hand are divided into one category
- the pixels belonging to the control device are divided into one category
- the background pixels can also be divided into one category.
- the user's hand can be identified from the target image based on the image segmentation algorithm, and the device can be controlled.
- the physical characteristics of the user's hand and the control device can be used to determine the image area corresponding to the user's hand and the image area corresponding to the control device.
- the two different objects, the user's hand and the control device are classified into categories.
- the relative positional relationship between the user's hand and the control device may be determined based on the depth-of-field algorithm, and the associated information may be determined according to the relative positional relationship.
- the depth of field refers to the distance range before and after the object measured by imaging at the front of the camera lens or other imagers that can obtain a clear image. Aperture, lens, and the distance from the focal plane to the subject are important factors that affect depth of field. After the focus is completed, the distance of the clear image presented in the range before and after the focus, this range is called the depth of field.
- the relative positional relationship between the user's hand and the control device can be detected by the depth-of-field algorithm, so that the associated information can be determined according to the relative positional relationship. If the user's hand is wrapped over the control device, it can be preliminarily judged that the control device can be manipulated to work in a mouse state, and further confirmation can be made in combination with the contact area between the user's hand and the control device; if the user's hand is on the control device The rear of the device is in a holding posture, and it can be preliminarily judged that the control device can be manipulated to work in the handle state, and further confirmation can be carried out in combination with the contact area between the user's hand and the control device.
- the above-mentioned step 210 may specifically include the following steps: acquiring depth data according to the sensor, and judging the relative position information of the user's hand and the control device according to the depth data.
- the above-mentioned relative position information of the user's hand and the control device may at least include the following situations: the first preset condition corresponding to the first working state (eg, mouse mode) may be the user's hand and the control device. The device is in contact with the first surface; the second preset condition corresponding to the second working state (eg handle mode) may be that the user's hand and the control device are in contact with the second surface, and the first surface is opposite to the second surface.
- the first preset condition corresponding to the first working state eg, mouse mode
- the second preset condition corresponding to the second working state eg handle mode
- the following step may be further included: determining the contact area between the user's hand and the control device.
- the overlapping part between the image area corresponding to the user's hand and the image area corresponding to the control device can be determined based on a depth-of-field algorithm or based on sensor data, and then the area corresponding to the overlapping part can be determined, and it is determined as the above-mentioned related area.
- Contact area the area corresponding to the overlapping part can be determined, and it is determined as the above-mentioned related area.
- the target image mentioned above may be a top view of the user's hand and the target device.
- the following steps may be included: identifying the fingers of the user's hand, the number of fingers, and the number of fingers of the user's hand from the target image based on the hand recognition model. The contact area between the part and the target device.
- the hand recognition model can be trained according to the first image including the user's hand and the finger marks in the first image; the hand recognition model can also be obtained according to the first image including the user's hand, and the first image The number of fingers is obtained by training; the hand recognition model can also be obtained by training according to the first image including the user's hand and the target device, and the contact area between the user's hand and the target device.
- the user's hand can be modeled based on the key feature points of the user's hand, and when the image includes many elements, the image corresponding to the user's hand can be quickly identified from the target image. area. Further, according to the hand recognition model, the fingers of the user's hand, the number of fingers, and the contact area between the user's hand and the target device can be determined from the image area corresponding to the hand, which can be used for judgment.
- the fingers of the user's hand, the number of fingers, and the contact area between the user's hand and the target device can be quickly and accurately identified from the target image based on the hand recognition model, which can then be used to determine the user's hand and the control device. correlation information between.
- step 220 is involved.
- the working state of the control device is determined.
- the control device is in the first working state
- the control device is in the second working state.
- the control device when the associated information satisfies the first preset condition, it is determined that the control device is in the first working state, so that the control device can be manipulated to work in the first working state; when the associated information satisfies the second preset condition, it is determined that the control device is in the first working state.
- the control device is in the second working state, so that the control device can be manipulated to work in the second working state.
- the above-mentioned related information satisfies the first preset condition, including:
- the contact area information is greater than the first threshold, or the user's hand covers the control device, or the number of fingers in the user's hand that manipulates the control device is greater than or equal to a preset number, including the first finger.
- a specific finger such as the index finger and/or the middle finger
- the user when the user uses the control device in the mouse mode, the user usually needs to cover the mouse with his hand , so the number of identifiable fingers needs to be greater than or equal to the preset number; and when the user uses the control device in mouse mode, the contact area between the user's hand and the control device is greater than the first threshold.
- the number of fingers of the user's hand in the target image and the detected fingers can be determined by the hand recognition model, and the number of identified fingers is more than two fingers, the fingers include the middle finger, and the contact area is larger than the first.
- the control device can be manipulated to work in the first working state.
- the control device can be manipulated to work in the first working state, and the working state corresponding to the control device can be quickly determined.
- the above-mentioned related information satisfies the second preset condition, including:
- the contact area information is less than the first threshold, or the user's hand is behind the control device, or the number of fingers in the user's hand that manipulates the control device is less than a preset number, including a second finger.
- the control device is in the second working state (handle mode)
- a specific finger such as the thumb
- the control device in the handle mode it is usually necessary to hold the handle, so
- the number of identifiable fingers in the top view needs to be less than or equal to the preset number; and when the user uses the control device in the handle mode, the contact area between the user's hand and the control device is less than or equal to the first threshold.
- the number of fingers and the detected fingers are determined by a finger detection algorithm, and the number of the identified fingers of the user's fingers is less than or equal to two, the fingers include the thumb, and the contact area is less than or equal to the first threshold.
- the control device can be manipulated to work in the second working state.
- the contact area information is less than the first threshold, or the user's hand is holding the control device, or the number of fingers in the user's hand that manipulates the control device is less than the preset number, including the second finger , control the target device to work in the second working state, and can quickly determine the working state corresponding to the control device.
- prompt information can be output on the electronic device controlled by the control device.
- the control device As well as being able to vibrate the control device and/or switch the color of the indicator lights on the control device. In order to feedback that the user has completed the mode switching of the control device.
- the control device when it is detected that the user is wearing the smart glasses, the control device is operated in the second working state.
- smart glasses also known as smart mirrors, refer to a general term for such a type of glasses that have an independent operating system like a smart phone.
- a distance sensor can be installed in the smart glasses.
- a distance sensor can be used to detect whether a user is wearing smart glasses. When it is detected that the user is wearing the smart glasses, it means that the user may want to perform a game or a remote control function, so the control device can be manipulated to work in a second working state (eg handle mode).
- the working state of the control device can be quickly switched to the working state that the user needs to control the device to enter.
- the association information between the user's hand and the control device is obtained; according to the association information, the working state of the control device is determined; wherein, when the association information satisfies the first preset condition, The control device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state. Therefore, the user does not need to manually switch between different modes of the control device, and the working state of the control device can be directly determined according to the association information between the user's hand and the control device, thereby improving the convenience for the user to operate the control device.
- the execution body may be a working state determining device, or a control module in the working state determining device for executing the loading working state determining method.
- the method for determining the working state provided by the embodiment of the present application is described by taking the method for determining the loading working state performed by the working state determining device as an example.
- an embodiment of the present application further provides a working state determination device, which will be described in detail with reference to FIG. 5 .
- FIG. 5 is a schematic structural diagram of an apparatus for determining a working state according to an embodiment of the present application.
- the working state determining apparatus 500 may include:
- the obtaining module 510 is configured to obtain the association information between the user's hand and the control device.
- the determining module 520 is configured to determine the working state of the control device according to the associated information.
- control device when the associated information satisfies the first preset condition, the control device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state.
- the associated information includes: relative position information of the user's hand and the control device, and/or contact area information of the user's hand and the control device.
- the obtaining module 510 includes:
- the acquisition module is used to acquire target images.
- the recognition module is used to recognize the user's hand and the control device in the target image.
- the determining module 520 is further configured to determine the relative position information of the user's hand and the control device.
- the associated information satisfies the first preset condition, including:
- the contact area information is greater than the first threshold, or the user's hand covers the control device, or the number of fingers in the user's hand that manipulates the control device is greater than or equal to a preset number, including the first finger.
- the associated information satisfies the second preset condition, including:
- the contact area information is less than the first threshold, or the user's hand is behind the control device, or the number of fingers in the user's hand that manipulates the control device is less than a preset number, including a second finger.
- the device for determining the working state obtains the association information between the user's hand and the control device; determines the working state of the control device according to the association information; wherein, when the association information satisfies the first preset condition In the case of , the control device is in the first working state, and when the associated information satisfies the second preset condition, the control device is in the second working state. Therefore, the user does not need to manually switch between different modes of the control device, and the working state of the control device can be directly determined according to the association information between the user's hand and the control device, thereby improving the convenience for the user to operate the control device.
- the apparatus for determining the working state in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal.
- the apparatus may be a mobile electronic device or a non-mobile electronic device.
- the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant).
- assistant, PDA personal digital assistant
- the non-mobile electronic device can be a server, a network attached storage (NAS), a personal computer (PC), a television (television, TV), a teller machine or a self-service machine, etc.
- NAS network attached storage
- PC personal computer
- TV television
- teller machine a teller machine
- self-service machine etc.
- the device for determining the working state in the embodiment of the present application may be a device having an operating system.
- the operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
- the working state determining apparatus provided in the embodiment of the present application can implement each process implemented by the working state determining apparatus in the method embodiments of FIG. 2 to FIG. 4 . To avoid repetition, details are not repeated here.
- an embodiment of the present application further provides an electronic device 600, including a processor 601, a memory 602, a program or instruction stored in the memory 602 and executable on the processor 601, the program Or, when the instruction is executed by the processor 601, each process of the above working state determination method embodiment can be implemented, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
- the electronic devices in the embodiments of the present application include the aforementioned mobile electronic devices and non-mobile electronic devices.
- FIG. 7 is a schematic diagram of a hardware structure of another electronic device according to an embodiment of the present application.
- the electronic device 700 includes but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, a processor 710 and a power supply 711 and other components.
- the input unit 704 may include a graphics processor 7041 and a microphone 7042; the display unit 706 may include a display panel 7061; the user input unit 707 may include a touch panel 7071 and other input devices 7072; and the memory 709 may include application programs and operating systems.
- the electronic device 700 may also include a power supply (such as a battery) for supplying power to various components, and the power supply 711 may be logically connected to the processor 710 through a power management system, so as to manage charging, discharging, and Power management and other functions.
- a power supply such as a battery
- the structure of the electronic device shown in FIG. 7 does not constitute a limitation on the electronic device.
- the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
- the processor 710 is configured to acquire association information between the user's hand and the control device.
- the processor 710 is further configured to determine the working state of the control device according to the associated information.
- the network module 702 is used to collect target images.
- the processor 710 is further configured to identify the user's hand and the control device in the target image.
- the processor 710 is further configured to determine the relative position information of the user's hand and the control device.
- the association information between the user's hand and the control device is obtained; according to the association information, the working state of the control device is determined; wherein, when the association information satisfies the first preset condition, the control device is the first In a working state, when the associated information satisfies the second preset condition, the control device is in the second working state. Therefore, the user does not need to manually switch between different modes of the control device, and the working state of the control device can be directly determined according to the association information between the user's hand and the control device, thereby improving the convenience for the user to operate the control device.
- Embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, each process of the above working state determination method embodiment can be achieved, and can achieve The same technical effect, in order to avoid repetition, will not be repeated here.
- the processor is the processor in the electronic device described in the foregoing embodiments.
- the readable storage medium includes a computer-readable storage medium, and examples of the computer storage medium include tangible (non-transitory) computer-readable storage media, such as computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, optical disk, electronic circuit, semiconductor memory device, flash memory, erasable ROM (EROM), or floppy disk, etc.
- An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement the above embodiment of the method for determining a working state and can achieve the same technical effect, in order to avoid repetition, it will not be repeated here.
- the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110120545.6A CN112818825B (zh) | 2021-01-28 | 2021-01-28 | 工作状态确定方法和装置 |
CN202110120545.6 | 2021-01-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022161324A1 true WO2022161324A1 (fr) | 2022-08-04 |
Family
ID=75859895
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/073568 WO2022161324A1 (fr) | 2021-01-28 | 2022-01-24 | Procédé et dispositif de détermination d'état de fonctionnement |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112818825B (fr) |
WO (1) | WO2022161324A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112818825B (zh) * | 2021-01-28 | 2024-02-23 | 维沃移动通信有限公司 | 工作状态确定方法和装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080276172A1 (en) * | 2007-05-03 | 2008-11-06 | International Business Machines Corporation | Dynamic mouse over handling for tightly packed user interface components |
CN107193366A (zh) * | 2017-04-01 | 2017-09-22 | 邓伟娜 | 一种键盘/鼠标自动切换方法和装置、一种计算机系统 |
CN109908575A (zh) * | 2019-02-14 | 2019-06-21 | 深圳威尔视觉传媒有限公司 | 基于图像匹配判断输入模式的方法和相关装置 |
CN111459317A (zh) * | 2020-04-24 | 2020-07-28 | 维沃移动通信有限公司 | 输入设备和电子设备输入系统 |
CN211653611U (zh) * | 2020-03-23 | 2020-10-09 | 北京凌宇智控科技有限公司 | 一种多功能输入设备 |
CN112818825A (zh) * | 2021-01-28 | 2021-05-18 | 维沃移动通信有限公司 | 工作状态确定方法和装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105117016A (zh) * | 2015-09-07 | 2015-12-02 | 众景视界(北京)科技有限公司 | 用于虚拟现实和增强现实交互控制中的交互手柄 |
CN108536273B (zh) * | 2017-03-01 | 2024-10-18 | 深圳巧牛科技有限公司 | 基于手势的人机菜单交互方法与系统 |
CN107831921B (zh) * | 2017-11-24 | 2020-01-10 | 深圳多哚新技术有限责任公司 | 一种手柄空间位置与编码对应关系的确定方法、装置及系统 |
CN110232311B (zh) * | 2019-04-26 | 2023-11-14 | 平安科技(深圳)有限公司 | 手部图像的分割方法、装置及计算机设备 |
-
2021
- 2021-01-28 CN CN202110120545.6A patent/CN112818825B/zh active Active
-
2022
- 2022-01-24 WO PCT/CN2022/073568 patent/WO2022161324A1/fr active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080276172A1 (en) * | 2007-05-03 | 2008-11-06 | International Business Machines Corporation | Dynamic mouse over handling for tightly packed user interface components |
CN107193366A (zh) * | 2017-04-01 | 2017-09-22 | 邓伟娜 | 一种键盘/鼠标自动切换方法和装置、一种计算机系统 |
CN109908575A (zh) * | 2019-02-14 | 2019-06-21 | 深圳威尔视觉传媒有限公司 | 基于图像匹配判断输入模式的方法和相关装置 |
CN211653611U (zh) * | 2020-03-23 | 2020-10-09 | 北京凌宇智控科技有限公司 | 一种多功能输入设备 |
CN111459317A (zh) * | 2020-04-24 | 2020-07-28 | 维沃移动通信有限公司 | 输入设备和电子设备输入系统 |
CN112818825A (zh) * | 2021-01-28 | 2021-05-18 | 维沃移动通信有限公司 | 工作状态确定方法和装置 |
Also Published As
Publication number | Publication date |
---|---|
CN112818825A (zh) | 2021-05-18 |
CN112818825B (zh) | 2024-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021135601A1 (fr) | Procédé et appareil de photographie auxiliaire, équipement terminal, et support d'enregistrement | |
CN110688951B (zh) | 图像处理方法及装置、电子设备和存储介质 | |
KR102391792B1 (ko) | 생체 검출 방법, 장치 및 시스템, 전자 기기 및 저장 매체 | |
WO2019101021A1 (fr) | Procédé de reconnaissance d'image, appareil et dispositif électronique | |
US10007841B2 (en) | Human face recognition method, apparatus and terminal | |
US9685001B2 (en) | System and method for indicating a presence of supplemental information in augmented reality | |
EP3113114A1 (fr) | Procédé et dispositif de traitement d'image | |
CN110035218B (zh) | 一种图像处理方法、图像处理装置及拍照设备 | |
US10902241B2 (en) | Electronic device and method for recognizing real face and storage medium | |
CN107368810A (zh) | 人脸检测方法及装置 | |
WO2022110614A1 (fr) | Procédé et appareil de reconnaissance de geste, dispositif électronique et support de stockage | |
CN110992327A (zh) | 镜头脏污状态的检测方法、装置、终端及存储介质 | |
CN108346175B (zh) | 一种人脸图像修复方法、装置及存储介质 | |
JP2015521312A (ja) | 視標追跡によるユーザ入力処理 | |
US20150227789A1 (en) | Information processing apparatus, information processing method, and program | |
WO2022161324A1 (fr) | Procédé et dispositif de détermination d'état de fonctionnement | |
CN112486346A (zh) | 按键模式设置方法、装置及存储介质 | |
WO2022111461A1 (fr) | Procédé et appareil de reconnaissance et dispositif électronique | |
CN109947243B (zh) | 基于触摸手检测的智能电子设备手势捕获与识别技术 | |
WO2018076720A1 (fr) | Procédé d'utilisation à une seule main et système de commande | |
CN109993059B (zh) | 智能电子设备上基于单摄像头的双目视觉与物体识别技术 | |
CN109960406B (zh) | 基于双手手指之间动作的智能电子设备手势捕获与识别技术 | |
CN111610886A (zh) | 触控屏幕亮度的调整方法、设备及计算机可读存储介质 | |
TWI630507B (zh) | 目光偵測、辨識與控制方法 | |
CN111310526A (zh) | 目标跟踪模型的参数确定方法、装置及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22745199 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22745199 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 18-01-2024) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22745199 Country of ref document: EP Kind code of ref document: A1 |