CN116009702A - Head wearable device, hand wearable device and gesture recognition method - Google Patents
Head wearable device, hand wearable device and gesture recognition method Download PDFInfo
- Publication number
- CN116009702A CN116009702A CN202310127005.XA CN202310127005A CN116009702A CN 116009702 A CN116009702 A CN 116009702A CN 202310127005 A CN202310127005 A CN 202310127005A CN 116009702 A CN116009702 A CN 116009702A
- Authority
- CN
- China
- Prior art keywords
- gesture recognition
- wearable device
- hand
- trigger instruction
- recognition result
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application provides a wearable equipment of wear-type, wearable equipment of hand and gesture recognition method, relates to computer technology field, and wearable equipment of wear-type and the mutual process of carrying out gesture recognition of wearable equipment of hand include: the hand wearable device sends a trigger instruction to the head wearable device when determining that the hand wearable device is in a preset motion state based on inertial data acquired by the inertial measurement unit. The wearable equipment responds to the triggering instruction, the gesture recognition module is started, and gesture recognition is carried out through the gesture recognition module based on the media information collected by the image sensor, so that a gesture recognition result is obtained, and the gesture recognition module in the wearable equipment does not need to be in a working state for a long time, so that the power consumption of the wearable equipment is effectively reduced, and the cruising ability of the wearable equipment is improved.
Description
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a head wearable device, a hand wearable device and a gesture recognition method.
Background
With the development of science and technology, the market of intelligent glasses such as augmented Reality (Augmented Reality, abbreviated as AR) glasses/Virtual Reality (VR) glasses has gradually increased. A wearable head-mounted device (Head Mounted Display, abbreviated as HMD) such as AR glasses/VR glasses can present images to a user through a display screen to provide the user with an immersive experience.
Under the related art, the wearable device recognizes gestures of a user to realize interaction with the user. However, gesture recognition modules in head-mounted wearable devices consume greater power. If the gesture recognition module is in a working state for a long time, excessive power consumption of the wearable device is caused, and the cruising ability of the wearable device is further affected.
Disclosure of Invention
The embodiment of the application provides a wearable device, a wearable device for hands and a gesture recognition method, which are used for reducing the power consumption of the wearable device and improving the endurance of the wearable device.
In one aspect, embodiments of the present application provide a wearable device, including:
the device comprises a processor, a first wireless module, an image sensor and a first gesture recognition module;
the first wireless module is used for receiving a trigger instruction sent by the hand wearable device and transmitting the trigger instruction to the processor, wherein the trigger instruction is sent when the hand wearable device determines that the hand wearable device is in a preset motion state based on acquired inertial data;
the processor is used for starting the first gesture recognition module after receiving the trigger instruction;
the first gesture recognition module is used for performing gesture recognition according to the media information acquired by the image sensor to obtain a gesture recognition result.
Optionally, the processor is further configured to:
after the trigger instruction is received and before the first gesture recognition module is started, the image sensor is started, so that the image sensor in the wearable headset device does not need to be in a working state for a long time, and the power consumption of the wearable headset device is further reduced.
Optionally, the processor is further configured to:
the image sensor is activated before receiving the trigger instruction.
Optionally, the media information includes:
n frames of images acquired by the image sensor before receiving the trigger instruction and M frames of images acquired by the image sensor after receiving the trigger instruction, wherein N and M are positive integers.
In the embodiment of the application, the gesture recognition is performed by combining the image of the hand or wrist and other targets acquired by the image sensor before the triggering instruction is received and the image of the hand or wrist and other targets acquired by the image sensor after the triggering instruction is received, so that the accuracy of gesture recognition by the first gesture recognition module is improved.
Optionally, the first wireless module is further configured to receive the inertial data sent by the hand wearable device;
the first gesture recognition module is further configured to perform gesture recognition according to the inertial data and the media information acquired by the image sensor, so as to obtain a gesture recognition result.
Because the hand wearable device is related to the interactive gestures executed by the user through the inertial data acquired by the inertial measurement unit, the hand wearable device performs gesture recognition based on the inertial data acquired by the inertial measurement unit and the media information acquired by the image sensor, so that a gesture recognition result is obtained, and the accuracy of the hand wearable device in gesture recognition can be effectively improved.
Optionally, the first gesture recognition module is further configured to transmit the gesture recognition result to the processor;
the processor is further used for executing response actions according to the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
In one aspect, embodiments of the present application provide a hand wearable device, including:
the system comprises a second wireless module, an inertial measurement unit and a second gesture recognition module;
the inertial measurement unit is used for acquiring inertial data of the hand wearable device;
the second gesture recognition module is used for sending a trigger instruction to the wearable head device through the second wireless module when the wearable hand device is in a preset motion state according to the inertial data, wherein the trigger instruction is used for triggering the wearable head device to perform gesture recognition based on the acquired media information, and a gesture recognition result is obtained.
In one aspect, an embodiment of the present application provides a gesture recognition method, applied to a wearable device, including:
receiving a trigger instruction sent by hand wearable equipment, wherein the trigger instruction is sent when the hand wearable equipment determines that the hand wearable equipment is in a preset motion state based on acquired inertial data;
and responding to the triggering instruction, and carrying out gesture recognition based on the acquired media information to obtain a gesture recognition result.
Optionally, responding to the trigger instruction, performing gesture recognition based on the collected media information, and after obtaining a gesture recognition result, further including:
executing response actions according to the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
In one aspect, an embodiment of the present application provides a gesture recognition method, which is applied to a hand wearable device, including:
acquiring inertial data of the hand wearable device;
according to the inertial data, when the hand wearable device is determined to be in a preset motion state, a trigger instruction is sent to the head wearable device, and the trigger instruction is used for triggering the head wearable device to perform gesture recognition based on the acquired media information, so that a gesture recognition result is obtained.
In one aspect, an embodiment of the present application provides a gesture recognition method, applied to a wearable device, including:
receiving a trigger instruction sent by hand wearable equipment, wherein the trigger instruction is sent when the hand wearable equipment determines that the hand wearable equipment is in a preset motion state based on acquired inertial data;
and responding to the triggering instruction, and carrying out gesture recognition based on the acquired media information to obtain a gesture recognition result.
Optionally, responding to the trigger instruction, performing gesture recognition based on the collected media information, and after obtaining a gesture recognition result, further including:
executing response actions according to the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
In one aspect, an embodiment of the present application provides a gesture recognition method, which is applied to a hand wearable device, including:
acquiring inertial data of the hand wearable device;
according to the inertial data, when the hand wearable device is determined to be in a preset motion state, a trigger instruction is sent to the head wearable device, and the trigger instruction is used for triggering the head wearable device to perform gesture recognition based on the acquired media information, so that a gesture recognition result is obtained.
In one aspect, embodiments of the present application provide a computer device including a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the gesture recognition method described above when the program is executed.
In one aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program executable by a computer device, which when run on the computer device, causes the computer device to perform the steps of the gesture recognition method described above.
In the embodiment of the application, based on the acquired inertial data, the hand wearable device sends a trigger instruction to the head wearable device when determining that the hand wearable device is in a preset motion state. The wearable equipment of wear-type just starts first gesture recognition module and carries out gesture recognition, obtains gesture recognition result for first gesture recognition module need not be in operating condition for a long time like this, thereby effectively reduce the consumption of wearable equipment of wear-type, improve the duration of wearable equipment of wear-type.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, it will be apparent that the drawings in the following description are only some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1A is a schematic structural diagram of a system architecture according to an embodiment of the present application;
fig. 1B is a schematic structural diagram of a head wearable device according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a hand wearable device according to an embodiment of the present application;
fig. 3 is a flowchart of a gesture recognition method according to an embodiment of the present application.
Fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantageous effects of the present invention more apparent, the present invention will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1A, a schematic structural diagram of a system architecture provided in an embodiment of the present application, where the system architecture includes a head wearable device 100 and a hand wearable device 200. In practical applications, the head wearable device 100 may be smart glasses, such as AR glasses, VR glasses, and the like. The hand wearable device 200 may be a smart watch, smart bracelet, ring, or the like. The head wearable device 100 and the hand wearable device 200 may be directly or indirectly connected through wired or wireless communication, which is not limited herein.
Referring to fig. 1B, which is a schematic structural diagram of a wearable device provided in an embodiment of the present application, the wearable device 100 includes a processor 101, a first wireless module 102, an image sensor 103, and a first gesture recognition module 104.
The first wireless module 102 is configured to receive a trigger instruction sent by the hand wearable device 200, and transmit the trigger instruction to the processor 101, where the trigger instruction is sent when the hand wearable device 200 determines that the hand wearable device is in a preset motion state based on the acquired inertial data; the processor 101 is configured to start the first gesture recognition module 104 after receiving the trigger instruction; the first gesture recognition module 104 is configured to perform gesture recognition according to the media information collected by the image sensor 103, and obtain a gesture recognition result.
The first wireless module 102 may be a WIFI module, a classical bluetooth module, a bluetooth low energy (Bluetooth Low Energy, BLE for short), an LE audio module, a Zigbee module, a near field communication (Near Field Communication, NFC for short), an Ultra Wideband (UWB) module, and the like. The image sensor 103 may be a camera, a depth sensor, an infrared sensor, or the like. The type of media information collected by the image sensor 103 may be images, video, etc. In addition, the head wearable device 100 further includes a memory for storing media information collected by the image sensor 103, and the like. Processor 101 may employ various reduced instruction set computers (Reduced Instruction Set Computer, abbreviated as RISC), digital signal processors (Digital Signal Processing, abbreviated as DSP), central processing units (central processing unit, abbreviated as CPU), and graphics processors (graphics processing unit, abbreviated as GPU), as well as other processor hardware circuitry, etc., as processing units to perform corresponding functions. The processor 101 may include only one processing unit or may include a plurality of processing units.
When the hand wearable device 200 determines that the hand wearable device is in the preset motion state based on the collected inertial data, it is explained that the user may be executing the interactive gesture, so a trigger instruction is sent to the first wireless module 102 of the head wearable device 100, so that the head wearable device 100 starts the first gesture recognition module 104 to perform gesture recognition. The preset motion state can be displacement exceeding a preset value in a period of time, or pose rotating exceeding a preset angle in a period of time, or speed or acceleration exceeding a preset value, etc.
Since the wearable headset 100 performs gesture recognition based on the media information acquired by the image sensor 103 of the wearable headset 100 after receiving the trigger instruction, a gesture recognition result is obtained. Therefore, even if an error occurs when the hand wearable device 200 detects whether itself is in a preset motion state, the accuracy of gesture recognition of the head wearable device 100 is not affected.
In some embodiments, the first gesture recognition module 104 performs gesture recognition on the media information in a model-driven (model-driven) manner or a data-driven (data-driven) manner to obtain a gesture recognition result.
When the model-driven mode is adopted for gesture recognition, the method specifically comprises the following steps: a series of gesture geometry models are generated in advance with hand pose parameters or node positions, and a search space is established, which includes all possible gesture geometry models. And when the gesture is recognized, inquiring a gesture geometric model matched with the image to be recognized from the search space, and taking the hand pose parameters of the matched gesture geometric model as a gesture recognition result.
When the gesture recognition is performed in a data-driven (data-driven) manner, the method specifically comprises the following steps: firstly, acquiring a training sample and a corresponding label, and then, learning a mapping from the training sample to the label by adopting a machine learning algorithm to obtain a gesture recognition model, wherein the machine learning algorithm comprises, but is not limited to, a random forest, a support vector machine, a neural network and the like. When the gesture recognition is carried out, the image to be recognized is input into a gesture recognition model, and a prediction label corresponding to the image to be recognized is output, wherein the prediction label is a gesture recognition result.
In the embodiment of the application, based on the acquired inertial data, the hand wearable device sends a trigger instruction to the head wearable device when determining that the hand wearable device is in a preset motion state. The wearable equipment of wear-type just starts first gesture recognition module and carries out gesture recognition, obtains gesture recognition result for first gesture recognition module need not be in operating condition for a long time like this, thereby effectively reduce the consumption of wearable equipment of wear-type, improve the duration of wearable equipment of wear-type.
Referring to fig. 2, which is a schematic structural diagram of a hand wearable device provided in an embodiment of the present application, the hand wearable device 200 includes a second wireless module 201, an inertial measurement unit 202, and a second gesture recognition module 203.
An inertial measurement unit 202 for acquiring inertial data of the hand wearable device 200; the second gesture recognition module 203 is configured to send a trigger instruction to the wearable headset 100 through the second wireless module 201 when determining that the wearable headset 200 is in a preset motion state according to the inertial data, where the trigger instruction is used to trigger the wearable headset 100 to perform gesture recognition based on the acquired media information, so as to obtain a gesture recognition result.
In particular, the inertial measurement unit (Inertial Measurement Unit, IMU) 202 may be a sensor element such as a gyroscope and/or accelerometer, which tends to have low power consumption. The inertial data of the hand wearable device 200 collected by the inertial measurement unit 202 includes: acceleration and/or angular velocity, etc. The second gesture recognition module 203 recognizes the motion state of the hand wearable device 200 based on the inertial data acquired by the inertial measurement unit 202. When the motion state of the hand wearable device 200 is the preset motion state or the preset gesture, a trigger instruction is sent to the first wireless module 102 of the head wearable device 100 through the second wireless module 201. The second wireless module 201 may be a WIFI module, a classical bluetooth module, a bluetooth low energy (Bluetooth Low Energy, BLE for short), an LE audio module, a Zigbee module, a near field communication (Near Field Communication, NFC for short), an Ultra Wideband (UWB) module, and the like.
As the head wearable device 100 performs gesture interaction with the user, the hand wearable device 200 often also has corresponding gestures or actions (i.e. there is a change in motion state). That is, the corresponding preset motion state of the hand wearable device 200, that is, the preset motion state generated when the user performs the interactive gesture, may be set in advance based on the interactive gesture between the head wearable device 100 and the user. In this way, the user is not required to add additional gesture actions because the hand wearable device 200 is to perform inertial measurement, thereby improving the convenience of the user.
In addition, because the hand wearable device 200 often has corresponding actions (i.e. has a change of a motion state) when the head wearable device 100 performs gesture interaction with a user, therefore, when the hand wearable device 200 determines that the hand wearable device is in a preset motion state based on the acquired inertial data, the user is likely to execute the interaction gesture, so that a trigger instruction is sent to the head wearable device, the head wearable device is triggered to start the first gesture recognition module to perform gesture recognition, and a gesture recognition result is obtained, so that the first gesture recognition module does not need to be in a working state for a long time, thereby effectively reducing the power consumption of the head wearable device and improving the endurance capability of the head wearable device.
In some embodiments, the processor 101 is further configured to: the image sensor 103 is activated before receiving the trigger instruction.
Specifically, in the head wearable device 100, the image sensor 103 is already in an on state before receiving a trigger instruction, acquires an image in real time and saves the most recently acquired N frames of images. After receiving the trigger instruction, the image sensor 103 continues to acquire images. After the first gesture recognition module is started, the first gesture recognition module performs gesture recognition according to N frames of images acquired by the image sensor before receiving the trigger instruction and M frames of images acquired by the image sensor after receiving the trigger instruction, and a gesture recognition result is obtained, wherein N and M are positive integers.
In the embodiment of the application, the gesture recognition is performed by combining the image of the hand or wrist and other targets acquired by the image sensor before the triggering instruction is received and the image of the hand or wrist and other targets acquired by the image sensor after the triggering instruction is received, so that the accuracy of gesture recognition by the first gesture recognition module is improved.
In some embodiments, the processor 101 is further configured to: after receiving the trigger instruction, and before starting the first gesture recognition module 104, the image sensor 103 is started.
Specifically, after receiving the trigger instruction, the wearable headset 100 starts the image sensor 103 to collect an image of a target such as a hand or a wrist, and then starts the first gesture recognition module, performs gesture recognition based on the image collected by the image sensor 103, and obtains a gesture recognition result.
Since the head wearable device 100 turns off the image sensor 103 before receiving the trigger instruction, the power consumption of the image sensor can be effectively reduced.
It should be noted that, the time for activating the image sensor described above is only an example, and the time for activating the image sensor is not specifically limited in the embodiment of the present application.
In some embodiments, the first wireless module 102 is further configured to receive inertial data sent by the hand wearable device. The first gesture recognition module 104 is further configured to perform gesture recognition according to the inertial data and the media information acquired by the image sensor, so as to obtain a gesture recognition result.
Specifically, when the inertial measurement unit 202 in the hand wearable device 200 collects inertial data and determines that the hand wearable device 200 is in a preset motion state based on the inertial data, a trigger instruction is sent to the head wearable device 100 through the second wireless module 201. The inertial data collected by the inertial measurement unit 202 may be transmitted while the trigger instruction is transmitted, or the inertial data collected by the inertial measurement unit 202 may be transmitted after the trigger instruction is transmitted.
The first wireless module 104 in the head wearable device 100 receives the trigger instruction and the inertial data. The first gesture recognition module 104 performs gesture recognition by combining the media information collected by the image sensor 103 and the received inertial data, so as to obtain a gesture recognition result, thereby improving the accuracy of gesture recognition by the wearable device.
In some embodiments, the hand wearable device 200 determines a first motion state of a hand or wrist wearing the hand wearable device 200 based on inertial data acquired by the inertial measurement unit 202. After the first wireless module 104 in the wearable device 100 receives the trigger instruction, the first gesture recognition module 104 determines a second motion state of the target within the acquisition view angle of the image sensor 103 based on the media information acquired by the image sensor 103.
When the difference between the first motion state and the second motion state is smaller than the preset threshold, it is indicated that the object in the acquisition view angle of the image sensor 103 is the hand or the wrist of the hand wearable device 200, so gesture recognition is further performed based on the media information acquired by the image sensor 103, and a gesture recognition result is obtained. When the difference between the first motion state and the second motion state is greater than or equal to the preset threshold, it is indicated that the target in the acquisition view angle of the image sensor 103 is not the hand or the wrist of the wearable device 200, and gesture recognition is not performed subsequently, so that other targets such as the interference hand and the interference wrist are prevented from interfering gesture recognition and interaction, and the safety of gesture interaction of the user is improved.
In some embodiments, the first gesture recognition module 104 is further configured to communicate the gesture recognition result to the processor. The processor 101 is further configured to perform a response action according to the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
Specifically, the head wearable device 100 may perform a responsive action directly based on the gesture recognition result. For example, when the smart glasses recognize the mode switching gesture, the smart glasses switch from the current mode to the target mode.
The wearable headset 100 may also control other smart devices to perform response actions based on the gesture recognition result, where the other smart devices may be the wearable handheld device 200, or may be devices such as a smart television and a smart speaker. For example, when the gesture recognition result is determined to be the gesture for starting the intelligent sound box, the intelligent glasses send a starting instruction to the intelligent sound box so that the intelligent sound box is started.
In this embodiment of the application, when the wearable equipment of hand is in the preset motion state based on the inertial data who gathers, send trigger instruction to the wearable equipment of head to make the wearable equipment of head start first gesture recognition module carry out gesture recognition, obtain gesture recognition result, and carry out response action based on gesture recognition result, perhaps control other intelligent equipment and carry out response action, make first gesture recognition module need not be in operating condition for a long time like this, thereby effectively reduce the consumption of the wearable equipment of head, improve the duration of the wearable equipment of head.
Referring to fig. 3, a schematic flow chart of a gesture recognition method is provided for an embodiment of the present application, where the flow chart of the method is interactively performed by a hand wearable device and a head wearable device, and the method includes the following steps:
in step S301, the hand wearable device collects own inertial data.
Specifically, the hand wearable device collects inertial data of the hand wearable device through an inertial measurement unit, which may be a sensor element such as a gyroscope and/or an accelerometer, the inertial data including: and information such as acceleration and/or angular velocity and the like.
Step S302, when the hand wearable device is in a preset motion state according to the inertial data, sending a trigger instruction to the head wearable device.
Specifically, the motion state of the hand wearable device is determined according to the inertial data through the second gesture recognition module. When the hand wearable device is in a preset motion state, a trigger instruction is sent to the head wearable device through the second wireless module, and the trigger instruction is used for prompting that the user of the head wearable device is executing or possibly executing interactive gestures.
In step S303, the wearable device responds to the trigger instruction, and performs gesture recognition based on the collected media information to obtain a gesture recognition result.
Specifically, the wearable head device receives a trigger instruction sent by the wearable hand device through the first wireless module, and then starts the first gesture recognition module. The first gesture recognition module performs gesture recognition according to the media information acquired by the image sensor to obtain a gesture recognition result, wherein the wearable headset device can start the image sensor before or after receiving the trigger instruction.
In the embodiment of the application, based on the acquired inertial data, the hand wearable device sends a trigger instruction to the head wearable device when determining that the hand wearable device is in a preset motion state. The wearable equipment of wear-type just starts gesture recognition module and carries out gesture recognition, obtains gesture recognition result for gesture recognition module need not be in operating condition for a long time like this, thereby effectively reduce the consumption of wearable equipment of wear-type, improve the duration of wearable equipment of wear-type.
In some embodiments, the hand wearable device further transmits inertial data acquired by the inertial measurement unit from the hand wearable device to the head wearable device through the first wireless module. The wearable head device performs gesture recognition based on the inertial data and the acquired media information to obtain a gesture recognition result.
Because the hand wearable device is related to the interactive gestures executed by the user through the inertial data acquired by the inertial measurement unit, the hand wearable device performs gesture recognition based on the inertial data acquired by the inertial measurement unit and the media information acquired by the image sensor, so that a gesture recognition result is obtained, and the accuracy of the hand wearable device in gesture recognition can be effectively improved.
In some embodiments, the wearable head device responds to the trigger instruction, performs gesture recognition based on the acquired media information, and performs a response action according to the gesture recognition result after obtaining the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
Specifically, the wearable head-mounted device can directly execute corresponding response actions based on gesture recognition results, and can also control other intelligent devices to execute corresponding response actions based on gesture recognition results so as to meet gesture interaction requirements in different scenes, and the application range is wide.
Based on the same technical concept, the embodiment of the present application provides a computer device, which may be the head wearable device and/or the hand wearable device shown in fig. 1A, as shown in fig. 4, including at least one processor 401, and a memory 402 connected to the at least one processor, where a specific connection medium between the processor 401 and the memory 402 is not limited in the embodiment of the present application, and in fig. 4, the processor 401 and the memory 402 are connected by a bus, for example. The buses may be divided into address buses, data buses, control buses, etc.
In the embodiment of the present application, the memory 402 stores instructions executable by the at least one processor 401, and the at least one processor 401 may perform the steps of the gesture recognition method by executing the instructions stored in the memory 402.
Where the processor 401 is a control center of a computer device, various interfaces and lines may be utilized to connect various portions of the computer device, through execution or execution of instructions stored in the memory 402 and invocation of data stored in the memory 402, to effect gesture recognition. Alternatively, the processor 401 may include one or more processing units, and the processor 401 may integrate an application processor and a modem processor, wherein the application processor mainly processes an operating system, a user interface, an application program, etc., and the modem processor mainly processes wireless communication. It will be appreciated that the modem processor described above may not be integrated into the processor 401. In some embodiments, processor 401 and memory 402 may be implemented on the same chip, and in some embodiments they may be implemented separately on separate chips.
The processor 401 may be a general purpose processor such as a Central Processing Unit (CPU), digital signal processor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array or other programmable logic device, discrete gate or transistor logic, discrete hardware components, which may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
Based on the same inventive concept, embodiments of the present application provide a computer-readable storage medium storing a computer program executable by a computer device, which when run on the computer device, causes the computer device to perform the steps of the gesture recognition method described above.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, or as a computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer device or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer device or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer device or other programmable apparatus to produce a computer device implemented process such that the instructions which execute on the computer device or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
Claims (10)
1. A head-mounted wearable device, comprising:
processor, first wireless module, image sensor and first gesture recognition module
The first wireless module is used for receiving a trigger instruction sent by the hand wearable device and transmitting the trigger instruction to the processor, wherein the trigger instruction is sent when the hand wearable device determines that the hand wearable device is in a preset motion state based on acquired inertial data;
the processor is used for starting the first gesture recognition module after receiving the trigger instruction;
the first gesture recognition module is used for performing gesture recognition according to the media information acquired by the image sensor to obtain a gesture recognition result.
2. The device of claim 1, wherein the processor is further configured to:
after receiving the trigger instruction, and before starting the first gesture recognition module, starting the image sensor.
3. The device of claim 1, wherein the processor is further configured to:
the image sensor is activated before receiving the trigger instruction.
4. The device of claim 3, wherein the media information comprises:
n frames of images acquired by the image sensor before receiving the trigger instruction and M frames of images acquired by the image sensor after receiving the trigger instruction, wherein N and M are positive integers.
5. The device of claim 1, wherein the first wireless module is further to receive the inertial data sent by the hand wearable device;
the first gesture recognition module is further configured to perform gesture recognition according to the inertial data and the media information acquired by the image sensor, so as to obtain a gesture recognition result.
6. The device of claim 1, wherein the first gesture recognition module is further to communicate the gesture recognition result to the processor;
the processor is further used for executing response actions according to the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
7. A hand wearable device, comprising:
the system comprises a second wireless module, an inertial measurement unit and a second gesture recognition module;
the inertial measurement unit is used for acquiring inertial data of the hand wearable device;
the second gesture recognition module is used for sending a trigger instruction to the wearable head device through the second wireless module when the wearable hand device is in a preset motion state according to the inertial data, wherein the trigger instruction is used for triggering the wearable head device to perform gesture recognition based on the acquired media information, and a gesture recognition result is obtained.
8. A gesture recognition method applied to a head-mounted wearable device, comprising:
receiving a trigger instruction sent by hand wearable equipment, wherein the trigger instruction is sent when the hand wearable equipment determines that the hand wearable equipment is in a preset motion state based on acquired inertial data;
and responding to the triggering instruction, and carrying out gesture recognition based on the acquired media information to obtain a gesture recognition result.
9. The method of claim 8, wherein responding to the trigger instruction, performing gesture recognition based on the collected media information, and after obtaining the gesture recognition result, further comprising:
executing response actions according to the gesture recognition result; or sending a control instruction to other intelligent devices according to the gesture recognition result so as to trigger the other intelligent devices to execute response actions.
10. A gesture recognition method applied to a hand wearable device, comprising:
acquiring inertial data of the hand wearable device;
according to the inertial data, when the hand wearable device is determined to be in a preset motion state, a trigger instruction is sent to the head wearable device, and the trigger instruction is used for triggering the head wearable device to perform gesture recognition based on the acquired media information, so that a gesture recognition result is obtained.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310127005.XA CN116009702A (en) | 2023-02-06 | 2023-02-06 | Head wearable device, hand wearable device and gesture recognition method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310127005.XA CN116009702A (en) | 2023-02-06 | 2023-02-06 | Head wearable device, hand wearable device and gesture recognition method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116009702A true CN116009702A (en) | 2023-04-25 |
Family
ID=86026905
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310127005.XA Pending CN116009702A (en) | 2023-02-06 | 2023-02-06 | Head wearable device, hand wearable device and gesture recognition method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116009702A (en) |
-
2023
- 2023-02-06 CN CN202310127005.XA patent/CN116009702A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA3105663C (en) | Service processing method and related apparatus | |
US10007349B2 (en) | Multiple sensor gesture recognition | |
US11031005B2 (en) | Continuous topic detection and adaption in audio environments | |
CN111400605A (en) | Recommendation method and device based on eyeball tracking | |
JP2024533962A (en) | An electronic device for tracking objects | |
CN112788583B (en) | Equipment searching method and device, storage medium and electronic equipment | |
CN118103799A (en) | User interaction with remote devices | |
US12056286B2 (en) | Electronic device for providing augmented reality service and operating method thereof | |
TWI653546B (en) | Virtual reality system with outside-in tracking and inside-out tracking and controlling method thereof | |
US11908175B2 (en) | Electronic device training image recognition model and operation method for same | |
US20240362743A1 (en) | Late warping to minimize latency of moving objects | |
US20210064221A1 (en) | Content processing method and electronic device for supporting same | |
CN111880647B (en) | Three-dimensional interface control method and terminal | |
CN113766127A (en) | Control method and device of mobile terminal, storage medium and electronic equipment | |
CN116009702A (en) | Head wearable device, hand wearable device and gesture recognition method | |
CN116711302A (en) | Method and device for transmitting multiple application data with low time delay | |
CN115633120B (en) | Interaction method and device | |
US12067693B2 (en) | Late warping to minimize latency of moving objects | |
US20230342026A1 (en) | Gesture-based keyboard text entry | |
CN114647300B (en) | System control method and device, wearable device and storage medium | |
US20230418385A1 (en) | Low-power hand-tracking system for wearable device | |
CN109144234A (en) | Virtual reality system and its control method with external tracking and built-in tracking | |
CN118113142A (en) | Man-machine interaction method, communication device and electronic equipment | |
CN117784936A (en) | Control method, terminal device, wearable device, communication system and storage medium | |
CN116560495A (en) | Media information processing method and head wearable device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |