CN116466820A - Identification method for hand position and action and its flying mark gesture control micro system - Google Patents

Identification method for hand position and action and its flying mark gesture control micro system Download PDF

Info

Publication number
CN116466820A
CN116466820A CN202310381105.5A CN202310381105A CN116466820A CN 116466820 A CN116466820 A CN 116466820A CN 202310381105 A CN202310381105 A CN 202310381105A CN 116466820 A CN116466820 A CN 116466820A
Authority
CN
China
Prior art keywords
human hand
point
distance
hand
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310381105.5A
Other languages
Chinese (zh)
Inventor
怯肇乾
王立娟
徐鹏
孙建言
陶晓霞
翟悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN202310381105.5A priority Critical patent/CN116466820A/en
Publication of CN116466820A publication Critical patent/CN116466820A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0383Signal control means within the pointing device
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses a method for identifying the position and the action of a human hand and a micro system for controlling a dart gesture thereof, which comprises the following steps: three reference detection points of a coordinate system and distance measurement values of the three reference detection points to a measured human hand are formed based on a known position relation on the same plane, so that space position coordinates of the human hand in the coordinate system are obtained; based on the position coordinates of the human hand, identifying the movement of the human hand according to a criterion; the coordinate positions of the three reference detection points have the following relation: the three points include point a, point B and point C; the distance AB between the point A and the point B is equal to the distance AC between the point A and the point C, and is marked as d; AB is in the horizontal direction, AC is vertical AB, and three points constitute the observation plane XAY. According to the invention, TOF gesture recognition is adopted, aiming at action delay, an algorithm is simplified, the method is easier to realize on hardware and software, laser femtosecond TOF ranging data with low price is used for calculating the position (x, y, z) of the indication finger and continuous action thereof.

Description

Identification method for hand position and action and its flying mark gesture control micro system
Technical Field
The invention belongs to the field of full-automatic products, and relates to a method for identifying the position and the action of a human hand and a micro system for controlling a flying mark gesture.
Background
There are three main classes of related gesture recognition products: camera recognition, infrared recognition, and TOF (Time of flight) recognition. The camera shooting identification is high in cost and is mainly used in high-grade conferences and forum occasions. Infrared identification is cost effective, but typically the identification distance is only 15cm. TOF recognition is proposed by responsible persons in the last year, namely TOF ranging, namely laser femtosecond ranging is mainly realized through BP-NN artificial intelligent algorithm and intelligent fuzzy judgment thereof, the recognition distance is 0-60cm, and the method is economical and practical, but delay is added when the action is fast.
Disclosure of Invention
In order to solve the problems, the technical scheme provided by the invention is as follows: a method of identifying the position and motion of a human hand, comprising the steps of:
three reference detection points of a coordinate system and distance measurement values of the three reference detection points to a measured human hand are formed based on a known position relation on the same plane, so that space position coordinates of the human hand in the coordinate system are obtained;
and further, based on the position coordinates of the human hand, the motion of the human hand is identified according to the criterion.
Further: the coordinate positions of the three reference detection points have the following relation:
the three points comprise a point A, a point B and a point C; the distance AB between the point A and the point B is equal to the distance AC between the point A and the point C, and is marked as d;
AB is in the horizontal direction, AC is vertical AB, and three points constitute the observation plane XAY.
Further: the movement of the human hand includes clicking, double clicking, up-and-down movement and left-and-right movement.
Further: the determination of the hand space position coordinates comprises the following steps:
the measurement plane X ' A ' Y ' where the spatial position coordinates M of the human hand are located is parallel to the observation plane XAY;
the points A, B and C have projection position coordinates A '(0, z), B' (d, 0, z) and C '(0, d, z) on a measurement plane X' A 'Y';
the measuring plane X 'A' Y 'is parallel to the observing plane XAY, and the vertical distance between the measuring plane X' A 'Y' and the observing plane XAY is z;
the distance between the point M and the point A, the point B and the point C of the hand position is the distance l A Distance l B Distance l C
Distance l A Distance l B Distance l C Projection onto the measuring plane X ' A ' Y ', i.e. the observation radius r A 、r B、 r C
The measurement plane coordinate system X ' A ' Y ' translates to the point B ', and the position M of the hand position coordinate M under X ' B ' Y ' is measured B (x B ,y,z);
The measurement plane coordinate system X 'A' Y 'translates to the C' point, and the position coordinate M of the human hand is the position M under the X 'C' Y C (xy C ,z);
The relation of translation coordinates is:
x B =x-d (1)
y C =y-d (2)
observation distance l A 、l B 、l C The respective positional relationships with the observation points in the measurement plane coordinate systems X ' A ' Y ', X ' B ' Y ', X ' C ' Y ' are as follows:
from equations (1) to (5), d, l are known A 、l B 、l C And (3) obtaining:
further: based on the hand space position coordinates, the process of recognizing the hand movement is as follows according to criteria:
based on the position coordinates (x, y, z) of the human hand, the current z value z is calculated within 100ms i And a maximum z value z mx In comparison, z mx Half the arm length, when the current z value z i Change to z max And then from z max Current z value z before changing back i And the hand is kept for 300ms, judging that the gesture motion is clicking;
when the hand movement generates two continuous clicking actions, judging that the gesture movement is double clicking;
when the position coordinates (x, y, z) of the human hand are within 100ms, the coordinate change range of the x value and the z value is within 100mm, the y coordinate size is changed to be more than half of the arm length, then the human hand is kept for 300ms, and then the human hand is judged to move upwards or downwards;
when the position coordinates (x, y, z) of the human hand are within 100ms, the coordinate change range of the y value and the z value is within 100mm, the x coordinate size is changed to be more than half of the arm length, then the human hand is kept for 300ms, and the human hand is judged to move leftwards or rightwards.
A micro system for controlling a dart gesture comprises a picking and transmitting terminal, a receiving and rotating terminal and gesture follow-up window control software;
the picking and sending terminal comprises three TOF sensors for detecting the distance of a human hand, a first MCU for calculating and identifying the real-time space position and movement trend data of the human hand and a first wireless module for transmitting the real-time space position and movement trend data of the human hand;
one end of each TOF sensor is connected with one end of the first MCU controller, and the other end of the first MCU controller is connected with one end of the first wireless module.
The receiving and transferring terminal comprises a second wireless module for receiving the real-time space position and the movement trend data of the hand movement transmitted by the first wireless module;
the second MCU controller is used for receiving real-time space position and movement trend data of human hand movement transmitted by the second wireless module and converting the data into operation instruction data of a traditional mouse and a keyboard;
and the USB drive interface is used for receiving the traditional mouse and keyboard operation instruction data transmitted by the second MCU controller and transmitting the data.
And the computer operating system is used for receiving the operation instruction data of the traditional mouse and the keyboard transmitted by the USB drive interface of the receiving and converting terminal, identifying the hand position and the movement trend data, completing the interactive control operation of the traditional mouse and the keyboard, and realizing gesture follow-up window switching by means of window control display application software.
The identification method of the hand position and the action and the femto-mark gesture control microsystem thereof adopt TOF gesture identification, simplify the algorithm aiming at action delay, are easier to realize on hardware and software, particularly directly output instant positions (x, y and z), and directly and intelligently judge various gestures (clicking, double clicking, up-down, left-right movement directions) by continuous positions in a fuzzy way. Has the following advantages: the method is characterized in that the position (x, y, z) of the indication finger and the action (click, double click, up-down, left-right movement direction) of the indication finger are calculated based on the cheap laser femtosecond TOF ranging data, and then the indication finger is transmitted to a computer in a mouse data expansion package and wireless mode, so that the control function of human-computer interaction of the traditional desktop mouse is realized, and the centralized functions (page turning, document switching, compiling, deployment and the like) of various common office and special software for gesture control are expanded. The difference between the fly mark and the traditional desktop mouse is that: the mouse can realize continuous operation or operations of more expansion buttons, operation balls and the like, and a certain gesture is expressed, so that corresponding centralized operation of the computer is formed; the main application occasions are as follows: class, meeting, forum, exhibition, etc., are popular.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram I of the process of the present invention;
FIG. 2 is a schematic diagram II of the process of the present invention;
FIG. 3 is a schematic diagram III of the process of the present invention;
FIG. 4 is a block diagram of a dart gesture manipulation microsystem;
FIG. 5 is a schematic diagram of a dart gesture manipulation microsystem.
Detailed Description
It should be noted that, without conflict, the embodiments of the present invention and features in the embodiments may be combined with each other, and the present invention will be described in detail below with reference to the drawings and the embodiments.
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments of the present invention. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present invention. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
The relative arrangement of the parts and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise. Meanwhile, it should be clear that the dimensions of the respective parts shown in the drawings are not drawn in actual scale for convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
In the description of the present invention, it should be understood that the azimuth or positional relationships indicated by the azimuth terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal", and "top, bottom", etc., are generally based on the azimuth or positional relationships shown in the drawings, merely to facilitate description of the present invention and simplify the description, and these azimuth terms do not indicate and imply that the apparatus or elements referred to must have a specific azimuth or be constructed and operated in a specific azimuth, and thus should not be construed as limiting the scope of protection of the present invention: the orientation word "inner and outer" refers to inner and outer relative to the contour of the respective component itself.
Spatially relative terms, such as "above … …," "above … …," "upper surface at … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial location relative to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "above" or "over" other devices or structures would then be oriented "below" or "beneath" the other devices or structures. Thus, the exemplary term "above … …" may include both orientations of "above … …" and "below … …". The device may also be positioned in other different ways (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
In addition, the terms "first", "second", etc. are used to define the components, and are only for convenience of distinguishing the corresponding components, and the terms have no special meaning unless otherwise stated, and therefore should not be construed as limiting the scope of the present invention.
FIG. 1 is a schematic diagram I of the process of the present invention;
FIG. 2 is a schematic diagram II of the process of the present invention;
FIG. 3 is a schematic diagram III of the process of the present invention;
a method of identifying the position and motion of a human hand, comprising the steps of:
s1: three reference detection points of a coordinate system and distance measurement values of the three reference detection points to a measured human hand are formed based on a known position relation on the same plane, so that space position coordinates of the human hand in the coordinate system are obtained;
s2: based on the position coordinates of the human hand, identifying the movement of the human hand according to a criterion; the movement of the human hand includes clicking, double clicking, up-and-down movement and left-and-right movement.
Step S1/S2 is sequentially executed;
further: the coordinate positions of the three reference detection points have the following relation:
the three points comprise a point A, a point B and a point C; the distance AB between the point A and the point B is equal to the distance AC between the point A and the point C, and is marked as d;
AB is in the horizontal direction, AC is vertical AB, and three points constitute the observation plane XAY.
Further: the determination of the hand space position coordinates comprises the following steps:
the measurement plane X ' A ' Y ' where the spatial position coordinates M of the human hand are located is parallel to the observation plane XAY;
the points A, B and C have projection position coordinates A '(0, z), B' (d, 0, z) and C '(0, d, z) on a measurement plane X' A 'Y';
the measuring plane X 'A' Y 'is parallel to the observing plane XAY, and the vertical distance between the measuring plane X' A 'Y' and the observing plane XAY is z;
the distance between the point M and the point A, the point B and the point C of the hand position is the distance l A Distance l B Distance l C
Distance l A Distance l A Distance l C Projection onto the measuring plane X ' A ' Y ', i.e. the observation radius r A 、r B 、r C
The measurement plane coordinate system X ' A ' Y ' translates to the point B ', and the position M of the hand position coordinate M under X ' B ' Y ' is measured B (x B ,y,z);
The measurement plane coordinate system X 'A' Y 'translates to the C' point, and the position coordinate M of the human hand is the position M under the X 'C' Y C (x,y C ,z);
The relation of translation coordinates is:
x B =x-d (1)
y C =y-d (2)
observation distance l A 、l B 、l C The respective positional relationships with the observation points in the measurement plane coordinate systems X ' A ' Y ', X ' B ' Y ', X ' C ' Y ' are as follows:
from equations (1) to (5), d, l are known A 、l B 、l C And (3) obtaining:
further, the process of recognizing the hand movement based on the hand space position coordinates according to the criterion is as follows:
based on the position coordinates (x, y, z) of the human hand, the current z value z is calculated within 100ms i And a maximum z value z mx In comparison, z mx Half the arm length, when the current z value z i Change to z max And then from z max Current z value z before changing back i And the hand is kept for 300ms, judging that the gesture motion is clicking;
when the hand movement generates two continuous clicking actions, judging that the gesture movement is double clicking;
when the position coordinates (x, y, z) of the human hand are within 100ms, the coordinate change range of the x value and the z value is within 100mm, the y coordinate size is changed to be more than half of the arm length, then the human hand is kept for 300ms, and then the human hand is judged to move upwards or downwards;
when the position coordinates (x, y, z) of the human hand are within 100ms, the coordinate change range of the y value and the z value is within 100mm, the x coordinate size is changed to be more than half of the arm length, then the human hand is kept for 300ms, and the human hand is judged to move leftwards or rightwards.
FIG. 4 is a block diagram of a dart gesture manipulation microsystem;
FIG. 5 is a schematic diagram of a dart gesture manipulation microsystem.
A micro-system is controlled to dart gesture, characterized in that: the system comprises a picking and transmitting terminal, a receiving and converting terminal and gesture follow-up window control software;
the picking and sending terminal comprises three TOF sensors for detecting the distance of a human hand, a first MCU for calculating and identifying the real-time space position and movement trend data of the human hand and a first wireless module for transmitting the real-time space position and movement trend data of the human hand;
one end of each TOF sensor is connected with one end of the first MCU controller, and the other end of the first MCU controller is connected with one end of the first wireless module.
The first MCU controller operates the identification method of the hand position and the action to calculate and identify the real-time space position and the movement trend data of the hand movement;
the receiving and transferring terminal comprises a second wireless module for receiving the real-time space position and the movement trend data of the hand movement transmitted by the first wireless module;
the second MCU controller is used for receiving real-time space position and movement trend data of human hand movement transmitted by the second wireless module and converting the data into operation instruction data of a traditional mouse and a keyboard;
and the USB drive interface is used for receiving the traditional mouse and keyboard operation instruction data transmitted by the second MCU controller and transmitting the data.
And the computer operating system is used for receiving the operation instruction data of the traditional mouse and the keyboard transmitted by the USB drive interface of the receiving and converting terminal, identifying the hand position and the movement trend data, completing the interactive control operation of the traditional mouse and the keyboard, and realizing gesture follow-up window switching by means of window control display application software.
The USB driving module of the computer operating system receives data for controlling the movement of a traditional mouse or the operation of a keyboard, so that corresponding man-machine interaction is realized by means of the operating system, each window switching of gesture follow-up is realized, and micro-transparent dynamic indication is hidden on the accompanying window.
The frequency of the ranging TOF sensor is 0.1-20Hz, the angle is 25/27, and the blind area is formed: 0-40 mm, distance 400-16000 mm, resolution: 10-10 mm;
models used for TOF sensors, such as VL53L0X or VL53L1X;
the model numbers of the first MUC controller and the second MUC controller adopt Corte-M33 devices;
the wireless communication mode of the acquisition and transmission terminal comprises the following steps: loRa, BT, wiFi and the like;
the computer USB drive in the figure is typically self-contained in the operating system. The short-distance wireless communication can be in a Long Range wireless LoRa communication (Long Range Radio) form with strong interference resistance, large transmission distance, strong wall penetrating capability and low power consumption (transmitting maximum power consumption of 35 mA), or in a BT, wiFi form and the like.
The computer operating system comprises window control display software, and alternatively, the window control display software is not used, namely common mouse interactive control; with the method, gestures can be identified, and expansion centralized control is realized; the gesture-following window switching method can realize gesture-following window switching, and can complete basic PPT slide selection and play, remote switching page turning operation of common office software including office (word, excel, PPT and the like) and acrobat (pdf electronic document), and remote research and development demonstration covering integrated development environments including eclipse, notepad and the like along with micro-transparent dynamic indication implicit on the window.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (6)

1. A method for identifying the position and motion of a human hand, comprising the steps of:
three reference detection points of a coordinate system and distance measurement values of the three reference detection points to a measured human hand are formed based on a known position relation on the same plane, so that space position coordinates of the human hand in the coordinate system are obtained;
and further, based on the position coordinates of the human hand, the motion of the human hand is identified according to the criterion.
2. A method of identifying the position and motion of a human hand as claimed in claim 1 wherein: the coordinate positions of the three reference detection points have the following relation:
the three points comprise a point A, a point B and a point C; the distance AB between the point A and the point B is equal to the distance AC between the point A and the point C, and is marked as d;
AB is in the horizontal direction, AC is vertical AB, and three points constitute the observation plane XAY.
3. A method of identifying the position and motion of a human hand as claimed in claim 1 wherein: the movement of the human hand includes clicking, double clicking, up-and-down movement and left-and-right movement.
4. A method of identifying the position and motion of a human hand as claimed in claim 2 wherein: the determination of the hand space position coordinates comprises the following steps:
the measurement plane X ' A ' Y ' where the spatial position coordinates M of the human hand are located is parallel to the observation plane XAY;
the points A, B and C have projection position coordinates A '(0, z), B' (d, 0, z) and C '(0, d, z) on a measurement plane X' A 'Y';
the measuring plane X 'A' Y 'is parallel to the observing plane XAY, and the vertical distance between the measuring plane X' A 'Y' and the observing plane XAY is z;
the distance between the point M and the point A, the point B and the point C of the hand position is the distance l A Distance l B Distance l C
Distance l A Distance l B Distance l C Projection onto the measuring plane X ' A ' Y ', i.e. the observation radius r A 、r B 、r c
Measuring plane seatThe standard system X 'A' Y 'translates to the point B', and the position M of the hand position coordinate M under the X 'B' Y B (x B ,y,z);
The measurement plane coordinate system X 'A' Y 'translates to the C' point, and the position coordinate M of the human hand is the position M under the X 'C' Y c (x,y c ,z);
The relation of translation coordinates is:
x B =x-d (1)
y c =y-d (2)
observation distance l A 、l B 、l C The respective positional relationships with the observation points in the measurement plane coordinate systems X ' A ' Y ', X ' B ' Y ', X ' C ' Y ' are as follows:
from equations (1) to (5), d, l are known A 、l B 、l C And (3) obtaining:
5. a method of identifying the position and motion of a human hand as claimed in claim 1 wherein: the process of recognizing the hand movement based on continuous hand space position coordinate judgment is as follows:
based on the position coordinates (x, y, z) of the human hand, the current z value z is calculated within 100ms i And a maximum z value z mx Comparison with each otherIn comparison, z mx Half the arm length, when the current z value z i Change to z max And then from z max Current z value z before changing back i And the hand is kept for 300ms, judging that the gesture motion is clicking;
when the hand movement generates two continuous clicking actions, judging that the gesture movement is double clicking;
when the position coordinates (x, y, z) of the human hand are within 100ms, the coordinate change range of the x value and the z value is within 100mm, the y coordinate size is changed to be more than half of the arm length, then the human hand is kept for 300ms, and then the human hand is judged to move upwards or downwards;
when the position coordinates (x, y, z) of the human hand are within 100ms, the coordinate change range of the y value and the z value is within 100mm, the x coordinate size is changed to be more than half of the arm length, then the human hand is kept for 300ms, and the human hand is judged to move leftwards or rightwards.
6. A micro-system is controlled to dart gesture, characterized in that: the system comprises a picking and transmitting terminal, a receiving and converting terminal and gesture follow-up window control software;
the picking and sending terminal comprises three TOF sensors for detecting the distance of a human hand, a first MCU for calculating and identifying the real-time space position and movement trend data of the human hand and a first wireless module for transmitting the real-time space position and movement trend data of the human hand;
one end of each TOF sensor is connected with one end of the first MCU controller, and the other end of the first MCU controller is connected with one end of the first wireless module;
the receiving and transferring terminal comprises a second wireless module for receiving the real-time space position and the movement trend data of the hand movement transmitted by the first wireless module;
the second MCU controller is used for receiving real-time space position and movement trend data of human hand movement transmitted by the second wireless module and converting the data into operation instruction data of a traditional mouse and a keyboard;
a USB drive interface for receiving and transmitting the operation instruction data of the traditional mouse and the keyboard transmitted by the second MCU controller;
and the computer operating system is used for receiving the operation instruction data of the traditional mouse and the keyboard transmitted by the USB drive interface of the receiving and converting terminal, identifying the hand position and the movement trend data, completing the interactive control operation of the traditional mouse and the keyboard, and realizing gesture follow-up window switching by means of window control display application software.
CN202310381105.5A 2023-04-11 2023-04-11 Identification method for hand position and action and its flying mark gesture control micro system Pending CN116466820A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310381105.5A CN116466820A (en) 2023-04-11 2023-04-11 Identification method for hand position and action and its flying mark gesture control micro system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310381105.5A CN116466820A (en) 2023-04-11 2023-04-11 Identification method for hand position and action and its flying mark gesture control micro system

Publications (1)

Publication Number Publication Date
CN116466820A true CN116466820A (en) 2023-07-21

Family

ID=87174617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310381105.5A Pending CN116466820A (en) 2023-04-11 2023-04-11 Identification method for hand position and action and its flying mark gesture control micro system

Country Status (1)

Country Link
CN (1) CN116466820A (en)

Similar Documents

Publication Publication Date Title
CN102253713B (en) Towards 3 D stereoscopic image display system
CN101943946B (en) Two-dimensional image force touch reproducing control method and system based on three-dimensional force sensor
US8830189B2 (en) Device and method for monitoring the object's behavior
US7598942B2 (en) System and method for gesture based control system
KR101705924B1 (en) Spatial, Multi-Modal Control Device for Use with Spatial Operating System
CN203300127U (en) Children teaching and monitoring robot
CN103294177A (en) Cursor moving control method and system
US20150029402A1 (en) Remote controller, system, and method for controlling remote controller
Jingqiu et al. An ARM-based embedded gesture recognition system using a data glove
KR20110022057A (en) Gesture-based control system for vehicle interfaces
CN103713741B (en) A kind of method controlling display wall based on Kinect gesture
CN102081506A (en) Gesture input method of remote control
CN106933385B (en) A kind of implementation method of the low-power consumption sky mouse pen based on three-dimensional ultrasonic positioning
Wang et al. Wheeled robot control based on gesture recognition using the Kinect sensor
CN105204764B (en) Handheld terminal with suspension screen, display device and remote control method
US9292106B2 (en) Interface apparatus using motion recognition, and method for controlling same
CN107102750B (en) The selection method of target in a kind of virtual three-dimensional space based on pen type interactive system
CN113569635A (en) Gesture recognition method and system
CN116466820A (en) Identification method for hand position and action and its flying mark gesture control micro system
Rao et al. Dual sensor based gesture robot control using minimal hardware system
CN107367966A (en) Man-machine interaction method and device
Yu et al. A multi-sensor gesture interaction system for human-robot cooperation
JP5788853B2 (en) System and method for a gesture-based control system
CN105700707B (en) A kind of double-deck cursor towards big screen display device clicks exchange method
Shruti et al. Arduino Based Hand Gesture Controlled Robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination