CN102402290A - Method and system for identifying posture of body - Google Patents

Method and system for identifying posture of body Download PDF

Info

Publication number
CN102402290A
CN102402290A CN2011104033689A CN201110403368A CN102402290A CN 102402290 A CN102402290 A CN 102402290A CN 2011104033689 A CN2011104033689 A CN 2011104033689A CN 201110403368 A CN201110403368 A CN 201110403368A CN 102402290 A CN102402290 A CN 102402290A
Authority
CN
China
Prior art keywords
limbs
freedom
degree
sensor
hardware device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011104033689A
Other languages
Chinese (zh)
Inventor
詹姆斯·刘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING INSENTEK TECHNOLOGY CO., LTD.
Original Assignee
BEIJING INSENTEK TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING INSENTEK TECHNOLOGY Co Ltd filed Critical BEIJING INSENTEK TECHNOLOGY Co Ltd
Priority to CN2011104033689A priority Critical patent/CN102402290A/en
Publication of CN102402290A publication Critical patent/CN102402290A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to a method and a system for identifying the posture of a body. The method comprises the following steps of: A, respectively acquiring motion parameters of a plurality of parts of the body and sensing parameters of hardware equipment; B, according to an angle difference value which is obtained by comparing the motion parameters of all parts of the body, acquiring the position change information of the body; C, according to the position change information of the body and the angle difference value of the sensing parameters of the hardware equipment, determining the motion information of the body relative to the hardware equipment; and D, identifying the motion information of the body relative to the hardware equipment, and outputting the motion information which serves as monitoring information. In the method for identifying the posture of the body provided by the invention, a sensor device is used for identifying the posture of the body, and simultaneously optical equipment can be externally connected for auxiliarily monitoring the position of the body relative to the hardware equipment. By adoption of the method and the system for identifying the posture of the body provided by the invention, the speed for identifying the posture of the body is improved, different parts of the body can be identified at the same time, and certain flexibility is realized.

Description

A kind of limbs posture identification method and system
Technical field
The present invention relates to the cognitive method of a kind of limbs attitude and position, be specifically related to a kind of perception limbs posture and its method and system of discerning.
Background technology
In recent years, along with popularizing and development of multimedia technology, people are carrying out unremitting exploration to the novel human-machine interaction technology.Use limbs, gesture etc. intuitively mode accomplish the operation of computing machine, become a hot technology.And people's limbs are execution mechanisms of a kind of complicacy, and its flexibility ratio is high, expressive force is abundant and can accomplish meticulous operation, but these characteristics also make the recognition and tracking of its attitude become the significant challenge in the computer research.
Therefore, meet sword and give birth to through convenience, advanced person, reliable man-machine interactive system that various high-tech means realize, a lot of electronic products salable also are owing to outstanding man-machine interaction means produce huge economic benefit.Such as the WII game machine of Nintendo, its man-machine interaction means have been taked through acceleration (inclination angle) the sensor intervener of game machine telepilot inside and the interactive mode of recreation, thereby have defeated other technologies, have given prominence to the advanced person of Nintendo's technology; And IPHONE, the IPAD of the X-BOX of the PLAYSTATION III of Sony Corporation, Microsoft and U.S. APPLE company; Its success also is because the advanced person of the man-machine interaction means of its product to a great extent, the acceleration transducer (inclination angle) that switches anyhow such as the capacitive transducer and the picture at its contact screen interface etc.
Current, Gesture Recognition is applied to various fields such as intelligent robot, computing machine, game machine, mobile phone, display, automatic control system, production technology as the communication means between the mankind and the computing machine.Provide from the US20100199228A1 of Microsoft (open day is on August 5th, 2010) and to have utilized degree of depth camera to catch and the body posture of analysis user, and it has been interpreted as the scheme of computer command.US20080291160A1 (open day is on November 27th, 2008) from Nintendo company provides and has utilized infrared sensor and acceleration transducer to catch the scheme of user's hand position.In addition, utilize data glove to assist scheme in the prior art in addition to the identification of hand attitude.These schemes have realized the identification to hand exercise, but also exist various deficiencies, and prices are rather stiff.CN1276572A from Panasonic Electric Equipment Industrial Co.,Ltd provides the use camera that hand is taken pictures; Then image is carried out normalized Analysis; And the image that normalization obtains carried out space projection, and the projection coordinate of gained and the projection coordinate of image stored are in advance compared.This method is more directly perceived, but needs through complex mathematical computation process, and can't carry out recognition and tracking to the locus of hand.And the recent body propagated sensation sensor of new generation of (on October 20th, 2011) Microsoft's development of on Britain's The Daily Telegraph newspaper, reporting, its working method is wall, automobile even the palm touch screen as man-machine interaction.The KINECT body propagated sensation sensor of new generation of the researchist of Microsoft development can be followed the trail of the motion from people's arm to wall, and its principle is the optical projection of the use on the shoulders instrument the people, makes staff be used as the virtual machine keyboard at the isoplanar image of wall.The OMNITOUCH of U.S. Carnegie Mellon University has also used similar principles.
These seem to be advanced means; Its principle all is the moving image that obtains hand through the activity that optical devices such as video camera are taken human hand; Carry out Flame Image Process through main frame then; Thereby a series of specific activities of identification hand specific part is carried out various processing by the main frame execution based on the gesture virtual cursor that is recognized by gesture identifying device.But; On using, but there are a lot of limitation; Their application is not very convenient under a lot of situation; For example,, 2 hand overlapping to light source when 2 palmistrys be not in same plane or 2 hand perpendicular light source, when not having smooth flat or the like under the most reality, and above-mentioned " advanced person " technology also can't be used; In addition, often longer through the time of a posture of image recognition, need second more than 10 at least, then need the longer time for the identification of dynamic posture, this is undoubtedly a kind of torment for high-level efficiency, allegro modern.
Summary of the invention
Technical matters to be solved by this invention provides a kind of limbs posture identification method and system, and system provided by the invention can perception limbs posture and it is discerned, thereby realizes man-machine interaction.
The present invention discloses a kind of limbs posture identification method in order to solve the problems of the technologies described above, and said method comprises the steps:
Steps A is obtained the sensed parameter of the kinematic parameter and the hardware device at a plurality of positions of limbs respectively;
Step B obtains the change in location information of limbs according to the angle difference between the kinematic parameter at each position of limbs relatively;
Step C confirms the movable information of limbs with respect to hardware device according to the angle difference between the sensed parameter of the change in location information of limbs and hardware device;
Step D discerns as monitor message with respect to the movable information of hardware device limbs and to export.
Further, said method is utilized the position of optical device auxiliary monitoring limbs with respect to hardware device before also being included in steps A.
Further, the change in location information of said limbs is the spatial positional information of each position of limbs under motion state.
The positional information of trigger point when further, the sensed parameter of said hardware device is the external trigger hardware device.
Further, the kinematic parameter at a plurality of positions of said limbs is meant that each position of limbs is in the spatial movement parameter with 3 degree of freedom, 6 degree of freedom or 9 degree of freedom.
Further, the angle difference between the kinematic parameter at each position of said limbs is the angle difference between the same single-degree-of-freedom of each position kinematic parameter of limbs.
Further, the angle difference between the sensed parameter of the change in location information of said limbs and hardware device is with the angle difference between the single-degree-of-freedom.
The invention also discloses a kind of limbs gesture recognition system, said system comprises: microprocessor, data transmission module, be in a plurality of first sensor modules of different parts on the limbs and be in second sensor assembly on the hardware device, wherein,
Said a plurality of first sensor module is used for obtaining respectively the kinematic parameter of different parts on the limbs;
Said second sensor assembly is used to obtain the sensed parameter of hardware device;
Said microprocessor; Be used to calculate the angle difference that is between the kinematic parameter that each first sensor module of different parts is obtained on the limbs; Obtain the change in location information of limbs then according to the angle difference that calculates; And the angle difference between the sensed parameter of the hardware device that obtains according to the change in location information calculations and second sensor assembly of limbs, thereby confirm the movable information of limbs with respect to hardware device;
Said data transmission module is used for the limbs that said microprocessor calculates are exported as monitor message with respect to the movable information of hardware device.
Further, said system also comprises and is used for the optical device of auxiliary monitoring limbs with respect to the hardware device position.
Further, said a plurality of first sensor module places finger place, the back of the hand or wrist place, arm place and neck or the place, front of limbs respectively.
Further, said a plurality of first sensor module is connected with said microprocessor through lead with second sensor assembly.
Further, the said a plurality of first sensor module and second sensor assembly comprise respectively and are used to detect the multiple degrees of freedom sensor that has the kinematic parameter of 3 degree of freedom, 6 degree of freedom or 9 degree of freedom in the space.
Further, said a plurality of first sensor module comprises the first sensor that is used to obtain the limb motion parameter respectively, is used for energy unit that the kinematic parameter that first sensor obtains is transferred to the wireless data transmission unit of microprocessor and WV is provided for said first sensor and wireless data transmission unit.
Further, said energy unit is battery or rechargeable battery.
Further, said battery or rechargeable battery adopt and can obtain in the external environment condition material of energy and process, and said material comprises one or more in piezoelectric, magnetostriction materials, photochromics, thermo-sensitive material and the thermo-electric converting material.
Further, said second sensor assembly comprises second sensor of trigger point positional information when being used to obtain the external trigger hardware device.
Further, the said first sensor and second sensor are respectively and are used to detect the multiple degrees of freedom sensor that has the kinematic parameter of 3 degree of freedom, 6 degree of freedom or 9 degree of freedom in the space.
Further, have the sensor of 3 degree of freedom kinematic parameters in the space be the magnetic field sensor of 3 degree of freedom or the acceleration transducer of 3 degree of freedom in said detection.
Further, said detection has the combination of the acceleration transducer of magnetic field sensor that the sensor of 6 degree of freedom kinematic parameters is 3 degree of freedom and 3 degree of freedom in the space.
Further, said detection has the combination of rotation sensor of acceleration transducer and 3 degree of freedom of the magnetic field sensor that the sensor of 9 degree of freedom kinematic parameters is 3 degree of freedom, 3 degree of freedom in the space.
Adopt the beneficial effect of the invention described above technical scheme to be: limbs gesture recognition system provided by the invention; Adopt sensor component to carry out the identification of limbs posture; Simultaneously also can external optical device auxiliary monitoring limbs with respect to the position of hardware device; Discern through sensor senses limbs posture and to it, thereby realize human-computer interaction function; The limbs gesture recognition technology that realizes through limbs posture identification method provided by the invention and system has strengthened recognition speed, makes recognition time foreshorten to Millisecond, and can discern the different parts of limbs simultaneously to have certain dirigibility.
Description of drawings
Fig. 1 is the building-block of logic of limbs gesture recognition system in the embodiment of the invention;
Fig. 2 is the internal logic structure figure of first sensor module in second kind of embodiment of limbs gesture recognition system in the embodiment of the invention;
Fig. 3 is a limbs posture identification method process flow diagram in the embodiment of the invention;
Fig. 4 is a concrete application principle figure of limbs gesture recognition system of the present invention.
Embodiment
Below in conjunction with accompanying drawing principle of the present invention and characteristic are described, institute gives an actual example and only is used to explain the present invention, is not to be used to limit scope of the present invention.
Fig. 1 is the building-block of logic of limbs gesture recognition system in the embodiment of the invention; As shown in Figure 1, said limbs gesture recognition system comprises: microprocessor 101, data transmission module 102, be in different parts on the limbs a plurality of first sensor modules (comprise first sensor modules A 103A, first sensor module B 103B ... First sensor module N 103N) and be in second sensor assembly 104 on the hardware device.In embodiments of the present invention; Said a plurality of first sensor module; Can place finger place, the back of the hand or wrist place, arm place (like the position of the bicipital muscle of arm) and neck or the place, front of limbs respectively, be respectively applied for the kinematic parameter that obtains each position on the limbs; Said second sensor assembly 104 is used to obtain the sensed parameter of hardware device, in the present embodiment, and the co-ordinate position information of trigger point when the sensed parameter of said hardware device is the external trigger hardware device; Said microprocessor 101; Be used to calculate the angle difference that is between the kinematic parameter that each first sensor module of different parts is obtained on the limbs; Obtain the change in location information of limbs then according to the angle difference that calculates; And the angle difference between the sensed parameter of the hardware device that obtains according to the change in location information calculations and second sensor assembly 104 of limbs; Thereby confirm the movable information of limbs, limbs are identified as monitor message with respect to the movable information of hardware device with respect to hardware device; Said data transmission module 102 is used for the limbs that said microprocessor 101 calculates are exported as monitor message with respect to the movable information of hardware device.
In embodiments of the present invention, described hardware device includes but not limited to equipment such as giant display, goods electronic sand map, piano.In this embodiment, each the first sensor module and second sensor assembly comprise respectively and are used to detect the multiple degrees of freedom sensor that has the kinematic parameter of 3 degree of freedom, 6 degree of freedom or 9 degree of freedom in the space.Wherein, have the sensor of 3 degree of freedom kinematic parameters in the space be the magnetic field sensor of 3 degree of freedom or the acceleration transducer of 3 degree of freedom in said detection; Said detection has 6 degree of freedom kinematic parameters in the space sensor is the combination of acceleration transducer of magnetic field sensor and 3 degree of freedom of 3 degree of freedom; Said detection has 9 degree of freedom kinematic parameters in the space sensor is the combination of rotation sensor of acceleration transducer and 3 degree of freedom of the magnetic field sensor of 3 degree of freedom, 3 degree of freedom.
In first kind of embodiment of limbs gesture recognition system of the present invention, said a plurality of first sensor modules (comprise first sensor modules A 103A, first sensor module B 103B ... First sensor module N 103N) is connected with said microprocessor through lead with second sensor assembly.In this embodiment, the sensed parameter of the kinematic parameter at each position of limbs of obtaining of said first sensor module and the hardware device that said second sensor assembly obtains transfers in the microprocessor through lead respectively and handles.
In second kind of embodiment of limbs gesture recognition system of the present invention; Said each first sensor module adopts wireless mode to transfer data to microprocessor; In this embodiment; The internal logic structure figure of said first sensor module is as shown in Figure 2: in the present embodiment, said each first sensor module comprises the first sensor 201 that is used to obtain each position kinematic parameter of limbs respectively, be used for energy unit 203 that the kinematic parameter that first sensor 201 obtains is transferred to the wireless data transmission unit 202 of microprocessor and WV is provided for said first sensor 201 and wireless data transmission unit 202.In this embodiment; Said energy unit 203 can be battery or rechargeable battery; Wherein, Said battery or rechargeable battery can adopt and can obtain in the external environment condition material of energy and process, and said material can be one or more in piezoelectric, magnetostriction materials, photochromics, thermo-sensitive material and the thermo-electric converting material, or is made up of RLC oscillator and antenna.Can obtain energy of potential energy, mechanical energy, luminous energy, heat energy, thermal gradient energy, wireless radiation etc. from surrounding space through above-mentioned material or structure.Such as obtaining energy from the motion of limbs, obtaining energy, obtain optical energy etc. from external environment condition from the temperature of limbs and the temperature difference of environment; Or receive and send messages and obtain energy there from microprocessor through wireless data transmission unit 202; Or through antenna radio wave absorbing energy; Thereby cause the vibration of rlc circuit and produce electric current, can also obtain energy, and internal battery also can obtain charging by the way from internal battery.
In the preferred implementation of limbs gesture recognition system of the present invention, said system can also comprise and is used for the optical device of auxiliary monitoring limbs with respect to the hardware device position.In the present embodiment; Through the auxiliary Position Approximate of confirming limbs with respect to hardware devices such as the display screen of game machine or goods electronic sand maps of optical device; Detect corresponding motion or location parameter by first sensor module that is in each position on the limbs and second sensor assembly that is arranged on the hardware device then; By microprocessor calculate and definite limbs with respect to the movable information of hardware device, by data transmission module this information is exported as monitor message at last, thereby is realized man-machine interaction.In this embodiment, described optical device includes but not limited to equipment such as camera, video camera, scanner.
Fig. 3 is limbs posture identification method process flow diagram in the embodiment of the invention, and is as shown in Figure 3, and said limbs posture identification method comprises the steps:
Step 301 is obtained the sensed parameter of the kinematic parameter and the hardware device at a plurality of positions of limbs respectively;
In the present embodiment; The a plurality of positions of said limbs include but not limited to finger place, the back of the hand or wrist place, arm place (like the position of the bicipital muscle of arm) and neck or the place, front of limbs, or positions such as the foot of limbs, calf, huckle and chest or back.In this embodiment, the kinematic parameter at a plurality of positions of said limbs be meant limbs above-mentioned each position in spatial movement parameter with 3 degree of freedom, 6 degree of freedom or 9 degree of freedom.The co-ordinate position information of trigger point when the sensed parameter of said hardware device is the external trigger hardware device.
Step 302 obtains the change in location information of limbs according to the angle difference between the kinematic parameter at each position of limbs relatively;
In the present embodiment; The kinematic parameter at each position of said limbs is the spatial positional information of each position of limbs under motion state, and the angle difference between the kinematic parameter at each position of said limbs is the angle difference between the same single-degree-of-freedom of each position kinematic parameter of limbs.For example; Through the angle difference between the finger and the same single-degree-of-freedom of the back of the hand place kinematic parameter relatively; And obtain pointing change in location information with respect to the back of the hand; Through the angle difference between the same single-degree-of-freedom of further relatively the back of the hand and arm place kinematic parameter; And obtain the change in location information of the back of the hand with respect to arm, again through the angle difference between the same single-degree-of-freedom of the kinematic parameter of arm place and neck or chest relatively, thereby obtain the change in location information of arm with respect to neck or chest.
Step 303 is confirmed the movable information of limbs with respect to hardware device according to the angle difference between the sensed parameter of the change in location information of limbs and hardware device;
In the present embodiment; Angle difference between the change in location information of said limbs and the sensed parameter of hardware device is similarly with the angle difference between the single-degree-of-freedom; Angle difference when change in location information through calculating limbs and external trigger hardware device between the co-ordinate position information of trigger point; Thereby confirm the movable information of limbs, the action of grabbing from the goods electronic sand map sky such as staff etc. with respect to hardware device.
Step 304 is discerned as monitor message with respect to the movable information of hardware device limbs and to be exported.
In a preferred implementation of the present invention, said method is utilized the step of optical device auxiliary monitoring limbs with respect to the hardware device position before also being included in step 301.
Obviously, it is apparent to those skilled in the art that above-mentioned each module of the present invention or each step can realize through the general calculation device; They can be integrated on the single calculation element; Perhaps be distributed on the network that a plurality of calculation element forms, alternatively, they can be realized with the executable program code of calculation element; Thereby; Can they be stored in the memory storage and carry out, perhaps they are made into each integrated circuit modules respectively, perhaps a plurality of modules in them or step are made into the single integrated circuit module and realize by calculation element.Therefore, the present invention is not restricted to the combination of any specific hardware and software.
Below through concrete application implementation example and combine above-mentioned system architecture and method step to describe principle of the present invention in detail; In first concrete application implementation example of the present invention; Described limbs gesture recognition is to detect the kinematic parameter of limbs corresponding site at X, Y, Z space three degree of freedom (being volume coordinate) through the first sensor module that is arranged at the limbs different parts; Certainly; In order to obtain limb motion parameter more accurately; Can also detect around the kinematic parameter of X, Y, three axle rotary freedoms of Z (being the angle of pitch, yaw angle and the torsion angle at each position of limbs) and according to the kinematic parameter of the determined A of geographical magnetic direction, B, three directions of C (pointing to the positive north, B like A makes a comment or criticism directly over east, C points to) degree of freedom; Said first sensor module transfers to the kinematic parameter that obtains in the microprocessor through lead or wireless data transmission unit; By microprocessor the kinematic parameter that each sensor assembly obtains is compared, with the angle difference between the single-degree-of-freedom, and obtain the change in location information of limbs through each kinematic parameter relatively.In this embodiment; Explain that through a concrete embodiment first sensor module in the native system obtains the principle of kinematic parameter; As shown in Figure 4; When the described first sensor module of embodiment of the invention is set respectively at the back of the hand of human hands and each finger place; The three-dimensional motion of any one finger first sensor module (first sensor module 103B, first sensor module 103C, first sensor module 103D, first sensor module 103E, first sensor module 103F) that all can be positioned at the finger place real-time monitors so; And transfer in the microprocessor 101 through lead 404 or with wireless mode, by microprocessor 101 itself and the data that the first sensor module 103A that places the back of the hand monitors are calculated.Need to prove; The position of microprocessor described in Fig. 4 101 is just explained principle of the present invention for ease; When concrete the application; Said microprocessor 101 can be arranged on Anywhere, such as can being integrated in the hardware device that is provided with second sensor assembly, or waiting and realizes function of the present invention through independent processor module is set.
In this embodiment; Calculate the kinematic parameter that each first sensor module (like first sensor module 103B, first sensor module 103C, first sensor module 103D, first sensor module 103E, first sensor module 103F) of being positioned at finger obtains and the angle difference of the same single-degree-of-freedom (perhaps coaxial) between the kinematic parameter that the first sensor module 103A of the back of the hand obtains through microprocessor 101, thereby obtain embodying the variation and the posture of each finger position.For example; When representing Chinese " 1 " with finger gesture; Only be positioned at X, Y, Z and A, B, C six parameters and positions that the first sensor module 103C of forefinger obtains and be positioned at the back of the hand or above-mentioned six parameters that the first sensor module 103A of wrist obtains basic identical; And X, Y, Z and A, B, the C of middle finger and middle finger 2 fingers in back (totally 3 fingers) are basic identical, but the relevant parameter that has on a parameter and the back of the hand among X, Y, Z and A, B, the C exists 150 to spend the differential seat angle of spending to 270.In like manner; When representing the meaning that the American swears at people with finger gesture; Promptly stretch out a middle finger; Only be positioned at X, Y, Z and A, B, C six parameters and positions that the first sensor module 103D of middle finger obtains this moment and be positioned at the back of the hand or above-mentioned six parameters that the first sensor module 103A of wrist obtains basic identical; And X, Y, Z and A, B, the C of 2 fingers of thumb, forefinger and middle finger back are basic identical, but the relevant parameter that has on a parameter and the back of the hand among X, Y, Z and A, B, the C exists 150 to spend to the differential seat angle of 270 degree.Because data relatively is that finger is with respect to the back of the hand; So patent of the present invention, only needs motion and the variation in any space of 10 fingers that a plurality of first sensor modules described in the embodiment of the invention just can 2 hands of real time record without any need for fixing and virtual plane.If what adopt in the said first sensor module is the sensor with 9 degree of freedom, then can write down the parameters such as speed, acceleration, angular acceleration of hand and finger gesture change.
Based on above-mentioned principle; During second sensor assembly in an embodiment of the invention is set on the hardware device at goods electronic sand map; And second sensor assembly also is connected with microprocessor through mode such as wired or wireless; At this moment, human hands can be real-time monitored by the described limbs gesture recognition system of the embodiment of the invention with respect to any gesture conversion of goods electronic sand map.For example; When needs identification human hands during from the action that grabs in the goods electronic sand map sky; Then this system is through above-mentioned principle; The first sensor module monitors that relatively is positioned at finger to kinematic parameter and the first sensor module monitors that is positioned at the back of the hand to angle difference and the motion change information that obtains pointing of kinematic parameter, at this moment, the first sensor module monitors that is positioned at the back of the hand to kinematic parameter be the comparable data that is positioned at the kinematic parameter that the first sensor module monitors on the finger arrives; The co-ordinate position information of trigger point during then through second sensor assembly induction on the goods electronic sand map and the external trigger goods electronic sand map that obtains; First sensor module monitors through relatively being positioned at the back of the hand to kinematic parameter and goods electronic sand map on the angle difference of co-ordinate position information of second sensor assembly trigger point of sensing obtain the movable information of human hands on goods electronic sand map; Like human hands the action of controlling or grabbing from the goods electronic sand map sky of goods electronic sand map etc. all can be calculated and identify by microprocessor, thereby realize control goods electronic sand map.
In the above-described embodiments; Can also add optical device such as auxiliary monitoring human hands such as camera, scanner position with respect to goods electronic sand map; When system moves; Earlier confirm the Position Approximate of human hands, obtain the movable information of human hands on goods electronic sand map through principle of the present invention again, and realize control goods electronic sand map with respect to goods electronic sand map through optical device.
In second concrete application implementation example of the present invention; Second sensor assembly of a system of the present invention is set on the display screen at electronic game machine; When finger, the back of the hand or the wrist of human body and positions such as arm, neck or front also respectively were provided with a first sensor module, various three-dimensionals or the three-dimensional above various posture change forms that comprise hand, arm and health so all can and transfer to the disposal system of electronic game machine by real time record.Like the described principle of above-mentioned embodiment; The angle difference of the similar same single-degree-of-freedom that relatively is positioned at the kinematic parameter that finger and the first sensor module of the back of the hand obtain and the change in location and the pose information that obtain pointing; The angle difference of the same single-degree-of-freedom of the kinematic parameter that the motion change of whole the back of the hand or wrist also can be obtained through relatively the back of the hand or wrist and the first sensor module that is positioned at arm embodies, thereby obtains the back of the hand or the wrist parameters such as direction of motion, speed, posture and sense of rotation with respect to arm; In like manner; The motion change of entire arms can also be through relatively being positioned at the kinematic parameter that arm and the first sensor module that is positioned at neck or front obtain the angle difference of same single-degree-of-freedom embody, thereby obtain the parameters such as direction of motion, speed, posture and sense of rotation of arm with respect to trunk.The co-ordinate position information of the external trigger display screen of second sensor assembly on parameters such as direction of motion, speed, posture and the sense of rotation of finger, the back of the hand or wrist and arm and the electronic game machine display screen being sensed through microprocessor is then carried out the comparison with single-degree-of-freedom; And carry out analysis-by-synthesis and calculate; Thereby obtain showing on human body and the electronic game machine display screen the various motion interactive information of recreation, the continuation of playing through the disposal system control of electronic game machine moves.
In the above-described embodiments; Can add optical device such as servicing unit monitoring humans such as camera, scanner position equally with respect to the electronic game machine display screen; When system moves; Earlier confirm the Position Approximate of human body with respect to the electronic game machine display screen through optical device, the motion interactive information in obtaining human body and play through principle of the present invention again makes electronic game lively more, approaching actual.
In the 3rd concrete application implementation example of the present invention, the principle of said limbs gesture recognition and first concrete application implementation example and second concrete application implementation example are basic identical, and difference is that field and the purpose used are different with the effect that reaches.In this embodiment; Said limbs gesture recognition system can be applied to education sector; Such as information such as the action, position and the movement velocity of pointing that are used to learn to play the piano, monitor when drawing qin learner's finger, keystroke dynamics, angles; So that the learner can grasp the study situation of oneself more intuitively, adjust improvement to self problem, thereby accelerated pace of learning and quality.
In the above-described embodiment; The first sensor module that native system is set through finger and the back of the hand place the learner; Second sensor assembly of native system is set on piano; The first sensor module monitors that relatively is positioned at finger by microprocessor to kinematic parameter and the first sensor module monitors that is positioned at the back of the hand to angle difference and the motion change information that obtains pointing of kinematic parameter; At this moment, the first sensor module monitors that is positioned at the back of the hand to kinematic parameter be the comparable data that is positioned at the kinematic parameter that the first sensor module monitors on the finger arrives; Then the first sensor module monitors through relatively being positioned at the back of the hand to the co-ordinate position information of kinematic parameter when being positioned at hand that second sensor assembly on the piano monitors the piano button is pushed; Thereby calculate the information such as movement velocity, keystroke dynamics, angle of action, position and the finger of learner finger, for the learner voluntarily with reference to improving.
Can know through above-mentioned various embodiments; Limbs gesture recognition system of the present invention can replace the input equipment of various man-machine interactions: like computer keyboard, mouse, touch screen etc., and replace game control device (JOYSTICK), telepilot, sound recognition system etc.; The limbs pose information of the limbs gesture recognition system output that provides through the embodiment of the invention can be controlled the operation such as machines such as computing machine, intelligent telephone set, TV, IPADS, game station, machine controling equipment, motion judge equipment, vehicle, aircraft display device, office equipment, PRN device, display device, three-dimensional operation sand tables.
The above is merely preferred embodiment of the present invention, and is in order to restriction the present invention, not all within spirit of the present invention and principle, any modification of being done, is equal to replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (23)

1. a limbs posture identification method is characterized in that, said method comprises the steps:
Steps A is obtained the sensed parameter of the kinematic parameter and the hardware device at a plurality of positions of limbs respectively;
Step B obtains the change in location information of limbs according to the angle difference between the kinematic parameter at each position of limbs relatively;
Step C confirms the movable information of limbs with respect to hardware device according to the angle difference between the sensed parameter of the change in location information of limbs and hardware device;
Step D discerns as monitor message with respect to the movable information of hardware device limbs and to export.
2. limbs posture identification method according to claim 1 is characterized in that, said method is utilized the position of optical device auxiliary monitoring limbs with respect to hardware device before also being included in steps A.
3. limbs posture identification method according to claim 1 is characterized in that, the change in location information of said limbs is the spatial positional information of each position of limbs under motion state.
4. limbs posture identification method according to claim 1 is characterized in that, the positional information of trigger point when the sensed parameter of said hardware device is the external trigger hardware device.
5. limbs posture identification method according to claim 1 is characterized in that, the kinematic parameter at a plurality of positions of said limbs is meant that each position of limbs is in the spatial movement parameter with 3 degree of freedom, 6 degree of freedom or 9 degree of freedom.
6. limbs posture identification method according to claim 5 is characterized in that, the angle difference between the kinematic parameter at each position of said limbs is the angle difference between the same single-degree-of-freedom of each position kinematic parameter of limbs.
7. limbs posture identification method according to claim 5 is characterized in that, the angle difference between the change in location information of said limbs and the sensed parameter of hardware device is with the angle difference between the single-degree-of-freedom.
8. a limbs gesture recognition system is characterized in that, said system comprises: microprocessor, data transmission module, be in a plurality of first sensor modules of different parts on the limbs and be in second sensor assembly on the hardware device, wherein,
Said a plurality of first sensor module is used for obtaining respectively the kinematic parameter of different parts on the limbs;
Said second sensor assembly is used to obtain the sensed parameter of hardware device;
Said microprocessor; Be used to calculate the angle difference that is between the kinematic parameter that each first sensor module of different parts is obtained on the limbs; Obtain the change in location information of limbs then according to the angle difference that calculates; And the angle difference between the sensed parameter of the hardware device that obtains according to the change in location information calculations and second sensor assembly of limbs, thereby confirm the movable information of limbs with respect to hardware device;
Said data transmission module is used for the limbs that said microprocessor calculates are exported as monitor message with respect to the movable information of hardware device.
9. limbs gesture recognition system according to claim 8 is characterized in that, said system also comprises and is used for the optical device of auxiliary monitoring limbs with respect to the hardware device position.
10. according to Claim 8 or 9 described limbs gesture recognition system, it is characterized in that said a plurality of first sensor modules place finger place, the back of the hand or wrist place, arm place and neck or the place, front of limbs respectively.
11. limbs gesture recognition system according to claim 10 is characterized in that, said a plurality of first sensor modules are connected with said microprocessor through lead with second sensor assembly.
12. limbs gesture recognition system according to claim 11; It is characterized in that the said a plurality of first sensor modules and second sensor assembly comprise respectively and be used to detect the multiple degrees of freedom sensor that has the kinematic parameter of 3 degree of freedom, 6 degree of freedom or 9 degree of freedom in the space.
13. limbs gesture recognition system according to claim 12 is characterized in that, said detection has 3 degree of freedom kinematic parameters in the space sensor is the magnetic field sensor of 3 degree of freedom or the acceleration transducer of 3 degree of freedom.
14. limbs gesture recognition system according to claim 12 is characterized in that, said detection has 6 degree of freedom kinematic parameters in the space sensor is the combination of acceleration transducer of magnetic field sensor and 3 degree of freedom of 3 degree of freedom.
15. limbs gesture recognition system according to claim 12; It is characterized in that said detection has 9 degree of freedom kinematic parameters in the space sensor is the combination of rotation sensor of acceleration transducer and 3 degree of freedom of the magnetic field sensor of 3 degree of freedom, 3 degree of freedom.
16. according to Claim 8 or 9 described limbs gesture recognition system; It is characterized in that said a plurality of first sensor modules comprise the first sensor that is used to obtain the limb motion parameter respectively, be used for energy unit that the kinematic parameter that first sensor obtains is transferred to the wireless data transmission unit of microprocessor and WV is provided for said first sensor and wireless data transmission unit.
17. according to Claim 8 or 9 described limbs gesture recognition system, it is characterized in that said second sensor assembly comprises second sensor of trigger point positional information when being used to obtain the external trigger hardware device.
18. limbs gesture recognition system according to claim 16 is characterized in that, said energy unit is battery or rechargeable battery.
19. limbs gesture recognition system according to claim 18; It is characterized in that; Said battery or rechargeable battery adopt and can obtain in the external environment condition material of energy and process, and said material comprises one or more in piezoelectric, magnetostriction materials, photochromics, thermo-sensitive material and the thermo-electric converting material.
20. according to claim 16 or 17 each described limbs gesture recognition system; It is characterized in that the said first sensor and second sensor are respectively and are used to detect the multiple degrees of freedom sensor that has the kinematic parameter of 3 degree of freedom, 6 degree of freedom or 9 degree of freedom in the space.
21. limbs gesture recognition system according to claim 20 is characterized in that, said detection has 3 degree of freedom kinematic parameters in the space sensor is the magnetic field sensor of 3 degree of freedom or the acceleration transducer of 3 degree of freedom.
22. limbs gesture recognition system according to claim 20 is characterized in that, said detection has 6 degree of freedom kinematic parameters in the space sensor is the combination of acceleration transducer of magnetic field sensor and 3 degree of freedom of 3 degree of freedom.
23. limbs gesture recognition system according to claim 20; It is characterized in that said detection has 9 degree of freedom kinematic parameters in the space sensor is the combination of rotation sensor of acceleration transducer and 3 degree of freedom of the magnetic field sensor of 3 degree of freedom, 3 degree of freedom.
CN2011104033689A 2011-12-07 2011-12-07 Method and system for identifying posture of body Pending CN102402290A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011104033689A CN102402290A (en) 2011-12-07 2011-12-07 Method and system for identifying posture of body

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011104033689A CN102402290A (en) 2011-12-07 2011-12-07 Method and system for identifying posture of body

Publications (1)

Publication Number Publication Date
CN102402290A true CN102402290A (en) 2012-04-04

Family

ID=45884574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011104033689A Pending CN102402290A (en) 2011-12-07 2011-12-07 Method and system for identifying posture of body

Country Status (1)

Country Link
CN (1) CN102402290A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830799A (en) * 2012-08-02 2012-12-19 晨星软件研发(深圳)有限公司 Method for carrying out induction on induction device based on intelligent terminal and intelligent terminal
CN103377554A (en) * 2012-04-27 2013-10-30 卢颖 Induction type pedestrian crossing controlling facility optimization design based on kinect
CN104077559A (en) * 2013-03-29 2014-10-01 现代自动车株式会社 Vehicle having gesture detection system and method
CN105308536A (en) * 2013-01-15 2016-02-03 厉动公司 Dynamic user interactions for display control and customized gesture interpretation
CN105677036A (en) * 2016-01-29 2016-06-15 清华大学 Interactive type data glove
CN106125936A (en) * 2016-06-30 2016-11-16 联想(北京)有限公司 A kind of motion sensing control method and electronic installation
CN106485984A (en) * 2015-08-27 2017-03-08 中国移动通信集团公司 A kind of intelligent tutoring method and apparatus of piano
CN107115653A (en) * 2016-11-03 2017-09-01 京东方科技集团股份有限公司 Adjust device, stroke information processing system, the stroke information processing method of stroke
CN108108709A (en) * 2017-12-29 2018-06-01 纳恩博(北京)科技有限公司 A kind of recognition methods and device, computer storage media
US10042510B2 (en) 2013-01-15 2018-08-07 Leap Motion, Inc. Dynamic user interactions for display control and measuring degree of completeness of user gestures
CN108874119A (en) * 2017-05-16 2018-11-23 芬奇科技有限公司 The mobile input to generate computer system of tracking arm
CN111465978A (en) * 2017-10-12 2020-07-28 W·戴维 Electronic body percussion instrument
US11347317B2 (en) 2013-04-05 2022-05-31 Ultrahaptics IP Two Limited Customized gesture interpretation
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042548A1 (en) * 2002-11-07 2004-05-21 Olympus Corporation Movement detection device
CN1689513A (en) * 2004-04-27 2005-11-02 清华大学 Device for measuring joint movement posture of human body
CN1696874A (en) * 2005-06-28 2005-11-16 中国海洋大学 Attitude measurement device and attitude measurement method based on skeleton model
CN101730874A (en) * 2006-06-28 2010-06-09 诺基亚公司 Touchless gesture based input
CN202512510U (en) * 2011-12-07 2012-10-31 北京盈胜泰科技术有限公司 Limb gesture identification system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004042548A1 (en) * 2002-11-07 2004-05-21 Olympus Corporation Movement detection device
CN1689513A (en) * 2004-04-27 2005-11-02 清华大学 Device for measuring joint movement posture of human body
CN1696874A (en) * 2005-06-28 2005-11-16 中国海洋大学 Attitude measurement device and attitude measurement method based on skeleton model
CN101730874A (en) * 2006-06-28 2010-06-09 诺基亚公司 Touchless gesture based input
CN202512510U (en) * 2011-12-07 2012-10-31 北京盈胜泰科技术有限公司 Limb gesture identification system

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103377554A (en) * 2012-04-27 2013-10-30 卢颖 Induction type pedestrian crossing controlling facility optimization design based on kinect
CN102830799B (en) * 2012-08-02 2015-06-10 晨星软件研发(深圳)有限公司 Method for carrying out induction on induction device based on intelligent terminal and intelligent terminal
CN102830799A (en) * 2012-08-02 2012-12-19 晨星软件研发(深圳)有限公司 Method for carrying out induction on induction device based on intelligent terminal and intelligent terminal
US10042510B2 (en) 2013-01-15 2018-08-07 Leap Motion, Inc. Dynamic user interactions for display control and measuring degree of completeness of user gestures
CN105308536A (en) * 2013-01-15 2016-02-03 厉动公司 Dynamic user interactions for display control and customized gesture interpretation
CN113568506A (en) * 2013-01-15 2021-10-29 超级触觉资讯处理有限公司 Dynamic user interaction for display control and customized gesture interpretation
US10817130B2 (en) 2013-01-15 2020-10-27 Ultrahaptics IP Two Limited Dynamic user interactions for display control and measuring degree of completeness of user gestures
US10782847B2 (en) 2013-01-15 2020-09-22 Ultrahaptics IP Two Limited Dynamic user interactions for display control and scaling responsiveness of display objects
US10241639B2 (en) 2013-01-15 2019-03-26 Leap Motion, Inc. Dynamic user interactions for display control and manipulation of display objects
US11269481B2 (en) 2013-01-15 2022-03-08 Ultrahaptics IP Two Limited Dynamic user interactions for display control and measuring degree of completeness of user gestures
CN104077559A (en) * 2013-03-29 2014-10-01 现代自动车株式会社 Vehicle having gesture detection system and method
US11347317B2 (en) 2013-04-05 2022-05-31 Ultrahaptics IP Two Limited Customized gesture interpretation
CN106485984A (en) * 2015-08-27 2017-03-08 中国移动通信集团公司 A kind of intelligent tutoring method and apparatus of piano
CN105677036B (en) * 2016-01-29 2018-04-10 清华大学 A kind of interactive data gloves
CN105677036A (en) * 2016-01-29 2016-06-15 清华大学 Interactive type data glove
CN106125936B (en) * 2016-06-30 2019-03-08 联想(北京)有限公司 A kind of motion sensing control method and electronic device
CN106125936A (en) * 2016-06-30 2016-11-16 联想(北京)有限公司 A kind of motion sensing control method and electronic installation
CN107115653A (en) * 2016-11-03 2017-09-01 京东方科技集团股份有限公司 Adjust device, stroke information processing system, the stroke information processing method of stroke
CN108874119A (en) * 2017-05-16 2018-11-23 芬奇科技有限公司 The mobile input to generate computer system of tracking arm
CN108874119B (en) * 2017-05-16 2021-08-06 芬奇科技有限公司 System and method for tracking arm movement to generate input for a computer system
CN111465978A (en) * 2017-10-12 2020-07-28 W·戴维 Electronic body percussion instrument
CN108108709A (en) * 2017-12-29 2018-06-01 纳恩博(北京)科技有限公司 A kind of recognition methods and device, computer storage media
US11875012B2 (en) 2018-05-25 2024-01-16 Ultrahaptics IP Two Limited Throwable interface for augmented reality and virtual reality environments

Similar Documents

Publication Publication Date Title
CN102402290A (en) Method and system for identifying posture of body
CN102402291A (en) Body posture identifying method and device
CN107221223B (en) Virtual reality cockpit system with force/tactile feedback
CN102520791A (en) Wireless gesture recognition device
US9360944B2 (en) System and method for enhanced gesture-based interaction
Perng et al. Acceleration sensing glove (ASG)
CN110096131B (en) Touch interaction method and device and touch wearable equipment
CN202512510U (en) Limb gesture identification system
CN102707799B (en) A kind of gesture identification method and gesture identifying device
CN102622083A (en) Hand gesture recognition method and hand gesture recognition device
CN103529944B (en) A kind of human motion recognition method based on Kinect
Fang et al. A robotic hand-arm teleoperation system using human arm/hand with a novel data glove
Sun et al. Augmented reality based educational design for children
CN107896508A (en) Multiple target/end points can be used as(Equipment)" method and apparatus of the super UI " architectures of equipment, and correlation technique/system of the gesture input with dynamic context consciousness virtualized towards " modularization " general purpose controller platform and input equipment focusing on people of the integration points of sum
Fang et al. Robotic teleoperation systems using a wearable multimodal fusion device
CN106326881B (en) Gesture recognition method and gesture recognition device for realizing man-machine interaction
US20200026354A1 (en) Adaptive haptic effect rendering based on dynamic system identification
Sanfilippo et al. A low-cost multi-modal auditory-visual-tactile framework for remote touch
KR100934391B1 (en) Hand-based Grabbing Interaction System Using 6-DOF Haptic Devices
Shao et al. A natural interaction method of multi-sensory channels for virtual assembly system of power transformer control cabinet
JP2018151950A (en) Information processing apparatus, information processing system and program
Streli et al. Hoov: Hand out-of-view tracking for proprioceptive interaction using inertial sensing
Mishra et al. Design of hand glove for wireless gesture control of robot
Cannan et al. A Multi-sensor armband based on muscle and motion measurements
CN202694258U (en) Limb posture recognition device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
ASS Succession or assignment of patent right

Owner name: BEIJING AISIBO TECHNOLOGY CO., LTD.

Free format text: FORMER OWNER: BEIJING INSENTEK TECHNOLOGY CO., LTD.

Effective date: 20141210

C41 Transfer of patent application or patent right or utility model
COR Change of bibliographic data

Free format text: CORRECT: ADDRESS; FROM: 100080 HAIDIAN, BEIJING TO: 101102 TONGZHOU, BEIJING

TA01 Transfer of patent application right

Effective date of registration: 20141210

Address after: 101102, No. 2, government road, Tongzhou District light industrial park, Zhongguancun science and Technology Park, Tongzhou, Beijing, China (A-3)

Applicant after: BEIJING INSENTEK TECHNOLOGY CO., LTD.

Address before: 100080, building 1220, building A, 19 Zhongguancun Avenue, Beijing, Haidian District

Applicant before: Beijing Insentek Technology Co., Ltd.

C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20120404