CN106125928A - PPT based on Kinect demonstrates aid system - Google Patents

PPT based on Kinect demonstrates aid system Download PDF

Info

Publication number
CN106125928A
CN106125928A CN201610472100.3A CN201610472100A CN106125928A CN 106125928 A CN106125928 A CN 106125928A CN 201610472100 A CN201610472100 A CN 201610472100A CN 106125928 A CN106125928 A CN 106125928A
Authority
CN
China
Prior art keywords
gesture
right hand
ppt
kinect
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610472100.3A
Other languages
Chinese (zh)
Inventor
张冬冬
虞世泽
忻成杰
赵儒
赵一儒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN201610472100.3A priority Critical patent/CN106125928A/en
Publication of CN106125928A publication Critical patent/CN106125928A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In order to overcome the deficiency of existing PPT demonstration aid, the present invention provides a kind of PPT based on Kinect demonstration aid system and method.User, without using traditional mouse, light pen to give a lecture, expands the range of activity of user, provides a kind of preferably man-machine interaction mode simultaneously.A kind of PPT based on Kinect demonstrates aid system, and system main modular is divided into data acquisition module, gesture recognition module, position computation module and PPT functional control module.Data acquisition module, including Kinect, it is placed on The Cloud Terrace and controls the The Cloud Terrace anglec of rotation by motor, utilizing Kinect sampling depth information and bone information, depth information is used for calculating the position of user (speaker), and bone information is used for gesture identification.Gesture recognition module identifies user (speaker) gesture according to gathering data, and then is able to carry out the operation required for user.Position computation module, identifies user (speaker) gesture head position manipulation cloud platform rotation according to gathering data, it is achieved real-time tracking.

Description

PPT based on Kinect demonstrates aid system
Technical field
The present invention relates to Kinect somatosensory photographic head application and development field, specifically based on Kinect PPT demonstrates Aid system and method.
Background technology
The development maked rapid progress along with science and technology, increasing teacher can use PPT to assist to carry out education activities, more come The most enterprises can issue their product by the form of PPT demonstration or sum up the situation of their recent each side.PPT This software all has quite varied application at teaching field or at commercial field.But currently used PPT teaching is old Teacher, generally requires use mouse control, thus the range of activity of speaker has been confined to the side of computer.Although light pen or Person's wireless mouse is promoted, but meanwhile the original powerful annotation function of PPT has been faded out the regarding of people the most therewith Line.Although light pen can point out audience the emphasis given a lecture by laser spots, but cannot leave annotation vestige on screen.
Summary of the invention
In order to overcome the deficiency of existing PPT demonstration aid, the present invention provides a kind of PPT based on Kinect to demonstrate Aid system and method.User, without using traditional mouse, light pen to give a lecture, expands the movable model of user Enclose, provide a kind of preferably man-machine interaction mode simultaneously.
The present invention needs system solution to be protected to be summarised as:
A kind of PPT based on Kinect demonstrates aid system, is characterised by, system main modular is divided into data acquisition module Block, gesture recognition module, position computation module and PPT functional control module.Data acquisition module, including Kinect, is placed in The The Cloud Terrace anglec of rotation is controlled, mainly by Kinect sampling depth information and bone information, the degree of depth on The Cloud Terrace and by motor Information is used for calculating the position of user (speaker), and bone information is used for gesture identification.Gesture recognition module is according to gathering data Identify user (speaker) gesture, and then be able to carry out the operation required for user.Position computation module, according to gathering data Identify user (speaker) gesture head position manipulation cloud platform rotation, it is achieved real-time tracking.
PPT functional control module is a series of services provided a user with, related control based on C# programming language and API disclosed in Microsoft Office PowerPoint, it is achieved the control function that speaker's gesture is demonstrated for PPT, including The use of the functions such as paintbrush annotation, PPT (quickly) page turning, picture presentation, video playback.System break traditions demonstration need click on The present situation that could control, based on head identification, control cloud platform rotation expands the activity space of user with this, in conjunction with conventional mouse With the advantage of light pen, making annotation pen more easy-to-use, PPT can check picture and film in storehouse when playing immediately, reaches exhibiting pictures The purpose of details and control movie playback more easily.
Described data acquisition module directly utilizes the degree of depth of Kinect SDK and skeleton obtains function to gather information.
Described gesture recognition module, the degree of depth collected according to Kinect and bone information, according to predefined gesture Storehouse, identifies that speaker carries out gesture when PPT demonstrates.Including page turning gesture identification, resurrect gesture identification, suspend gesture identification, Wave the identification of the basic gestures such as gesture, and picture drags and scaling, video playback, annotation pen, the hands of rapid page function The submodules such as gesture identification.
Described page turning gesture recognition module uses Microsoft.Samples.Kinect.SwipeGestureRecogniz Two gesture gestures SwipeLeftDetected defined in er storehouse and SwipeRightDetected. The definition of SwipeLeftDetected gesture is that left hand is raised up to chest naturally, and level brandished human body centrage to the right, simulation Left mouse button " ← ".The definition of SwipeRightDetected gesture is that the right hand is raised up to chest naturally, and level was brandished to the left Human body centrage, analog mouse right button " → ".
Described resurrect gesture identification submodule again include annotation pen, left and right sidebar, upper sidebar resurrect identification submodule.
Described annotation pen resurrects identification submodule, after the right hand holds relieving for the first time as user, screen there will be and carry out The tracking cross of annotation, tracking cross can be along with the right hand moving of speaker.Speaker can select his needs to note The place released, then holds the right hand, and now user just can add and annotates.For avoiding occurring annotation clerical mistake to resurrect, system pair Annotation pen resurrects region and is limited: 1) .dis1-dis2 <-0.2, wherein, dis1 is the cervical region distance to the right hand, and dis2 is Head is to the distance of crotch.2) .RightHand.Position.Y-RightShoulder.Position.Y >=-0.1, wherein RightHand.Position.Y represents the Y-axis coordinate of right hand central point, and RightShoulder.Position.Y represents right shoulder The Y-axis coordinate of central point.
Described left and right sidebar resurrects identification submodule, and user lifts right-hand man naturally can resurrect left and right sidebar.Consider The normal body language of user, system resurrects mode to hurdle, left and right and is limited: 1) .dis1-dis2 > 0, wherein dis1 is neck Portion is to the distance of the right hand, and dis2 is the head distance to crotch.2). left (right) hands take on direction with left (right) and level angle must Must be less than 26 °.3). only when user's time of raising one's hand was more than 1 second, and left and right sidebar just can resurrect.
Described upper sidebar resurrects identification submodule, and the right hand only need to be lifted excessive by user slightly, can resurrect sidebar.
Described time-out gesture identification submodule, sets two vectors: the direction vector (x1, y1) of left hand forearm and the right side The direction vector (x2, y2) of arm.
X1=LeftHand.Position.X-LeftElbow.Position.X
X2=RightHand.Position.X-RightElbow.Position.X
Y1=LeftHand.Position.Y-LeftElbow.Position.Y
Y2=RightHand.Position.Y-RightElbow.Position.Y
Wherein, LeftHand.Position.X represents the X-axis coordinate of left hand central point, LeftElbow.Position.X Representing the X-axis coordinate of left elbow central point, RightHand.Position.X represents the X-axis coordinate of right hand central point, RightElbow.Position.X represents the X-axis coordinate of right elbow central point, and LeftHand.Position.Y represents left hand center The Y-axis coordinate of point, LeftElbow.Position.Y represents the Y-axis coordinate of left elbow central point, RightHand.Position.Y Representing the Y-axis coordinate of right hand central point, RightElbow.Position.Y represents the Y-axis coordinate of right elbow central point.Broadcast at video In the case of putting, as | x1*x2+y1*y2 | < 0.008, when two forearms are almost vertical, system is thought now to constitute and is suspended hands Gesture.In order to reduce false triggering rate, system is provided with for time-out gesture identification and limits as follows:
1). the horizontal mid-point of right-hand man should be near the horizontal mid-point of head | (LeftHand.Position.X+ RightHand.Position.X)/2-head.Position.X | < 0.08, during wherein head.Position.X represents head Heart point X-axis coordinate.
2). between right-hand man should take on two (LeftShoulder.Position.X < LeftHand.Position.X) && (RightShoulder.Position.X > RightHand.Position.X), wherein LeftShoulder.Position.X and RightShoulder.Position.X represents left shoulder and right shoulder central point X-axis respectively Coordinate.
Described gesture identification submodule of waving, uses " waving " such a gesture to realize exiting of function, to wave this The maximum duration of one gesture identification is 1 second.Kinect identified in 1 second about the number of times brandished be at least 3 times and then judge that it is Wave gesture.Start to identify when the position of hands is higher than the position of elbow.When hands crosses the centre line arrival offside, enumerator adds 1. Enumerator exceed brandish judgement minimum number be then judged as waving.
Described picture drags and scaling gesture identification submodule resurrects right hand column PPT demonstrates when, can demonstrate this In engineering catalogue hypograph file, the picture of all expansion entitled jpg, png, bmp etc., can be carried out in they pull into PPT Show, stretch out to the right the right hand and within 1 second, just can resurrect the picture hurdle on the right, right hand retraction picture hurdle then can be disappeared.Pass through the right hand The slip having swung up and down picture hurdle and the selection of picture, speaker's right hand moves up and down occurs that picture chosen by pattern, right Hold the backward left side to pull and picture can be pulled into PPT, in this case both hands hold the most outwardly or inwardly movement with a firm grip can By zoom picture on the basis of center, the right hand is held and movement can move picture, and mobile and zoom operations can be many Secondary carry out, wave in this case to may return to normal condition.The coboundary height same with right shoulder selecting region of the right hand, the most about It is 0.2, the 0.4m being mapped in reality.Selecting region slide downward when lifting hands, to show the picture above list, hands is lowerd Time then contrary, the most no longer slide when arriving list head or tail.The right hand falls when selecting in region, according to the height geometric ratio with right shoulder Example is mapped on screen:
Y=(RightShoulder.Y-RightHand.Y)/0.2*ScreenHeight
Wherein, Y is that the right hand selects region height, RightShoulder.Y and RightHand.Y is respectively right shoulder and the right hand The Y-axis coordinate of central point, ScreenHeight is screen height.
Any video that described video playback gesture identification submodule recalls in video library in speech at any time plays out, Resurrecting of video hurdle is similar with the processing mode on picture hurdle with mobile, and simply the operation left hand that selects of video completes, " time-out " gesture can be used in the broadcasting of video to control at the broadcasting/time-out carrying out video.System uses same gesture Control time-out and the broadcasting of video, after once suspending or playing, wait that 1s detects follow-up gesture again.
Described annotation gesture identification submodule, when speaker catches the right hand just to enter line state, screen there will be Cursor, position is to determine relative to the position of shoulder according to speaker's right hand, and the right hand makees to capture posture just can be on screen Red line is marked in relevant position, and the right hand unclamps, and can move cursor, then can draw next section of line.Finally use " waving " posture Exiting line state, the line drawn is wiped free of.Coordinate in scratching process maps the relative position used between the right hand and left shoulder, Position linearity in respective rectangular frame is mapped on screen.Concrete mapping equation is:
X=(RightHand.X LeftShouder.X 0.32)/0.35*ScreenWidth
Y=(RightHand.Y LeftShouder.Y+0.08)/0.16*ScreenHeight
Wherein, RightHand is right-handed scale (R.H.scale), and LeftShouder is left shoulder coordinate, ScreenWidth and ScreenHeight is width and the height of current screen, can be with the screen of self adaptation different resolution.
After described rapid page gesture identification submodule resurrects the thumbnail of all lantern slides of current PPT, can slide left and right Dynamic check all thumbnails resurrected, and choose wherein after one can directly rapid page to corresponding lantern slide, Can use under display mode and non-display mode.Speaker upwards lifts the right hand over the top of the head, can resurrect sidebar, top The thumbnail of all lantern slides in the PPT currently shown it is shown that in hurdle.In this case and the right hand higher than shoulder or more than Time highly, the upper sidebar resurrected can be rolled or chooses lantern slide thumbnail.Folder when forearm Yu normal When angle is more than 45 degree, upper sidebar slides.Forward slip when forearm is on the left of normal, slides backward during right side.Forearm with When normal angle is less than 45 degree, occur that the lantern slide thumbnail of display in current upper sidebar is chosen by a pattern, system in Between region be divided into 5 zonules, 1 to 5 in each corresponding thumbnail, choose according to the position of hands.Now the right hand is held, Entering page turning state, the position of the pattern chosen determines, no longer changes.Under page turning state the right hand pull down to abdominal part or Below person, it is possible at rapid page to the lantern slide selected, and upper sidebar disappears, and State Transferring is to normal condition.If Unclamp the right hand under page turning state, change supreme sidebar state;Conversion on top wave under columns state to normal condition.
Described position computation module, by analyzing the data that Kinect gathers, calculates speaker relative in photographic head scope The deviation post of the heart, sends signal in serial communication mode to Galileo development board, thus controls motor by Galileo Carry out horizontal direction rotation, it is achieved real-time tracking speaker.
Described PPT functional control module, annotation pen controls to use figure (Graphics) class in C# programming language to enter The drafting of a row annotation red line;PPT (quickly) page turning controls to be based primarily upon disclosed in Microsoft Office PowerPoint API, selects the procedure set required for the operation of native system PPT interactive controlling, realizes the behaviour to PPT according to the result of gesture identification Control;Picture presentation picturebox control in C# programming language realizes;Video playback controls with in C# programming language AxWindowsMediaPlayer control carries out the broadcasting of video.
Accompanying drawing explanation
Fig. 1 hardware forms
Fig. 2 System Initialization Procedure
Fig. 3 system software architecture
Fig. 4 page turning gesture
About Fig. 5 sidebar resurrects gesture
Fig. 6 suspends gesture
Fig. 7 waves gesture
Fig. 8 right-hand operated picture hurdle gesture
Fig. 9 annotates a gesture scribe area schematic diagram
Figure 10 PPT rapid page gesture identification flow chart
Figure 11 gesture identification total condition conversion figure
Figure 12 Kinect detection position signal
Figure 13 Galileo condition conversion figure
Figure 14 setting-out control flow chart
Figure 15 PPT Controlling model
Detailed description of the invention
Native system design utilizes the depth information of Kinect photographic head synchronous acquisition, framework information, based on Kinect SDK Develop, and run on Bay Trail flush bonding processor.System uses motor to control the rotation of The Cloud Terrace, according to The positional information of the speaker that Kinect transmits, determines the rotary state of motor, uses Galileo development board and ULN2003 Motor is driven by chip, with reach Kinect can be with the purpose of real-time tracking speaker.According to bone information, design The gesture that various discriminations are high, learning cost is low, realizes speaker with this and utilizes gestures to operate PPT (quickly) page turning, make With annotation pen, check a series of utilitarian functions such as picture, broadcasting video.
System hardware framework, with Bay Trail flush bonding processor as core, is connected with each ancillary equipment peripheral hardware, carries out Data process, and send signal etc..Kinect is responsible for sampling depth and framework information, then by entering the gesture information of speaker Row identifies, application program disclosed in related control based on C# programming language and Microsoft Office PowerPoint connects Mouth (API), it is achieved the control that PPT is demonstrated by speaker's gesture.Galileo development board is used for receiving electric machine rotation information and controls electricity Machine, externally connected with display screen projection PPT, display picture, broadcasting video etc., motor is followed speaker in real time and is rotated, to ensure Kinect gathers the accuracy of information.
Below in conjunction with embodiment and accompanying drawing, technical solution of the present invention is described further.The present embodiment system forms: Kinet+Bay trail+Galileo development board+motor+screen, system hardware forms as shown in Figure 1.
The principle of whole system is: system hardware framework is with Bay Trail flush bonding processor as core, peripheral with each Equipment peripheral hardware connects, and carries out data process, sends signal etc..Kinect is responsible for sampling depth and bone information, and (this technology belongs to In prior art, be not present contribution to the art part), then system according to predefined gesture to speaker Gesture information be identified, by calling API disclosed in Microsoft Office PowerPoint, it is achieved speaker's hands The control that gesture is demonstrated for PPT.Galileo development board is used for receiving electric machine rotation command information thus controls motor, external display Screen projection PPT, display picture, broadcasting video etc., motor is followed speaker in real time and is rotated, to ensure that Kinect gathers information Accuracy.Galileo development board is directly Galilean output port to be passed through ULN2003 chip to the control of motor Connecting the driving module of motor, the rotation to motor is controlled.
First have to during system start-up initialize, initialize flow process as shown in Figure 2:
System software architecture is as shown in Figure 3: system main modular is divided into data acquisition module, gesture recognition module, position Computing module and PPT functional control module, described gesture recognition module, position computation module, PPT functional control module all run In Bay Trail flush bonding processor.Data acquisition module, mainly by Kinect gather required for gesture identification deep Degree information and bone information (being already belonging to prior art, be not present contribution to the art part), for gesture identification and Calculating location information provides data.Gesture control module is the control module of system, identifies user's hands according to gathering data Gesture, and then it is able to carry out the operation required for user.Position computation module is the interface with hardware components, manipulates cloud according to position Platform rotates, it is achieved real-time tracking.PPT functional control module is a series of services provided a user with, including paintbrush annotation, PPT The use of the functions such as (quickly) page turning, picture presentation, video playback.
1. data acquisition module
The degree of depth and the skeleton that utilize Kinect SDK obtain function to gather information.The resolution of depth information collection is 680 × 480, frame per second is that 30 frames are per second.
2. gesture control module
(1) page turning gesture identification
Use two gestures defined in Microsoft.Samples.Kinect.SwipeGestureRecognizer storehouse Gesture SwipeLeftDetected and SwipeRightDetected.The definition of SwipeLeftDetected gesture is left hand Naturally being raised up to chest, level brandished human body centrage to the right, analog mouse left button " ← ".SwipeRightDetected hands The definition of gesture is that the right hand is raised up to chest naturally, and level brandished human body centrage to the left, analog mouse right button " → ".Such as Fig. 4 Shown in.
(2) gesture identification is resurrected
This submodule again include annotation pen, left and right sidebar, upper sidebar resurrect identification submodule.
A. annotation pen resurrects identification
After the right hand holds relieving for the first time as user, screen there will be the tracking cross carrying out annotating, tracking cross meeting Right hand moving along with speaker.Speaker can select the place that his needs carry out annotating, and then holds the right hand, now User just can add and annotates.For avoiding occurring annotation clerical mistake to resurrect, system resurrects region to annotation pen and is limited: 1) .dis1-dis2 <-0.2, wherein, dis1 is the cervical region distance to the right hand, and dis2 is the head distance to crotch.2) .RightHand.Position.Y-RightShoulder.Position.Y >=-0.1, wherein RightHand.Position.Y represents the Y-axis coordinate of right hand central point, and RightShoulder.Position.Y represents right shoulder The Y-axis coordinate of central point.
B. left and right sidebar resurrects identification
User lifts right-hand man naturally can resurrect left and right sidebar.In view of the normal body language of user, system is to a left side Right hurdle resurrects mode and is limited: 1) .dis1-dis2 > 0, and wherein dis1 is the cervical region distance to the right hand, and dis2 is that head arrives The distance of crotch.2). left (right) hands take on direction with left (right) and level angle is necessarily less than 26 °.3). only raise one's hand as user Time, left and right sidebar just can resurrect more than 1 second.As shown in Figure 5.
C. go up sidebar and resurrect identification
The right hand only need to be lifted excessive by user slightly, can resurrect sidebar.
(3) gesture identification is suspended
Set two vectors: the direction vector of the direction vector (x1, y1) of left hand forearm and right hand forearm (x2, y2)。
X1=LeftHand.Position.X-LeftElbow.Position.X
X2=RightHand.Position.X-RightElbow.Position.X
Y1=LeftHand.Position.Y-LeftElbow.Position.Y
Y2=RightHand.Position.Y-RightElbow.Position.Y
Wherein, LeftHand.Position.X represents the X-axis coordinate of left hand central point, LeftElbow.Position.X Representing the X-axis coordinate of left elbow central point, RightHand.Position.X represents the X-axis coordinate of right hand central point, RightElbow.Position.X represents the X-axis coordinate of right elbow central point, and LeftHand.Position.Y represents left hand center The Y-axis coordinate of point, LeftElbow.Position.Y represents the Y-axis coordinate of left elbow central point, RightHand.Position.Y Representing the Y-axis coordinate of right hand central point, RightElbow.Position.Y represents the Y-axis coordinate of right elbow central point.Broadcast at video In the case of putting, as | x1*x2+y1*y2 | < 0.008, when two forearms are almost vertical, system is thought now to constitute and is suspended hands Gesture.As shown in Figure 6.
In order to reduce false triggering rate, system is provided with for time-out gesture identification and limits as follows:
1). the horizontal mid-point of right-hand man should be near the horizontal mid-point of head | (LeftHand.Position.X+ RightHand.Position.X)/2-head.Position.X | < 0.08, during wherein head.Position.X represents head Heart point X-axis coordinate.
2). between right-hand man should take on two (LeftShoulder.Position.X < LeftHand.Position.X) && (RightShoulder.Position.X > RightHand.Position.X), wherein LeftShoulder.Position.X and RightShoulder.Position.X represents left shoulder and right shoulder central point X-axis respectively Coordinate.
(4) wave gesture identification
The maximum duration of this gesture identification of waving is 1 second.Kinect identified in 1 second about the number of times brandished at least Then judge that it is for gesture of waving for 3 times.Start to identify when the position of hands is higher than the position of elbow.Whenever hands is crossed the centre line arrival offside Time, enumerator+1.Enumerator exceed brandish judgement minimum number be then judged as waving." waving " such a gesture is used Realize exiting of function, exit gesture and use scene, as shown in Figure 7:
(5) picture drags and scaling gesture identification
PPT demonstrates when, resurrect right hand column, all expansion names in this engineering catalogue hypograph file can be demonstrated For the picture of jpg, png, bmp etc., be shown in they can be pulled into PPT, stretch out to the right the right hand and just can resurrect the right side in 1 second The picture hurdle on limit, then can disappear right hand retraction picture hurdle.By the slip having swung up and down picture hurdle and the picture of the right hand Selection, the right hand select coboundary and the right shoulder in region with high, height is about 0.2, the 0.4m being mapped in reality.Lift hands Time select region slide downward, to show the picture above list, then contrary when hands is lowerd, when arriving list head or tail the most no longer Slide.The right hand falls when selecting in region, is mapped on screen according to the height equal proportion with right shoulder:
Y=(RightShoulder.Y-RightHand.Y)/0.2*ScreenHeight
Wherein, Y is that the right hand selects region height, RightShoulder.Y and RightHand.Y is respectively right shoulder and the right hand The Y-axis coordinate of central point, ScreenHeight is screen height.
Speaker's right hand moves up and down occurs that picture chosen by pattern, and the right hand is held the backward left side and pulled and picture can be pulled into PPT, in this case both hands hold with a firm grip the most outwardly or inwardly movement can on the basis of center zoom picture, the right hand is held Living and movement can move picture, mobile and zoom operations can repeatedly be carried out, and waving in this case, it is normal to may return to State.As shown in Figure 8.
(6) video playback gesture identification
Resurrecting of video hurdle is similar with the processing mode on picture hurdle with mobile, and simply the selection operation of video is to come with left hand Complete, " time-out " gesture can be used in the broadcasting of video to control at the broadcasting/time-out carrying out video.System uses same Individual gesture controls time-out and the broadcasting of video, waits and detect follow-up gesture again in 1 second after once suspending or playing.
(7) an annotation gesture identification
When speaker catches the right hand just to enter line state, screen there will be cursor, position is according to speaker's right hand Determining relative to the position of shoulder, the right hand makees to capture posture just can mark red line in relevant position on screen, and the right hand unclamps Then can move cursor, then can draw next section of line.Finally using " waving " posture to exit line state, the line drawn is wiped Remove.Coordinate in scratching process maps the relative position used between the right hand and left shoulder, the position line in respective rectangular frame Property is mapped on screen.Concrete mapping equation is:
X=(RightHand.X LeftShouder.X 0.32)/0.35*ScreenWidth
Y=(RightHand.Y LeftShouder.Y+0.08)/0.16*ScreenHeight
Wherein, RightHand is right-handed scale (R.H.scale), and LeftShouder is left shoulder coordinate, ScreenWidth and ScreenHeight is width and the height of current screen, can be with the screen of self adaptation different resolution.As shown in Figure 9.
(8) rapid page gesture identification
After the thumbnail resurrecting all lantern slides of current PPT, can horizontally slip and check all thumbnails resurrected, and And choose and wherein can directly to corresponding lantern slide, all may be used under display mode and non-display mode by rapid page after one To use.Speaker upwards lifts the right hand over the top of the head, can resurrect sidebar, is shown that the PPT currently shown in upper sidebar In the thumbnail of all lantern slides.In this case and the right hand is higher than shoulder or during level above, can be to the upper sidebar resurrected Carry out rolling or lantern slide thumbnail being chosen.When the angle of forearm and normal is more than 45 degree, upper sidebar is slided Dynamic.Forward slip when forearm is on the left of normal, slides backward during right side.When forearm and normal angle are less than 45 degree, occur one The lantern slide thumbnail of display in current upper sidebar is chosen by individual pattern, and system is divided into 5 zonules middle region, each In corresponding thumbnail 1 to 5, chooses according to the position of hands.Now the right hand is held, and enters page turning state, the pattern chosen Position determines, no longer changes.Under page turning state, the right hand pulls down to abdominal part or following, it is possible to rapid page is to choosing At the lantern slide selected, and upper sidebar disappears, and State Transferring is to normal condition.If unclamping the right hand under page turning state, conversion is extremely Top columns state;Conversion on top wave under columns state to normal condition.Rapid page gesture identification flow chart such as Figure 10.
The state transition graph of all gesture identification is as shown in figure 11.State after system initialization is normal condition.Just Often under state, lift excessive if system identification gesture is the right hand, then forward upper side frame to and resurrect state;If system identification gesture is the right hand Stretch out certain distance and the persistent period is more than 1 second, then forward left frame to and resurrect state;If system identification gesture is right hand first time Hold and unclamp again, then forward to enter line state;Certain distance and persistent period is stretched out more than 1 if system identification gesture is left hand Second, then forward left frame to and resurrect state.Under on top frame resurrects state, wave if system identification gesture is left hand, then just return Often state;Hold if system identification gesture is the right hand, then forward page turning state to.Under page turning state, if system identification gesture is The right hand unclamps, then return upper side frame and resurrect state.Under frame resurrects state on the right, put down, then if system identification gesture is the right hand Return normal condition;Hold if system identification gesture is the right hand and pulls picture into, then forwarding left frame display picture state to.On the right side Under frame display picture state, if system identification gesture is for waving, then return normal condition.Under left frame resurrects state, if System identification gesture is that left hand puts down, then return normal condition;Hold if system identification gesture is left hand, then forward left frame to and show Show video state.Under left frame display video state, if system identification gesture is for waving, then return normal condition;If system Identify that gesture is that left hand unclamps, then forward to play the video state that left frame shows.Playing the video state that left frame shows Under, if system identification gesture is for waving, then return normal condition;If system identification gesture is halt signal, then return to the left side Frame display video state.Under entering line state, hold if system identification gesture is the right hand, then forward line state to.Drawing Under line states, unclamp if system identification gesture is the right hand, then back into line state.
3. position computation module
The positional information of speaker is gathered by Kinect, according to the X-direction coordinate of head node in skeleton stream, it is judged that when Front needs toward which direction rotate, and turn serial ports by USB and send control string (" l ", " r " or " s ") to Galileo plate, The rotation of motor is controlled by Galileo plate by controlling the low and high level of D8-D11 output port.Kinect is to drilling The detection signal of the X-direction coordinate of speaker, such as Figure 12:
The horizontal centre of screen is coordinate 0 point, and when user is towards Kinect, the left side is negative, and the right is just.Protect during initialization Holding in Stay state, once the absolute value of X exceedes threshold value 0.05 (here ± 0.05 represent offset level center 0.05m) just Send control string from serial ports, and to change current state be rotation status.Such as send to Galileo during X <-0.05 " l ", The Cloud Terrace is allowed to rotate clockwise to follow the tracks of the movement of speaker.Until the absolute value of X is less than 0.02, just turn again to resting state. The threshold value returning to resting state arranges less, it is ensured that when returning to resting state, personage is substantially at center.No matter It is currently any running status, receives " l " character and be maintained for rotating clockwise, receive " r " character and be maintained for turning counterclockwise Dynamic, receive " s " character and be maintained for static.So design is advantageous in that, Galileo will not continuously receive identical instruction, Instruction transmission speed otherwise can be caused to be faster than reception speed, instruct and pile up in relief area, cause Galileo to react time delay.State Convert figure such as Figure 14:
4.PPT functional control module
(1) annotation pen controls
The Graphics class in C# is used to carry out the drafting of red line.Graphics class is packaged in System.Darwing In a GDI drawing classes, can in application interface or figure layer a series of simple graphs such as graphical pointv, line, rectangle, circle. Corresponding to right hand position in virtual rectangle frame, we use the cross mark of a black to represent that current cursor is at screen On position, this cross mark combined by the graphical boxes of two elongated black rectangle, according to the position of speaker's right hand And change the position on screen in real time.When cross mark moves on interface, its red line covered will be wiped free of.In order to Solve this problem, use a figure layer to be used for drawing red line.Generate the bitmap DrawBmp of and current screen formed objects With the Graphics that can draw a design on this bitmap.Under line state, each frame skeleton data is transmitted through, and can obtain The coordinate of one point being mapped on screen.Because the frame number of Kinect sampling is 30 frames per second, iff according to every frame Coordinate draws a point on screen, and the point drawn is discrete, can not be linked to be line, so the most secondary former and later two point is drawn as Article one, line.Use a flag bit currently the most whether to judge to be this first point, the most then position of the current point of record For prePoint, each frame hereafter, after obtaining the coordinate of a new point, it is designated as nowPoint, calls the function in C# DrawGra.DrawLine (RedPen, prePoint, nowPoint) function draws a straight line between two points, and (RedPen is C# In paintbrush Pen class, be used for representing color and the thickness of setting-out), this root line is drawn on DrawBmp.Then use whole The Graphics class of screen window is drawn in DrawBmp in application window: G.DrawImage (DrawBmp, 0,0), two below Individual 0 represents that the upper left corner of the picture drawn, in the upper left corner of screen, just overlaps with whole window.So red line and " ten " sign Show not at a figure layer, would not be wiped free of.Then nowPoint is assigned to prePoint, waits the data of next frame. Complete the function of setting-out and so forth.Integrality flow process is as shown in figure 14:
(2) rapid page control module
The page turning of PPT, first has to obtain the object of lantern slide application program.This module is based on Microsoft Office The discloseder API of PowerPoint carry out secondary development, select main interoperability procedure set (PIA).Main interoperability procedure set It it is a unique procedure set provided by supplier.It comprises the type definition (as metadata) realized with COM.Can only have One main interoperability procedure set, and this procedure set must be signed with strong name by the publisher of COM typelib.One master grasps mutually Make procedure set and can pack multiple versions in same type storehouse.In PIA Microsoft.Office.Interop.PowerPoint.Application represents PPT application.This object model is such as Shown in Figure 15:
After obtaining lantern slide application object, need to obtain slideshow object, i.e. Microsoft.Office.Interop.PowerPoint.Slide, but slideshow object is present among PowerPoint object, So first to obtain PowerPoint object, i.e. Microsoft.Office.Interop.PowerPoint.presentation. Just can use the select method of slideshow object that it is carried out page turn over operation after obtaining slide object.
But most of the time, we use the display mode of PPT, can not be with the operation of slide object under this pattern PPT is carried out page turn over operation.So needing to obtain Application.SlideShowWindows.View object, view object Representing the lantern slide that PPT plays, with the first of view object, the method such as last, previous, next, gotoslide is permissible Realize the page turning of PPT during projection.
(3) picture presentation
Picturebox class in picture C# shows, the display pattern of picturebox change into stretching after figure sector-meeting with The change of picturebox size position attribution and change, so the attribute changing picturebox control just can realize picture State changes.Top and the left attribute of picturebox determines the position that picturebox shows, and the two attribute is root It is mapped to screen position according to the position of hands and determines.Picture scaling realization be record both hands hold instantaneous right-hand man away from From, the size of picturebox is changed in real time according to the both hands distance in the most each moment and the ratio of initial distance.Whole system Picture hurdle on the right of generating during initialization.First by statement String path= Environment.CurrentDirectory+ " image " opens the image file under present procedure catalogue, and traversal is wherein The file of all jpg, png and bmp suffix, and record number imageFileNum.Because every pictures sets in picture hurdle For showing less than the size of 150 pixels with length and width, between two pictures, it is provided with the interval of 30 pixels, so whole right hand column The height of picture RightBmp should be (30+imageFileNum*180) pixel, and width is set to 200 pixels.It is suitable to generate After the picture of size, the most successively the thumbnail of every pictures is plotted in the correspondence position of RightBmp, and remembers by an array The title of each picture under record, in order to use during subsequently selected picture.RightBmp is put into a PictureBox: RightArea, is first set to false Visible attribute, uses when resurrecting after waiting.
(4) video playback capability
The broadcasting of video is carried out with the AxWindowsMediaPlayer control in C#.MediaPlayer (wmp) is micro- The com assembly of soft offer, it is possible to easily operate the files in stream media such as video, audio frequency.After choosing video in video hurdle, Video path is extracted as the Url of wmp, and wmp.Visible is set to true, calls wmp.Ctlcontrols.play () function Carry out the broadcasting of video.Need to call wmp.Ctlcontrols.pause () when of time-out.It should be noted that for The simplicity of operation, uses same gesture (being similar to the gesture of T) to control time-out and the broadcasting of video, so once suspending Or wait detects follow-up gesture in 1 second again after playing.

Claims (10)

1. PPT based on Kinect demonstrate an aid system, be characterised by, including data acquisition module, gesture recognition module, Position computation module and PPT functional control module;
Data acquisition module, is placed on The Cloud Terrace including Kinect, Kinect and controls the The Cloud Terrace anglec of rotation by motor, mainly Being to utilize Kinect sampling depth information and bone information, depth information is used for calculating the position of user (speaker), and skeleton is believed Breath is for gesture identification;
Gesture recognition module identifies user (speaker) gesture according to gathering data, and then is able to carry out the behaviour required for user Make;
Position computation module, identifies user (speaker) gesture head position manipulation cloud platform rotation according to gathering data, it is achieved Real-time tracking;
PPT functional control module is a series of services provided a user with, including paintbrush annotation, PPT (quickly) page turning, photo exhibition Show, the use of the function such as video playback.
2. PPT based on Kinect as claimed in claim 1 demonstrates aid system, is characterised by,
Described gesture recognition module, the degree of depth collected according to Kinect and bone information, according to predefined gesture library, know Other speaker carries out gesture during PPT demonstration, including page turning gesture identification, resurrects gesture identification, suspends gesture identification, hands of waving The identification of the basic gestures such as gesture, and picture drag and scaling, video playback, annotation pen, the gesture identification of rapid page function Deng submodule.
3. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described page turning gesture identification Module uses two gesture gestures defined in Microsoft.Samples.Kinect.SwipeGestureRecognizer storehouse SwipeLeftDetected and SwipeRightDetected.The definition of SwipeLeftDetected gesture is that left hand is natural Being raised up to chest, level brandished human body centrage to the right, analog mouse left button " ← ".SwipeRightDetected gesture Being defined as the right hand and be naturally raised up to chest, level brandished human body centrage to the left, analog mouse right button " → ".
4. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described annotation pen resurrects knowledge Small pin for the case module, after the right hand holds relieving for the first time as user, screen there will be the tracking cross carrying out annotating, tracking cross meeting Right hand moving along with speaker;Annotation pen is resurrected region limited:
1) .dis1-dis2 <-0.2, wherein, dis1 is the cervical region distance to the right hand, and dis2 is the head distance to crotch.
2) .RightHand.Position.Y-RightShoulder.Position.Y >=-0.1,
Wherein RightHand.Position.Y represents the Y-axis coordinate of right hand central point, RightShoulder.Position.Y Represent the Y-axis coordinate of right shoulder central point.
5. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described left and right sidebar resurrects Identifying submodule, user lifts right-hand man naturally can resurrect left and right sidebar.In view of the normal body language of user, to left and right Hurdle resurrects mode and limits:
1) .dis1-dis2 > 0, wherein dis1 is the cervical region distance to the right hand, and dis2 is the head distance to crotch;
2). left (right) hands take on direction with left (right) and level angle is necessarily less than 26 °;
3). only when user's time of raising one's hand was more than 1 second, and left and right sidebar just can resurrect.
6. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described time-out gesture identification Submodule, sets two vectors: the direction vector (x1, y1) of left hand forearm and the direction vector (x2, y2) of right hand forearm:
X1=LeftHand.Position.X-LeftElbow.Position.X
X2=RightHand.Position.X-RightElbow.Position.X
Y1=LeftHand.Position.Y-LeftElbow.Position.Y
Y2=RightHand.Position.Y-RightElbow.Position.Y
Wherein, LeftHand.Position.X represents the X-axis coordinate of left hand central point,
LeftElbow.Position.X represents the X-axis coordinate of left elbow central point,
RightHand.Position.X represents the X-axis coordinate of right hand central point,
RightElbow.Position.X represents the X-axis coordinate of right elbow central point,
LeftHand.Position.Y represents the Y-axis coordinate of left hand central point,
LeftElbow.Position.Y represents the Y-axis coordinate of left elbow central point,
RightHand.Position.Y represents the Y-axis coordinate of right hand central point,
RightElbow.Position.Y represents the Y-axis coordinate of right elbow central point;
In the case of video playback, as | x1*x2+y1*y2 | < 0.008, when two forearms are almost vertical, system thinks this Time constitute suspend gesture;
Time-out gesture identification is provided with and limits as follows:
1). the horizontal mid-point of right-hand man should be near the horizontal mid-point of head | (LeftHand.Position.X+ RightHand.Position.X)/2-head.Position.X | < 0.08, during wherein head.Position.X represents head Heart point X-axis coordinate;
2). right-hand man should be between two shoulders (LeftShoulder.Position.X < LeftHand.Position.X) (RightShoulder.Position.X > RightHand.Position.X), wherein LeftShoulder.Position.X and RightShoulder.Position.X represents left shoulder and right shoulder central point X-axis coordinate respectively.
7. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described in wave gesture identification Submodule, uses " waving " such a gesture to realize exiting of function, and the maximum duration of this gesture identification of waving is 1 Second;Kinect identified in 1 second about the number of times brandished then judge that it is for gesture of waving at least 3 times;
Start to identify when the position of hands is higher than the position of elbow;
When hands crosses the centre line arrival offside, enumerator adds 1;
Enumerator exceed brandish judgement minimum number be then judged as waving.
8. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described picture drags and contracting Put gesture identification submodule and resurrect right hand column PPT demonstrates when, institute in this engineering catalogue hypograph file can be demonstrated There is the picture expanding entitled jpg, png, bmp, be shown in they are pulled into PPT, stretch out to the right the right hand and just can resurrect for 1 second The picture hurdle on the right, then can disappear right hand retraction picture hurdle;
By the slip having swung up and down picture hurdle and the selection of picture of the right hand, speaker's right hand moves up and down there is pattern Choosing picture, the right hand is held the backward left side and is pulled and picture can be pulled into PPT, in this case both hands hold with a firm grip then outwards or Move inward can on the basis of center zoom picture, the right hand is held and movement can move picture, mobile and scaling behaviour Work can repeatedly be carried out, and waves in this case to may return to normal condition;
The coboundary selecting region of the right hand is about 0.2 with right shoulder with height, height, the 0.4m being mapped in reality;When lifting hands Select region slide downward, to show the picture above list, then contrary when hands is lowerd, arrive during list head or tail the most sliding Dynamic;
The right hand falls when selecting in region, is mapped on screen according to the height equal proportion with right shoulder:
Y=(RightShoulder.Y-RightHand.Y)/0.2*ScreenHeight
Wherein, Y is that the right hand selects region height, RightShoulder.Y and RightHand.Y is respectively right shoulder and right hand center The Y-axis coordinate of point, ScreenHeight is screen height.
9. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described annotation gesture is known Small pin for the case module, when speaker catches the right hand just to enter line state, screen there will be cursor, and position is according to speaker's right hand Determining relative to the position of shoulder, the right hand makees to capture posture just can mark red line in relevant position on screen, and the right hand unclamps Then can move cursor, then can draw next section of line.Finally using " waving " posture to exit line state, the line drawn is wiped Remove.Coordinate in scratching process maps the relative position used between the right hand and left shoulder, the position line in respective rectangular frame Property is mapped on screen, and concrete mapping equation is:
X=(RightHand.X LeftShouder.X 0.32)/0.35*ScreenWidth
Y=(RightHand.Y LeftShouder.Y+0.08)/0.16*ScreenHeight
Wherein, RightHand is right-handed scale (R.H.scale), and LeftShouder is left shoulder coordinate, ScreenWidth and ScreenHeight It is width and the height of current screen, can be with the screen of self adaptation different resolution.
10. PPT based on Kinect as claimed in claim 2 demonstrates aid system, is characterised by, described rapid page gesture After identifying the thumbnail that submodule resurrects all lantern slides of current PPT, can horizontally slip and check all thumbnails resurrected, and And choose and wherein can directly to corresponding lantern slide, all may be used under display mode and non-display mode by rapid page after one To use;
Speaker upwards lifts the right hand over the top of the head, can resurrect sidebar, is shown that institute in the PPT currently shown in upper sidebar There is the thumbnail of lantern slide;In this case and the right hand is higher than shoulder or during level above, the upper sidebar resurrected can be carried out Roll or lantern slide thumbnail is chosen;When the angle of forearm and normal is more than 45 degree, upper sidebar slides;Little Forward slip when arm is on the left of normal, slides backward during right side;When forearm and normal angle are less than 45 degree, a pattern occurs Choosing the lantern slide thumbnail of display in current upper sidebar, system is divided into 5 zonules, each corresponding contracting middle region In sketch map 1 to 5, chooses according to the position of hands;Now the right hand is held, and enters page turning state, and the position of the pattern chosen is true Fixed, no longer change;Under page turning state, the right hand pulls down to abdominal part or following, it is possible to rapid page is unreal to select At lamp sheet, and upper sidebar disappears, and State Transferring is to normal condition;If unclamping the right hand under page turning state, change supreme sidebar State;Conversion on top wave under columns state to normal condition.
CN201610472100.3A 2016-06-24 2016-06-24 PPT based on Kinect demonstrates aid system Pending CN106125928A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610472100.3A CN106125928A (en) 2016-06-24 2016-06-24 PPT based on Kinect demonstrates aid system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610472100.3A CN106125928A (en) 2016-06-24 2016-06-24 PPT based on Kinect demonstrates aid system

Publications (1)

Publication Number Publication Date
CN106125928A true CN106125928A (en) 2016-11-16

Family

ID=57269466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610472100.3A Pending CN106125928A (en) 2016-06-24 2016-06-24 PPT based on Kinect demonstrates aid system

Country Status (1)

Country Link
CN (1) CN106125928A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106886284A (en) * 2017-01-20 2017-06-23 西安电子科技大学 A kind of Cultural relics in museum interactive system based on Kinect
CN107678425A (en) * 2017-08-29 2018-02-09 南京理工大学 A kind of car controller based on Kinect gesture identifications
CN108089715A (en) * 2018-01-19 2018-05-29 赵然 A kind of demonstration auxiliary system based on depth camera
CN109313485A (en) * 2017-02-18 2019-02-05 广州艾若博机器人科技有限公司 Robot control method, device and robot based on gesture identification
CN110069133A (en) * 2019-03-29 2019-07-30 湖北民族大学 Demo system control method and control system based on gesture identification
CN112684895A (en) * 2020-12-31 2021-04-20 安徽鸿程光电有限公司 Marking method, device, equipment and computer storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814242A (en) * 2010-04-13 2010-08-25 天津师范大学 Moving object real-time tracking recording device of classes
CN102520793A (en) * 2011-11-30 2012-06-27 苏州奇可思信息科技有限公司 Gesture identification-based conference presentation interaction method
CN103268153A (en) * 2013-05-31 2013-08-28 南京大学 Human-computer interactive system and man-machine interactive method based on computer vision in demonstration environment
CN103713741A (en) * 2014-01-08 2014-04-09 北京航空航天大学 Method for controlling display wall through gestures on basis of Kinect

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814242A (en) * 2010-04-13 2010-08-25 天津师范大学 Moving object real-time tracking recording device of classes
CN102520793A (en) * 2011-11-30 2012-06-27 苏州奇可思信息科技有限公司 Gesture identification-based conference presentation interaction method
CN103268153A (en) * 2013-05-31 2013-08-28 南京大学 Human-computer interactive system and man-machine interactive method based on computer vision in demonstration environment
CN103713741A (en) * 2014-01-08 2014-04-09 北京航空航天大学 Method for controlling display wall through gestures on basis of Kinect

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
鲍峰: "基于Kinect的人机交互演示系统", 《计算机与现代化》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106886284A (en) * 2017-01-20 2017-06-23 西安电子科技大学 A kind of Cultural relics in museum interactive system based on Kinect
CN109313485A (en) * 2017-02-18 2019-02-05 广州艾若博机器人科技有限公司 Robot control method, device and robot based on gesture identification
CN109313485B (en) * 2017-02-18 2019-10-11 广州艾若博机器人科技有限公司 Robot control method, device and robot based on gesture identification
CN107678425A (en) * 2017-08-29 2018-02-09 南京理工大学 A kind of car controller based on Kinect gesture identifications
CN108089715A (en) * 2018-01-19 2018-05-29 赵然 A kind of demonstration auxiliary system based on depth camera
CN110069133A (en) * 2019-03-29 2019-07-30 湖北民族大学 Demo system control method and control system based on gesture identification
CN112684895A (en) * 2020-12-31 2021-04-20 安徽鸿程光电有限公司 Marking method, device, equipment and computer storage medium

Similar Documents

Publication Publication Date Title
CN106125928A (en) PPT based on Kinect demonstrates aid system
Skublewska-Paszkowska et al. 3D technologies for intangible cultural heritage preservation—literature review for selected databases
Ishii Tangible user interfaces
Rauterberg et al. BUILD-IT: a planning tool for construction and design
CN101419499B (en) Multimedia human-computer interaction method based on camera and mike
EP2919104B1 (en) Information processing device, information processing method, and computer-readable recording medium
CN102880360B (en) Infrared type multi-point interaction electric whiteboard system and blank Projection surveying method
JP2017522682A (en) Handheld browsing device and method based on augmented reality technology
CN103440033B (en) A kind of method and apparatus realizing man-machine interaction based on free-hand and monocular cam
Yin et al. Toward natural interaction in the real world: Real-time gesture recognition
CN108805766B (en) AR somatosensory immersive teaching system and method
CN104571823A (en) Non-contact virtual human-computer interaction method based on smart television set
CN104331164A (en) Gesture movement smoothing method based on similarity threshold value analysis of gesture recognition
CN106293099A (en) Gesture identification method and system
Monteiro et al. Teachable reality: Prototyping tangible augmented reality with everyday objects by leveraging interactive machine teaching
Avila et al. Art in the digital age
US20210081092A1 (en) Information processing system, information processing method, and program
Calvert Approaches to the representation of human movement: notation, animation and motion capture
JP2018005663A (en) Information processing unit, display system, and program
CN115379278B (en) Recording method and system for immersion type micro lessons based on augmented reality (XR) technology
JP6699406B2 (en) Information processing device, program, position information creation method, information processing system
Annachhatre et al. Virtual Mouse Using Hand Gesture Recognition-A Systematic Literature Review
WO2017218229A1 (en) Representing a document in a virtual spatial context
Rauterberg et al. BUILD-IT: a computer vision-based interaction technique of a planning tool for construction and design
Lu et al. Classification, application, challenge, and future of midair gestures in augmented reality

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20161116

WD01 Invention patent application deemed withdrawn after publication