CN103809733B - Man-machine interactive system and method - Google Patents

Man-machine interactive system and method Download PDF

Info

Publication number
CN103809733B
CN103809733B CN201210440197.1A CN201210440197A CN103809733B CN 103809733 B CN103809733 B CN 103809733B CN 201210440197 A CN201210440197 A CN 201210440197A CN 103809733 B CN103809733 B CN 103809733B
Authority
CN
China
Prior art keywords
user
man
hand
posture
interactive operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210440197.1A
Other languages
Chinese (zh)
Other versions
CN103809733A (en
Inventor
孙迅
陈茂林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Beijing Samsung Telecommunications Technology Research Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Samsung Telecommunications Technology Research Co Ltd, Samsung Electronics Co Ltd filed Critical Beijing Samsung Telecommunications Technology Research Co Ltd
Priority to CN201810619648.5A priority Critical patent/CN108845668B/en
Priority to CN201210440197.1A priority patent/CN103809733B/en
Priority to KR1020130050237A priority patent/KR102110811B1/en
Priority to US14/071,180 priority patent/US9684372B2/en
Publication of CN103809733A publication Critical patent/CN103809733A/en
Application granted granted Critical
Publication of CN103809733B publication Critical patent/CN103809733B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

It provides a kind of man-machine interactive system and man-machine interaction method, the man-machine interactive system includes:Image acquisition equipment, for obtaining image data;Man-machine dialogue system equipment determines that user wants the interactive operation carried out according to a plurality of types of actions of the user detected from image data and posture;It shows equipment, shows display screen corresponding with the result of interactive operation.The present invention can carry out man-machine interactive operation using the combination of multi-motion detection mode, in the case where not needing additional input unit, reduce the fuzziness of man-machine interactive operation identification, improve the accuracy of man-machine interactive operation.

Description

Man-machine interactive system and method
Technical field
The present invention relates to computer visions and area of pattern recognition, more particularly, to a kind of non-contacting, naturally remote Apart from human-computer interaction (HCI) system and method.
Background technology
Man-machine interaction mode based on computer vision technique can be obtained by various images and processing method is come visually Obtain user's input.Man-machine interaction mode based on computer vision technique becomes the popular words of human-computer interaction technology of new generation Topic, is especially widely used in terms of the human-computer interaction of amusement and recreation.Under this interactive mode, it can pass through user's Body posture, head pose, sight or human action are interacted with computer, so as to making user from traditional key It is freed in the input mode of disk, mouse etc., obtains unprecedented man-machine interaction experience.
It is currently suggested a variety of man-machine interaction modes based on computer vision.In a kind of existing man-machine interaction mode In, it can generate, change and operate 3D objects by using touch input and three-dimensional (3D) gesture input.In another method In, it can be interacted with visual user interface by human body attitude detection.
However, the type for the motion detection that existing human-computer interaction device and method are utilized is more single, it usually needs Input unit based on touch and user is needed to remember a large amount of compulsory exercise to execute interaction.Due to gesture, posture and The reason of depth induction range, it usually needs pretreatment or various manual operations are carried out, for example, it is desired to various sensors are calibrated, Pre-defined interactive space etc..This makes user be inconvenienced.Therefore, it is necessary to one kind can utilize multi-motion detection mode and Man-machine interaction mode independent of additional input unit.
Invention content
According to an aspect of the present invention, a kind of man-machine interactive system is provided, including:Image acquisition equipment, for obtaining Image data;Man-machine dialogue system equipment, according to a plurality of types of actions of the user detected from image data and posture come really Determine user and wants the interactive operation carried out;It shows equipment, shows display screen corresponding with the result of interactive operation.
According to an aspect of the present invention, man-machine dialogue system equipment includes:Motion detection block is detected from image data A plurality of types of actions of user and posture;Interaction determining module, according to the multiple types of the user of motion detection block detection Action and posture determine that user wants the interactive operation that will carry out, and send out corresponding display behaviour to display control module It instructs;Display control module, the instruction control determined according to interaction determining module show that equipment shows phase on the display screen The interactive operation answered.
According to an aspect of the present invention, motion detection block includes:Sight capture module, for being detected from image data The direction of visual lines of user;Posture tracing module, in image data track and identify user's body each section posture and Action.
According to an aspect of the present invention, the pitching side that sight capture module passes through the head of detection user from image data The direction of visual lines of user is determined to deflection direction.
According to an aspect of the present invention, posture tracing module track and detect in image data the node of the hand of user with It determines movement and the gesture of the hand of user, and detects the body bone node of user to determine that the posture of user's body each section is dynamic Make.
According to an aspect of the present invention, the direction of visual lines for the user that interaction determining module detect according to sight capture module with Posture tracing module identification user hand posture come determine whether start interactive operation.
According to an aspect of the present invention, if it is determined that the instruction direction of the direction of visual lines of user and the hand of user is directed to show Display items on display screen curtain are more than the predetermined time, then interact determining module determination and start to interact operation to the display items.
According to an aspect of the present invention, if it is determined that the instruction direction of the direction of visual lines of user and the hand of user is not directed toward Display items then interact determining module and determine that stopping interacts operation to the display items.
According to an aspect of the present invention, when user is close to image acquisition equipment, the tracking of posture tracing module and identification are used The finger movement at family is to identify the gesture of user, when user is far from image acquisition equipment, the tracking of posture tracing module and identification The action of the arm of user.
According to an aspect of the present invention, man-machine dialogue system equipment further includes:Self-defined posture registration module, for registering Interaction operation command corresponding with user-defined gesture actions.
According to another aspect of the present invention, a kind of man-machine interaction method is provided, including:Obtain image data;According to from Image data detection user a plurality of types of actions and posture come determine user want carry out interactive operation;Display and friendship The corresponding display screen of result of interoperability.
According to another aspect of the present invention, determine that the step of interactive operation includes:The more of user are detected from image data The action of type and posture;The interaction behaviour that will be carried out is determined according to a plurality of types of actions of the user of detection and posture Make, and sends out display operation instruction corresponding with interactive operation;Equipment is shown according to determining instruction control on the display screen Show corresponding interactive operation.
According to another aspect of the present invention, the step of a plurality of types of actions and posture for detecting user includes:From image The direction of visual lines of user is detected in data;The posture action of tracking and identification user's body each section.
According to another aspect of the present invention, pass through the pitch orientation on the head of detection user and deflection side from image data Always the direction of visual lines of user is determined.
According to another aspect of the present invention, it is used by the way that the node of the hand of user is tracked and detected in image data with determining The movement of the hand at family and gesture, and by detecting the body bone node of user from image data to determine each portion of user's body The posture action divided.
According to another aspect of the present invention, the user identified according to the direction of visual lines of the user of detection and posture tracing module Hand posture come determine whether start interactive operation.
According to another aspect of the present invention, if it is determined that the instruction direction of the direction of visual lines of user and the hand of user is directed to Show that the display items on screen are more than the predetermined time, it is determined that start to interact operation to the display items.
According to another aspect of the present invention, if it is determined that the instruction direction of the direction of visual lines of user and the hand of user does not refer to To display items, it is determined that stopping interacts operation to the display items.
According to another aspect of the present invention, when user is close to image acquisition equipment, the finger of tracking and identification user are dynamic Make, to identify the gesture of user, when user is far from image acquisition equipment, to identify the action of the arm of user.
According to another aspect of the present invention, determine that the step of interactive operation further includes:Determine the User Defined with registration The corresponding interactive operation of gesture actions.
Description of the drawings
Pass through the description carried out with reference to the attached drawing for being exemplarily illustrated embodiment, above and other purpose of the invention It will become apparent with feature, wherein:
Fig. 1 is to show that man-machine interactive system according to the ... of the embodiment of the present invention and user carry out interactive schematic diagram;
Fig. 2 is the structure diagram for the man-machine dialogue system equipment for showing man-machine interactive system according to the ... of the embodiment of the present invention;
Fig. 3 is to show the schematic diagram according to another embodiment of the present invention for starting or stopping man-machine interactive operation posture;
Fig. 4 is the flow chart for showing man-machine interaction method according to the ... of the embodiment of the present invention;
Fig. 5 is to show that man-machine interaction method according to the ... of the embodiment of the present invention carries out the flow chart of menu operation;
Fig. 6 is to show that man-machine interaction method according to the ... of the embodiment of the present invention carries out the flow of the interactive operation of 3D display target Figure;
Fig. 7 is to show that man-machine interaction method according to the ... of the embodiment of the present invention carries out the flow chart of hand-written operation.
Specific implementation mode
The embodiment of the present invention will now be described in detail, examples of the embodiments are shown in the accompanying drawings, wherein identical mark Number identical component is referred to always.It will illustrate the embodiment by referring to accompanying drawing below, to explain the present invention.
Fig. 1 is to show that man-machine interactive system according to the ... of the embodiment of the present invention and user carry out interactive schematic diagram.
As shown in Figure 1, man-machine interactive system according to the ... of the embodiment of the present invention includes image acquisition equipment 100, human-computer interaction Processing equipment 200 and display equipment 300.Image acquisition equipment 100 can have depth special for obtaining image data, image data It seeks peace color characteristic.Image acquisition equipment 100 can be the device that can shoot depth image, for example, depth camera.
The image data that man-machine dialogue system equipment 200 is used to obtain image acquisition equipment 100 is analyzed, to know Do not go out the posture of user and acts and the posture and action of user are parsed.Then, 200 basis of man-machine dialogue system equipment The output control of parsing shows that equipment 300 carries out corresponding display.Show that equipment 300 can be such as television set (TV), projection The equipment of instrument.
Here, as shown in Figure 1, man-machine dialogue system equipment 200 can be according to a plurality of types of actions of the user detected Determine that user wants the interactive operation carried out with posture.For example, user can be look in the content shown by display equipment 300 Multiple objects (for example, OBJ1, OBJ2 and OBJ3 shown in Fig. 1) in some special object (OBJ2) while, use hand It is directed toward the special object, to start interactive operation.That is, man-machine dialogue system equipment 200 can detect regarding for user Line direction, the action of gesture and body parts and posture.User can also be specific right come some to display by mobile finger As being operated, for example, changing the display location of the object.Meanwhile user goes back some position of Mobile body (for example, hand Arm) or entire body is moved to interact the input of operation.Although it should be understood that at image acquisition equipment 100, human-computer interaction The equipment that reason equipment 200 and display equipment 300 are shown as separation, but these three equipment can also be arbitrarily combined into one Or two equipment.For example, image acquisition equipment 100 and man-machine dialogue system equipment 200 can be realized within one device.
Come to the man-machine dialogue system equipment in man-machine interactive system according to the ... of the embodiment of the present invention below with reference to Fig. 2 200 structure is described in detail.
As shown in Fig. 2, man-machine dialogue system equipment 200 according to the ... of the embodiment of the present invention includes motion detection block 210, hands over Mutual determining module 220 and display control module 230.
Motion detection block 210 is used to detect a plurality of types of actions of user and determines the posture of user.For example, movement Detection module 210 is detectable and determines the movement of direction of visual lines, the movement of body part, gesture motion and the body posture of user Action.Interaction determining module 220 can be according to a plurality of types of actions and posture for the user that motion detection block 210 detects come really Surely the interactive operation that will be carried out.The operating process of motion detection block 210 will be described in detail below.
According to one embodiment of present invention, motion detection block 210 may include sight capture module 211 and posture tracking Module 213.
Wherein, direction of visual lines of the sight capture module 211 for obtaining user from image data.It can be by from picture number The direction of visual lines of user is obtained according to the head pose of middle detection user.The posture on head is mainly by head pitching and head deflection To embody.Correspondingly, the pitch angle and deflection angle on head can be estimated respectively in the head zone in depth image, to be based on institute Pitch angle and deflection angle are stated to synthesize corresponding head pose, to obtain the direction of visual lines of user.
Posture tracing module 213 is used to track and identify the posture action of user's body each section.For example, posture tracks mould Block 213 can track and identify the action in the instruction direction and finger of user from the image data of acquisition.Posture tracing module 213 The movement locus and speed of traceable hand.In addition, posture tracing module 213 can also be tracked and identify user's body all parts The action of (for example, arm).Preferably, in user under the pattern of image acquisition equipment 100, posture tracing module 213 can The node of the hand of user is tracked by intensive, reliable image data, so that it is determined that the direction side of the finger of user To with action (that is, gesture).And in user far under the pattern of image acquisition equipment 100, since the image data of acquisition compares Roughly, noise is more and hand region is small, and posture tracing module 213 can be by tracking human skeleton node come to the upper of user Arm (that is, bone between carpopodium point and toggle point) is tracked, to track and identify the arm pointing direction of user and move Make.
For this purpose, according to an embodiment of the invention, posture tracing module 213 can be based on skin color feature and/or 3D features Come identify and track user hand movement.Specifically, posture tracing module 213 may include instructing based on skin color or 3D features Experienced grader.The case where for using skin color grader, using probabilistic model (for example, gauss hybrid models (GMM)) come by the distribution of color of hand skin to distinguish whether a possible pixel belongs to hand.For depth characteristic, It can be such as " Real-Time Human Pose Recognition in Parts from Single Depth Images.Jamie The mode introduced in Shotton et al.In CVPR 2011 " generates depth comparative feature, or by partial-depth block (small square Shape block) it is compared with the block on known hand model and measures similarity.Then, by different color characteristic and depth characteristic Combination, general grader (such as, Random Forest or AdaBoosting decision trees) can be used to execute classification task To determine the hand in image data.Then, by detection hand frame by frame, posture tracing module 213 is traceable and calculates hand Movement locus/the speed in portion, to position hand in 2D images and 3d space domain.Particularly, by by depth data and 3D hands Portion's model is compared, the position of traceable hand joint.However, if hand works as image far from image acquisition equipment 100 When hand region in data is less than predetermined threshold, data reliability is considered, it can be by way of tracking the body bone of user To determine the movement of arm.
Interaction determining module 220 can according to the various motion of the user detected by motion detection block 210 come determine by The interactive operation to be carried out.For example, interaction determining module 220 can be according to the user sight side determined by posture tracing module 211 Direction is indicated to the user determined with posture tracing module 213 to determine whether to enter interactive operation posture, and according to subsequent The interactive operation that the posture action and direction of visual lines determination of user will execute.That is, interaction determining module 220 can basis The instruction direction of user's direction of visual lines and user determines the beginning or end of interactive operation.Specifically, when posture tracing module The user that the direction of visual lines and posture tracing module 213 of 211 determination users determines indicates that direction is directed in display equipment 300 Some target (that is, the place that crosses in the instruction direction of direction of visual lines and finger has specific display target) of display is more than pre- When fixing time, interaction determining module 220 can determine that user wants to proceed by interaction to be operated to display target.To aobvious During showing that target is operated, whether interaction determining module 220 determines at least one of user's sight and pointing direction It stills remain in the display target.When user's sight and pointing direction are not maintained on the target, interaction determines mould Block 220 can determine that user stops the interactive operation with the display target.By above mode, user can be more accurately determined Whether start or terminate interactive operation, to improve the accuracy of interactive operation.
Interaction behaviour is determined whether to start or terminated according to the action of the user detected and posture it should be understood that the above is only Make an example of state.Also it can determine whether to start or terminate interactive operation state according to other preset modes.For example, It can start interactive operation posture according to the direction of visual lines of user and scheduled gesture.As shown in figure 3, working as motion detection block 210 show that showing for equipment 300 is specific on screen from the finger opening and direction of visual lines direction for determining user in image data Xiang Shi then interacts determining module 220 and can determine that user wants to interact operation to the particular item.Next, working as motion detection When module 210 determines that the finger of user closes up and hand starts mobile, it is specific that interaction determining module 220 can determine that user wants dragging .If motion detection block 210 determines the holding of user into fist, interactive determining module 220 can determine that user wants to stop Only interactive operation.
After entering interactive operation state, interaction determining module 220 determines use also according to the action of user and posture Want the interactive operation carried out in family.According to one embodiment of present invention, interaction determining module 220 can be according to the finger of the hand of user Show direction to determine the interactive operation of mobile pointer.According to the instruction direction of the hand of the user of the determination of posture tracing module 213, hand over Mutual determining module 220 can calculate the intersection point in the instruction direction and display screen, to obtain the position of pointer on the display screen It sets.When the movement of the hand of user, interaction determining module 220 can send out corresponding order, and the instruction control of display control module 230 is aobvious Show the display of equipment 300 so that pointer moves on the screen also with the movement of hand.
According to one embodiment of present invention, the use that interaction determining module 220 can also be determined according to posture tracing module 213 The hand motion at family carrys out the interactive operation of confirming button.According to posture tracing module 213 determine user hand instruction direction, Interaction determining module 220 can calculate the intersection point in the instruction direction and display screen, if there is such as button in the position Display items then interact determining module 220 and can determine that user presses the button.Alternatively, if posture tracing module 213 determines user Finger/fist along its indicate direction fast move, then interact 220 confirming button of determining module be pressed.
It should be understood that simply showing interactive determining module 220 here according to sight side determined by Eye-controlling focus module 210 It is acted to the posture of the user determined with posture tracing module 213 to determine that the several of interactive operation that user wants to carry out show Example.It should be understood by those skilled in the art that the interactive operation of the present invention is without being limited thereto.It can also be acted according to the posture of user And/or the direction of visual lines of user carries out more interactive operations, such as can to drag display target, rotation by mobile hand aobvious Show target, clicks or double-click display target etc. by the movement of finger.
In addition, according to an embodiment of the invention, user also can customize interactive operation corresponding with specific action. For this purpose, man-machine dialogue system equipment 200 may also include a self-defined posture registration module (not shown), for registering with user certainly The corresponding interactive operation of gesture actions of definition.Self-defined posture registration module can have a database, the appearance for that will record Gesture and action are mapped to corresponding interaction operation command.For example, in the case where carrying out the display of 2D or 3D targets, tracking can be passed through The direction of motion of two hands zooms in or out 2D or 3D display target.Particularly, self-defined in order to register new gesture actions Posture registration module tests the reproducibility and ambiguity of user-defined gesture actions, and returns to a reliability score, To indicate whether user-defined interaction operation command is effective.
After interaction determining module 220 determines that user wants the interactive operation carried out, interaction determining module 220 is to aobvious Show that control module 230 sends out corresponding instruction, display control module 230 is controlled according to instruction shows equipment 300 in display screen The upper corresponding interactive operation of display.For example, controllable display equipment 300 display pointer is moved, corresponding display items are moved, Button such as is pressed at the screen-pictures of operations.
The detailed process of man-machine interaction method according to the ... of the embodiment of the present invention is described below with reference to Fig. 4.
As shown in figure 4, in step S410, image data is obtained by image acquisition equipment 100 first.
Next, in step S420, man-machine dialogue system equipment 200 analyzes the picture number that image acquisition equipment 100 obtains A plurality of types of user's postures in and action want the interaction carried out to determine whether to enter interactive operation state and user Operation.Here, for example, man-machine dialogue system equipment 200 can detect and identify the direction of visual lines and human body of user from image data Various pieces action and posture, with determine user want carry out interactive operation.According to the present embodiment, man-machine dialogue system Equipment 200 can determine whether to enter interactive operation state according to the instruction direction of the direction of visual lines of detection and user.Specifically, Detect that the direction of visual lines of user and the instruction direction of hand are directed toward and show from image data when man-machine dialogue system equipment 200 is determined When showing that some display items shown on the display screen of equipment 300 are more than the predetermined time, man-machine dialogue system equipment 200 enters Interactive operation state, and the interactive operation that will be executed to display target is determined according to the subsequent posture action of user.
Then, in step S430, according to determining interactive operation control show equipment 300 show corresponding display screen or Person, which updates, shows screen.For example, can be determined according to the instruction direction of the hand of user user want the pointer of mobile display position, Drag display items, click display items, double-click display items etc..
In the step s 420, if during executing interactive operation, man-machine dialogue system equipment 200 determines the instruction of user Direction and direction of visual lines have been away from display target, it is determined that user wants to stop the interactive operation to display target, and shows Stop the display screen operated to display target.It should be noted that can also determine whether user wants by another way Stop interactive operation.For example, interactive operation can be stopped according to the certain gestures (clenching one's fists as described above) of user.
Illustrate to execute various interactive operations using man-machine interaction method according to the present invention below with reference to Fig. 5-Fig. 7 Exemplary flow.
Fig. 5 shows that man-machine interaction method according to the ... of the embodiment of the present invention carries out the flow chart of menu operation.
In the embodiment of fig. 5, it is assumed that default menu is displayed on the display screen of display equipment 300, and preset Menu includes that several items interact operation for user.
In step S510, when the human body attitude detected in the image data from capture shows the instruction side of the hand of user When to some specific menu item being directed to direction of visual lines on display screen, the interactive operation state entered to menu is determined.
Next, in step S520, the movement locus and speed of the hand of traceable user are to determine the action of the hand of user And gesture, and the interactive operation that user wishes to carry out is determined according to the action of hand and gesture.For example, can be according to the dynamic of the hand of user Make to simulate the interactive operation of mouse.When determining that the index finger of user makes the action clicked, Fingers can be chosen to show on direction Menu particular item.When determining that the middle finger of user makes the action clicked, can show in corresponding with right mouse button action Hold, for example, display and the relevant additional menu option of this etc..Then, in step S530, control shows that equipment shows or updates Menu content corresponding with determining interactive operation.
Fig. 6 is the flow chart for the operation that man-machine interaction method according to the ... of the embodiment of the present invention carries out 3D display target.Here, Show that equipment 300 is can to show the display equipment of 3D contents.
First, in step S610, when the human body attitude detected in the image data from capture shows the hand of user When indicating the specific 3D display target that direction and direction of visual lines are directed on display screen, the friendship entered to 3D display target is determined Interoperability state.Next, in step S620, the movement locus of the hand of traceable user and speed with determine user hand it is dynamic Work and gesture, and the interactive operation that user wishes to carry out is determined according to the action of hand and gesture.For example, can be by the instruction direction of hand Get up with the 3D display target pick-up on the joint of direction of visual lines, and can according to the movement of hand mobile 3 D display target.Separately Outside, also it can drag, zoom in or out the 3D display target chosen according to the action of hand.Finally, in step S630, control display Equipment renders the 3D display target after interactive operation according to determining interactive operation again.
Fig. 7 is the flow chart that man-machine interaction method according to the ... of the embodiment of the present invention carries out text entry operation.Here, suppose that The presumptive area on display screen shown by display equipment 300 can be used as text input area.
First, in step S710, when the human body attitude detected in the image data from capture shows the hand of user When indicating the handwriting input region that direction and direction of visual lines are directed on display screen, the interactive operation into handwriting input is determined State.Next, in step S720, the movement locus and speed of the hand of traceable user, and according to the movement rail of the hand of user Mark determines that user wants the text of input.It can determine that user wants the text of input according to based on the recognition methods of study, and It is corresponding interaction operation command by text interpretation.Finally, in step S730, control shows that equipment shows that interaction operation command is held The display screen of result after row.
It should be understood that although above example determines whether to start or terminates to hand over according to the instruction direction of direction of visual lines and hand The subsequent interactive operation of interoperability and user, but the invention is not restricted to this.It can be examined according to other types of movement is detected The combination of survey determines whether to start or terminate interactive operation and subsequent interactive operation.
According to the present invention it is possible to man-machine interactive operation be carried out using the combination of multi-motion detection mode, to not In the case of needing additional input unit (for example, touch screen input device), the fuzziness of man-machine interactive operation identification is reduced, Improve the accuracy of man-machine interactive operation.For example, in the case where not using touch screen input device, display target may be implemented Amplification, diminution interactive operation.In this way, taking full advantage of the motion detection mode of computer vision technique, brought for user Preferably interaction operation experience.
Although the present invention, the skill of this field has shown and described with reference to several exemplary embodiments of the present invention Art personnel will be understood that, can be in the case where not departing from claim and its spirit and scope of the present invention that equivalent limits Various changes are made in form and details.

Claims (14)

1. a kind of man-machine interactive system, including:
Image acquisition equipment, for obtaining image data;
Sight capture module determines user by detecting pitch orientation and the deflection direction on the head of user from image data Direction of visual lines;
Posture tracing module, the instruction direction of the hand for user to be tracked and identified in image data;
Interaction determining module, is directed to display items to determine interaction based on the instruction direction of the direction of visual lines of user and the hand of user The beginning of operation;
Wherein, the interactive determining module:
At least one of the instruction direction of hand of direction of visual lines and user in response to determining user still points to display items to protect Interactive operation is held,
Display items are not directed toward to stop interactive operation in the instruction direction of the hand of direction of visual lines and user in response to determining user.
2. man-machine interactive system as described in claim 1, wherein posture tracing module be additionally operable in image data tracking and Identify the posture and action of user's body each section.
3. man-machine interactive system as claimed in claim 2, wherein use is tracked in image data and detected to posture tracing module The node of the hand at family detects the body bone node of user to determine user's body to determine movement and the gesture of the hand of user The posture of each section acts.
4. man-machine interactive system as claimed in claim 3, wherein appearance of the interaction determining module also according to user's body each section State determines beginning interactive operation with action.
5. man-machine interactive system as claimed in claim 4, wherein the use that interaction determining module is detected according to sight capture module Family direction of visual lines and posture tracing module identification user hand action come determine whether start interactive operation.
6. man-machine interactive system as claimed in claim 2, wherein when user is close to image acquisition equipment, posture tracks mould Block is tracked and the finger movement of identification user is to identify the gesture of user, when user is far from image acquisition equipment, posture tracking Module is tracked and the action of the arm of identification user.
7. man-machine interactive system as described in claim 1, further includes:
It shows equipment, shows display screen corresponding with the result of interactive operation,
Wherein, if it is determined that the display items that the instruction direction of the direction of visual lines of user and the hand of user is directed on display screen are super The predetermined time is spent, then starts interactive operation.
8. man-machine interactive system as described in claim 1, further includes:
Self-defined posture registration module, for registering interaction operation command corresponding with user-defined gesture actions.
9. a kind of man-machine interaction method, including:
Obtain image data;
The direction of visual lines of user is determined by detecting pitch orientation and the deflection direction on the head of user from image data;
The instruction direction of the hand of user is tracked and identified in image data;
It is directed to display items based on the instruction direction of the direction of visual lines of user and the hand of user to determine the beginning of interactive operation;
Wherein, in response to determining that the direction of visual lines of user and at least one of the instruction direction of hand of user still point to display items Interactive operation is kept, is stopped in response to determining that display items are not directed toward in the instruction direction of the direction of visual lines of user and the hand of user Only interactive operation.
10. man-machine interaction method as claimed in claim 9, wherein by the hand that user is tracked and detected in image data Node to determine movement and the gesture of the hand of user, and by detecting the body bone node of user from image data with true Determine the posture action of user's body each section.
11. man-machine interaction method as claimed in claim 9, wherein tracked according to the direction of visual lines of the user of detection and posture Module identification user hand action come determine whether start interactive operation.
12. man-machine interaction method as claimed in claim 9, wherein if it is determined that the hand of the direction of visual lines and user of user It indicates that the display items that direction is directed on display screen are more than the predetermined time, then starts interactive operation.
13. man-machine interaction method as claimed in claim 9, wherein when user is close to image acquisition equipment, tracking and identification The finger movement of user, when user is far from image acquisition equipment, identifies the action of the arm of user to identify the gesture of user.
14. man-machine interaction method as claimed in claim 9, wherein the step of determining interactive operation further include:It determines and registers The corresponding interactive operation of user-defined gesture actions.
CN201210440197.1A 2012-11-07 2012-11-07 Man-machine interactive system and method Active CN103809733B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201810619648.5A CN108845668B (en) 2012-11-07 2012-11-07 Man-machine interaction system and method
CN201210440197.1A CN103809733B (en) 2012-11-07 2012-11-07 Man-machine interactive system and method
KR1020130050237A KR102110811B1 (en) 2012-11-07 2013-05-03 System and method for human computer interaction
US14/071,180 US9684372B2 (en) 2012-11-07 2013-11-04 System and method for human computer interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210440197.1A CN103809733B (en) 2012-11-07 2012-11-07 Man-machine interactive system and method

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201810619648.5A Division CN108845668B (en) 2012-11-07 2012-11-07 Man-machine interaction system and method

Publications (2)

Publication Number Publication Date
CN103809733A CN103809733A (en) 2014-05-21
CN103809733B true CN103809733B (en) 2018-07-20

Family

ID=50706630

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201210440197.1A Active CN103809733B (en) 2012-11-07 2012-11-07 Man-machine interactive system and method
CN201810619648.5A Active CN108845668B (en) 2012-11-07 2012-11-07 Man-machine interaction system and method

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810619648.5A Active CN108845668B (en) 2012-11-07 2012-11-07 Man-machine interaction system and method

Country Status (2)

Country Link
KR (1) KR102110811B1 (en)
CN (2) CN103809733B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030093065A (en) * 2002-05-31 2003-12-06 주식회사 유니온금속 Heat Exchanger using Fin Plate having plural burring tubes and Method for manufacturing the same
CN104391578B (en) * 2014-12-05 2018-08-17 重庆蓝岸通讯技术有限公司 A kind of real-time gesture control method of 3-dimensional image
CN104740869B (en) * 2015-03-26 2018-04-03 北京小小牛创意科技有限公司 The exchange method and system that a kind of actual situation for merging true environment combines
CN105005779A (en) * 2015-08-25 2015-10-28 湖北文理学院 Face verification anti-counterfeit recognition method and system thereof based on interactive action
CN105740948B (en) * 2016-02-04 2019-05-21 北京光年无限科技有限公司 A kind of exchange method and device towards intelligent robot
CN105759973A (en) * 2016-03-09 2016-07-13 电子科技大学 Far-near distance man-machine interactive system based on 3D sight estimation and far-near distance man-machine interactive method based on 3D sight estimation
CN107743257B (en) * 2017-02-22 2018-09-28 合肥龙图腾信息技术有限公司 Human posture's identification device
CN109426498B (en) * 2017-08-24 2023-11-17 北京迪文科技有限公司 Background development method and device for man-machine interaction system
CN107678545A (en) * 2017-09-26 2018-02-09 深圳市维冠视界科技股份有限公司 A kind of information interactive terminal and method
CN107944376A (en) * 2017-11-20 2018-04-20 北京奇虎科技有限公司 The recognition methods of video data real-time attitude and device, computing device
CN107895161B (en) * 2017-12-22 2020-12-11 北京奇虎科技有限公司 Real-time attitude identification method and device based on video data and computing equipment
JP7091983B2 (en) * 2018-10-01 2022-06-28 トヨタ自動車株式会社 Equipment control device
CN110442243A (en) * 2019-08-14 2019-11-12 深圳市智微智能软件开发有限公司 A kind of man-machine interaction method and system
CN111527468A (en) * 2019-11-18 2020-08-11 华为技术有限公司 Air-to-air interaction method, device and equipment
KR102375947B1 (en) * 2020-03-19 2022-03-18 주식회사 메이아이 Method, system and non-transitory computer-readable recording medium for estimatiing interaction information between person and product based on image
CN112051746B (en) * 2020-08-05 2023-02-07 华为技术有限公司 Method and device for acquiring service
CN112099623A (en) * 2020-08-20 2020-12-18 昆山火灵网络科技有限公司 Man-machine interaction system and method
KR102524016B1 (en) * 2020-08-21 2023-04-21 김덕규 System for interaction with object of content
US11693482B2 (en) * 2021-05-28 2023-07-04 Huawei Technologies Co., Ltd. Systems and methods for controlling virtual widgets in a gesture-controlled device
CN113849065A (en) * 2021-09-17 2021-12-28 支付宝(杭州)信息技术有限公司 Method and device for triggering client operation instruction by using body-building action
US20230168736A1 (en) * 2021-11-29 2023-06-01 Sony Interactive Entertainment LLC Input prediction for pre-loading of rendering data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1694045A (en) * 2005-06-02 2005-11-09 北京中星微电子有限公司 Non-contact type visual control operation system and method
CN102270035A (en) * 2010-06-04 2011-12-07 三星电子株式会社 Apparatus and method for selecting and operating object in non-touch mode
CN102749990A (en) * 2011-04-08 2012-10-24 索尼电脑娱乐公司 Systems and methods for providing feedback by tracking user gaze and gestures

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002259989A (en) * 2001-03-02 2002-09-13 Gifu Prefecture Pointing gesture detecting method and its device
KR100520050B1 (en) * 2003-05-12 2005-10-11 한국과학기술원 Head mounted computer interfacing device and method using eye-gaze direction
JP5207513B2 (en) * 2007-08-02 2013-06-12 公立大学法人首都大学東京 Control device operation gesture recognition device, control device operation gesture recognition system, and control device operation gesture recognition program
CN101344816B (en) * 2008-08-15 2010-08-11 华南理工大学 Human-machine interaction method and device based on sight tracing and gesture discriminating
JP5483899B2 (en) * 2009-02-19 2014-05-07 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus and information processing method
KR101596890B1 (en) * 2009-07-29 2016-03-07 삼성전자주식회사 Apparatus and method for navigation digital object using gaze information of user
US8418237B2 (en) * 2009-10-20 2013-04-09 Microsoft Corporation Resource access based on multiple credentials
US9244533B2 (en) * 2009-12-17 2016-01-26 Microsoft Technology Licensing, Llc Camera navigation for presentations
US8659658B2 (en) * 2010-02-09 2014-02-25 Microsoft Corporation Physical interaction zone for gesture-based user interfaces
JP2012098771A (en) * 2010-10-29 2012-05-24 Sony Corp Image forming apparatus and image forming method, and program
US20130154913A1 (en) * 2010-12-16 2013-06-20 Siemens Corporation Systems and methods for a gaze and gesture interface
CN106125921B (en) * 2011-02-09 2019-01-15 苹果公司 Gaze detection in 3D map environment
WO2012144666A1 (en) * 2011-04-19 2012-10-26 Lg Electronics Inc. Display device and control method therof
CN202142050U (en) * 2011-06-29 2012-02-08 由田新技股份有限公司 Interactive customer reception system
US9201500B2 (en) * 2012-09-28 2015-12-01 Intel Corporation Multi-modal touch screen emulator

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1694045A (en) * 2005-06-02 2005-11-09 北京中星微电子有限公司 Non-contact type visual control operation system and method
CN102270035A (en) * 2010-06-04 2011-12-07 三星电子株式会社 Apparatus and method for selecting and operating object in non-touch mode
CN102749990A (en) * 2011-04-08 2012-10-24 索尼电脑娱乐公司 Systems and methods for providing feedback by tracking user gaze and gestures

Also Published As

Publication number Publication date
CN108845668B (en) 2022-06-03
CN103809733A (en) 2014-05-21
KR20140059109A (en) 2014-05-15
KR102110811B1 (en) 2020-05-15
CN108845668A (en) 2018-11-20

Similar Documents

Publication Publication Date Title
CN103809733B (en) Man-machine interactive system and method
US11567578B2 (en) Systems and methods of free-space gestural interaction
US20210181857A1 (en) Systems and Methods of Creating a Realistic Displacement of a Virtual Object in Virtual Reality/Augmented Reality Environments
US11720181B2 (en) Cursor mode switching
US20170024017A1 (en) Gesture processing
US8166421B2 (en) Three-dimensional user interface
US9684372B2 (en) System and method for human computer interaction
EP2049976B1 (en) Virtual controller for visual displays
JP4323180B2 (en) Interface method, apparatus, and program using self-image display
US20120202569A1 (en) Three-Dimensional User Interface for Game Applications
US20120204133A1 (en) Gesture-Based User Interface
US9063573B2 (en) Method and system for touch-free control of devices
CN105980965A (en) Systems, devices, and methods for touch-free typing
US20140139429A1 (en) System and method for computer vision based hand gesture identification
KR20130001176A (en) System and method for close-range movement tracking
US20130120250A1 (en) Gesture recognition system and method
CN109753154A (en) There are the gestural control method and device of screen equipment
Choondal et al. Design and implementation of a natural user interface using hand gesture recognition method
Caputo Gestural interaction in virtual environments: User studies and applications
Feng et al. FM: Flexible mapping from one gesture to multiple semantics
Hartmann et al. A virtual touchscreen with depth recognition
Yoon et al. Vision-Based bare-hand gesture interface for interactive augmented reality applications
US20230031200A1 (en) Touchless, Gesture-Based Human Interface Device
Susantok et al. Android-based Touch Screen Projector Design Using a 3D Camera
Cong Research on fingertip positioning and human-computer interaction technology based on stereo vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant