CN101515198A - Human-computer interaction method for grapping and throwing dummy object and system thereof - Google Patents

Human-computer interaction method for grapping and throwing dummy object and system thereof Download PDF

Info

Publication number
CN101515198A
CN101515198A CNA2009100473593A CN200910047359A CN101515198A CN 101515198 A CN101515198 A CN 101515198A CN A2009100473593 A CNA2009100473593 A CN A2009100473593A CN 200910047359 A CN200910047359 A CN 200910047359A CN 101515198 A CN101515198 A CN 101515198A
Authority
CN
China
Prior art keywords
throwing
dummy object
data
hand
magnetic force
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2009100473593A
Other languages
Chinese (zh)
Inventor
陈一民
姚争为
陈明
邹一波
陆意骏
黄诗华
陈伟
李启明
谭志鹏
刘燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CNA2009100473593A priority Critical patent/CN101515198A/en
Publication of CN101515198A publication Critical patent/CN101515198A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to a human-computer interaction method for grapping and throwing a dummy object and a system thereof. The method provided by the invention has work steps: (1) gaining data of a head and a hand, (2) realizing grapping recognition based on visual perception, (3) realizing throwing recognition based on magnetic tracking, (4) realizing the real rotation of a projectile. The system provided by the invention comprises a grapping recognition subsystem based on the visual perception, a throwing recognition subsystem based on the magnetic tracking and a path display subsystem with the real rotation. The invention can combine the consistency movement of snatching and throwing as an interactive method and uses the augmented-reality technique for taking the dummy object as a target to be snatched and thrown. The realization of the functions only needs an optical see-through helm and a magnetic tracker to be worn (a camera is not used); thereby a user can directly perceive the shocking effect of the scene. The invention has the advantages of natural operation, high real-time, and being economical and practical.

Description

A kind of dummy object grasps man-machine interaction method and the system with throwing
Technical field
The present invention relates to a kind of interaction technique, particularly a kind of dummy object grasps man-machine interaction method and the system with throwing.
Background technology
Augmented reality also is referred to as mixed reality.It is to produce non-existent virtual objects in the actual environment by computer graphics techniques and visualization technique, and by sensing technology with virtual objects accurately " placement " in true environment, virtual objects and true environment are combined together by display device.Human-computer interaction technology is meant by the computing machine Input/Output Device, realizes the technology of people and computer dialog with effective and efficient manner.Its appearance with grow up to be by the development decision of computing machine: scientific computer->ubiquitous computing machine->man-machine fusion, improve interactive efficiency.Desirable interactive mode will realize " natural interaction ", " user's freedom " exactly.
Based on the interaction technique of augmented reality is that the augmented reality technology is applied among the man-machine interaction, promptly use dummy object instead of part real-world object, alleviate the burden of user and system, and can realize allowing the user carry out three-dimensional real-time, interactive with the true and dummy object in more natural mode and the environment.The application that the augmented reality technology is combined with interaction technique at present seldom, even also movable incorporated with hand seldom of such application arranged, as Chinese patent: have the augmented reality natural interactive helmet of eye tracking function, application number 200710022978.8.It only utilizes the activity of eyes to carry out alternately; Chinese patent and for example: based on the augmented reality flight simulator of a plurality of fixed cameras, application number: 200710303603.9, it is that head movement by people carries out alternately.Come from the diversity of man-machine interaction and the authenticity of impression, we have proposed a kind of interactive mode that combines with throwing based on the extracting of augmented reality, and have introduced 3D sound as replenishing visual experience.
Many lists are also arranged with the application of throwing at present as main interactive means.As Chinese patent: the image game system of object-throwing, application number 97111486.2.Though this system will throw as main interactive means, throwing realizes true inadequately.The user throws by the visual human that steering controller is controlled on the monitor.Chinese patent and for example: based on the indoor integrated motion simulation system of virtual reality technology, application number 03153203.9.Motion process is true, all is to adopt real-world object by the object of throwing still, and system uses inconvenient, and has improved the hardware cost of system.As when simulative golf moves, real club and ball must be provided, fly out in order to prevent ball, also need be equipped with net.And the throwing object of this system is only limited in ball, and when the flight of simulation projectile, has ignored the real-time rotation situation of projectile.
Though use the research of Dextrous Hand extracting dummy object many in addition these years, be mainly used in virtual assembling field.And all be to adopt the video perspective formula helmet mostly, help the collection of image and the stack of dummy object like this.It is true that but the image by the display device playback does not have naturally that naked eyes see.Both just had and adopted the optics penetration helmet, but blocked, and realized grasping, and also can place one or more video cameras in specific place in order to solve.The expense of system and complicacy can strengthen greatly like this, and mutual effect is subjected to the influence of light easily.
Based on " natural interaction ", " user's freedom " principle, we adopt optical perspective helmet, only place a magnetic force receiver on the helmet, and are very light.And will be generated by computer drawing by thrower, and present by light through hole mirror, and superpose with true environment, the user can freely grasp and throw.The kind of object also is not limited to spheroid, can also show the real-time rotation situation of object in the whole flight course except having embodied parameters such as throwing angle, speed, acceleration, and system can trigger corresponding event according to the drop point of dummy object.
Summary of the invention
The problem and shortage that exists of prior art in view of the above, the object of the present invention is to provide the man-machine interaction method and the system of a kind of dummy object extracting and throwing, substitute real-world object with dummy object, and grasping and throwing combines action as a continuity, thereby realize the actual situation blending, free interactive.
For achieving the above object, the present invention adopts following technical conceive:
According to robotics, rigid body kinematics principle, utilization computer technology, optical technology and artificial intelligence technology, the head that obtains in real time and the exercise data of hand are analyzed, utilize the augmented reality technology, the actual situation object is carried out accurate contraposition, and real-time judge and response are made in the extracting and the throwing of dummy object.
The present invention realizes by the following technical solutions:
A kind of dummy object grasps with the man-machine interaction of throwing, it is characterized in that operation steps is as follows:
1) obtaining of head, hand data, step is as follows:
1. by hand magnetic force receiver, obtain centre of the palm position data;
2. the data that obtain according to head magnetic force receiver through coordinate transform, obtain eye position information;
3. by data glove, obtain the finger flexibility;
2) realize discerning based on the extracting of visually-perceptible, step is as follows:
1. copy the robotics principle push away the five fingers fingertip location information,
2. through the ray intersection principle, grasp the collision detection of pointing with dummy object,
3. design three and grasp rule: grasp the result phase rule, grasp and carry out the state rule and grasp standby condition rule, the determinacy of evidence in the computation rule;
4. rule is merged, realize grasping and judge;
3) realize discerning based on the throwing of magnetic force tracking, step is as follows:
1. the hand magnetic force tracker data of obtaining are carried out the convolution smoothing processing;
2. established standards throwing feature set to the new data packet that obtains, is calculated its throwing degree of membership, by threshold decision, realizes throwing action identification;
3. obtain by the throwing curvilinear characteristic and sell a little,, get the initial elevation angle of projectile and deflection this differentiate;
4) realize that projectile rotates in real time, step is as follows:
1. according to selling a little and any Eulerian angle before it, obtain initial angle speed, calculate by the moment of inertia of throwing dummy object.
2. with the momental equation of air resistance equation substitution,, try to achieve the equation that angular velocity is the function of time,, try to achieve the equation that angle is the function of time, realize the real-time rotation of projectile again through conversion to this equation integration by the moment of inertia generation.
A kind of dummy object grasps the man-machine interactive system with throwing, the man-machine interaction method that is used for dummy object extracting according to claim 1 and throwing, comprise head tracing system 1., the hand tracker 2., 3. actual situation merge display system, it is characterized in that:
1. the head tracking subsystem mainly obtains the action message of head, for providing accurate data based on the collision detection of visually-perceptible and the demonstration of dummy object, is made up of the microcomputer of magnetic force receptacle connection that is fixed on the optical perspective helmet.
2. the hand tracing subsystem mainly obtains the action message by hand, and calculating for the determinacy that grasps evidence provides accurate data, is made up of the microcomputer of magnetic force receptacle connection that is fixed on the data glove.
Actual situation merges the fusion that 3. display subsystem mainly realizes virtual scene and real scene, and respectfully present the flight path of dummy object after throwing out, connect the personal computer net that described microcomputer is formed by a main control computer, projector and PLC lighting control module are formed.
Below the technical scheme of invention is done to describe in detail:
Method provided by the invention is a kind of dummy object that presents in true environment, and true hand grasps dummy object, the actual situation interlock, and the throwing dummy object, the projectile track shows, the corresponding event trigger method.
Present dummy object in the true environment.With the fusion that superposes of dummy object and real-world object, this is a kind of augmented reality technology.The accuracy of actual situation contraposition has direct influence to the authenticity of effect.We adopt the magnetic force tracking equipment that eye and hand are carried out track and localization, but tracker has its fixing valid analysing range.Carry out when mutual so at first will add up in this mode, the zone of action of staff and head, this zone is 60cm * 40cm * 50cm.We adopt the genetic neural network method that gamma correction is carried out in this zone.At first utilize genetic algorithm that the weights and the threshold value of neural network are carried out rapid Optimum, in solution space, locate a search volume preferably, and, utilize the local search ability of neural network in this little solution space, to search out optimum solution then with this initial weight and threshold value as neural network search thereafter.According to the mapping theorem of BP network and consider the character of problem, for choosing of BP network, we adopt 3 layers of BP network to realize.This method has improved the learning efficiency of neural network weight coefficient effectively, reaches global convergence fast, has guaranteed the degree of accuracy of AR system and the sense of reality of scene.
True hand grasps the method for dummy object.The user puts on the saturating formula helmet of the light that tracker is housed and data glove grasps.People's grasping movement has its inherent feature by analysis.We become three phases with whole movement decomposition: prepare to have grabbed, grab, catch.This three phases has feature separately again.Preparation has been grabbed is characterized as hand motionless substantially A 2, and the obvious slack-off A of the translational speed of hand 3That is grabbing is characterized as the finger flexibility A that changes from small to big 4, palm is towards user A 5The finger sight line of catching that is characterized as intersects A with laughable jar 1
We draw following three knowledge according to these three steps, and these three knowledge are supported same conclusion.The element b in the set B wherein 1Representative " catching ", b 2Representative " not catching ":
A 2∧A 3→B{b 1(0.3),b 2(0.1)}
A 4∧A 5→B{b 1(0.5),b 2(0.1)}
A 1→B{b 1(0.5),b 2(0.2)}
We at first are A 1, A 2, A 3, A 4And A 5Feature is set up subordinate function separately.A 2, A 3And A 5Can directly calculate their determinacy by one group of nearest magnetic force tracking data.A 5Can directly calculate their determinacy by one group of nearest data glove data.A 1Relative complex some.Want to obtain according to the output signal of data glove earlier the angle of bend of finger-joint, copy robotics theory, determine the position of each finger, thereby determined the coordinate of finger in world coordinate system with respect to palm.Again viewpoint is imagined as light source, five straight lines from viewpoint process finger tip are imagined as light.Utilize the light geometrical principle to solve the collision detection problem.Collision detection is divided two levels: the outside is a bounding box, and the inside is a dummy object itself.We are three zones with whole spatial division: in the dummy object, between bounding box and the dummy object, outside the bounding box, and be each region allocation weights.Each finger that act as of each finger was set weights when basis was grabbed simultaneously.A is set up in these data combinations 1Subordinate function, calculate the determinacy of this feature.Carry out reasoning according to three knowledge at last, again The reasoning results separately merged according to evidence theory, realize judgement grasping:
The method of actual situation interlock.After we caught dummy object, dummy object should be naturally merges with hand, and moving of whipping and moving.Here the subject matter that will solve is: allow dummy object move into smoothly in user's hand, not allowing the user feel has lofty sense.We adopt following method to solve.
Specifying the position of dummy object stationary state is the source matrix, and the position of user's hand is an objective matrix during extracting, obtains location matrix on the straight line path between the two by the hypercomplex number interpolation method, gives dummy object with it.Also smoothing function must be in time closed simultaneously, otherwise when the user picks up the dummy object fast moving, the sense that lags behind can be produced.Through experiment checking repeatedly, the smoothing process of native system is made as 50 frames.
Avoid or less lofty sense, after catching dummy object, dummy object also must continue to maintain the relative attitude with hand.Because the posture of hand is indefinite when grasping, so tend to have certain rotation, make dummy object keep original attitude just to need to offset this rotation, concrete implementation method is as follows:
After judging extracting, note the rotation matrix of hand at that time.Calculate the inverse matrix of this rotation matrix.The location matrix of hand is taken advantage of on this inverse matrix right side, with it as father's matrix of dummy object during state in the hand.Thereby realize that dummy object merges with hand naturally, and moving of whipping and moving.
Throwing dummy object method.With the optimal approximation of the coefficient of binomial expansion decision Gaussian function, Gaussian function as impulse Response Function, is carried out convolutional filtering; And the regular set of setting ascendant trend; Data (frequency on tracker is decided) nearest k after level and smooth, with time is that preface is combined into one group, according to following transformation for mula with this group data be converted to a set: calculate the hamming approach degree of this set and ascendant trend regular set then and this value as degree of membership, judge by pre-set threshold.The judgement recursion is carried out, till identifying throwing action; After identifying throwing action, obtain top speed (projectile initial velocity) according to the rate curve of throwing, this is constantly sells constantly, can get the projectile initial coordinate.Try to achieve the initial elevation angle of projectile and deflection according to formula asin (dz/ds) and atan (dx/dy).And the initial attitude of dummy object (Eulerian angle) can obtain from the magnetic force receiver.
The method that the projectile track shows.By previous calculations, obtained the initial parameter of projectile, and these parameters can only be calculated the particle motion trace of virtual projectile.Want the realistic real-time rotation of projectile must being incorporated into.Just to obtain throwing action and the projectile relation of rotation in real time.According to the Newton first law in the particle kinematics, if original particle is made uniform circular motion with angular velocity omega, at a time decontrol hand, as not considering the effect of any power, particle will be with tangential velocity v c(v c=ω r) makes linear uniform motion.According to the kinetic energy conservation theorem, also can rotate with angular velocity omega in the time of the object translation around barycenter.
To sell and a little a bit rotate around x, y, z axle respectively, obtain two line segments with its front.Calculate the angle of these two line segments, this angle gets initial angle speed divided by interval time; Calculating is by the moment of inertia of the dummy object of throwing, and regular object can be tabled look-up.If non-regular object can be decomposed into several portions with it earlier, find the solution respectively, synthesize again at last; The substitution of air resistance equation by the momental equation that moment of inertia produces, is asked angular velocity to this equation integration, and getting angular velocity is the equation of the function of time: ω 21e -(K/J) t
Deriving the angle that rotates through again thus is the equation of the function of time: θ 211(J/K) e -(K/J) t
According to top equation, can get random time t, the angle that object rotates through.Rotation is in real time incorporated in the particle trajectory of dummy object, promptly obtain the complete flight path of dummy object.
The position of real-time statistics dummy object in world coordinate system, if virtual projectile in light through hole mirror visual range, has optical perspective helmet to be responsible for demonstration, in case gone out this scope, just be responsible for demonstration, and trigger corresponding event according to the drop point of dummy object by projector equipment.
The present invention has following conspicuous outstanding feature and remarkable advantage compared with prior art: will grasp with this continuity of throwing and move as a kind of interactive means; Utilization augmented reality technology is with the object of dummy object as extracting and throwing; Optical perspective formula glasses are followed the tracks of extracting and the throwing that (not using video camera) realizes dummy object in conjunction with magnetic force.Whole operation is natural, real-time, economical and practical.
Description of drawings
Fig. 1 is the man-machine interaction method flow chart of a kind of dummy object extracting of the present invention with throwing.
Fig. 2 is the man-machine interactive system structured flowchart of a kind of dummy object extracting of the present invention with throwing.
Fig. 3 is the detailed process block diagram of example.
Embodiment
A preferred embodiment of the present invention is:
With reference to Fig. 1, the man-machine interaction method that a kind of dummy object grasps with throwing comprises four steps: the 1) acquisition methods, 2 of head, hand data) based on the extracting identification, 3 of visually-perceptible) the throwing identification, 4 followed the tracks of based on magnetic force) projectile rotates in real time.
With reference to Fig. 2, used two magnetic force receivers in the embodiment of the invention are installed in respectively on the data glove and the helmet, and head and hand motion are carried out data acquisition.This dummy object grasp with the throwing man-machine interactive system comprise head tracing system 1., the hand tracker 2., 3. actual situation merge display system,
The step of whole method of work comprises shown in the example detailed flow figure among Fig. 3:
1) head, hand data obtains
By hand magnetic force receiver, obtain centre of the palm position data.Data according to head magnetic force receiver obtains through coordinate transform, obtain eye position information.By data glove, obtain the finger flexibility.
2) discern based on the extracting of visually-perceptible
Grasp obtaining of knowledge relevant evidence.Grasp standby condition evidence determinacy 7 with 3 thread computes respectively, grasp and carry out state evidence determinacy 8 and grasp result phase evidence determinacy 9.Grasp the calculating of standby condition evidence determinacy 7, can utilize the magnetic force tracking equipment of hand to obtain in conjunction with corresponding subordinate function; It is the activity trend that obtains finger by data glove that extracting carries out that state evidence determinacy 8 calculates, and according to tracker obtain palm towards; Grasp result phase evidence determinacy 9, obtain data respectively from data glove and tracker, the orientation 5 of each finger of calculating, and combination then intersects level detection 6 and calculates determinacy.
Merge grasping rule.Calculating through top three states can obtain five relevant evidences, forms three inference rules, grasps rule at last and merges 10, realizes grasping and judges 11.
3) the throwing identification of following the tracks of based on magnetic force
Utilize convolution level and smooth 12, the magnetic force tracking data is handled, and latest data is divided into groups 13.Be somebody's turn to do the throwing degree of membership 14 of action then according to the throwing feature calculation.Calculate throwing action parameter 15 at last.Wherein calculate the throwing degree of membership 14 of action, be used for action is discerned, whether will carry out the throwing action calculation of parameter with decision.The throwing action parameter comprises: the position when projectile is sold, the initial elevation angle of projectile and deflection, initial velocity, Eulerian angle etc.
4) track of projectile rotation in real time shows
The real-time rotation of virtual projectile: initial angle speed 16 and the moment of inertia 17 of using two virtual projectiles of thread computes respectively.The initial angle speed definition be before selling a bit and the mean angular velocity between selling a little, calculate according to 2 Eulerian angle of hand tracker acquisition.Moment of inertia can change because of the difference of throwing object, and the moment of inertia of these objects all is to calculate in advance well, calls according to the situation that reality grasps.Output data after these two modules are calculated utilizes air resistance to learn and the moment of inertia conservation principle as the input of angle acceleration calculation module 18, can obtain the equation that angle is the function of time, promptly realizes the real-time rotation of projectile.
The particle motion trace of virtual projectile: the front has obtained the parameter of throwing action by the throwing recognition subsystem of following the tracks of based on magnetic force 15, according to these parameters, in conjunction with field condition, calculates the particle motion trace 19 of the virtual projectile of whole piece.Track shows: by element of time, the real-time rotation and the particle motion trace of virtual projectile combined.And, assign idsplay order to display device by display analysis module 20 real-time judge.If the track of virtual projectile has optical perspective helmet to be responsible for demonstration,, just be responsible for demonstration by projector equipment in case gone out this scope in light through hole mirror visual range.

Claims (2)

1. a dummy object grasps with the man-machine interaction method of throwing, it is characterized in that operation steps is as follows:
1) obtaining of head, hand data, step is as follows:
1. by hand magnetic force receiver, obtain centre of the palm position data;
2. the data that obtain according to head magnetic force receiver through coordinate transform, obtain eye position information;
3. by data glove, obtain the finger flexibility;
2) realize discerning based on the extracting of visually-perceptible, step is as follows:
1. copy the robotics principle push away the five fingers fingertip location information,
2. through the ray intersection principle, grasp the collision detection of pointing with dummy object,
3. design three and grasp rule: grasp the result phase rule, grasp and carry out the state rule and grasp standby condition rule, the determinacy of evidence in the computation rule;
4. rule is merged, realize grasping and judge;
3) realize discerning based on the throwing of magnetic force tracking, step is as follows:
1. the hand magnetic force tracker data of obtaining are carried out the convolution smoothing processing;
2. established standards throwing feature set to the new data packet that obtains, is calculated its throwing degree of membership, by threshold decision, realizes throwing action identification;
3. obtain by the throwing curvilinear characteristic and sell a little,, get the initial elevation angle of projectile and deflection this differentiate;
4) realize that projectile rotates in real time, step is as follows:
1. according to selling a little and any Eulerian angle before it, obtain initial angle speed, calculate by the moment of inertia of throwing dummy object.
2. with the momental equation of air resistance equation substitution,, try to achieve the equation that angular velocity is the function of time,, try to achieve the equation that angle is the function of time, realize the real-time rotation of projectile again through conversion to this equation integration by the moment of inertia generation.
2. a dummy object grasps the man-machine interactive system with throwing, the man-machine interaction method that is used for dummy object extracting according to claim 1 and throwing, comprise head tracing system 1., the hand tracker 2., 3. actual situation merge display system, it is characterized in that:
1. the head tracking subsystem mainly obtains the action message of head, for providing accurate data based on the collision detection of visually-perceptible and the demonstration of dummy object, is made up of the microcomputer of magnetic force receptacle connection that is fixed on the optical perspective helmet.
2. the hand tracing subsystem mainly obtains the action message by hand, and calculating for the determinacy that grasps evidence provides accurate data, is made up of the microcomputer of magnetic force receptacle connection that is fixed on the data glove.
Actual situation merges the fusion that 3. display subsystem mainly realizes virtual scene and real scene, and respectfully present the flight path of dummy object after throwing out, connect the personal computer net that described microcomputer is formed by a main control computer, projector and PLC lighting control module are formed.
CNA2009100473593A 2009-03-11 2009-03-11 Human-computer interaction method for grapping and throwing dummy object and system thereof Pending CN101515198A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2009100473593A CN101515198A (en) 2009-03-11 2009-03-11 Human-computer interaction method for grapping and throwing dummy object and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2009100473593A CN101515198A (en) 2009-03-11 2009-03-11 Human-computer interaction method for grapping and throwing dummy object and system thereof

Publications (1)

Publication Number Publication Date
CN101515198A true CN101515198A (en) 2009-08-26

Family

ID=41039670

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2009100473593A Pending CN101515198A (en) 2009-03-11 2009-03-11 Human-computer interaction method for grapping and throwing dummy object and system thereof

Country Status (1)

Country Link
CN (1) CN101515198A (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101853319A (en) * 2010-05-14 2010-10-06 中国人民解放军军械工程学院 Method for establishing maintenance therblig set supporting virtual maintenance simulation
CN102270067A (en) * 2011-06-17 2011-12-07 清华大学 Contact track fusion method of multiple hierarchical cameras on interactive surface
CN102728060A (en) * 2011-03-30 2012-10-17 廖礼士 Interactive device and operation method thereof
CN103186922A (en) * 2011-09-30 2013-07-03 微软公司 Representing a location at a previous time period using an augmented reality display
CN103677495A (en) * 2012-09-07 2014-03-26 腾讯科技(深圳)有限公司 Icon selection device and method
CN104656901A (en) * 2009-12-22 2015-05-27 电子湾有限公司 Augmented reality system, method and apparatus for displaying an item image in a contextual environment
CN104732864A (en) * 2015-03-20 2015-06-24 武汉湾流新技术有限公司 Spraying simulation method based on augmented reality and simulation system
CN105056500A (en) * 2015-07-22 2015-11-18 陈飞 Situation simulation training/game system
US9268406B2 (en) 2011-09-30 2016-02-23 Microsoft Technology Licensing, Llc Virtual spectator experience with a personal audio/visual apparatus
CN105690386A (en) * 2016-03-23 2016-06-22 北京轩宇智能科技有限公司 Teleoperation system and teleoperation method for novel mechanical arm
CN107331220A (en) * 2017-09-01 2017-11-07 国网辽宁省电力有限公司锦州供电公司 Transformer O&M simulation training system and method based on augmented reality
CN108037827A (en) * 2017-12-08 2018-05-15 北京凌宇智控科技有限公司 The virtual objects throwing emulation mode and its system of Virtual actual environment
CN108536298A (en) * 2018-03-30 2018-09-14 广东工业大学 A kind of human body mapping appearance body interacts constrained procedure with the binding of virtual rotary body
US10127606B2 (en) 2010-10-13 2018-11-13 Ebay Inc. Augmented reality system and method for visualizing an item
US10147134B2 (en) 2011-10-27 2018-12-04 Ebay Inc. System and method for visualization of items in an environment using augmented reality
CN109683700A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 The human-computer interaction implementation method and device of view-based access control model perception
CN109766005A (en) * 2018-12-29 2019-05-17 北京诺亦腾科技有限公司 The method and device of taking and placing object in a kind of VR scene
CN109840011A (en) * 2017-11-29 2019-06-04 深圳市掌网科技股份有限公司 A kind of crawl recognition methods of dummy object and device
CN110721473A (en) * 2019-10-10 2020-01-24 深圳市瑞立视多媒体科技有限公司 Object throwing method, device, equipment and computer readable storage medium
CN111225554A (en) * 2020-02-19 2020-06-02 鲁班嫡系机器人(深圳)有限公司 Bulk object grabbing and assembling method, device, controller and system
CN112346564A (en) * 2020-10-26 2021-02-09 江南大学 Method for grabbing and releasing virtual object by hand
US10956775B2 (en) 2008-03-05 2021-03-23 Ebay Inc. Identification of items depicted in images
CN112847374A (en) * 2021-01-20 2021-05-28 湖北师范大学 Parabolic-object receiving robot system
CN113797525A (en) * 2020-12-23 2021-12-17 广州富港生活智能科技有限公司 Novel game system
US11651398B2 (en) 2012-06-29 2023-05-16 Ebay Inc. Contextual menus based on image recognition
US11727054B2 (en) 2008-03-05 2023-08-15 Ebay Inc. Method and apparatus for image recognition services
WO2024037559A1 (en) * 2022-08-18 2024-02-22 北京字跳网络技术有限公司 Information interaction method and apparatus, and human-computer interaction method and apparatus, and electronic device and storage medium

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10956775B2 (en) 2008-03-05 2021-03-23 Ebay Inc. Identification of items depicted in images
US11694427B2 (en) 2008-03-05 2023-07-04 Ebay Inc. Identification of items depicted in images
US11727054B2 (en) 2008-03-05 2023-08-15 Ebay Inc. Method and apparatus for image recognition services
CN104656901B (en) * 2009-12-22 2017-12-22 电子湾有限公司 For showing the augmented reality systems approach and device of images of items in situational context
US10210659B2 (en) 2009-12-22 2019-02-19 Ebay Inc. Augmented reality system, method, and apparatus for displaying an item image in a contextual environment
CN104656901A (en) * 2009-12-22 2015-05-27 电子湾有限公司 Augmented reality system, method and apparatus for displaying an item image in a contextual environment
CN101853319A (en) * 2010-05-14 2010-10-06 中国人民解放军军械工程学院 Method for establishing maintenance therblig set supporting virtual maintenance simulation
US10127606B2 (en) 2010-10-13 2018-11-13 Ebay Inc. Augmented reality system and method for visualizing an item
US10878489B2 (en) 2010-10-13 2020-12-29 Ebay Inc. Augmented reality system and method for visualizing an item
CN102728060A (en) * 2011-03-30 2012-10-17 廖礼士 Interactive device and operation method thereof
CN102270067A (en) * 2011-06-17 2011-12-07 清华大学 Contact track fusion method of multiple hierarchical cameras on interactive surface
US9268406B2 (en) 2011-09-30 2016-02-23 Microsoft Technology Licensing, Llc Virtual spectator experience with a personal audio/visual apparatus
CN103186922B (en) * 2011-09-30 2016-08-17 微软技术许可有限责任公司 Use Augmented Reality display represent before the time period time place method and individual audiovisual (A/V) device
US9286711B2 (en) 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Representing a location at a previous time period using an augmented reality display
CN103186922A (en) * 2011-09-30 2013-07-03 微软公司 Representing a location at a previous time period using an augmented reality display
US10628877B2 (en) 2011-10-27 2020-04-21 Ebay Inc. System and method for visualization of items in an environment using augmented reality
US11475509B2 (en) 2011-10-27 2022-10-18 Ebay Inc. System and method for visualization of items in an environment using augmented reality
US11113755B2 (en) 2011-10-27 2021-09-07 Ebay Inc. System and method for visualization of items in an environment using augmented reality
US10147134B2 (en) 2011-10-27 2018-12-04 Ebay Inc. System and method for visualization of items in an environment using augmented reality
US11651398B2 (en) 2012-06-29 2023-05-16 Ebay Inc. Contextual menus based on image recognition
CN103677495B (en) * 2012-09-07 2017-07-18 腾讯科技(深圳)有限公司 icon selection device and method
CN103677495A (en) * 2012-09-07 2014-03-26 腾讯科技(深圳)有限公司 Icon selection device and method
CN104732864A (en) * 2015-03-20 2015-06-24 武汉湾流新技术有限公司 Spraying simulation method based on augmented reality and simulation system
CN105056500B (en) * 2015-07-22 2017-08-29 陈飞 A kind of situation simulation training/games system
CN105056500A (en) * 2015-07-22 2015-11-18 陈飞 Situation simulation training/game system
CN105690386A (en) * 2016-03-23 2016-06-22 北京轩宇智能科技有限公司 Teleoperation system and teleoperation method for novel mechanical arm
CN105690386B (en) * 2016-03-23 2019-01-08 北京轩宇智能科技有限公司 A kind of mechanical arm remote control system and teleoperation method
CN107331220A (en) * 2017-09-01 2017-11-07 国网辽宁省电力有限公司锦州供电公司 Transformer O&M simulation training system and method based on augmented reality
CN109683700A (en) * 2017-10-18 2019-04-26 深圳市掌网科技股份有限公司 The human-computer interaction implementation method and device of view-based access control model perception
CN109840011A (en) * 2017-11-29 2019-06-04 深圳市掌网科技股份有限公司 A kind of crawl recognition methods of dummy object and device
CN108037827A (en) * 2017-12-08 2018-05-15 北京凌宇智控科技有限公司 The virtual objects throwing emulation mode and its system of Virtual actual environment
CN108536298A (en) * 2018-03-30 2018-09-14 广东工业大学 A kind of human body mapping appearance body interacts constrained procedure with the binding of virtual rotary body
CN109766005A (en) * 2018-12-29 2019-05-17 北京诺亦腾科技有限公司 The method and device of taking and placing object in a kind of VR scene
CN110721473A (en) * 2019-10-10 2020-01-24 深圳市瑞立视多媒体科技有限公司 Object throwing method, device, equipment and computer readable storage medium
CN111225554B (en) * 2020-02-19 2021-10-29 鲁班嫡系机器人(深圳)有限公司 Bulk object grabbing and assembling method, device, controller and system
CN111225554A (en) * 2020-02-19 2020-06-02 鲁班嫡系机器人(深圳)有限公司 Bulk object grabbing and assembling method, device, controller and system
CN112346564B (en) * 2020-10-26 2021-12-03 江南大学 Method for grabbing and releasing virtual object by hand
CN112346564A (en) * 2020-10-26 2021-02-09 江南大学 Method for grabbing and releasing virtual object by hand
CN113797525A (en) * 2020-12-23 2021-12-17 广州富港生活智能科技有限公司 Novel game system
CN113797525B (en) * 2020-12-23 2024-03-22 广州富港生活智能科技有限公司 Novel game system
CN112847374A (en) * 2021-01-20 2021-05-28 湖北师范大学 Parabolic-object receiving robot system
WO2024037559A1 (en) * 2022-08-18 2024-02-22 北京字跳网络技术有限公司 Information interaction method and apparatus, and human-computer interaction method and apparatus, and electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN101515198A (en) Human-computer interaction method for grapping and throwing dummy object and system thereof
CN111260762B (en) Animation implementation method and device, electronic equipment and storage medium
US10657696B2 (en) Virtual reality system using multiple force arrays for a solver
Bideau et al. Real handball goalkeeper vs. virtual handball thrower
EP2391988B1 (en) Visual target tracking
US8565485B2 (en) Pose tracking pipeline
US8451278B2 (en) Determine intended motions
CN103019024B (en) Real-time accurate surveying and analysis table tennis rotary system and system operation method
Yeo et al. Eyecatch: Simulating visuomotor coordination for object interception
Riley et al. Robot catching: Towards engaging human-humanoid interaction
CN105252532A (en) Method of cooperative flexible attitude control for motion capture robot
US20100303289A1 (en) Device for identifying and tracking multiple humans over time
CN101991949B (en) Computer based control method and system of motion of virtual table tennis
CN103038727A (en) Skeletal joint recognition and tracking system
JP2008307640A (en) Motion control system, motion control method, and motion control program
Thalmann et al. Autonomous virtual actors based on virtual sensors
CN102004840A (en) Method and system for realizing virtual boxing based on computer
CN103207667A (en) Man-machine interaction control method and application thereof
CN106390409A (en) Ball-hitting method and device for table tennis robot
Serra et al. An optimal trajectory planner for a robotic batting task: the table tennis example
Dhawan et al. Development of a novel immersive interactive virtual reality cricket simulator for cricket batting
He et al. Mathematical modeling and simulation of table tennis trajectory based on digital video image processing
Liu et al. An action recognition technology for badminton players using deep learning
CN116785683A (en) AR multi-person interaction system, head display device, method and storage medium
CN102004552A (en) Tracking point identification based method and system for increasing on-site sport experience of users

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20090826