CN101539804A - Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen - Google Patents

Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen Download PDF

Info

Publication number
CN101539804A
CN101539804A CN200910047360A CN200910047360A CN101539804A CN 101539804 A CN101539804 A CN 101539804A CN 200910047360 A CN200910047360 A CN 200910047360A CN 200910047360 A CN200910047360 A CN 200910047360A CN 101539804 A CN101539804 A CN 101539804A
Authority
CN
China
Prior art keywords
real
real time
parabolic
time
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200910047360A
Other languages
Chinese (zh)
Inventor
许永顺
陈一民
陈明
姚争为
胡俊
俞晓明
陆涛
邹一波
陆意骏
黄诗华
陈伟
李启明
刘燕
谭志鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN200910047360A priority Critical patent/CN101539804A/en
Publication of CN101539804A publication Critical patent/CN101539804A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention provides a real time human-machine interaction method and a system based on augmented virtual reality and anomalous screen. The method comprises the following operation steps: 1) modeling is carried out on virtual objects to be interacted and statistical average hand model ; 2) video acquisition is carried out on hand by N cameras to obtain original image; 3) processing is carried out in a computer. The system comprises a system for realizing real time transparent strengthened display, algorithm and system of real time false or true grasping and parabolic motion detection and real time parabolic animation generating , and real time multi-anomalous screens interactive system. A whole set of large special effect is constructed by various false or true conversion in large anomalous screen scene, three-dimensional animation or three-dimensional image of interactive content expressed in solid spectacles, additionally, video/audio devices, light control devices and the like.

Description

Based on strengthening the real-time human-computer interaction method and system of virtual reality with special-shaped screen
Technical field
The present invention relates to a kind of real-time human-computer interaction method and system of virtual reality, particularly a kind of virtual reality and special-shaped real-time human-computer interaction technical method and system of shielding of strengthening.
Background technology
Augmented reality technology (AR---Augmented Reality) system is placed on VR to go in the real environment to check.From present development trend, this virtual experience mode that combines with actual environment is one of the most effective technical application of virtual reality technology development.Abroad in AR technical research and application, still take the lead, and have multinomial intellecture property and technical standard technically.The U.S., Europe and Japan and other countries have obtained many breakthroughs in the gordian technique of AR at present, and have also possessed design and manufacturing capacity on key equipment, and have obtained expected effect.As people such as the Simon research of Regius professor in 2000 based on the tracking of coplane physical feature point and be applied to architectural design, the three-dimensional model of new building or building decoration thing and the video recording of outdoor scene are merged, can provide intuitively and important criterion whether new building and environment on every side be harmonious in this way.Britain Fraunhofer institute has also developed the AR system that is applicable to city planning, makes the architectural design personnel effect of previewing scheme at the scene.Japan liked to know in the World Expo that enterprise of Hitachi has just used in the shop cover AR system and demonstrated the close relationship of man and nature, people and animal in 2005.At home, the AR technology also is in the starting stage.Proposed based on the integrated algorithm of computer vision and virtual reality system in the distributed augmented reality system as Northern Transportation University's information science institute.The Zhou Jianlong of Xi'an Communications University etc. have proposed a kind of three-dimensional matching process of augmented reality system based on the theory of affine projection conversion.Shanghai University begins to start AR systematic research based on the PC platform first half of the year calendar year 2001, has designed the structure based on the AR system of PC platform, has realized that suitable AR is system, easy, automatic, the camera calibrating method of high precision and robust.
So-called special-shaped screen technology is exactly that projection is thrown on nonplanar special-shaped moulding screen, such screen can make model and scene, will be in conjunction with special graph processing technique, utilize the synthetic control technology of multi-projector splicing and image, the variation of the various scenes of change on the screen of an abnormal shape, thereby cause the environment of a kind of various visual angles, multi-screen demonstration owing to utilize a plurality of the small screen to form a giant-screen, Control Software will guarantee that the content that each screen is play is synchronous, and this just requires control end that reliable synchronous playing technology is arranged.Current simultaneous techniques can be divided into two kinds: a kind of is to utilize the synchronous clock of a unified hardware clock as each unit, and the synchronous effect that this method realizes is good, and speed is fast.But need other hardware device, cost is than higher.Another method utilizes software control synchronous exactly.Use multithreading, the asynchronous feedback technology can realize synchronous effect.What the present invention adopted is exactly the synchronous method of software control.Realize that the projection of the special-shaped screen of large scene is seamless spliced, it is not enough that light has the trapezoidal correction of vertical and horizontal direction, and therefore the straightening technology to picture is exactly one of seamless spliced gordian technique of multi-screen.This problem adopts the method for software, with computing machine image is carried out straightening and handles, and do not rely on concrete projector.
Small-scale AR application based on desktop has obtained remarkable progress, but still do not support the virtual network optical projection system and the method for many projection source with the example of the many optical projection systems application of big arenas formula, AR patent aspect, patent 200710304245.3, propose the welding protective helmet of a kind of video augmented reality non-productive operation, be used for the machining manufacture view.Patent 200710303603.9, proposed a kind of based on a plurality of be the augmented reality flight simulator of deciding video camera admittedly, be used for flight training simulation.Patent 200680029773.6 has proposed a kind of system, equipment and method that is used for the augmented reality glasses of end-user programming.Patent 200610101229.X has proposed a kind of method and system that realize three-dimensional enhanced reality, catches the frame of video of two-dimentional visual encoding mark in the true environment; Carry out augmented reality and handle the synthetic video frame.But the spacious not application aspect display and demonstration.Aspect many optical projection systems, (application number: 200710064616.5) the common display equipment that will connect computing machine invents the projector equipment that can use for network to patent, thereby by network, show or switch when having made things convenient for project content from a plurality of projection source on same screen to show.But only used modes such as button as man-machine interaction mode, the user does not almost have feeling of immersion and can say, also can not well utilize for the advantage of many optical projection systems.Patent (application number: 200710071105.6) based on the multi-projection large screen split-joint method of turntable, a kind of multi-projection large screen split-joint method based on turntable is disclosed, geometry correction and geometric alignment can be finished, the location of overlay region can also be carried out simultaneously, for the brightness unification is prepared.But do not consider for carrying out problem mutual and that may cause with many projected images content.
The AR Study on Technology is mainly studied in application system at present, and makes up a kind of new research application together in conjunction with other technology.Integrate and yet there are no all reports strengthening virtual reality technology and special-shaped screen technology.Achievement in research on display and demonstration and application at present also seldom.According to investigating and search data, organically integrate strengthening virtual reality technology and special-shaped screen technology, on showing, use especially beyond example.
Summary of the invention
The problem and shortage of the existence of prior art the object of the present invention is to provide a kind of real-time human-computer interaction technical method and system based on the enhancing virtual reality in view of the above, utilizes AR technology and many optical projection systems mutual.Can finish the cooperation interaction work of multi-user, AR and many optical projection systems by the work of network control AR equipment, many optical projection systems, make the user obtain deep feeling of immersion.
For achieving the above object, the present invention adopts following technical conceive:
Principle according to ergonomics, utilize computer vision, the network communications technology and many projections interaction technique that the sensor information that camera and tracking equipment provide is merged, identification and response are done in going forward side by side action, utilize Network Synchronization mechanism etc. to finish multi-user's cooperation interaction work.System's initial operating stage carries out initial work earlier, finishes the demarcation of camera and the initial viewpoint of consumer positioning, reads in the multi-media image of disposable special joint and the model of place of true combining together of material object.When system moves, tracker goes out the position and the direction of viewpoint by the information calculations to video information and magnetometric sensor, the variation of the corresponding viewpoint of graphical data operation system is carried out conversion to model of place and is played up, and wherein calculation task is automatically assigned on each calculating node.Synthetic display system is synthetic and to generate stereo-picture right with the picture of the picture of virtual scene and real scene, and the picture of stereo image pair is sent to respectively on the left and right sides screen on the helmet, and the user just can experience the world of the actual situation combination with feeling of immersion.In operational process, system can also accept user's interactive action, change the various attributes of mutual object (virtual object),, allow marvellous change that the more multi-threaded content of user experience can't experience and in reality certain free thought in future as shape, position, color and movement velocity etc.
The present invention realizes by the following technical solutions:
A kind of based on the real-time human-computer interaction method that strengthens virtual reality, it is characterized in that operation steps is as follows:
1. one kind based on strengthening the real-time human-computer interaction method of virtual reality with special-shaped screen, it is characterized in that operation steps is as follows:
1) to wanting mutual dummy object to carry out modeling; Hand model to statistical average carries out modeling;
2) by N platform camera staff is carried out video acquisition, obtain original image;
3) handle in computing machine, step is:
1. realize that the saturating formula of real-time light strengthens the method that shows: to the staff original image of taking,, carry out three-dimensionalreconstruction, be transformed into the coordinate system identical, carry out collision detection with dummy object through after the dividing processing.Utilize the bounding box method slightly to go on foot detection earlier,, accurately detect again, utilize the artificial immune system method accurately to detect, realize the rapid virtual-to-physical collision detection if there is collision to exist.Dummy model after the collision is carried out deformation process, be presented at after registering on the saturating formula Helmet Mounted Display of light.
2. actual situation grasping in real time, parabolic motion detection and real-time parabolic animation generating algorithm: to the original image of taking, carry out image segmentation through colour of skin statistics, image after cutting apart is carried out the bone processing, utilize the grasping disaggregated model of building up in advance to screen, determine type of action.On the basis of grasping success, utilize the difference algorithm of frame to calculate coefficients such as acceleration and direction variable value and carry out the parabolic motion detection and judge, if greater than test statistics mean value, then be parabolic successfully.Generate corresponding parabolic animation in real time in parabolic success back.
3. real-time multi-screen interactive method: the real-time parabolic animation that generates is striden screen handle, carry out image co-registration for the situation of another projection screen that flies out from a projection screen.Utilize the network platform to realize distributed pre-service.
A kind of be used for above-mentioned based on strengthening the system that virtual reality and the real-time human-computer interaction method of special-shaped screen are used, comprise that the saturating formula of the real-time light of realization strengthens system (4), actual situation grasping in real time, the parabolic motion detection and the many in real time special-shaped screen interactive systems (6) of real-time parabolic animation generating algorithm and system (5) that shows, use the network switch to connect between the system, system as shown in Figure 2.It is characterized in that:
1) the saturating formula of light strengthens the system (4) that shows in real time, and overlapping independently by two, system forms.Every cover is added two camera and is handled PC by a saturating formula Helmet Mounted Display of oled light and constitutes; Every complete equipment utilizes one to handle the processing that PC finishes the saturating display packing of enhancing light.
2) actual situation grasping in real time, parabolic motion detection are the multisensor syste that vision and magnetic force devices merge with real-time parabolic cartoon generation system (5), are made of the equipment of magnetic force tracking equipment with (4).System (5) has increased magnetic force devices on the basis of (4), carry out information fusion and realize actual situation grasping, parabolic action processing.
3) many special-shaped screen interactive systems (6) have increased by 5 PC and 8 projectors' realization multi-screen interactives on the basis of system (5) in real time.PC control is played up by one by per two projectors, is handled with sound equipment as sequential and script control by a PC in addition.PC communication between the system (4) (5) (6) is finished by LAN switch.Method provided by the invention is a kind of real-time augmented reality demonstration, real time human-machine interaction action recognition and the mutual method of many projections.
Below the technical scheme of invention is done to describe in detail: realize that the saturating formula of real-time light strengthens the system that shows, comprise and adopt the method for colour of skin statistics to cut apart the staff image, utilize 3DSMAX to build up the averaging model of staff, be transformed into the coordinate system identical with dummy object, carry out the three-dimensionalreconstruction of real-time staff, utilize the bounding box method slightly to go on foot detection earlier, if there is collision to exist, accurately detect again, utilize the artificial immune system method accurately to detect (seeing embodiment for details partly illustrates).
After accurately determining the collision collection, according to the different response of degree dummy object, make it do the superficial makings distortion, concrete mode is to preset 64 possible metamorphic animations, calls according to the difference of collision situation and plays the response animation.Simultaneously the model of virtualization 3D hand makes in its display screen that does not appear at HMD, utilize the average interpolation make the response animation frame by frame preface be presented in the display screen of HMD.
Real time human-machine interaction action recognition system, comprise utilizing and realize that the saturating formula of real-time light strengthens the system that shows and cuts apart good image, dynamic image is carried out the Difference Calculation of frame, calculate the motion feature that it comprises after the boneization again, comprise: the finger flexibility, centre of the palm angle is carried out index relatively with the database that prestores, to determine the action success ratio.Calculate coefficients such as acceleration and direction variable value and carry out the judgement of parabolic motion detection.If greater than test statistics mean value, then be the parabolic success.
Wherein, catch the action of true hand and the pose of determining virtual hand needs magnetic force to follow the tracks of and the collaborative work of data glove.At first the magnetic force tracking receiver is placed on the wrist place, obtains position and the Eulerian angle of wrist under world coordinate system, and then obtains the homogeneous transformation matrix at this place.Afterwards, according to the DH algorithm, set up local coordinate system at each joint, release coordinate transform from the i coordinate system is even taken advantage of by matrix to i-1 coordinate system transformation of coordinates matrix, gets each joint in one's hands and the coordinate of finger tip in world coordinate system.
Many in real time special-shaped screen interactive systems, comprise that the content that will show image carries out content and divide, show that background is on fixing projection screen, interaction content is carried out the script control and treatment, utilize multiple pc to carry out distributed playing up, splicing part is carried out pre-service, make and stride the screen animation and merge and to play up chronologically.
Utilize a plurality of the small screen to form a giant-screen, Control Software guarantees that the content that each screen is play is synchronous, and control end has reliable synchronous playing technology.Utilize software control synchronous.Use multithreading, the asynchronous feedback technology can realize synchronous effect.The asynchronous feedback technology is exactly to utilize the real-time collection of control end to play the end feedack to control their synchronous playing.Control end only need judge whether consistent, whether send steering order according to playing the consistent situation decision of end if playing the broadcast state that end feeds back.Owing to the feedback information of playing end has been arranged, just can guarantee under LAN environment, to keep synchronously.This problem of straightening technology of picture is adopted the method for software, with computing machine image is carried out straightening and handle, and do not rely on concrete projector.The real-time deformation correcting technology that system adopts is correction of non-linear distortions, can correct the image that projects on the such abnormal shape screen projection screen of ball curtain or ring curtain.
Make the projected picture of a plurality of different projector together seamless spliced, eliminate edge bright band or shade, make whole projection screen reach incorporate panoramic effect.Some common joining methods have at present: amixis is fought recklessly, simply overlapping, edge fusion.This problem is the third with the method that adopts: the edge merges.Adopt image mosaic edge integration technology, eliminate the edge shade, make whole screen reach integrated stereoscopic full views effect.Compare with simple method of superposition, the brightness linear attenuation of the right lap of left projection ray machine, the brightness of the left side lap of right projection ray machine is linear to be increased.It is in full accord to show as the view picture picture brightness on display effect.
The present invention compared with prior art, have following conspicuous outstanding feature and remarkable advantage: the present invention adopt above-mentioned virtual fusion the more real-time augmented reality of virtualization show, real time human-machine interaction action recognition and the mutual method that merges of many projections, real-time AR shows alternately, real-time, general and economical and practical advantage thereby have.
Communication module is the basis that guarantees total system synchronous operation, mainly comprises multimachine synchronizing information, hardware controls command information, sensor information, rendering command information etc. based on the real-time communication protocol of LAN (Local Area Network).
At first check oneself whether to be initialised when communication module is called, in initialization, the information of communication module collection submodule determines to send the size of send buffer neatly.If the far-end socket that the socket that this module comprised is not corresponding with it as yet connects, then attempt connecting.If connect, then by checking socketmap container decision mode of operation (Server/Client), and application buffer zone, set up communication thread, the executive communication main body sends earlier with server, and client receives the back and sends, server is received as communication process at last one time, and it is synchronous that the incident of playing up main thread by wait is carried out multithreading then.
The sequential of mixed-media network modules mixed-media as shown in the figure, adopting blocking model and using under the situation of event synchronization mechanism, this sequential has guaranteed that each frame only sends one group of data when playing up scene, not only coordinated each machine and played up the transmission that speed difference brings and receive inconsistent phenomenon, and can make that network traffics are controlled to be minimum.
Description of drawings
The flow chart that Fig. 1 handles in computing machine for the inventive method.
Fig. 2 is the system architecture diagram of one embodiment of the invention.
Fig. 3 is the detail flowchart of Fig. 1 example.
Fig. 4 is a projection interactive system structural drawing.
Fig. 5 sensor fusion structural drawing.
Fig. 6 AR strengthens demonstration and plays up console module figure
Embodiment
A preferred embodiment of the present invention is: as follows with reference to this operation steps based on the real-time human-computer interaction method that strengthens virtual reality and special-shaped screen of Fig. 1:
1 realizes that the saturating formula of real-time light strengthens demonstration: to the staff original image of taking, through after the dividing processing, carry out three-dimensionalreconstruction, be transformed into the coordinate system identical with dummy object, carry out collision detection; Utilize the bounding box method slightly to go on foot detection earlier,, accurately detect again, utilize the artificial immune system method accurately to detect, realize the rapid virtual-to-physical collision detection if there is collision to exist; Dummy model after the collision is carried out deformation process, be presented at after registering on the saturating formula Helmet Mounted Display of light;
2 real-time actual situation grasping, parabolic motion detection are calculated with parabolic animation generation in real time: to the original image of taking, carry out image segmentation through colour of skin statistics, image after cutting apart is carried out the bone processing, utilize the grasping disaggregated model of building up in advance to screen, determine type of action; On the basis of grasping success, utilize the difference algorithm of frame to calculate coefficients such as acceleration and direction variable value and carry out the parabolic motion detection and judge, if greater than test statistics mean value, then be parabolic successfully; Generate corresponding parabolic animation in real time in parabolic success back;
3 real-time multi-screen interactives: the real-time parabolic animation that generates is striden screen handle, carry out image co-registration for the situation of another projection screen that flies out from a projection screen; Utilize the network platform to realize distributed pre-service.
Referring to Fig. 2, based on strengthening the system that virtual reality and special-shaped real-time human-computer interaction method of shielding are used, comprise that the saturating formula of the real-time light of realization strengthens system (4), actual situation grasping in real time, the parabolic motion detection and the many in real time special-shaped screen interactive systems (6) of real-time parabolic animation generating algorithm and system (5) that shows, use the network switch to utilize netting twine to connect between the system, it is characterized in that:
The saturating formula of light strengthens the system (4) that shows in real time, and overlapping independently by two, system forms; Every cover is added two camera and is handled PC by a saturating formula Helmet Mounted Display of oled light and constitutes; Every complete equipment utilizes one to handle the processing that PC finishes the saturating display packing of enhancing light; Actual situation grasping in real time, parabolic motion detection and real-time parabolic cartoon generation system (5) are the multisensor syste that vision and magnetic force devices merge, and the system (4) with the demonstration of the saturating formula enhancing of light in real time constitutes by the magnetic force tracking equipment; On the basis of the system (4) that the saturating formula enhancing of real-time light shows, increased magnetic force devices, carried out information fusion and realize actual situation grasping, parabolic action processing;
Real-time how abnormal shape screen interactive systems (6) are on the basis of actual situation grasping in real time, parabolic motion detection and real-time parabolic cartoon generation system (5), increased by 5 PC and 8 projectors' realization multi-screen interactives; PC control is played up by one by per two projectors, is handled with sound equipment as sequential and script control by a PC in addition; In real time the saturating formula of light strengthens the system (4) that shows, actual situation grasping in real time, parabolic motion detection and parabolic cartoon generation system (5) and many in real time abnormal shapes in real time and shields PC between the interactive systems (6) and communicate by letter and finished by LAN switch.
The process of wherein immune actual situation collision detection is as follows: (a) antigen recognizing, wait the data input of the problem of finding the solution.(b) initial antibodies produces, and activates memory cell and produces initial antibodies.(c) antibody tolerance is calculated affinity degree and concentration.(d) concentration is selected.(e) memory cell produces, and will have the antibody of the highest affinity degree to be retained in the memory cell storehouse for memory cell entirely with antigen.Because the number of memory cell is more limited, the antibody that has a higher affinity degree with antigen that produces will be replaced the lower antibody of affinity degree.(f) promotion and the inhibition of antibody generation, to having the antibody of higher affinity degree to promote with antigen, the antibody higher to concentration suppresses.This step is for the diversity that keeps antibody colony, thereby avoids premature convergence to realize that global optimization has important effect.Utilize selection, intersection and the mutation operator of genetic algorithm to produce new antibody.In above step, step c wants iteration operation to step f, till satisfying end condition (convergence criterion).
Dummy model after the collision is carried out deformation process, be presented at after registering on the saturating formula Helmet Mounted Display of light.Magnetic force is followed the tracks of processing module and is mainly finished dealing with the collection of original trace information with level and smooth; The sensor information Fusion Module finishes computer vision and magnetic force is followed the tracks of the fusion and the selection of processing module information.As shown in Figure 4, to each user installation one cover sensor device, comprise N/2 camera, 2 6DOF magnetic force tracking equipments, the saturating formula HMD of light, a fusion treatment PC.The real time human-machine interaction action recognition is cut apart good image with the action processing module utilization of parabolic animation generating algorithm and system 2 in real time, dynamic image is carried out the Difference Calculation of frame, calculate the motion feature that it comprises after the boneization again, comprise: the finger flexibility, centre of the palm angle, carry out index relatively with the database that prestores, to determine the action success ratio.Calculate coefficients such as acceleration and direction variable value and carry out the judgement of parabolic motion detection.If greater than test statistics mean value, then be the parabolic success.Play up control alternately, the multiusers interaction synchro control is finished system multi-user script control function, guarantees that the sequential of special-shaped screen and multiusers interaction correctly realizes.Projector's image co-registration is finished in the multi-projector control of many in real time special-shaped screen interactive systems 3 and function is adjusted in brightness; The CG communications platform is finished the communication function of AR system and special-shaped screen system, adopts the frame traffic control timing.
The reciprocal process of real time human-machine interaction action recognition and in real time parabolic animation generating algorithm and system 5 and many special-shaped screen interactive systems 6 in real time is with reference to shown in Figure 3: 7. begin sequential alternately and arrive; 8. equipment self-inspection passes through; 9. at the equipment calibration program of particular user, finish the height and the sensitivity 10. of customization and call attention program, finely tune at 9 correction; 11. the first time that reciprocal process begins is synchronous, and synchronous mark is set; 12. call and play up object model and enter internal memory; The object type sign is set; 13. motion detection circulation; 14. dynamically generate information; 15. grasping recognition system scope; 16. bonding attitude shows and follows the tracks of; 174. enter the projectile motion preparatory stage; 18. multi-user's progress is carried out synchro control; 19. the special-shaped screen system response of many projections; 20. parabolic recognition system scope; 21.HMD registration and fusion; 22. optical projection system registration and fusion; 23.HMD with many projections synchro control; 24. many dummy object circulations; 25. synchro control finishes; 26. the optical projection system interaction response shows; 27. synchronous terminal procedure.
With reference to Fig. 4, the structure of optical projection system is that requirement has 3% coincident.Projector 1,2nd is used for finishing that the virtual scene of real time human-machine interaction action recognition system shows.Projector 2 is the transition scene display that are accomplished to large-size screen monitors.Projector 5,6 is a main screen projector.Projector is finished by intersection control routine synchronously.User's visual windows size is 4 * 1.5M.
With reference to Fig. 5, what Fig. 5 described is the installation site that AR strengthens display system.A plurality of cameras are installed in user's the left and right sides as required respectively.The use location that the user uses highly fixing chair to arrange the user.
With reference to Fig. 6, AR of the present invention plays up platform structure at first by the external unit image data, comprise the tracking of data glove and magnetic force, these data are carried out necessary processing, comprise the filtering of magnetic force tracking data, interpolation and changes in coordinates on this basis, physical simulation, motion-captured, collision detection etc.Being to the playing up of scene then, is camera successively, scenery, illumination, animation prompting etc.Be the network service thread at last, it is with to play up main thread parallel.

Claims (2)

1. one kind based on strengthening the real-time human-computer interaction method of virtual reality with special-shaped screen, it is characterized in that operation steps is as follows:
A. to wanting mutual dummy object to carry out modeling; Hand model to statistical average carries out modeling;
B. by N platform camera staff is carried out video acquisition, obtain original image;
C. handle in computing machine, step is:
A. realize that the saturating formula of real-time light strengthens demonstration: to the staff original image of taking,, carry out three-dimensionalreconstruction, be transformed into the coordinate system identical, carry out collision detection with dummy object through after the dividing processing; Utilize the bounding box method slightly to go on foot detection earlier,, accurately detect again, utilize the artificial immune system method accurately to detect, realize the rapid virtual-to-physical collision detection if there is collision to exist; Dummy model after the collision is carried out deformation process, be presented at after registering on the saturating formula Helmet Mounted Display of light;
B. actual situation grasping in real time, parabolic motion detection are calculated with parabolic animation generation in real time: to the original image of taking, carry out image segmentation through colour of skin statistics, image after cutting apart is carried out the bone processing, utilize the grasping disaggregated model of building up in advance to screen, determine type of action; On the basis of grasping success, utilize the difference algorithm of frame to calculate coefficients such as acceleration and direction variable value and carry out the parabolic motion detection and judge, if greater than test statistics mean value, then be parabolic successfully; Generate corresponding parabolic animation in real time in parabolic success back;
C. real-time multi-screen interactive: the real-time parabolic animation that generates is striden screen handle, carry out image co-registration for the situation of another projection screen that flies out from a projection screen; Utilize the network platform to realize distributed pre-service.
2. one kind is used for according to claim 1 described based on strengthening the system of virtual reality with the real-time human-computer interaction method of special-shaped screen, comprise that the saturating formula of the real-time light of realization strengthens system (4), actual situation grasping in real time, the parabolic motion detection and the many in real time special-shaped screen interactive systems (6) of real-time parabolic animation generating algorithm and system (5) that shows, use the network switch to utilize netting twine to connect between the system, it is characterized in that:
1) the saturating formula of light strengthens the system (4) that shows in real time, and overlapping independently by two, system forms; Every cover is added two camera and is handled PC by a saturating formula Helmet Mounted Display of oled light and constitutes; Every complete equipment utilizes one to handle the processing that PC finishes the saturating display packing of enhancing light;
2) actual situation grasping in real time, parabolic motion detection and real-time parabolic cartoon generation system (5) are the multisensor syste that vision and magnetic force devices merge, and the system (4) with the demonstration of the saturating formula enhancing of light in real time constitutes by the magnetic force tracking equipment; On the basis of the system (4) that the saturating formula enhancing of real-time light shows, increased magnetic force devices, carried out information fusion and realize actual situation grasping, parabolic action processing;
3) real-time how abnormal shape screen interactive systems (6) are on the basis of actual situation grasping in real time, parabolic motion detection and real-time parabolic cartoon generation system (5), increased by 5 PC and 8 projectors' realization multi-screen interactives; PC control is played up by one by per two projectors, is handled with sound equipment as sequential and script control by a PC in addition; In real time the saturating formula of light strengthens the system (4) that shows, actual situation grasping in real time, parabolic motion detection and parabolic cartoon generation system (5) and many in real time abnormal shapes in real time and shields PC between the interactive systems (6) and communicate by letter and finished by LAN switch.
CN200910047360A 2009-03-11 2009-03-11 Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen Pending CN101539804A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910047360A CN101539804A (en) 2009-03-11 2009-03-11 Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910047360A CN101539804A (en) 2009-03-11 2009-03-11 Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen

Publications (1)

Publication Number Publication Date
CN101539804A true CN101539804A (en) 2009-09-23

Family

ID=41123015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910047360A Pending CN101539804A (en) 2009-03-11 2009-03-11 Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen

Country Status (1)

Country Link
CN (1) CN101539804A (en)

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101990346A (en) * 2010-08-24 2011-03-23 北京水晶石数字科技有限公司 System for controlling light by three-dimensional software
CN101990345A (en) * 2010-08-24 2011-03-23 北京水晶石数字科技有限公司 Method for controlling lighting through three-dimensional software
CN102184342A (en) * 2011-06-15 2011-09-14 青岛科技大学 Virtual-real fused hand function rehabilitation training system and method
CN102222333A (en) * 2011-05-20 2011-10-19 同济大学 Method and device of mobile augmented reality of underground engineering based on mixed registration
CN102375325A (en) * 2010-08-10 2012-03-14 西安费斯达自动化工程有限公司 True three-dimensional simulation angle description and direct projection display method
CN102385761A (en) * 2010-09-03 2012-03-21 中国航天员科研训练中心 Logical reasoning-based rapid collision detection method in virtual operation simulation
CN102622850A (en) * 2011-01-28 2012-08-01 索尼公司 Information processing device, alarm method, and program
CN103034550A (en) * 2012-12-07 2013-04-10 上海电机学院 Virtual-real interaction collision detection system and method based on artificial immune system
CN103314344A (en) * 2010-12-10 2013-09-18 索尼爱立信移动通讯有限公司 Touch sensitive haptic display
CN103479138A (en) * 2013-08-08 2014-01-01 罗轶 Interactive virtual reality car show platform
CN104011788A (en) * 2011-10-28 2014-08-27 奇跃公司 System And Method For Augmented And Virtual Reality
CN104035760A (en) * 2014-03-04 2014-09-10 苏州天魂网络科技有限公司 System capable of realizing immersive virtual reality over mobile platforms
CN104408760A (en) * 2014-10-28 2015-03-11 燕山大学 Binocular-vision-based high-precision virtual assembling system algorithm
CN104423627A (en) * 2013-09-02 2015-03-18 联想(北京)有限公司 Information processing method and electronic equipment
CN104679222A (en) * 2013-11-26 2015-06-03 深圳先进技术研究院 Medical office system based on human-computer interaction, medical information sharing system and method
CN104777907A (en) * 2015-04-17 2015-07-15 中国科学院计算技术研究所 System for group human-computer interaction
CN105046710A (en) * 2015-07-23 2015-11-11 北京林业大学 Depth image partitioning and agent geometry based virtual and real collision interaction method and apparatus
CN105518574A (en) * 2013-07-31 2016-04-20 微软技术许可有限责任公司 Mixed reality graduated information delivery
CN105519104A (en) * 2013-09-03 2016-04-20 Cjcgv株式会社 Simulated-image management system and method for providing simulated image of multi-projection system
CN105938541A (en) * 2015-03-02 2016-09-14 卡雷风险投资有限责任公司 System and method for enhancing live performances with digital content
CN106019592A (en) * 2016-07-15 2016-10-12 中国人民解放军63908部队 Augmented reality optical transmission-type helmet mounted display pre-circuit and control method thereof
CN106062862A (en) * 2014-10-24 2016-10-26 何安莉 System and method for immersive and interactive multimedia generation
CN106110627A (en) * 2016-06-20 2016-11-16 曲大方 Physical culture and Wushu action correction equipment and method
CN106375751A (en) * 2016-08-30 2017-02-01 广州视爱电子科技有限公司 Method for efficiently screening and organizing high-level exhibition
CN106373142A (en) * 2016-12-07 2017-02-01 西安蒜泥电子科技有限责任公司 Virtual character on-site interaction performance system and method
CN106445137A (en) * 2016-09-21 2017-02-22 上海电机学院 Augmented reality system
CN103955267B (en) * 2013-11-13 2017-03-15 上海大学 Both hands man-machine interaction method in x ray fluoroscopy x augmented reality system
CN106648071A (en) * 2016-11-21 2017-05-10 捷开通讯科技(上海)有限公司 Social implementation system for virtual reality
CN106648263A (en) * 2016-11-11 2017-05-10 珠海格力电器股份有限公司 Terminal equipment and control system, control method and device thereof
CN106851253A (en) * 2017-01-23 2017-06-13 合肥安达创展科技股份有限公司 Stereo image system is built based on model of place and full-length special-shaped intelligent connecting technology
CN106980378A (en) * 2017-03-29 2017-07-25 联想(北京)有限公司 Virtual display methods and system
CN107844196A (en) * 2012-06-29 2018-03-27 索尼电脑娱乐公司 Video processing equipment, method for processing video frequency and processing system for video
CN107885311A (en) * 2016-09-29 2018-04-06 深圳纬目信息技术有限公司 A kind of confirmation method of visual interactive, system and equipment
CN108010128A (en) * 2017-11-23 2018-05-08 华中科技大学 The analysis method of infrared detection auxiliary welding intelligent helmet device based on AR
CN108305316A (en) * 2018-03-08 2018-07-20 网易(杭州)网络有限公司 Rendering intent, device, medium based on AR scenes and computing device
CN108428376A (en) * 2017-02-15 2018-08-21 安徽大学 A kind of vision immersion stereoprojection teaching method
CN108969864A (en) * 2018-06-06 2018-12-11 中国人民解放军第四军医大学 Depression recovery therapeutic equipment and its application method based on VR technology
CN108986232A (en) * 2018-07-27 2018-12-11 广州汉智网络科技有限公司 A method of it is shown in VR and AR environment picture is presented in equipment
CN109033535A (en) * 2018-06-29 2018-12-18 中国航空规划设计研究总院有限公司 A kind of Design of Production Line visualization system based on VR technology
WO2019034142A1 (en) * 2017-08-17 2019-02-21 腾讯科技(深圳)有限公司 Three-dimensional image display method and device, terminal, and storage medium
CN109690448A (en) * 2016-08-22 2019-04-26 姜头焕 Virtual reality amusement equipment control method and system
TWI659279B (en) * 2018-02-02 2019-05-11 國立清華大學 Process planning apparatus based on augmented reality
CN109801379A (en) * 2019-01-21 2019-05-24 视辰信息科技(上海)有限公司 General augmented reality glasses and its scaling method
CN110865708A (en) * 2019-11-14 2020-03-06 杭州网易云音乐科技有限公司 Interaction method, medium, device and computing equipment of virtual content carrier
CN111028603A (en) * 2019-12-27 2020-04-17 广东电网有限责任公司培训与评价中心 Live-line work training method and system for transformer substation based on dynamic capture and virtual reality
CN112268506A (en) * 2020-08-31 2021-01-26 中国航发南方工业有限公司 Data splicing method based on non-contact optical scanning detection
US11003917B2 (en) 2013-10-17 2021-05-11 Drägerwerk AG & Co. KGaA Method for monitoring a patient within a medical monitoring area
CN112862976A (en) * 2019-11-12 2021-05-28 北京超图软件股份有限公司 Image generation method and device and electronic equipment
CN115100666A (en) * 2022-05-18 2022-09-23 东北大学 AR conference system based on significance detection and super-resolution reconstruction and construction method
CN115202485A (en) * 2022-09-15 2022-10-18 深圳飞蝶虚拟现实科技有限公司 XR (X-ray fluorescence) technology-based gesture synchronous interactive exhibition hall display system

Cited By (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102375325A (en) * 2010-08-10 2012-03-14 西安费斯达自动化工程有限公司 True three-dimensional simulation angle description and direct projection display method
CN102375325B (en) * 2010-08-10 2014-11-12 西安费斯达自动化工程有限公司 True three-dimensional simulation angle description and direct projection display method
CN101990345A (en) * 2010-08-24 2011-03-23 北京水晶石数字科技有限公司 Method for controlling lighting through three-dimensional software
CN101990346A (en) * 2010-08-24 2011-03-23 北京水晶石数字科技有限公司 System for controlling light by three-dimensional software
CN102385761A (en) * 2010-09-03 2012-03-21 中国航天员科研训练中心 Logical reasoning-based rapid collision detection method in virtual operation simulation
CN102385761B (en) * 2010-09-03 2013-12-25 中国航天员科研训练中心 Logical reasoning-based rapid collision detection method in virtual operation simulation
CN103314344A (en) * 2010-12-10 2013-09-18 索尼爱立信移动通讯有限公司 Touch sensitive haptic display
CN103314344B (en) * 2010-12-10 2015-11-25 索尼爱立信移动通讯有限公司 Touch sensitive haptic display
CN102622850A (en) * 2011-01-28 2012-08-01 索尼公司 Information processing device, alarm method, and program
CN102222333B (en) * 2011-05-20 2013-01-02 同济大学 Method and device of mobile augmented reality of underground engineering based on mixed registration
CN102222333A (en) * 2011-05-20 2011-10-19 同济大学 Method and device of mobile augmented reality of underground engineering based on mixed registration
CN102184342B (en) * 2011-06-15 2013-11-20 青岛科技大学 Virtual-real fused hand function rehabilitation training system and method
CN102184342A (en) * 2011-06-15 2011-09-14 青岛科技大学 Virtual-real fused hand function rehabilitation training system and method
CN104011788A (en) * 2011-10-28 2014-08-27 奇跃公司 System And Method For Augmented And Virtual Reality
CN107844196B (en) * 2012-06-29 2020-12-01 索尼电脑娱乐公司 Video processing apparatus, video processing method, and video processing system
CN107844196A (en) * 2012-06-29 2018-03-27 索尼电脑娱乐公司 Video processing equipment, method for processing video frequency and processing system for video
CN103034550A (en) * 2012-12-07 2013-04-10 上海电机学院 Virtual-real interaction collision detection system and method based on artificial immune system
CN105518574B (en) * 2013-07-31 2018-10-16 微软技术许可有限责任公司 Method and system for the delivering of mixed reality rating information
CN105518574A (en) * 2013-07-31 2016-04-20 微软技术许可有限责任公司 Mixed reality graduated information delivery
CN103479138A (en) * 2013-08-08 2014-01-01 罗轶 Interactive virtual reality car show platform
CN104423627A (en) * 2013-09-02 2015-03-18 联想(北京)有限公司 Information processing method and electronic equipment
CN105519104B (en) * 2013-09-03 2017-09-01 Cj Cgv 株式会社 Analog image management system and for the method for the analog image for providing many optical projection systems
CN105519104A (en) * 2013-09-03 2016-04-20 Cjcgv株式会社 Simulated-image management system and method for providing simulated image of multi-projection system
US11003917B2 (en) 2013-10-17 2021-05-11 Drägerwerk AG & Co. KGaA Method for monitoring a patient within a medical monitoring area
CN103955267B (en) * 2013-11-13 2017-03-15 上海大学 Both hands man-machine interaction method in x ray fluoroscopy x augmented reality system
CN104679222B (en) * 2013-11-26 2018-02-06 深圳先进技术研究院 Medical office system, medical information sharing system and method based on man-machine interaction
CN104679222A (en) * 2013-11-26 2015-06-03 深圳先进技术研究院 Medical office system based on human-computer interaction, medical information sharing system and method
CN104035760A (en) * 2014-03-04 2014-09-10 苏州天魂网络科技有限公司 System capable of realizing immersive virtual reality over mobile platforms
CN106062862B (en) * 2014-10-24 2020-04-21 杭州凌感科技有限公司 System and method for immersive and interactive multimedia generation
CN106062862A (en) * 2014-10-24 2016-10-26 何安莉 System and method for immersive and interactive multimedia generation
CN104408760B (en) * 2014-10-28 2017-12-29 燕山大学 A kind of high-precision virtual assembly system algorithm based on binocular vision
CN104408760A (en) * 2014-10-28 2015-03-11 燕山大学 Binocular-vision-based high-precision virtual assembling system algorithm
CN105938541A (en) * 2015-03-02 2016-09-14 卡雷风险投资有限责任公司 System and method for enhancing live performances with digital content
CN104777907A (en) * 2015-04-17 2015-07-15 中国科学院计算技术研究所 System for group human-computer interaction
CN104777907B (en) * 2015-04-17 2018-05-25 中国科学院计算技术研究所 A kind of system for group's human-computer interaction
CN105046710A (en) * 2015-07-23 2015-11-11 北京林业大学 Depth image partitioning and agent geometry based virtual and real collision interaction method and apparatus
CN106110627A (en) * 2016-06-20 2016-11-16 曲大方 Physical culture and Wushu action correction equipment and method
CN106019592A (en) * 2016-07-15 2016-10-12 中国人民解放军63908部队 Augmented reality optical transmission-type helmet mounted display pre-circuit and control method thereof
CN109690448A (en) * 2016-08-22 2019-04-26 姜头焕 Virtual reality amusement equipment control method and system
CN106375751A (en) * 2016-08-30 2017-02-01 广州视爱电子科技有限公司 Method for efficiently screening and organizing high-level exhibition
CN106445137A (en) * 2016-09-21 2017-02-22 上海电机学院 Augmented reality system
CN107885311A (en) * 2016-09-29 2018-04-06 深圳纬目信息技术有限公司 A kind of confirmation method of visual interactive, system and equipment
CN106648263A (en) * 2016-11-11 2017-05-10 珠海格力电器股份有限公司 Terminal equipment and control system, control method and device thereof
CN106648071B (en) * 2016-11-21 2019-08-20 捷开通讯科技(上海)有限公司 System is realized in virtual reality social activity
CN106648071A (en) * 2016-11-21 2017-05-10 捷开通讯科技(上海)有限公司 Social implementation system for virtual reality
CN106373142A (en) * 2016-12-07 2017-02-01 西安蒜泥电子科技有限责任公司 Virtual character on-site interaction performance system and method
CN106851253A (en) * 2017-01-23 2017-06-13 合肥安达创展科技股份有限公司 Stereo image system is built based on model of place and full-length special-shaped intelligent connecting technology
CN108428376A (en) * 2017-02-15 2018-08-21 安徽大学 A kind of vision immersion stereoprojection teaching method
CN106980378B (en) * 2017-03-29 2021-05-18 联想(北京)有限公司 Virtual display method and system
CN106980378A (en) * 2017-03-29 2017-07-25 联想(北京)有限公司 Virtual display methods and system
US10854017B2 (en) 2017-08-17 2020-12-01 Tencent Technology (Shenzhen) Company Limited Three-dimensional virtual image display method and apparatus, terminal, and storage medium
WO2019034142A1 (en) * 2017-08-17 2019-02-21 腾讯科技(深圳)有限公司 Three-dimensional image display method and device, terminal, and storage medium
CN109427083A (en) * 2017-08-17 2019-03-05 腾讯科技(深圳)有限公司 Display methods, device, terminal and the storage medium of three-dimensional avatars
CN109427083B (en) * 2017-08-17 2022-02-01 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for displaying three-dimensional virtual image
CN108010128A (en) * 2017-11-23 2018-05-08 华中科技大学 The analysis method of infrared detection auxiliary welding intelligent helmet device based on AR
TWI659279B (en) * 2018-02-02 2019-05-11 國立清華大學 Process planning apparatus based on augmented reality
US10606241B2 (en) 2018-02-02 2020-03-31 National Tsing Hua University Process planning apparatus based on augmented reality
CN108305316A (en) * 2018-03-08 2018-07-20 网易(杭州)网络有限公司 Rendering intent, device, medium based on AR scenes and computing device
CN108969864A (en) * 2018-06-06 2018-12-11 中国人民解放军第四军医大学 Depression recovery therapeutic equipment and its application method based on VR technology
CN109033535A (en) * 2018-06-29 2018-12-18 中国航空规划设计研究总院有限公司 A kind of Design of Production Line visualization system based on VR technology
CN108986232A (en) * 2018-07-27 2018-12-11 广州汉智网络科技有限公司 A method of it is shown in VR and AR environment picture is presented in equipment
CN108986232B (en) * 2018-07-27 2023-11-10 江苏洪旭德生科技发展集团有限公司 Method for presenting AR environment picture in VR display device
CN109801379B (en) * 2019-01-21 2023-02-17 视辰信息科技(上海)有限公司 Universal augmented reality glasses and calibration method thereof
CN109801379A (en) * 2019-01-21 2019-05-24 视辰信息科技(上海)有限公司 General augmented reality glasses and its scaling method
CN112862976A (en) * 2019-11-12 2021-05-28 北京超图软件股份有限公司 Image generation method and device and electronic equipment
CN112862976B (en) * 2019-11-12 2023-09-08 北京超图软件股份有限公司 Data processing method and device and electronic equipment
CN110865708B (en) * 2019-11-14 2024-03-15 杭州网易云音乐科技有限公司 Interaction method, medium, device and computing equipment of virtual content carrier
CN110865708A (en) * 2019-11-14 2020-03-06 杭州网易云音乐科技有限公司 Interaction method, medium, device and computing equipment of virtual content carrier
CN111028603A (en) * 2019-12-27 2020-04-17 广东电网有限责任公司培训与评价中心 Live-line work training method and system for transformer substation based on dynamic capture and virtual reality
CN112268506B (en) * 2020-08-31 2022-06-07 中国航发南方工业有限公司 Data splicing method based on non-contact optical scanning detection
CN112268506A (en) * 2020-08-31 2021-01-26 中国航发南方工业有限公司 Data splicing method based on non-contact optical scanning detection
CN115100666A (en) * 2022-05-18 2022-09-23 东北大学 AR conference system based on significance detection and super-resolution reconstruction and construction method
CN115202485B (en) * 2022-09-15 2023-01-06 深圳飞蝶虚拟现实科技有限公司 XR (X-ray fluorescence) technology-based gesture synchronous interactive exhibition hall display system
CN115202485A (en) * 2022-09-15 2022-10-18 深圳飞蝶虚拟现实科技有限公司 XR (X-ray fluorescence) technology-based gesture synchronous interactive exhibition hall display system

Similar Documents

Publication Publication Date Title
CN101539804A (en) Real time human-machine interaction method and system based on augmented virtual reality and anomalous screen
CN103472909B (en) Realistic occlusion for a head mounted augmented reality display
US10474336B2 (en) Providing a user experience with virtual reality content and user-selected, real world objects
CN102540464B (en) Head-mounted display device which provides surround video
CN103793060B (en) A kind of user interactive system and method
KR101698847B1 (en) System and method for combining data from multiple depth cameras
Azuma Overview of augmented reality
Klinker et al. Fata morgana-a presentation system for product design
US20170372449A1 (en) Smart capturing of whiteboard contents for remote conferencing
CN107507243A (en) A kind of camera parameters method of adjustment, instructor in broadcasting's video camera and system
CN106648098B (en) AR projection method and system for user-defined scene
CN102945564A (en) True 3D modeling system and method based on video perspective type augmented reality
CN102253711A (en) Enhancing presentations using depth sensing cameras
US20200209951A1 (en) Information processing system, information processing method, and program
KR20110107692A (en) Spatial multi interaction-based 3d stereo interactive vision system and method of the same
CN111897431A (en) Display method and device, display equipment and computer readable storage medium
EP3172721B1 (en) Method and system for augmenting television watching experience
US9329679B1 (en) Projection system with multi-surface projection screen
US20230179756A1 (en) Information processing device, information processing method, and program
CN110554556B (en) Method and system for spatial holographic interactive control of multiple screens
CN112684893A (en) Information display method and device, electronic equipment and storage medium
CN108346183B (en) Method and system for AR reference positioning
CN109389538A (en) A kind of Intelligent campus management system based on AR technology
Zoellner et al. Reality Filtering: A Visual Time Machine in Augmented Reality.
CN113066189B (en) Augmented reality equipment and virtual and real object shielding display method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20090923