CN101510074B - High present sensation intelligent perception interactive motor system and implementing method - Google Patents
High present sensation intelligent perception interactive motor system and implementing method Download PDFInfo
- Publication number
- CN101510074B CN101510074B CN2009100738293A CN200910073829A CN101510074B CN 101510074 B CN101510074 B CN 101510074B CN 2009100738293 A CN2009100738293 A CN 2009100738293A CN 200910073829 A CN200910073829 A CN 200910073829A CN 101510074 B CN101510074 B CN 101510074B
- Authority
- CN
- China
- Prior art keywords
- motion
- computing machine
- scene
- subsystem
- platform
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The invention relates to a high sound-surround ambiance intelligent sensing interactive motion system and an implementation method thereof; the system consists of a motion platform subsystem, a virtual reality subsystem, and a projection display subsystem; the motion platform subsystem consists of a parallel connection motion platform with N degrees of freedom, a servo motor driver, a motion controller, and a sensor; and the virtual reality subsystem consists of a motion control terminal computer, a network server computer, and a scene-rendering end computer. The projection display subsystem consists of an edge fusion device, a projector, audio equipment, and a dome screen. The invention has the beneficial effects that an immersive virtual reality environment is provided for the motion platform; simultaneously, the virtual reality environment feeds back the control information to the motion platform for realizing the interaction of the motion and the environment, the sound-surround ambiance is enhanced, and the participants is led to have high sense of immersion, thus improving the fitness fun and achieving the effect of keeping healthy.
Description
Technical field
The present invention relates to a kind of high present sensation intelligent perception interactive motor system and implementation method, belong to technical fields such as robot application and distributed virtual reality.
Background technology
Chinese patent " the non-running type multiple-dimension motion imitating exercising horse " (patent No.: 200610048379.9) be a kind of six degree of freedom bionic machine horse, above-mentioned patent adopts the six degree of freedom structure of 3-2-1 type, its structure is as follows: intermediate support, right front support and right back support are installed on the base, 3 rectilinear motions are installed on the intermediate support are driven branch road, 2 rectilinear motions are installed on the right front support are driven branch road, 1 linear drives branch road is installed on the right back support.Drive the linear drives of branch roads by 6 rectilinear motions, but the motion of the hyperspace of implementation model, the motion state of horseback in the time of can the motion of real simulated horse.
Wherein on 6 linear drives branch roads Panasonic Minas A4 series A C servo-driver and adaptive motor are installed all.
With the matching used control system of above-mentioned patent " Control System Design of a kind of novel bionic machine horse " (being published in the 19th Chinese process control proceeding vol1.1211-1214 on September 20th, 2008), described control system adopts the motion control card PCI-7356 of America NI company as motion controller, in conjunction with novel quadrature six-degree-of-freedom parallel connection mechanism, be simple and easy to the characteristics of usefulness according to LabVIEW, designed that control is convenient, multi-functional bionic machine horse control system.
Though the motion that above-mentioned motion platform and control system can be simulated the horse walking and run, the participant can adjust mode of motion according to the demand of self, reaches the effect of body-building, but above-mentioned motion platform is not set up corresponding motion virtual environment, does not carry out the FEEDBACK CONTROL of motion platform.Consider that the activity of key body carries out indoor mostly, the participant does not have impression on the spot in person, the interest that does not get enough athletic exercise.
Summary of the invention
Technical matters to be solved by this invention is at above-mentioned shortcoming of the prior art, and provides a kind of person of letting on to reach impression on the spot in person, improves body-building enjoyment, reaches intelligent perception interactive kinematic system and its implementation of the high telepresenc of health-care effect.
The technical solution adopted for the present invention to solve the technical problems:
Technical scheme one:
System of the present invention is made up of motion platform subsystem, virtual reality subsystem, Projection Display subsystem; Described motion platform subsystem is made up of N degree of freedom parallel kinematic platform, motor servo driver, motion controller, sensor, the input end of the output termination motor servo driver of motion controller, the input end of the output termination N degree of freedom parallel kinematic platform of motor servo driver, described sensor is made up of speed pickup and direction sensor;
Described virtual reality subsystem is made up of motion control end computing machine, network service end computing machine, scene rendering end computing machine, the input end of the described motion controller of output termination of motion control end computing machine, the respective input of described motion control end computing machine connects the output terminal of described sensor, and the network service end computing machine is connected with scene rendering end computer bidirectional with motion control end computing machine respectively;
Described Projection Display subsystem is made up of edge fusion device, projector, stereo set, ring curtain; The output terminal of the described scene rendering end of the input termination computing machine of edge fusion device, fusion device three tunnel outputs in edge connect the input end of respective projection instrument respectively, the output of described projector is projected on the ring curtain, and the audio output of scene rendering end computing machine connects corresponding stereo set.
Technical scheme two:
The technical scheme of the method for the invention is as follows:
(1) implementation method of the control system of motion platform subsystem:
The control system of described motion platform subsystem is aided with the exploitation of Motion motion-control module under graphical programming language LabVIEW environment (the Motion motion-control module does not belong to the motion controller part, and it is a part that is provided in this Software tool of LabVIEW.Similarly be to provide the picture module reason in this instrument of word.LabVIEW is a kind of Control Software instrument commonly used.), adopt the stratification architecture;
At first provide the movement locus of described motion platform, by real horse is being careful, trot, image or video acquisition are carried out in the motion of hurrying up under three kinds of states, track according to image demarcation and feature point extraction obtains three kinds of typical movement locus again, by host computer described three kinds of typical movement locus are write in the Butter buffer memory of described motion controller again, realize that by LabVIEW Notifier synchronous communication technology the motion control of described motion platform and sequential thereof are synchronous, use the BreaKpoints high-speed capture technology of described motion controller, monitor feeding of each in real time, control described motion platform and finish motion process;
(2) implementation method of the control section of virtual reality subsystem:
The control section of virtual reality subsystem adopts based on the graphics rendering engine of increasing income and makes up, the distributed structure/architecture that comprises motion control end, network service end, scene rendering end of integrated cooperation interaction technology:
Network service end is based on network engine RAKNET, definition is based on the dynamic data structure of XML and the communication protocol between each level, utilize IDENTIFY as interconnected sign, finish respective signal transmission, to realize the real-time and the accuracy of communication for control end and the many scene rendering end of doing more physical exercises; Network service end receives the control information of motion control end on the one hand and sends into the identical scene rendering end of interconnected sign, and the real-time information with the scene rendering end feeds back to the motion control end on the other hand;
The motion control end links to each other with described motion platform by serial ports, adopts the mode of down trigger to read serial data in real time, simultaneously Serial Port Information is converted to the operation information of playing up scene, as: speed, direction, position etc. send to network service end; The motion control termination is subjected to the scene feedback information sent through network service end by the scene rendering end, reads in described motion controller by serial ports, finishes the mode of motion adjustment of described motion platform;
OO graphics rendering engine OGRE that the utilization of scene rendering end is increased income and three-dimensional picture instrument 3DMAX finish the exploitation of the virtual reality scenario of high-immersion, introduce physical engine PhysX simultaneously and realize physics special efficacy, the enhanced scene sense of reality;
(3) implementation method of Projection Display subsystem:
The Projection Display subsystem adopts triple channel ring curtain projecting method, utilizes the edge fusion device to realize the Projection Display of stereoscopic visual effect.
The present invention can be provided with the corresponding virtual reality scene according to the feature and the motion state of motion platform.At the volley, motion state with motion platform, pass to virtual environment in real time as parameters such as speed, directions, environment and motion platform are changed synchronously, simultaneously, with the feedback quantity in the environment, as real-time transmitting movement platform such as collision, landform variation, motion state for motion platform is adjusted, thereby makes the participant reach psychological feelings on the spot in person.
The present invention also provides distributed how mutual virtual reality subsystem.Finish the intelligent perception between the different objects and mutual in virtual environment and controlled device and the same environment.
The invention has the beneficial effects as follows because virtual reality technology is introduced, for motion platform provides reality environment on the spot in person, reality environment strengthens telepresenc for the interaction of motion platform feedback control information realization motion with environment simultaneously, makes the participant have high-immersion; Thereby improve body-building enjoyment, reach health-care effect.
Description of drawings
The theory diagram of Fig. 1 system of the present invention;
Fig. 2 virtual reality subsystem hardware configuration schematic diagram;
Fig. 3 Projection Display subsystem hardware configuration schematic diagram;
Fig. 4 motion platform subsystem motion control process flow diagram;
Fig. 5 virtual reality subsystem software structural drawing;
Fig. 6 virtual scene is played up process flow diagram;
Fig. 7 collision detection process flow diagram;
The mutual control flow chart of Fig. 8.
In Fig. 2-3,1 motion control end computing machine, 2 network service end computing machines, 3 scene rendering end computing machines, 4 multimode optical fiber LAN (Local Area Network), 5 edge fusion devices, 6 projector, 7 stereo sets, 8 ring curtains.
Embodiment
The embodiment of system of the present invention following (seeing Fig. 1-3):
Present embodiment is made up of motion platform subsystem, virtual reality subsystem, Projection Display subsystem; Described motion platform subsystem is made up of N degree of freedom parallel kinematic platform, motor servo driver, motion controller, sensor, the input end of the output termination motor servo driver of motion controller, the input end of the output termination N degree of freedom parallel kinematic platform of motor servo driver, described sensor is made up of speed pickup and direction sensor;
Described virtual reality subsystem is made up of motion control end computing machine, network service end computing machine, scene rendering end computing machine, the input end of the described motion controller of output termination of motion control end computing machine, the respective input of described motion control end computing machine connects the output terminal of described sensor, and the network service end computing machine is connected with scene rendering end computer bidirectional with motion control end computing machine respectively;
Described Projection Display subsystem is made up of edge fusion device, projector, stereo set, ring curtain; The output terminal of the described scene rendering end of the input termination computing machine of edge fusion device, fusion device three tunnel outputs in edge connect the input end of respective projection instrument respectively, the output of described projector is projected on the ring curtain, and the audio output of scene rendering end computing machine connects corresponding stereo set.
Described N degree of freedom parallel kinematic platform adopts six-degree-of-freedom parallel device horse (referring to the ZL200610048379.9 in the above-mentioned background technology part).
Described N degree of freedom parallel kinematic platform also can adopt treadmill, Exercycle etc.
Described motion controller adopts the PCI-7356 motion control card of America NI company, and its digital control system is the open collecting and distributing control structure of " host computer+motion control card ".
The model of described motor servo driver is NADDT1205.
The model of described speed pickup is Hall JK80020, and described speed pickup is installed on the motion platform of front lower place of six-degree-of-freedom parallel device length body; Described direction sensor adopts the rotary torque sensor, and its model is ORT-8036, and described direction sensor is installed in six-degree-of-freedom parallel device head of the horse portion position.Therefore described scene rendering terminal computer will need to handle a large amount of graphical informations as graphics workstation, requires the calculator memory configuration than higher.With common Hewlett-Packard's graphics workstation is example:
Cpu frequency/dominant frequency: 2000MHz
Processor type: AMD Opteron 2212
CPU: double-core, 64 computing techniques can provide remarkable calculating and visualization function
Full duplex PCIe*16 port is supported 2 high-end video cards, can be used for 4 3D video card displays
Video card model: NVIDIA Quadro FX1500
Video card chip: Q uadro FX1500
Video memory capacity: 256 (MB)
Hard-disk capacity: 160G.
The hardware of Projection Display subsystem is as follows:
Ring curtain: metal curtain, 210 ° of radians; High 2200mm; Diameter 6000mm
3 of projector, height 4000 lumens; Resolution 1280 * 768; Camera lens 1: 1.8
Edge fusion device:, must adopt the hardware fusion device that adds to finish for VGA picture signal from computing machine.Fusion device model: pacify permanent AH640
Stereo set: the 5.1 sound channel audio amplifiers that adopt BOSE AM15.
2, the embodiment of the method for the invention following (seeing Fig. 1-8):
(1) implementation method of the control system of motion platform subsystem (referring to Fig. 4):
The control system of described motion platform subsystem is aided with the exploitation of Motion motion-control module under graphical programming language LabVIEW environment, adopt stratification architecture (referring to " Control System Design of a kind of novel bionic machine horse " in the above-mentioned background technology);
At first provide the movement locus of described motion platform, by real horse is being careful, trot, image or video acquisition are carried out in the motion of hurrying up under three kinds of states, track according to image demarcation and feature point extraction obtains three kinds of typical movement locus again, by host computer described three kinds of typical movement locus are write in the Butter buffer memory of described motion controller again, the high-precision motion control and the sequential thereof that realize described motion platform by LabVIEW Notifier synchronous communication technology are synchronous, use the BreaKpoints high-speed capture technology of described motion controller, monitor feeding of each in real time, control described motion platform and finish motion process;
(2) implementation method of the control section of virtual reality subsystem (referring to Fig. 5):
The control section of virtual reality subsystem adopts based on the graphics rendering engine of increasing income and makes up, the distributed structure/architecture that comprises motion control end, network service end, scene rendering end of integrated cooperation interaction technology:
Network service end is based on network engine RAKNET, definition is based on the dynamic data structure of XML and the communication protocol between each level, utilize IDENTIFY as interconnected sign, finish respective signal transmission, to realize the real-time and the accuracy of communication for control end and the many scene rendering end of doing more physical exercises; Network service end receives the control information of motion control end on the one hand and sends into the identical scene rendering end of interconnected sign, and the real-time information with the scene rendering end feeds back to the motion control end on the other hand;
The motion control end links to each other with described motion platform by serial ports, adopts the mode of down trigger to read serial data in real time, simultaneously Serial Port Information is converted to the operation information of playing up scene, as: speed, direction, position etc. send to network service end; The motion control termination is subjected to the scene feedback information sent through network service end by the scene rendering end, reads in described motion controller by serial ports, finishes the mode of motion adjustment of described motion platform;
The scene rendering end is positioned on the graphics server, can link to each other with display devices such as display, optical projection system and Helmet Mounted Displays.OO graphics rendering engine OGRE that the utilization of scene rendering end is increased income and three-dimensional picture instrument 3DMAX finish the exploitation of the virtual reality scenario of high-immersion, introduce physical engine PhysX simultaneously and realize physics special efficacy, the enhanced scene sense of reality;
Detailed process as shown in Figure 6, the loading of scenario resources (XML) file and third party's model data (comprising static model, transaction object and skeleton cartoon) is finished in initialization, light efficiency is set and creates video camera, creates frame simultaneously and monitors class; The outside data-triggered frame that upgrades responds processing in the circulation snoop procedure; Main realize collision detection and the renewal of playing up scene are handled in the frame response.Scene simulation at first upgrades plays up target, upgrades related with it viewport then and produces the FPS statistical information, and viewport calls the render scene method of related with it video camera and plays up, and the scene of final updating realizes emulation through playing up the subsystem class.For guaranteeing the three-dimensional of angular field of view, create six video camera viewports of up, down, left, right, before and after.Because the motion of entity object has very big randomness, for the target that reaches real-time, interactive guarantees visually natural and trippingly again, set 33 milliseconds logical renewal timer, logical renewal all can be played up triggering each time.
(3) implementation method of Projection Display subsystem:
The Projection Display subsystem adopts triple channel ring curtain projecting method, utilizes the edge fusion device to realize the Projection Display of stereoscopic visual effect.
The speed that scene in the described virtual reality subsystem changes and the implementation method of direction are as follows:
At first, the motion control end is issued information with the motion control end by server logic then and is indicated the identical scene rendering end of port by parameters such as serial ports program reading speed, directions; What the scene rendering end utilized OGRE plays up the renewal class, upgrades in the operation of frame at each, velocity information is handled obtaining under current speed by uniform motion, and visible scene information is played up according to the positional information that obtains in the position that next frame is possible; Visual natural and tripping in order to guarantee, the frame timer of setting 33ms, logical renewal all can be played up triggering each time.
Collision in the described virtual reality subsystem and the implementation method of terrain detection following (referring to Fig. 7):
Realize collision detection feedback and terrain detection feedback by means of physical engine PhysX,, promptly embody the collision effect of object and the concave-convex effect of landform to realize the high-immersion of scene;
Adopt OO bounding box collision ray detection method, the collision model of stationary body is described in employing along the bounding box (AABB) of coordinate axis, intersect the calculated amount that detects in order to reduce, reduce collision time, the crossing detection between box and box is reduced to towards the ray detection with kinetic characteristic object; Promptly use the object in the OO method representation virtual environment, but the object that only will have kinetic characteristic is as the collision detection agent object, other objects are regarded static box as; Have only and whether when the position of motion object or direction change, just detect and collision has taken place, and make corresponding collision response; The specific implementation of collision detection is utilized physical engine PhysX; When generating scene, create corresponding physical world, make each solid model in the virtual scene that a corresponding physical world object with it all be arranged, obtain to point to the pointer of physical object by code command, utilize pointer to call the collision ray detection function that the PhysX engine provides, realize collision detection; Utilize the function return message to come more new field then.
The mutual implementation method of described virtual reality subsystem and described motion platform following (referring to Fig. 8):
Being meant alternately of virtual reality subsystem and described motion platform, the virtual reality subsystem will be according to the scene picture that will show when the real-time regulation and control of movement velocity, directional information of producing described motion platform on the one hand; Virtual reality system also can be according to the control system of scene feedback information to motion platform on the other hand;
When described motion platform moved with very fast speed, the pace of change of the scene that presents accordingly also can be very fast, also will be consistent with described motion platform on the direction; If run into barrier in the scene in motion process, will bump, the virtual reality subsystem will feed back to collision information described motion platform, and described motion platform will produce braking according to this or slow down; Equally, if what run in scene is pitting or slope on the landform, described motion platform will be according to the pose of feedback information adjustment campaign.
The realization of forward control detects the parameters such as speed, direction of described motion platform by speed, direction sensor, detect the data that obtain and be converted to digital signal through A/D, delivers to motion control end computing machine by the RS-232 serial ports; Motion control end computing machine reads go forward side by side line data packing of relevant information and sends to the network service end computing machine according to host-host protocol, the network service end computing machine sends to the scene rendering end computing machine consistent with motion control end computer port according to how mutual port logic with packet; Scene rendering end computing machine obtains corresponding speed, direction parameter information by separating press operation after receiving packet, and carries out playing up of associated scenario according to this information;
The realization of FEEDBACK CONTROL is based on the collision detection of virtual scene, in scene rendering end computing machine, utilize OO bounding box collision ray detection algorithm to carry out collision detection in real time, and send to the network service end computing machine with the form of packet equally in bump issuable information and respective operations of collision feedback fraction analysis, the network service end computing machine utilizes port logic to feed back to corresponding motion control end computing machine, utilizes motion control card PCI-7356 to realize the Interrupt Process of described motion platform at motion control end computing machine.
Claims (8)
1. the intelligent perception interactive kinematic system of high telepresenc is characterized in that it is made up of motion platform subsystem, virtual reality subsystem, Projection Display subsystem; Described motion platform subsystem is made up of N degree of freedom parallel kinematic platform, motor servo driver, motion controller, sensor, the input end of the output termination motor servo driver of motion controller, the input end of the output termination N degree of freedom parallel kinematic platform of motor servo driver, described sensor is made up of speed pickup and direction sensor;
Described virtual reality subsystem is made up of motion control end computing machine, network service end computing machine, scene rendering end computing machine, the input end of the described motion controller of output termination of motion control end computing machine, the respective input of described motion control end computing machine connects the output terminal of described speed pickup and direction sensor, and the network service end computing machine is connected with scene rendering end computer bidirectional with motion control end computing machine respectively;
Described Projection Display subsystem is made up of edge fusion device, projector, stereo set, ring curtain; The output terminal of the described scene rendering end of the input termination computing machine of edge fusion device, three tunnel outputs of edge fusion device connect the input end of homolographic projection instrument respectively, the output of described projector is projected on the ring curtain, and the audio output of scene rendering end computing machine connects corresponding stereo set;
Described N degree of freedom parallel kinematic platform adopts six-degree-of-freedom parallel device horse.
2. the intelligent perception interactive kinematic system of high telepresenc according to claim 1, it is characterized in that described motion controller adopts the PCI-7356 motion control card, the digital control system of this motion control platform subsystem is the open collecting and distributing control structure of " host computer+motion control card ".
3. the intelligent perception interactive kinematic system of high telepresenc according to claim 2, the model that it is characterized in that described motor servo driver is MADDT1205.
4. the intelligent perception interactive kinematic system of high telepresenc according to claim 3, the model that it is characterized in that described speed pickup is Hall JK80020, and described speed pickup is installed on the motion platform of front lower place of six-degree-of-freedom parallel device length body; Described direction sensor adopts the rotary torque sensor, and its model is ORT-8036, and described direction sensor is installed in six-degree-of-freedom parallel device head of the horse portion position.
5. the implementation method of the intelligent perception interactive kinematic system of high telepresenc according to claim 1 is characterized in that:
(1) implementation method of the control system of motion platform subsystem:
The control system of described motion platform subsystem is aided with the exploitation of Motion motion-control module under graphical programming language LabVIEW environment, adopt the stratification architecture;
At first provide the movement locus of described motion platform, by real horse is being careful, trot, image or video acquisition are carried out in the motion of hurrying up under three kinds of states, track according to image demarcation and feature point extraction obtains three kinds of typical movement locus again, by host computer described three kinds of typical movement locus are write in the Buffer buffer memory of described motion controller again, realize that by LabVIEWNotifier synchronous communication technology the motion control of described motion platform and sequential thereof are synchronous, use the BreaKpoints high-speed capture technology of described motion controller, monitor feeding of each in real time, control described motion platform and finish motion process;
(2) implementation method of the control section of virtual reality subsystem:
The control section of virtual reality subsystem adopts based on the graphics rendering engine of increasing income and makes up, the distributed structure/architecture that comprises motion control end, network service end, scene rendering end of integrated cooperation interaction technology:
Network service end is based on network engine RAKNET, definition is based on the dynamic data structure of XML and the communication protocol between each level, utilize IDENTIFY as interconnected sign, finish respective signal transmission, to realize the real-time and the accuracy of communication for control end and the many scene rendering end of doing more physical exercises; Network service end receives the control information of motion control end on the one hand and sends into the identical scene rendering end of interconnected sign, and the real-time information with the scene rendering end feeds back to the motion control end on the other hand;
The motion control end links to each other with described motion platform by serial ports, adopts the mode of down trigger to read serial data in real time, simultaneously Serial Port Information is converted to the operation information of playing up scene, sends to network service end; The motion control termination is subjected to the scene feedback information sent through network service end by the scene rendering end, reads in described motion controller by serial ports, finishes the mode of motion adjustment of described motion platform;
OO graphics rendering engine OGRE that the utilization of scene rendering end is increased income and three-dimensional picture instrument 3DMAX finish the exploitation of the virtual reality scenario of high-immersion, introduce physical engine PhysX simultaneously and realize the physics special efficacy, the enhanced scene sense of reality;
(3) implementation method of Projection Display subsystem:
The Projection Display subsystem adopts triple channel ring curtain projecting method, utilizes the edge fusion device to realize the Projection Display of stereoscopic visual effect.
6. the implementation method of the intelligent perception interactive kinematic system of high telepresenc according to claim 5 is characterized in that the implementation method of speed that the scene in the described virtual reality subsystem changes and direction is as follows:
At first, the motion control end is issued information with the motion control end by server logic then and is indicated the identical scene rendering end of port by serial ports program reading speed, direction parameter; The scene rendering end utilizes that graphics rendering engine OGRE's play up the renewal class, upgrade in the operation of frame at each, velocity information is obtained under current speed by the uniform motion processing, and visible scene information is played up according to the positional information that obtains in the position that next frame is possible; Visual natural and tripping in order to guarantee, the frame timer of setting 33ms, logical renewal all can be played up triggering each time.
7. the implementation method of the intelligent perception interactive kinematic system of high telepresenc according to claim 6 is characterized in that the implementation method of collision in the described virtual reality subsystem and terrain detection is as follows:
Realize collision detection feedback and terrain detection feedback by means of physical engine PhysX,, promptly embody the collision effect of object and the concave-convex effect of landform to realize the high-immersion of scene;
Adopt OO bounding box collision ray detection method, adopt the collision model of describing stationary body along the bounding box of coordinate axis, the crossing detection between box and box is reduced to towards the ray detection with kinetic characteristic object; Promptly use the object in the OO method representation virtual environment, but the object that only will have kinetic characteristic is as the collision detection agent object, other objects are regarded static box as; Have only and whether when the position of motion object or direction change, just detect and collision has taken place, and make corresponding collision response; The specific implementation of collision detection is utilized physical engine PhysX; When generating scene, create corresponding physical world, make each solid model in the virtual scene that a corresponding physical world object with it all be arranged, obtain to point to the pointer of physical object by code command, utilize pointer to call the collision ray detection function that physical engine PhysX provides, realize collision detection; Utilize the function return message to come more new field then.
8. the implementation method of the intelligent perception interactive kinematic system of high telepresenc according to claim 7 is characterized in that the mutual implementation method of described virtual reality subsystem and described motion platform is as follows:
The realization of forward control detects speed, the direction parameter of described motion platform by speed, direction sensor, detects the data that obtain and is converted to digital signal through A/D, delivers to motion control end computing machine by the RS-232 serial ports; Motion control end computing machine reads go forward side by side line data packing of relevant information and sends to the network service end computing machine according to host-host protocol, the network service end computing machine sends to the scene rendering end computing machine consistent with motion control end computer port according to how mutual port logic with packet; Scene rendering end computing machine obtains corresponding speed, direction parameter by separating press operation after receiving packet, and carries out playing up of associated scenario according to this information;
The realization of FEEDBACK CONTROL is based on the collision detection of virtual scene, in scene rendering end computing machine, utilize OO bounding box collision ray detection algorithm to carry out collision detection in real time, and send to the network service end computing machine with the form of packet equally in bump issuable information and respective operations of collision feedback fraction analysis, the network service end computing machine utilizes port logic to feed back to corresponding motion control end computing machine, utilizes motion control card PCI-7356 to realize the Interrupt Process of described motion platform at motion control end computing machine.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009100738293A CN101510074B (en) | 2009-02-27 | 2009-02-27 | High present sensation intelligent perception interactive motor system and implementing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009100738293A CN101510074B (en) | 2009-02-27 | 2009-02-27 | High present sensation intelligent perception interactive motor system and implementing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101510074A CN101510074A (en) | 2009-08-19 |
CN101510074B true CN101510074B (en) | 2010-12-08 |
Family
ID=41002491
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2009100738293A Expired - Fee Related CN101510074B (en) | 2009-02-27 | 2009-02-27 | High present sensation intelligent perception interactive motor system and implementing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101510074B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108399008A (en) * | 2018-02-12 | 2018-08-14 | 张殿礼 | A kind of synchronous method of virtual scene and sports equipment |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102122204B (en) * | 2011-01-17 | 2012-07-18 | 北京邮电大学 | Distributed force sense synchronous sensing method and system |
CN102654804B (en) * | 2011-05-18 | 2016-03-09 | 上海华博信息服务有限公司 | A kind of human-computer interaction system based on irregular screen body multiple point touching |
CN102580328B (en) * | 2012-01-10 | 2014-02-19 | 上海恒润数码影像科技有限公司 | Control device of 4D (four-dimensional) audio and video all-in-one machine and control method of 4D audio and video all-in-one machine |
JP5949234B2 (en) * | 2012-07-06 | 2016-07-06 | ソニー株式会社 | Server, client terminal, and program |
CN103700297B (en) * | 2014-01-03 | 2016-01-06 | 周亚军 | Full-automatic two dimension writes body-building tutoring system |
CN103885465A (en) * | 2014-04-02 | 2014-06-25 | 中国电影器材有限责任公司 | Method for generating dynamic data of dynamic seat based on video processing |
CN105396260B (en) * | 2014-09-10 | 2018-03-27 | 乔山健身器材(上海)有限公司 | Promote the sports equipment of control panel function using portable electron device |
CN104916182B (en) * | 2015-05-27 | 2017-07-28 | 北京宇航系统工程研究所 | A kind of immersive VR maintenance and Training Simulation System |
CN105183005A (en) * | 2015-08-25 | 2015-12-23 | 李尔 | Scene virtualization device and control method |
CN106492455B (en) * | 2016-09-30 | 2019-12-27 | 深圳前海万动体育智能科技有限公司 | Football electronic interaction system |
WO2018171196A1 (en) | 2017-03-21 | 2018-09-27 | 华为技术有限公司 | Control method, terminal and system |
CN107632703A (en) * | 2017-09-01 | 2018-01-26 | 广州励丰文化科技股份有限公司 | Mixed reality audio control method and service equipment based on binocular camera |
CN108010079B (en) * | 2017-10-19 | 2021-11-02 | 中国船舶工业系统工程研究院 | State information remote monitoring system and method based on projection fusion and image recognition |
CN107967060A (en) * | 2017-12-14 | 2018-04-27 | 齐乐无穷(北京)文化传媒有限公司 | Multi-freedom parallel connection vivid platform data real-time collecting system and method |
US11633673B2 (en) | 2018-05-17 | 2023-04-25 | Universal City Studios Llc | Modular amusement park systems and methods |
US10445943B1 (en) * | 2018-09-19 | 2019-10-15 | Royal Caribbean Cruises Ltd. | Virtual reality bungee trampoline |
CN110478916A (en) * | 2019-08-19 | 2019-11-22 | 上海恒润文化科技有限公司 | A kind of the scenario triggered system and triggering method of special screne |
CN110602471A (en) * | 2019-10-10 | 2019-12-20 | 上海迪东实业有限公司 | Projection equipment cascade module and method and projection signal cascade method |
CN111010561A (en) * | 2019-12-20 | 2020-04-14 | 上海沃咨信息科技有限公司 | Virtual reality projection system based on VR technique |
CN114296541A (en) * | 2021-04-26 | 2022-04-08 | 苏州酷约网络科技有限公司 | Motion interaction system based on cloud rendering and application thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2170136Y (en) * | 1993-04-30 | 1994-06-29 | 胡耀宗 | Body-building horse of entertainment walking type |
CN1634626A (en) * | 2003-12-30 | 2005-07-06 | 上海科技馆 | Method and apparatus for realizing virtual carriage drive |
CN2877812Y (en) * | 2005-11-27 | 2007-03-14 | 梁晖 | Electric robot horse |
CN101002988A (en) * | 2006-09-29 | 2007-07-25 | 燕山大学 | Non-running type multiple-dimension motion imitating exercising horse |
CN200953191Y (en) * | 2006-09-30 | 2007-09-26 | 吉林大学 | Parallel six freedom driving simulator |
-
2009
- 2009-02-27 CN CN2009100738293A patent/CN101510074B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN2170136Y (en) * | 1993-04-30 | 1994-06-29 | 胡耀宗 | Body-building horse of entertainment walking type |
CN1634626A (en) * | 2003-12-30 | 2005-07-06 | 上海科技馆 | Method and apparatus for realizing virtual carriage drive |
CN2877812Y (en) * | 2005-11-27 | 2007-03-14 | 梁晖 | Electric robot horse |
CN101002988A (en) * | 2006-09-29 | 2007-07-25 | 燕山大学 | Non-running type multiple-dimension motion imitating exercising horse |
CN200953191Y (en) * | 2006-09-30 | 2007-09-26 | 吉林大学 | Parallel six freedom driving simulator |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108399008A (en) * | 2018-02-12 | 2018-08-14 | 张殿礼 | A kind of synchronous method of virtual scene and sports equipment |
Also Published As
Publication number | Publication date |
---|---|
CN101510074A (en) | 2009-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101510074B (en) | High present sensation intelligent perception interactive motor system and implementing method | |
US10671239B2 (en) | Three dimensional digital content editing in virtual reality | |
CN110650354B (en) | Live broadcast method, system, equipment and storage medium for virtual cartoon character | |
US10445932B2 (en) | Running exercise equipment with associated virtual reality interaction method and non-volatile storage media | |
CN104782122B (en) | Controlled three-dimensional communication point-to-point system and method | |
CN102163077B (en) | Capturing screen objects using a collision volume | |
CN103197757A (en) | Immersion type virtual reality system and implementation method thereof | |
CN201570011U (en) | Terminal control device and terminal | |
Li et al. | Real-time immersive table tennis game for two players with motion tracking | |
CN103324488B (en) | A kind of special-effect information acquisition methods and device | |
Song et al. | An immersive VR system for sports education | |
WO2017061890A1 (en) | Wireless full body motion control sensor | |
CN106775546A (en) | A kind of multichannel virtual real content synchronous method and its system | |
Holloway et al. | Virtual-worlds research at the University of North Carolina at Chapel Hill | |
CN113253843B (en) | Indoor virtual roaming realization method and realization system based on panorama | |
CN110262660B (en) | Virtual throwing and containing system based on Kinect somatosensory equipment | |
CN105843405A (en) | System and method based on 360-degree panoramic VR (virtual reality) holographic technique | |
Li et al. | Research on virtual skiing system based on harmonious human-computer interaction | |
CN106994246B (en) | Game system based on virtual reality | |
TWI835289B (en) | Virtual and real interaction method, computing system used for virtual world, and virtual reality system | |
CN106774935B (en) | Display device | |
CN109597480A (en) | Man-machine interaction method, device, electronic equipment and computer readable storage medium | |
Yuan et al. | Virtual fire drill system supporting co-located collaboration | |
CN112245910B (en) | Modeling and limit movement method and system based on Quest head display | |
CN108389246B (en) | Somatosensory equipment, game picture and naked eye 3D interactive game manufacturing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20101208 Termination date: 20130227 |