CN107333121B - The immersion solid rendering optical projection system and its method of moving view point on curve screens - Google Patents

The immersion solid rendering optical projection system and its method of moving view point on curve screens Download PDF

Info

Publication number
CN107333121B
CN107333121B CN201710501792.4A CN201710501792A CN107333121B CN 107333121 B CN107333121 B CN 107333121B CN 201710501792 A CN201710501792 A CN 201710501792A CN 107333121 B CN107333121 B CN 107333121B
Authority
CN
China
Prior art keywords
user
plane
projection
curve screens
eyes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710501792.4A
Other languages
Chinese (zh)
Other versions
CN107333121A (en
Inventor
赵思伟
杨承磊
李韩超
刘娟
刘士军
赵陆
周念梅
王玉超
孟祥旭
卞玉龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201710501792.4A priority Critical patent/CN107333121B/en
Publication of CN107333121A publication Critical patent/CN107333121A/en
Application granted granted Critical
Publication of CN107333121B publication Critical patent/CN107333121B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

The present invention relates to the immersion solid rendering optical projection system of moving view point on curve screens and its methods, this method renders optical projection system based on a kind of immersion solid of moving view point on curve screens, method includes the following steps: acquisition user information: tracking user's head position and posture information estimate user's eyes position and direction of visual lines;Three-dimensional rendering parameter is set: according to user's eyes position of estimation and direction of visual lines, adjusting the solid that stereoscopic camera carries out moving view point to parameter in real time and renders;Dynamical parallax adjustment: post-processing the parallax of the Stereoscopic image pair after rendering according to the geometry information of the relative positional relationship and projection screen of user and screen, and user is made to see comfortable correct stereo-picture in its position;The optimization of user's perception: it is mobile for user's head posture and eyeball respectively, dynamical parallax image adjusted is advanced optimized;Project stereoscopic scene, stereo scene of the user with anaglyph spectacles observation roaming virtual.

Description

The immersion solid rendering optical projection system and its method of moving view point on curve screens
Technical field
It is vertical that the invention belongs to a kind of immersions of moving view point in the technical field of stereoprojection more particularly to curve screens Body renders optical projection system and its method.
Background technique
With the arriving of information age 21 century and the maturation of digital technology, virtual reality technology and augmented reality Huge variation has occurred in continuous exploration in nearly 20 years in rapid development, entertainment industry.The content way means of amusement carry Body has all incorporated new intention and high-tech.Then, the property of participation recreational facilities for integrating many most advanced technologies of the time are many in China It constantly emerges in large numbers in the place such as exhibition center's fair hall and science and technology center in city.As virtual reality technology and augmented reality Basis, stereo projection technology is more and more mature, and application is also more and more extensive, in energy development, education, amusement, building what comes into a driver's Increasingly important role is played with fields such as urban planning, virtual medicine/chemical/biological engineerings.
Stereoprojection mode has very much, common are wear-type virtual reality device (such as HTC Vive), wear-type enhancing Real world devices (such as HoloLens), there are also most allow the stereoprojection (3D of such as cinema of projector known to people and large screen formula The mutual cooperation of movie projection systems, binocular projector and anaglyph spectacles can allow user to enjoy three-dimensional viewing on the spot in person Experience).Wherein, wear-type virtual reality device and wear-type augmented reality equipment is not widely available because expensive, at present Widely used stereoprojection mode is still based on the conventional stereo optical projection system of projector and large screen formula.People enjoy very much By this viewing form of three-dimensional film, but being constantly progressive with equipment and technology, the continuous improvement that people require viewing are single Pure conventional stereo shadow casting technique has been unable to satisfy the demand of people, has also discovered conventional stereo during continuous explore and throws Some areas for improvement of shadow technology:
First, wearing 3D glasses will limit the activity of spectators, and spectators watch film for a long time will appear eye strain etc. Malaise symptoms;Existing some medical literatures indicate that viewing 3D three-dimensional film may bring the safety such as vision induced disease to spectators Problem.Long-term viewing three-dimensional film may cause visual discomfort and fatigue, this is the common fault based on parallax stereo projection system.
Second, when using conventional stereo optical projection system, spectators can not interact with film, and it is not strong to there is sense.In the presence of Sense is a kind of illusion, can be summarized as following 6 kinds: real formula exists, immersion exists, social role's formula exists, is social in media Abundant formula exists, the information exchange of " we together " formula exists and exists as social role's medium formula.In other words, if Spectators are immersed in substance film and cannot distinguish between virtual and real boundary, then movie play back effect is exactly to be optimized ?.By finding that people recognize that film really shakes very much, still to the investigation of the experience after different people viewing 3D film They can not participate in the virtual movie world, and it is not strong to there is sense.
It is that existing conventional stereo optical projection system causes the identical image of user's right and left eyes viewing for second point problem 's.In the characteristic that the principle of stereo projection system is using people's right and left eyes parallax, allows user's right and left eyes to be respectively seen and projected Right and left eyes image, then right and left eyes image is fused into stereo-picture by brain.Traditional stereo projection system in the prior art is used Family is in different positions it is seen that identical right and left eyes image;But the difference due to user relative to screen position, left and right The variation of stereo virtual meeting bad student's shape and position that eye image is fused into (relative to real world).That is, with user position The change set, right and left eyes image is constant, but the location and shape of stereo virtual that right and left eyes image is fused into are changing.It causes Stating the reason of happening is that in traditional stereo projection system, virtual camera is to remove render scenes from single angle.With Family is only in zone of comfort just it is observed that the correct virtual image.Deviate zone of comfort, or in movement, the stereoscopic experience sense of user It will have a greatly reduced quality.Therefore, projection pattern in the prior art cannot provide the roaming comfortably in virtual scene for user And interactive experience.
It can not be from the angle real-time rendering scene figure from user in conclusion how to solve existing stereo projection system Picture can not allow user really to exist by mobile observation scene/object virtual image from different angles to generate this scene/object In the illusion of physical world, there is sense difference, still shortage effective solution scheme in user experience.
Summary of the invention
The present invention to solve the above-mentioned problems, overcomes in the prior art that stereo projection system can not be from the angle from user Real-time rendering scene image can not allow user by mobile observation scene/object virtual image from different angles to generate this Scape/object is really present in the illusion of physical world, and user experience has sense difference, provides on a kind of curve screens The immersion solid rendering optical projection system and its method of moving view point by tracking eyes of user position, and according to user and are thrown The shape information of the relative positional relationship and projection screen of shadow screen itself carries out three-dimensional rendering to virtual scene in real time and projects, So that user can naturally be roamed in the virtual scene projected by walking about for itself, from different location and angle Degree observation virtual scene, eliminating large screen three-dimensional imaging can only be the drawbacks of fixed position be watched, to greatly promote user Sense of participation, so that user is really dissolved into the virtual world, experience the 3D stereoscopic effect of shock true to nature.
To achieve the goals above, the present invention is using a kind of following technical solution:
The immersion solid of moving view point renders optical projection system on a kind of curve screens, which includes:
Tracing module, the tracing module are configured as tracking user's head position and posture information, estimate user's eyes The module of position and direction of visual lines;
Rendering module, the rendering module are configured as being believed according to user's eyes location information and curve screens position shape The parameter of stereoscopic camera pair is arranged in breath, after the completion of rendering, is carried out to the image that left and right camera generates according to the shape information of screen The module of post-processing;
Projection module, the projection module are configured as in the image projection to curve screens after rendering rendering module Module.
Further, the tracing module includes head pose acquisition device, head position acquisition device and eyes position Estimation unit,
The head pose acquisition device acquires user's head posture information, and is transmitted to the eyes location estimation list Member;
The head position acquisition device acquires user's head location information, and is transmitted to the eyes location estimation list Member;
The eyes location estimation unit user's head posture information and user's head location information based on the received, to Family eyes carry out estimation calculating relative to the location information and user's eyes direction of visual lines of curve screens, and are transmitted to the rendering Module.
Further, the projection module include stereoscopic camera to and curve screens, after rendering module rendering The projected image of stereoscopic camera pair be affixed on curve screens, establish the mapping in projected image space Yu curve screens space, it is complete At projection.
To achieve the goals above, the present invention uses following another technical solution:
The immersion solid of moving view point renders projecting method on a kind of curve screens, and this method is based on a kind of curve screens The immersion solid of upper moving view point renders optical projection system, method includes the following steps:
(1) acquire user information: tracking user's head position and posture information estimate user's eyes position and sight side To;
(2) three-dimensional rendering parameter is set: according to the user's eyes position and direction of visual lines estimated in step (1), being adjusted in real time Whole stereoscopic camera carries out the three-dimensional of moving view point to parameter and renders;
(3) dynamical parallax adjusts: according to the geometry information of the relative positional relationship and projection screen of user and screen The parallax of Stereoscopic image pair after rendering is post-processed, user is made to see comfortable correct stereo-picture in its position;
(4) user's perception optimizes: it is mobile for user's head posture and eyeball respectively, to dynamical parallax tune in step (3) Image after whole is advanced optimized;
(5) project stereoscopic scene, stereo scene of the user with anaglyph spectacles observation roaming virtual.
Further, in the step (1), user's head posture information is acquired using head pose acquisition device, and pass Transport to the eyes location estimation unit;User's head location information is acquired using head position acquisition device, and is transmitted to institute State eyes location estimation unit;The eyes location estimation unit user's head posture information and user's head position based on the received Confidence breath, location information and user's eyes direction of visual lines to user's eyes relative to curve screens carry out estimation calculating, and pass The rendering module is transported to, (2) are entered step.
Further, in the step (2), the three-dimensional of moving view point is carried out to parameter in adjustment stereoscopic camera in real time and is rendered Before, the corresponding relationship of convergence plane and projection plane is established, the corresponding relationship of the convergence plane and projection plane passes through width Height is than constraint, and the ratio of the length of the convergence plane and projection plane is K;
The convergence plane is the plane of the asymmetric view centrum intersection of the camera of left and right two of stereoscopic camera centering, described Convergence plane is rectangle plane, and the convergence plane is preset plane, and position is remained unchanged with shape;
The projection plane is the rectangle plane that user corresponding with convergence plane observes projected picture, for Curved surface projection screen, the projection plane are the hypothetical rectangle plane that image region left and right ends are connected on curved surface projection screen.
Further, in the step (2), the stereoscopic camera includes stereoscopic camera to position and stereoscopic camera to parameter Pair projection matrix;
The stereoscopic camera contraposition is set to stereoscopic camera to the position relative to convergence plane, according to user's eyes and projection The relative position of plane is arranged, and stereoscopic camera is flat to the shift length and user's eyes of the position relative to convergence plane and projection The shift length ratio of the relative position in face is K;
The projection matrix of the stereoscopic camera pair aligns confidence according to the geological information and stereoscopic camera of the convergence plane Breath calculates in real time,
Wherein, n is stereoscopic camera to section close in coordinate system, and f is stereoscopic camera to section remote in coordinate system, l, r, t, b Respectively stereoscopic camera is left and right to the nearly section of view centrum, upper and lower four sides.
Further, in the step (3), dynamical parallax image I adjusted is set2(u2,v2) it is output image, if Set the image I that rendering obtains in the step (2)1(u1,v1) it is input picture;
The specific steps of the dynamical parallax adjustment are as follows:
(3-1) is for exporting image I2(u2,v2) each of picture point, calculate the three-dimensional on its corresponding curve screens Coordinate points S2(x2,y2,z2);
(3-2) calculates S2(x2,y2,z2) corresponding to flat screen three-dimensional coordinate point S1(x1,y1,z1);
(3-3) calculates S1(x1,y1,z1) corresponding to input picture I1(u1,v1) in picture point, and be filled with to output Image I2(u2,v2) in, it obtains:
(u2,v2)=MI←C(MC←P(MP←I(u1,v1))) (2)
Wherein, MP←IIndicate (u1,v1) arrive (s1,t1) mapping;MC←PIndicate (s1,t1) arrive (s2,t2) mapping;MI←CTable Show (s2,t2) arrive (u2,v2) mapping;(u1,v1) indicate the image space that stereoscopic camera renders;(u2,v2) indicate dynamical parallax tune Image space after whole;(s1,t1) flat screen space after expression parameter;(s2,t2) curve screens after expression parameter Space.
Further, in the step (3-1), output image I is calculated according to following equation2(u2,v2) each of picture Three-dimensional coordinate point S on the corresponding curve screens of point2(x2,y2,z2):
Wherein, (u, v) indicates image space, (x2,y2,z2) indicate curve screens on (s2,t2) corresponding three-dimensional coordinate;Indicate curve screens space (s2,t2) and image space (u, v) table between mapping relations;M2Indicate curve screens space (s2,t2) and curve screens three-dimensional coordinate point (x2,y2,z2) mapping relations, can be by being parameterized to curve screens It obtains.
Further, in the step (3-2), S is calculated according to following equation2(x2,y2,z2) corresponding to flat screen Three-dimensional coordinate point S1(x1,y1,z1):
Wherein, (x1,y1,z1) indicate picture point S1Space coordinate;(x2,y2,z2) indicate picture point S2Space coordinate;M is indicated The perspective projection matrix of left camera corresponding to left eye;Transformational relation of the T representation space coordinate system to left camera coordinates system.
Further, in the step (3-3), S is calculated according to following equation1(x1,y1,z1) corresponding to input picture I1 (u1,v1) in picture point:
Wherein, (x1,y1,z1) indicate flat screen on (s1,t1) corresponding three-dimensional coordinate;Indicate flat screen Space (s1,t1) and image space (u, v) between mapping relations;M1Indicate flat screen space (s1,t1) and flat screen three Tie up coordinate points (x1,y1,z1) mapping relations, can be by carrying out parametrization acquisition to flat screen.
Further, in the step (4), for user's head posture, to dynamical parallax figure adjusted in step (3) As the specific steps advanced optimized are as follows:
Using inertial sensor detect head posture information, the estimation of user's eyes position is advanced optimized it is perfect, Optimal estimating formula are as follows:
Wherein, e be average pupil away from;K is the tangent value of AB flat screen angle;B is that point O ' arrives A " B " in z-direction Distance, in addition point O ' is, that is, along user's direction of visual lines, to be met according to the sight point of interest of the user of head position and Attitude estimation To first dummy object on coordinate points;A ", B " is the position for reevaluating out right and left eyes.
Further, mobile for user eyeball in the step (4), to dynamical parallax figure adjusted in step (3) As the specific steps advanced optimized are as follows:
The behavior state of detection user locks it in head position detection system when it is in gentle moving condition Head position height value y;When it, which is detached from gentle moving condition, enters NextState, its head position height value y is discharged.
Beneficial effects of the present invention:
1. the immersion solid rendering optical projection system and its method of moving view point, are created on a kind of curve screens of the invention Property the three-dimensional rendering and project extended to any curve screens by moving view point on, arbitrary surface screen moving view point has projected U.S.A solves the problems such as narrow viewing angle in the projection of flat screen moving view point, parallax free face is obvious, substantially increases three-dimensional feeling of immersion, And make the application scenarios of moving view point stereoprojection more extensive;The present invention, can be with real-time rendering high-resolution to algorithm optimization The stereoscopic picture plane of rate, and be not to have very high requirement to machine, while the basic accuracy of guarantee and frame number, and reduction pair The requirement of the performance of whole system practical can be applied in practice.
2. the immersion solid rendering optical projection system and its method of moving view point on a kind of curve screens of the invention, for The problems such as deformation of stereo virtual caused by the relativeness of user's viewpoint and curve screens complexity and displacement, propose dynamic vision Poor method of adjustment: according to the relative position of curve screens shape and user and screen, dynamic mapping adjusts institute's projected virtual scene Parallax, i.e., the right and left eyes image after three-dimensional rendering is post-processed, when guaranteeing that user walks about in virtual scene always See comfortable correct stereo virtual.
3. the immersion solid rendering optical projection system and its method of moving view point, use on a kind of curve screens of the invention The large-scale arc curtain stereo projection technology of triple channel, the stereoscopic picture plane high resolution of rendering is clear true to nature, and visual field covering is big, immerses Sense is strong, simultaneously because outstanding goes out screen stereoscopic effect, the sense of reality is stronger, compared to small-sized plane curtain, in solid, feeling of immersion etc. Has advantage in effect;And by tracking user's head posture, summary eye movement rule, successfully solves the left and right drift of the virtual image Move and upper and lower jitter problem, be user experience is more excellent.
4. the immersion solid rendering optical projection system and its method of moving view point, are realized on a kind of curve screens of the invention User can never ipsilateral observation virtual three-dimensional object effect, to change cinema system can only fix position passive viewing The drawbacks of, it is truly realized the experience for being immersed in virtual scene completely, new thinking is provided for interactive movie theatre system;The present invention Have good scalability, better experience effect can be provided, and can entertain in conjunction with many other technologies, educates, Training, demonstration etc. are applied in fields.
Detailed description of the invention
Fig. 1 is system construction drawing of the invention;
Fig. 2 is the rendering of moving view point solid and projection theory figure of the invention;
Fig. 3 is that parallax caused by screen shape of the invention changes schematic diagram;
Fig. 4 is that Disparity Analysis of the invention adjusts schematic diagram;
Fig. 5 is user's head posture schematic diagram of the invention;
Fig. 6 is system actual scene figure of the invention;
Fig. 7 is that flat screen drop shadow effect of the invention schemes;
Fig. 8 is that arcuate screens drop shadow effect of the invention schemes.
Specific embodiment:
It is noted that following detailed description is all illustrative, it is intended to provide further instruction to the application.Unless another It indicates, all technical and scientific terms used herein has usual with the application person of an ordinary skill in the technical field The identical meanings of understanding.
It should be noted that term used herein above is merely to describe specific embodiment, and be not intended to restricted root According to the illustrative embodiments of the application.As used herein, unless the context clearly indicates otherwise, otherwise singular Also it is intended to include plural form, additionally, it should be understood that, when in the present specification using term "comprising" and/or " packet Include " when, indicate existing characteristics, step, operation, device, component and/or their combination.
In the absence of conflict, the features in the embodiments and the embodiments of the present application can be combined with each other.It ties below Closing attached drawing, the invention will be further described with embodiment.
Embodiment 1:
As background technique is introduced, stereo projection system exists in the prior art can not be real from the angle from user When render scenes image, can not allow user by mobile observation scene/object virtual image from different angles to generate this scene/ Object is really present in the illusion of physical world, and user experience has sense difference, and the present embodiment provides a kind of Curved screens The immersion solid of moving view point renders optical projection system on curtain, by tracking eyes of user position, and according to user and projection screen The relative positional relationship of curtain and the shape information of projection screen itself carry out three-dimensional rendering to virtual scene in real time and project, so that Obtaining user can naturally be roamed in the virtual scene projected by walking about for itself, be seen from different location and angle Virtual scene is examined, eliminating large screen three-dimensional imaging can only be the drawbacks of fixed position be watched, to greatly promote the ginseng of user With sense, user is allow really to be dissolved into the virtual world, experiences the 3D stereoscopic effect of shock true to nature.
To achieve the goals above, the present invention adopts the following technical scheme:
The immersion solid of moving view point renders optical projection system on a kind of curve screens, as shown in Figure 1, the system includes:
Tracing module, the tracing module are configured as tracking user's head position and posture information, estimate user's eyes The module of position and direction of visual lines;
Rendering module, the rendering module are configured as being believed according to user's eyes location information and curve screens position shape The parameter of stereoscopic camera pair is arranged in breath, after the completion of rendering, is carried out to the image that left and right camera generates according to the shape information of screen The module of post-processing;
Projection module, the projection module are configured as in the image projection to curve screens after rendering rendering module Module.
The tracing module includes head pose acquisition device, head position acquisition device and eyes location estimation unit;
The head pose acquisition device acquires user's head posture information, and is transmitted to the eyes location estimation list Member;In the present embodiment, head pose acquisition device uses IMU component, and IMU component is fixed on user's head, and sees with head It examines direction to be consistent, so that it is guaranteed that available accurate head pose.
The head position acquisition device acquires user's head location information, and is transmitted to the eyes location estimation list Member;In the present embodiment, head position acquisition device tracks the position of user's head relative screen using Kinect video camera Information.It needs in advance to demarcate Kinect video camera, the positional number based on its own camera coordinate system that Kinect is obtained In the world coordinate system defined according to the system of being transformed into;The calibration process is completed offline in environmental structure, and is stored in configuration text In part, do not need to demarcate again in the immersion solid rendering optical projection system operation of moving view point on curve screens.
The eyes location estimation unit IMU component acquires based on the received user's head posture information and Kinect are taken the photograph The user's head location information of camera acquisition, location information and user's eyes sight side to user's eyes relative to curve screens To carrying out estimation calculating, and it is transmitted to the rendering module.
The projection module include stereoscopic camera to and curve screens, stereoscopic camera after rendering module rendering Pair projected image be affixed on curve screens, establish the mapping in projected image space Yu curve screens space, complete projection.
In such a system, three BENQ MS3081 3D projectors have been used to form projected array, every projector 800*600 120Hz image is projected, the projection of 2000*600 120Hz image can be completed after final splicing fusion.Such as Fig. 6 institute Show, projection screen uses arcuate screens, and maximum longitudal section is the rectangle of high 1.75m, long 5.6m, and the distance on this section to arc top is 0.85m。
Embodiment 2:
As background technique is introduced, stereo projection system exists in the prior art can not be real from the angle from user When render scenes image, can not allow user by mobile observation scene/object virtual image from different angles to generate this scene/ Object is really present in the illusion of physical world, and user experience has sense difference, and the present embodiment provides a kind of Curved screens The immersion solid of moving view point renders projecting method on curtain, by tracking eyes of user position, and according to user and projection screen The relative positional relationship of curtain and the shape information of projection screen itself carry out three-dimensional rendering to virtual scene in real time and project, so that Obtaining user can naturally be roamed in the virtual scene projected by walking about for itself, be seen from different location and angle Virtual scene is examined, eliminating large screen three-dimensional imaging can only be the drawbacks of fixed position be watched, to greatly promote the ginseng of user With sense, user is allow really to be dissolved into the virtual world, experiences the 3D stereoscopic effect of shock true to nature.
To achieve the goals above, the present invention uses following another technical solution:
The immersion solid of moving view point renders projecting method on a kind of curve screens, and this method is based in embodiment 1 The immersion solid of moving view point renders optical projection system on a kind of curve screens, method includes the following steps:
(1) acquire user information: tracking user's head position and posture information estimate user's eyes position and sight side To;
(2) three-dimensional rendering parameter is set: according to the user's eyes position and direction of visual lines estimated in step (1), being adjusted in real time Whole stereoscopic camera carries out the three-dimensional of moving view point to parameter and renders;
(3) dynamical parallax adjusts: according to the geometry information of the relative positional relationship and projection screen of user and screen The parallax of Stereoscopic image pair after rendering is post-processed, user is made to see comfortable correct stereo-picture in its position;
(4) user's perception optimizes: it is mobile for user's head posture and eyeball respectively, to dynamical parallax tune in step (3) Image after whole is advanced optimized;
(5) project stereoscopic scene, stereo scene of the user with anaglyph spectacles observation roaming virtual.
In the present embodiment, step (1) acquires user information: tracking user's head position and posture information estimate user Eyes position and direction of visual lines;
In the step (1), user's head posture information is acquired using head pose acquisition device, and is transmitted to described double Eye location estimation unit;User's head location information is acquired using head position acquisition device, and is transmitted to the eyes position Estimation unit;The eyes location estimation unit user's head posture information and user's head location information based on the received, it is right User's eyes carry out estimation calculating relative to the location information and user's eyes direction of visual lines of curve screens, and are transmitted to the wash with watercolours Module is contaminated, (2) are entered step.
In the present embodiment, three-dimensional rendering parameter is arranged in step (2): according to the user's eyes position estimated in step (1) And direction of visual lines, the three-dimensional rendering that stereoscopic camera carries out moving view point to parameter is adjusted in real time;
In the step (2), the convergence plane is the asymmetric view centrum of the camera of left and right two of stereoscopic camera centering The plane of intersection, the view centrum schematic diagram of stereoscopic camera pair, as shown in Fig. 2 (a), the asymmetric view centrum phase of two cameras in left and right Meet at convergence plane, the convergence plane is rectangle plane in virtual scene, and the convergence plane is preset plane, and position It is remained unchanged with shape;
The projection plane is the rectangle plane that user corresponding with convergence plane observes projected picture, user The view centrum schematic diagram of eyes, as shown in Fig. 2 (b);If projection screen is plane, projection plane is the imaging rectangle on screen Region;For curved surface projection screen, the projection plane is the imagination that image region left and right ends are connected on curved surface projection screen Rectangle plane.
Before the three-dimensional rendering that adjustment stereoscopic camera in real time carries out moving view point to parameter, convergence plane and projection are established The corresponding relationship of plane, the position (height away from ground) of convergence plane described in Fig. 2 (a) and length-width ratio are all by true generation Projection plane determines in boundary Fig. 2 (b).The ratio of width to height (can be obtained by measurement) of projection plane, determines the ratio of width to height of convergence plane; Position of the convergence plane in virtual scene, determines the region for the virtual scene that user can see.The convergence plane with The corresponding relationship of projection plane is constrained by the ratio of width to height, and the ratio of the length of the convergence plane and projection plane is K;
In the step (2), the stereoscopic camera includes projection of the stereoscopic camera to position and stereoscopic camera pair to parameter Matrix;
The stereoscopic camera contraposition is set to stereoscopic camera to the position relative to convergence plane, according to user's eyes and projection The relative position of plane is arranged, and stereoscopic camera is flat to the shift length and user's eyes of the position relative to convergence plane and projection The shift length ratio of the relative position in face is K;
The projection matrix of the stereoscopic camera pair aligns confidence according to the geological information and stereoscopic camera of the convergence plane Breath calculates in real time.Corresponding with projection screen, the convergence plane (rectangle plane) of stereoscopic camera pair is in virtual scene coordinate system Geological information (that is, the coordinate on four vertex) is to maintain constant.Enable four vertex of convergence plane in the seat of virtual scene coordinate system Be designated as LU, LD, RU, RD, then according to the position of stereoscopic camera pair, direction, convert it in camera coordinates system, in basis and Nearly section n, remote section f can calculate stereoscopic camera to view centrum.Enabling l, r, t, b is respectively that stereoscopic camera is close to view centrum Section is left and right, upper and lower four sides.The projection matrix of the stereoscopic camera pair is as follows:
Wherein, n is stereoscopic camera to section close in coordinate system, and f is stereoscopic camera to section remote in coordinate system, l, r, t, b Respectively stereoscopic camera is left and right to the nearly section of view centrum, upper and lower four sides.
In the present embodiment, (3) dynamical parallax adjusts: according to the relative positional relationship and projection screen of user and screen Geometry information post-processes the parallax of the Stereoscopic image pair after rendering, sees that user in its position comfortable correct Stereo-picture;
Compared with flat screen, curve screens can provide more extensive visual angle and more comfortable experience sense for user, Stereoprojection simultaneously on curve screens is also increasingly complex.On the one hand, the biggish curve screens of area need more projector associations With work to show high-resolution picture.On the other hand, the family eyes that will use of screen shape receive the parallax of picture and change, To cause stereo-picture to be subjected to displacement the distortion with shape, as shown in Fig. 3 (a), the view centrum schematic diagram of stereoscopic camera pair, by It is influenced in by arcuate screens, as shown in Fig. 3 (b), projects to the picture point S on arcuate screens2l、S2rIt is formed in user's eye Virtual image point P ' and flat screen on image S1l、S1rVirtual image point P is formed by be deviateed.However, virtual image point in Fig. 3 (b) P is only the correct picture that allow user to see.
For the variation of better analysis correction parallax as caused by curve screens, one is defined before curve screens virtually Flat screen.Curve screens space, flat screen space and the mapping relations in projected image space are as follows:
Wherein:
(u, v) indicates the image space of stereoscopic camera rendering, also projected image space;
(s2,t2) curve screens space after expression parameter;
(x2,y2,z2) indicate curve screens on (s2,t2) corresponding three-dimensional coordinate;
Indicate curve screens space (s2,t2) and image space (u, v) table between mapping relations;
M2Indicate curve screens space (s2,t2) and curve screens three-dimensional coordinate point (x2,y2,z2) mapping relations, can By carrying out parametrization acquisition to curve screens;
(s1,t1) flat screen space after expression parameter;
(x1,y1,z1) indicate flat screen on (s1,t1) corresponding three-dimensional coordinate;
Indicate flat screen space (s1,t1) and image space (u, v) between mapping relations;
M1Indicate flat screen space (s1,t1) and flat screen three-dimensional coordinate point (x1,y1,z1) mapping relations, can By carrying out parametrization acquisition to flat screen.
As shown in Fig. 3 (a), stereoscopic camera is to rendering about point P picture point Sl、Sr, with the mapping mode of wallpaper to figure 3 (b), the S on the flat screen in Fig. 4 (a)1l、S1rWith the S on curve screens2l、S2r.Its drop shadow effect will be as that will project Image is attached on curve screens the same as a wallpaper.And the S on curve screens2l、S2rThe virtual image point merged in human eye Image S on P ', with flat screen1l、S1rIt is formed by virtual image point P to compare, position is deviated.And mobile in user In the process, the position of virtual image point P ' can change with the position change of user, remain spatial position rather than virtual image point P It is constant.The drop shadow effect one in three-dimensional rendering and drop shadow effect and plane curtain in order to make the moving view point on curve screens Sample, i.e. holding virtual image point P ' are overlapped with virtual image point P always, have been herein proposed a kind of dynamic and have been adjusted stereoscopic camera to rendering to obtain Stereo-picture parallax method.
As shown in Fig. 4 (b), I is enabled1(u1,v1) rendering of left camera image, I2(u2,v2) it is I1(u1,v1) pass through dynamic vision Result images after difference adjusting.Picture point S in figure1, it is image I1(u1,v1) project to picture point corresponding to point P on flat screen; Picture point S in figure2, it is image I2(u2,v2) project to picture point corresponding to point P on curve screens.So dynamical parallax is tied after adjusting Fruit must keep, picture point S1, picture point S2Position with people's left eye has reached correction such as Fig. 4 (a) on same perspective projection line Middle picture point S1l, picture point S2lThe effect of position deviation between people's left eye position.Picture point S1, picture point S2Between meet perspective projection pass System, can be described with following formula:
Wherein:
(x1,y1,z1) indicate picture point S1Space coordinate;
(x2,y2,z2) indicate picture point S2Space coordinate;
M indicates the perspective projection matrix of left camera corresponding to left eye;
Transformational relation of the T representation space coordinate system to left camera coordinates system.
Dynamical parallax is the last handling process to stereoscopic camera to rendering gained image.By front analysis it is found that three-dimensional The image I that camera obtains rendering1(u1,v1), the drop shadow effect of moving view point can be reached by being projected directly on flat screen; And the displacement and deformation of the virtual image will be caused by projecting on curve screens, be needed to image I1(u1,v1) carry out parallax adjustment place It is projected again after reason.
In the step (3), dynamical parallax image I adjusted is set2(u2,v2) it is output image, the step is set (2) the image I that rendering obtains in1(u1,v1) it is input picture;
The specific steps of the dynamical parallax adjustment are as follows:
(3-1) is for exporting image I2(u2,v2) each of picture point, calculate the three-dimensional on its corresponding curve screens Coordinate points S2(x2,y2,z2);
In the step (3-1), output image I is calculated according to following equation2(u2,v2) each of picture point it is corresponding Three-dimensional coordinate point S on curve screens2(x2,y2,z2):
Wherein, (u, v) indicates image space, (x2,y2,z2) indicate curve screens on (s2,t2) corresponding three-dimensional coordinate;Indicate curve screens space (s2,t2) and image space (u, v) table between mapping relations;M2Indicate curve screens space (s2,t2) and curve screens three-dimensional coordinate point (x2,y2,z2) mapping relations, can be by being parameterized to curve screens It obtains.
(3-2) calculates S2(x2,y2,z2) corresponding to flat screen three-dimensional coordinate point S1(x1,y1,z1);
In the step (3-2), S is calculated according to following equation2(x2,y2,z2) corresponding to flat screen three-dimensional coordinate Point S1(x1,y1,z1):
Wherein, (x1,y1,z1) indicate picture point S1Space coordinate;(x2,y2,z2) indicate picture point S2Space coordinate;M is indicated The perspective projection matrix of left camera corresponding to left eye;Transformational relation of the T representation space coordinate system to left camera coordinates system.
(3-3) calculates S1(x1,y1,z1) corresponding to input picture I1(u1,v1) in picture point, and be filled with to output Image I2(u2,v2) in, it obtains:
(u2,v2)=MI←C(MC←P(MP←I(u1,v1))) (9)
Wherein, MP←IIndicate (u1,v1) arrive (s1,t1) mapping;MC←PIndicate (s1,t1) arrive (s2,t2) mapping;MI←CTable Show (s2,t2) arrive (u2,v2) mapping;(u1,v1) indicate the image space that stereoscopic camera renders;(u2,v2) indicate dynamical parallax tune Image space after whole;(s1,t1) flat screen space after expression parameter;(s2,t2) curve screens after expression parameter Space.
In the step (3-3), S is calculated according to following equation1(x1,y1,z1) corresponding to input picture I1(u1,v1) in Picture point:
Wherein, (x1,y1,z1) indicate flat screen on (s1,t1) corresponding three-dimensional coordinate;Indicate flat screen Space (s1,t1) and image space (u, v) between mapping relations;M1Indicate flat screen space (s1,t1) and flat screen three Tie up coordinate points (x1,y1,z1) mapping relations, can be by carrying out parametrization acquisition to flat screen.
In the present embodiment, (4) user's perception optimizes: it is mobile for user's head posture and eyeball respectively, to step (3) Middle dynamical parallax image adjusted is advanced optimized;
In the step (4), for user's head posture, in the optical projection system of moving view point, it is double to need to track user The position of eye, the position of left and right camera in rendering system is arranged.It is not direct tracking eyes however in common tracking Position, but head position is tracked, with the center position coordinates of head position estimation eyes, then according to average pupil away from estimating Count the position coordinates of right and left eyes.The mode of this estimation eyes position does not consider influence caused by head pose.Always Guarantee when user front screen-oriented, the virtual image of receiving is unaffected.However, the virtual image of the user in mobile observation projection When, because his sight focus just often will appear the case where side is to screen in the virtual image of its observation;This can exist to left images The location and shape of the virtual image formed in user's eyes generate a little influence.
As shown in Fig. 5 (a), enabling A, B is the physical location of right and left eyes, and P is the centre bit for the eyes estimated with head position Set, e be average pupil away from.Then A ', B ' are centered on P, and e is length, the position of the right and left eyes estimated;i1、i2For with A ', The picture point about point O ' of the corresponding left and right camera rendering of B '.And i1、i2Formed virtual image point is but point O in user's eye, and It is not our desired point O '.Such case exists, and can allow in user's moving process when observing stereo virtual, be aware of the virtual image It is not fixed to a certain position in space, but is drifted about in left and right, furthermore user can also be observed that a little torsional deformation of the virtual image.
The specific steps that dynamical parallax image adjusted in step (3) is advanced optimized are as follows:
Using inertial sensor detect head posture information, the estimation of user's eyes position is advanced optimized it is perfect, Optimal estimating formula are as follows:
Wherein, e be average pupil away from;K is the tangent value of AB flat screen angle;B is that point O ' arrives A " B " in z-direction Distance, in addition point O ' is, that is, along user's direction of visual lines, to be met according to the sight point of interest of the user of head position and Attitude estimation To first dummy object on coordinate points;A ", B " is the position for reevaluating out right and left eyes.
It is proved by present invention test, after user's head attitude updating eyes location estimation, user sees mobile The left and right drift sense of virtual image when examining stereo virtual substantially reduces.
Mobile for user eyeball in the step (4), in the test of actual user's experience sense, another there are one problems Abnormal prominent: user is when observing the virtual image while moving, it is found that the virtual image is being shaken up and down, and mobile faster shake is existing As being more obvious.However in real world, when people, which stares at jobbie while moving, to be seen, this object can't be felt Body is being shaken up and down.It is had found by experiment repeatedly with analysis, virtual object of the people when observing real-world object with observation projection Maximum difference when body: when a people is in walk process, the track of the movement on his head be in undaform one on the other, However when observing real-world object, himself can not be aware of this shake, this is because observed by eyes are focused at always Object on, the rotation of eyeball is compensated for moved up and down as head caused by deviation, to make imaging of this object in it Always it is in central area;And in the stereo virtual of observation projection, eyes are assembled on the projection screen, rather than screen front overhang On floating stereo virtual, therefore during people walks about, the rotation of eyeball can keep imaging of the screen in human eye to protect always Hold it is motionless, and in walk process head motion track be in undaform so that the position of rendering system neutral body camera pair also at Undaform is shaken up and down, to cause up and down shake sense of the stereo virtual in human eye, and object goes out to shield remoter, shake sense It is stronger.In conclusion in user's moving process, when observing real-world object, eyeball movement can make up by head jitter and The object shake sense of generation;And when observing dummy object, since eyes convergence plane is not on the stereo virtual of projection, to lead It causes, eyeball is mobile not to can overcome the disadvantages that the shake sense of the virtual image for being moved by head and being generated.
The specific steps that dynamical parallax image adjusted in step (3) is advanced optimized are as follows:
The behavior state of detection user locks it in head position detection system when it is in gentle moving condition Head position height value y;When it, which is detached from gentle moving condition, enters NextState, its head position height value y is discharged.
By user's contrastive test and analysis later, as the result is shown under the optimization of this scheme, user's fundamental sensation less than Shake sense up and down before, the sense of reality on the spot in person are greatly reinforced with feeling of immersion.
In the present embodiment, (5) project stereoscopic scene, stereo scene of the user with anaglyph spectacles observation roaming virtual.
In the step (5), curve screens are affixed on by the projected image of step (1)-step (4) stereoscopic camera pair On, the mapping in projected image space Yu curve screens space is established, projection is completed.
In the present embodiment, using the large-scale arc curtain stereo projection technology of triple channel, the stereoscopic picture plane high resolution of rendering, Clear true to nature, visual field covering is big, and feeling of immersion is strong, simultaneously because outstanding goes out screen stereoscopic effect, the sense of reality is stronger, compared to it is small-sized Plane curtain, has advantage in solid, feeling of immersion and other effects.And pass through tracking user's head posture, summary eye movement rule Rule, successfully solve the virtual image left and right drift and upper and lower jitter problem, be user experience is more excellent.
Fig. 7, Fig. 8 respectively show flat screen drop shadow effect figure and arcuate screens drop shadow effect figure;Fig. 7 left figure and Fig. 8 Left figure is the scene that user sees on the left side, and Fig. 7 right figure and Fig. 8 right figure are the scene that user sees on the right.Cube of front Body is the drop shadow effect for screen in stereoprojection, is seemingly just suspended in front of the user, and in user's test, there are many users all Once trial is gone to touch it with hand.By being stood to the rendering of flat screen moving view point solid and projection and arcuate screens moving view point The contrast test analysis of body rendering and projection, the results show that the stereoscopic effect of arcuate screens is more preferable, virtual image offset sense and shake sense Smaller, the sense of reality and feeling of immersion are stronger, are more liked by user.And since the visual angle of arcuate screens is broader, user Can be bigger than flat screen with movable range, therefore user can do more free exploration in virtual scene.
In addition, the parallax free face of right and left eyes image is overlapped with projection plane (that is, flat screen) on plane curtain, with Family is easier to notice the presence in parallax free face when observing, so that feeling of immersion be made to reduce;However, the arc of the moving view point in this paper In shaped screen stereoprojection, parallax free face be projection plane (arcuate screens face) before a rectangular section, the arc with physics Shaped screen is not overlapped, so that user is difficult to be aware of its presence so that feeling of immersion greatly reinforces.
Beneficial effects of the present invention:
1. the immersion solid rendering optical projection system and its method of moving view point, are created on a kind of curve screens of the invention Property the three-dimensional rendering and project extended to any curve screens by moving view point on, arbitrary surface screen moving view point has projected U.S.A solves the problems such as narrow viewing angle in the projection of flat screen moving view point, parallax free face is obvious, substantially increases three-dimensional feeling of immersion, And make the application scenarios of moving view point stereoprojection more extensive;The present invention, can be with real-time rendering high-resolution to algorithm optimization The stereoscopic picture plane of rate, and be not to have very high requirement to machine, while the basic accuracy of guarantee and frame number, and reduction pair The requirement of the performance of whole system practical can be applied in practice.
2. the immersion solid rendering optical projection system and its method of moving view point on a kind of curve screens of the invention, for The problems such as deformation of stereo virtual caused by the relativeness of user's viewpoint and curve screens complexity and displacement, propose dynamic vision Poor method of adjustment: according to the relative position of curve screens shape and user and screen, dynamic mapping adjusts institute's projected virtual scene Parallax, i.e., the right and left eyes image after three-dimensional rendering is post-processed, when guaranteeing that user walks about in virtual scene always See comfortable correct stereo virtual.
3. the immersion solid rendering optical projection system and its method of moving view point, use on a kind of curve screens of the invention The large-scale arc curtain stereo projection technology of triple channel, the stereoscopic picture plane high resolution of rendering is clear true to nature, and visual field covering is big, immerses Sense is strong, simultaneously because outstanding goes out screen stereoscopic effect, the sense of reality is stronger, compared to small-sized plane curtain, in solid, feeling of immersion etc. Has advantage in effect;And by tracking user's head posture, summary eye movement rule, successfully solves the left and right drift of the virtual image Move and upper and lower jitter problem, be user experience is more excellent.
4. the immersion solid rendering optical projection system and its method of moving view point, are realized on a kind of curve screens of the invention User can never ipsilateral observation virtual three-dimensional object effect, to change cinema system can only fix position passive viewing The drawbacks of, it is truly realized the experience for being immersed in virtual scene completely, new thinking is provided for interactive movie theatre system;The present invention Have good scalability, better experience effect can be provided, and can entertain in conjunction with many other technologies, educates, Training, demonstration etc. are applied in fields.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.

Claims (10)

1. the immersion solid of moving view point renders optical projection system on a kind of curve screens, it is characterized in that: the system includes:
Tracing module, the tracing module are configured as tracking user's head position and posture information, estimate user's eyes position With the module of direction of visual lines;
Rendering module, the rendering module are configured as according to user's eyes location information and curve screens position shape information, The parameter of stereoscopic camera pair is set, after the completion of rendering, after being carried out to the image that left and right camera generates according to the shape information of screen The module of processing;Before the three-dimensional rendering that adjustment stereoscopic camera in real time carries out moving view point to parameter, establish convergence plane with The corresponding relationship of the corresponding relationship of projection plane, the convergence plane and projection plane is constrained by the ratio of width to height, and the convergence The ratio of plane and the length of projection plane is K;The convergence plane is the non-right of the camera of left and right two of stereoscopic camera centering Claim the plane of view centrum intersection, the convergence plane is rectangle plane, and the convergence plane is preset plane, and position and shape It remains unchanged;The projection plane is the rectangle plane that user corresponding with convergence plane observes projected picture, right In curved surface projection screen, the projection plane is that the hypothetical rectangle of connection image region left and right ends on curved surface projection screen is flat Face;
Projection module, the projection module are configured as the mould in the image projection to curve screens after rendering rendering module Block.
2. the immersion solid of moving view point renders optical projection system, feature on a kind of curve screens as described in claim 1 Be: the tracing module includes head pose acquisition device, head position acquisition device and eyes location estimation unit, the head Portion's Posture acquisition device acquires user's head posture information, and is transmitted to the eyes location estimation unit;The head position Acquisition device acquires user's head location information, and is transmitted to the eyes location estimation unit;The eyes location estimation list First user's head posture information and user's head location information based on the received, the position to user's eyes relative to curve screens Information and user's eyes direction of visual lines carry out estimation calculating, and are transmitted to the rendering module;
The projection module include stereoscopic camera to and curve screens, stereoscopic camera pair after rendering module rendering Projected image is affixed on curve screens, establishes the mapping in projected image space Yu curve screens space, completes projection.
3. the immersion solid of moving view point renders projecting method on a kind of curve screens, this method is based on such as claim 1-2 The immersion solid of moving view point renders optical projection system on a kind of any curve screens, it is characterized in that: this method includes Following steps:
(1) acquire user information: tracking user's head position and posture information estimate user's eyes position and direction of visual lines;
(2) three-dimensional rendering parameter is arranged: according to the user's eyes position and direction of visual lines estimated in step (1), adjustment is vertical in real time Body camera carries out the three-dimensional of moving view point to parameter and renders;
(3) dynamical parallax adjusts: according to the geometry information of the relative positional relationship and projection screen of user and screen to wash with watercolours The parallax of Stereoscopic image pair after dye is post-processed, and user is made to see comfortable correct stereo-picture in its position;
(4) user's perception optimizes: it is mobile for user's head posture and eyeball respectively, after dynamical parallax adjustment in step (3) Image advanced optimized;
(5) project stereoscopic scene, stereo scene of the user with anaglyph spectacles observation roaming virtual;
In the step (2), before the three-dimensional rendering that adjustment stereoscopic camera in real time carries out moving view point to parameter, establishes and assemble The corresponding relationship of the corresponding relationship of plane and projection plane, the convergence plane and projection plane is constrained by the ratio of width to height, and institute The ratio for stating the length of convergence plane and projection plane is K;
The convergence plane is the plane of the asymmetric view centrum intersection of the camera of left and right two of stereoscopic camera centering, the convergence Plane is rectangle plane, and the convergence plane is preset plane, and position is remained unchanged with shape;
The projection plane is the rectangle plane that user corresponding with convergence plane observes projected picture, for curved surface Projection screen, the projection plane are the hypothetical rectangle plane that image region left and right ends are connected on curved surface projection screen.
4. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 3 It is: in the step (1), user's head posture information is acquired using head pose acquisition device, and be transmitted to the eyes position Set estimation unit;User's head location information is acquired using head position acquisition device, and is transmitted to the eyes location estimation Unit;The eyes location estimation unit user's head posture information and user's head location information based on the received, to user Eyes carry out estimation calculating relative to the location information and user's eyes direction of visual lines of curve screens, and are transmitted to the rendering mould Block enters step (2).
5. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 3 It is:
In the step (2), the stereoscopic camera includes projection matrix of the stereoscopic camera to position and stereoscopic camera pair to parameter;
The stereoscopic camera contraposition is set to stereoscopic camera to the position relative to convergence plane, according to user's eyes and projection plane Relative position setting, stereoscopic camera is to the shift length of the position relative to convergence plane and user's eyes and projection plane The shift length ratio of relative position is K;
The projection matrix of the stereoscopic camera pair is according to the geological information and stereoscopic camera of the convergence plane to location information reality When calculate,
Wherein, n is stereoscopic camera to section close in coordinate system, and f is stereoscopic camera to section remote in coordinate system, l, r, t, b difference For stereoscopic camera is left and right to the nearly section of view centrum, upper and lower four sides.
6. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 3 It is: in the step (3), dynamical parallax image I adjusted is set2(u2,v2) it is output image, it is arranged in the step (2) Render obtained image I1(u1,v1) it is input picture;
The specific steps of the dynamical parallax adjustment are as follows:
(3-1) is for exporting image I2(u2,v2) each of picture point, calculate the three-dimensional coordinate on its corresponding curve screens Point S2(x2,y2,z2);
(3-2) calculates S2(x2,y2,z2) corresponding to flat screen three-dimensional coordinate point S1(x1,y1,z1);
(3-3) calculates S1(x1,y1,z1) corresponding to input picture I1(u1,v1) in picture point, and be filled with to output image I2(u2,v2) in, it obtains:
(u2,v2)=MI←C(MC←P(MP←I(u1,v1))) (2)
Wherein, MP←IIndicate (u1,v1) arrive (s1,t1) mapping;MC←PIndicate (s1,t1) arrive (s2,t2) mapping;MI←CIt indicates (s2,t2) arrive (u2,v2) mapping;(u1,v1) indicate the image space that stereoscopic camera renders;(u2,v2) indicate dynamical parallax adjustment Image space afterwards;(s1,t1) flat screen space after expression parameter;(s2,t2) curve screens after expression parameter are empty Between.
7. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 6 It is: in the step (3-1), output image I is calculated according to following equation2(u2,v2) each of the corresponding Curved screen of picture point Three-dimensional coordinate point S on curtain2(x2,y2,z2):
Wherein, (u, v) indicates image space, (x2,y2,z2) indicate curve screens on (s2,t2) corresponding three-dimensional coordinate;Indicate curve screens space (s2,t2) and image space (u, v) table between mapping relations;M2Indicate curve screens space (s2,t2) and curve screens three-dimensional coordinate point (x2,y2,z2) mapping relations, can be by being parameterized to curve screens It obtains.
8. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 6 It is: in the step (3-2), S is calculated according to following equation2(x2,y2,z2) corresponding to flat screen three-dimensional coordinate point S1 (x1,y1,z1):
Wherein, (x1,y1,z1) indicate picture point S1Space coordinate;(x2,y2,z2) indicate picture point S2Space coordinate;M indicates left eye The perspective projection matrix of corresponding left camera;Transformational relation of the T representation space coordinate system to left camera coordinates system.
9. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 6 It is: in the step (3-3), S is calculated according to following equation1(x1,y1,z1) corresponding to input picture I1(u1,v1) in picture Point:
Wherein, (x1,y1,z1) indicate flat screen on (s1,t1) corresponding three-dimensional coordinate;Indicate flat screen space (s1,t1) and image space (u, v) between mapping relations;M1Indicate flat screen space (s1,t1) sat with flat screen three-dimensional Punctuate (x1,y1,z1) mapping relations, can be by carrying out parametrization acquisition to flat screen.
10. the immersion solid of moving view point renders projecting method, feature on a kind of curve screens as claimed in claim 3 It is: in the step (4), for user's head posture, the image adjusted of dynamical parallax in step (3) is carried out further excellent The specific steps of change are as follows:
Head posture information is detected using inertial sensor, perfect, optimization is advanced optimized to the estimation of user's eyes position Estimation formulas are as follows:
Wherein, e be average pupil away from;K is the tangent value of AB flat screen angle;B be point O ' arrive A " B " in z-direction away from From in addition point O ' is to be encountered according to the sight point of interest of the user of head position and Attitude estimation that is, along user's direction of visual lines First dummy object on coordinate points;A ", B " is the position for reevaluating out right and left eyes;
Mobile, the specific steps that dynamical parallax image adjusted in step (3) is advanced optimized for user eyeball Are as follows:
The behavior state of detection user locks its head when it is in gentle moving condition in head position detection system Position height value y;When it, which is detached from gentle moving condition, enters NextState, its head position height value y is discharged.
CN201710501792.4A 2017-06-27 2017-06-27 The immersion solid rendering optical projection system and its method of moving view point on curve screens Active CN107333121B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710501792.4A CN107333121B (en) 2017-06-27 2017-06-27 The immersion solid rendering optical projection system and its method of moving view point on curve screens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710501792.4A CN107333121B (en) 2017-06-27 2017-06-27 The immersion solid rendering optical projection system and its method of moving view point on curve screens

Publications (2)

Publication Number Publication Date
CN107333121A CN107333121A (en) 2017-11-07
CN107333121B true CN107333121B (en) 2019-02-26

Family

ID=60197870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710501792.4A Active CN107333121B (en) 2017-06-27 2017-06-27 The immersion solid rendering optical projection system and its method of moving view point on curve screens

Country Status (1)

Country Link
CN (1) CN107333121B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018130770A1 (en) * 2017-12-13 2019-06-13 Apple Inc. Stereoscopic rendering of virtual 3D objects
CN108133189B (en) * 2017-12-22 2020-05-01 苏州大学 Hospital waiting information display method
CN110134222A (en) * 2018-02-02 2019-08-16 上海集鹰科技有限公司 A kind of VR shows positioning sighting system and its positioning method of sight
CN110392251B (en) * 2018-04-18 2021-07-16 广景视睿科技(深圳)有限公司 Dynamic projection method and system based on virtual reality
CN109714588A (en) * 2019-02-16 2019-05-03 深圳市未来感知科技有限公司 Multi-viewpoint stereo image positions output method, device, equipment and storage medium
CN109901713B (en) * 2019-02-25 2020-07-17 山东大学 Multi-person cooperative assembly system and method
CN110008835B (en) * 2019-03-05 2021-07-09 成都旷视金智科技有限公司 Sight line prediction method, device, system and readable storage medium
CN110133958A (en) * 2019-05-21 2019-08-16 广州悦享环球文化科技有限公司 A kind of tracking system and method for three-dimensional film
CN110332932A (en) * 2019-06-05 2019-10-15 南昌大学 A kind of interior unmanned plane positioning system
CN110275599A (en) * 2019-06-20 2019-09-24 维沃移动通信有限公司 A kind of information display method and terminal device
EP3789816A1 (en) * 2019-09-05 2021-03-10 Vivior AG Device and method for mapping of visual scene onto projection surface
CN111275803B (en) * 2020-02-25 2023-06-02 北京百度网讯科技有限公司 3D model rendering method, device, equipment and storage medium
CN114415826A (en) * 2020-05-15 2022-04-29 华为技术有限公司 Data processing method and equipment thereof
CN113038116B (en) * 2021-03-09 2022-06-28 中国人民解放军海军航空大学航空作战勤务学院 Simulation training visual system for oil adding and receiving in air
CN113347402A (en) * 2021-06-28 2021-09-03 筑友建筑装饰装修工程有限公司 Improved method, device and storage medium for rendering immersive content based on Unity
CN115202485B (en) * 2022-09-15 2023-01-06 深圳飞蝶虚拟现实科技有限公司 XR (X-ray fluorescence) technology-based gesture synchronous interactive exhibition hall display system
WO2024065332A1 (en) * 2022-09-28 2024-04-04 华为技术有限公司 Display module, optical display system, terminal device and image display method
CN115934020B (en) * 2023-01-05 2023-05-30 南方科技大学 Naked eye 3D display method and terminal based on arc screen
CN117689791B (en) * 2024-02-02 2024-05-17 山东再起数据科技有限公司 Three-dimensional visual multi-scene rendering application integration method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050275915A1 (en) * 2004-06-01 2005-12-15 Vesely Michael A Multi-plane horizontal perspective display
CN103402106B (en) * 2013-07-25 2016-01-06 青岛海信电器股份有限公司 three-dimensional image display method and device
CN105704468B (en) * 2015-08-31 2017-07-18 深圳超多维光电子有限公司 Stereo display method, device and electronic equipment for virtual and reality scene
CN105704475B (en) * 2016-01-14 2017-11-10 深圳前海达闼云端智能科技有限公司 The 3 D stereo display processing method and device of a kind of curved surface two-dimensional screen
CN106251403B (en) * 2016-06-12 2018-02-16 深圳超多维光电子有限公司 A kind of methods, devices and systems of virtual three-dimensional Scene realization

Also Published As

Publication number Publication date
CN107333121A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
CN107333121B (en) The immersion solid rendering optical projection system and its method of moving view point on curve screens
CN106131530B (en) A kind of bore hole 3D virtual reality display system and its methods of exhibiting
CN106251403B (en) A kind of methods, devices and systems of virtual three-dimensional Scene realization
CN104699247B (en) A kind of virtual reality interactive system and method based on machine vision
CN101072366B (en) Free stereo display system based on light field and binocular vision technology
CN104811681B (en) Display and method for adjusting 3D rendering
JP2022168029A (en) System and method for augmented reality
CN101587386B (en) Method, device and system for processing cursor
CN106131536A (en) A kind of bore hole 3D augmented reality interactive exhibition system and methods of exhibiting thereof
US20080278569A1 (en) Automatic Conversion from Monoscopic Video to Stereoscopic Video
CN106444023A (en) Super-large field angle binocular stereoscopic display transmission type augmented reality system
CN104869389B (en) Off-axis formula virtual video camera parameter determination method and system
WO2010119852A1 (en) Arbitrary viewpoint image synthesizing device
CN102156810A (en) Augmented reality real-time virtual fitting system and method thereof
CN107105333A (en) A kind of VR net casts exchange method and device based on Eye Tracking Technique
JP6384940B2 (en) 3D image display method and head mounted device
CN107005687B (en) Unmanned plane during flying experiential method, device, system and unmanned plane
JP2013513833A (en) System and method for providing stereoscopic images
CN107240147A (en) Image rendering method and system
CN102929091A (en) Method for manufacturing digital spherical curtain three-dimensional film
CN206350095U (en) A kind of three-dimensional filming system dynamically tracked based on human body
CN107545537A (en) A kind of method from dense point cloud generation 3D panoramic pictures
CN106507090A (en) A kind of principal and subordinate's remote viewing system
WO2022127747A1 (en) Method and system for real social using virtual scene
CN105721855B (en) A kind of three-dimensional data method for drafting and its application, three-dimensional image display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant