CN107248193A - The method, system and device that two dimensional surface is switched over virtual reality scenario - Google Patents

The method, system and device that two dimensional surface is switched over virtual reality scenario Download PDF

Info

Publication number
CN107248193A
CN107248193A CN201710364281.2A CN201710364281A CN107248193A CN 107248193 A CN107248193 A CN 107248193A CN 201710364281 A CN201710364281 A CN 201710364281A CN 107248193 A CN107248193 A CN 107248193A
Authority
CN
China
Prior art keywords
virtual reality
sphere
sphere model
model
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710364281.2A
Other languages
Chinese (zh)
Inventor
张涛
马进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING HONGMA MEDIA CULTURE DEVELOPMENT CO LTD
Original Assignee
BEIJING HONGMA MEDIA CULTURE DEVELOPMENT CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING HONGMA MEDIA CULTURE DEVELOPMENT CO LTD filed Critical BEIJING HONGMA MEDIA CULTURE DEVELOPMENT CO LTD
Priority to CN201710364281.2A priority Critical patent/CN107248193A/en
Publication of CN107248193A publication Critical patent/CN107248193A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • G06T3/08
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Abstract

The invention provides the invention provides the method, system and device that a kind of two dimensional surface and virtual reality scenario are switched over, this method includes:Sphere model is created, and vision camera is set up in the sphere model, sphere texture is attached on the sphere model, and the attaching UV expanded views on the sphere texture;Set up the mapping relations of the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, the picture on exhibition ball model.The present invention realizes two dimensional surface and switched over virtual reality scenario, and user can be allowed to realize the view effect of diverse location.

Description

The method, system and device that two dimensional surface is switched over virtual reality scenario
Technical field
The present invention relates to scene handoff technique field, especially relate to what a kind of two dimensional surface was switched over virtual reality scenario Method, system and device.
Background technology
Current business systems most of on the market are all based on the operating system of two dimensional surface, line selection seat as shown in Figure 2 System, interactive experience belongs to a kind of two-dimentional experience, and user is when operation, it is impossible to visual effect and sense when experiencing viewing By puzzlement can be produced when seat is selected.
VR (Virtual Reality, i.e. virtual reality, abbreviation VR), is by VPL companies of U.S. founder Lanier (Jaron Lanier) is proposed in early 1980s.Its specific intension is:Comprehensively utilize computer graphics system and each Plant in the interface equipment such as reality and control, three-dimensional environment generating on computers, can interacting and the technology for immersing sensation is provided. Wherein, computer generation, the three-dimensional environment that can interact turn into virtual environment (i.e. Virtual Environment, abbreviation VE). The carrier that virtual reality technology is realized is virtual reality emulation platform, i.e., and (Virtual Reality Platform, referred to as VRP)。
If introducing VR technologies in the business system of two-dimensional operation, three can be brought for the business system of two-dimensional operation Experience sense is tieed up, the 3D visual effect of diverse location can be experienced by allowing when user's on-line operation.
But, inventor is had found in the course of the study, it is impossible to which VR technologies are directly placed into the business system of two-dimensional operation In, VR technologies are introduced in the business system of two-dimensional operation to be needed to solve following difficult point:
The online displaying of 1.VR scenes;
2. two dimensional surface is for the mapping of three dimensions;
The operation part of the business system of conventional two-dimensional operation, is all based on the choosing seat of two dimensional surface, how by two dimension Plane coordinates, the apex coordinate for being mapped to three dimensions is a difficult point, is this area technical problem to be solved.
The content of the invention
It is a primary object of the present invention to provide method, the system that a kind of two dimensional surface and virtual reality scenario are switched over And device, how solve by the plane coordinates of two dimension, be mapped to the technical problem of the apex coordinate of three dimensions, allow user to exist It can be stayed before being operated to business system in scene, experience the visual effect of diverse location.
One aspect of the present invention provides a kind of method that two dimensional surface is switched over virtual reality scenario, including:
Sphere model is created, and vision camera is set up in the sphere model,
Sphere texture is attached on the sphere model, and the attaching UV expanded views on the sphere texture;
The mapping relations of the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions are set up, are based on The mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, on exhibition ball model Picture.
Further, the establishment sphere model, and also wrapped before erection vision camera in the sphere model Include:
Create the world coordinate system that the initialization of virtual reality three dimensions obtains initialization of virtual reality three-dimensional.
Further, the establishment sphere model, and vision camera is set up in the sphere model, including:
Sphere model is created in server memory, camera coordinates system is obtained, and set in the center of the sphere model Vision camera is put, the vision camera is the first visual angle of virtual reality scenario.
Further, UV expanded views are attached on the sphere texture, including:
It is panorama textures by the UV expanded views suture of shooting;
The panorama textures are loaded on sphere texture, and it is former by vision of the setting vision camera in sphere model Point.
Further, the image coordinate system for setting up UV expanded views and the world coordinate system of virtual reality three dimensions Mapping relations, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, exhibition Show the picture on sphere model, including:
Operational control behavior is added in the image coordinate system of UV expanded views,
Receive after the controlling behavior request, interact, inversely calculated by mapping relations with the control core of sphere model Obtain current location world coordinates, the world coordinate system viewing matrix value for changing the virtual reality three dimensions of sphere model changes Become angular field of view, and then adjust the displaying scope of the textures inside sphere model.
Based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions so that Realize the picture on the switching at visual angle, exhibition ball model.
Another aspect of the present invention additionally provides the system that a kind of two dimensional surface is switched over virtual reality scenario, including:
First creation module, vision camera is set up for creating sphere model, and in the sphere model,
Module is attached, for sphere texture to be attached into the sphere model, and UV is attached on the sphere texture Expanded view;
Module is set up, for setting up the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions Mapping relations, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, exhibition Show the picture on sphere model.
Further, in addition to:
Second creation module, initialization of virtual reality three-dimensional is obtained for creating the initialization of virtual reality three dimensions World coordinate system.
Further, first creation module, including:
First creating unit, for creating sphere model in server memory, obtains camera coordinates system, and in the ball The center of body Model sets vision camera, and the vision camera is the first visual angle of virtual reality scenario.
Further, module is attached, including:
Stapling unit, for being panorama textures by the UV expanded views suture of shooting;
Loading unit, for the panorama textures to be loaded into sphere texture, and with the setting vision in sphere model Video camera is vision origin.
Further, it is described to set up module, including:
Adding device, for adding operational control behavior in the image coordinate system of UV expanded views,
Interactive unit, for after the controlling behavior request is received, being interacted with the control core of sphere model, by reflecting Penetrating relation, inversely calculating obtains current location world coordinates, changes the world coordinate system of the virtual reality three dimensions of sphere model Viewing matrix value changes angular field of view, and then adjusts the displaying scope of the textures inside sphere model.
Display unit, for vision camera to be moved into virtual reality three from UV expanded views system based on the mapping relations In dimension space, so that the switching at visual angle is realized, the picture on exhibition ball model.
Another aspect of the present invention additionally provides the device that a kind of two dimensional surface is switched over virtual reality scenario, including preceding State the system described in any one.
The invention provides the method, system and device that a kind of two dimensional surface and virtual reality scenario are switched over, the hair It is bright to set up vision camera by creating sphere model, and in the sphere model, sphere texture is attached to the spheroid On model, and the attaching UV expanded views on the sphere texture;The image coordinate system and virtual reality for setting up UV expanded views are three-dimensional Vision camera, is moved to virtually by the mapping relations of the world coordinate system in space based on the mapping relations from UV expanded views system In reality three-dimensional, how the technical scheme of the picture on exhibition ball model solves by the plane coordinates of two dimension, maps To the technical problem of the apex coordinate of three dimensions, user is allowed to be stayed before being operated to business system in scene, body Test the visual effect of diverse location.
Brief description of the drawings
Fig. 1 is the embodiment one of the method switched over according to a kind of two dimensional surface of the present invention with virtual reality scenario Flow chart;
Fig. 2 is the schematic diagram of the line selection base system according to the present invention;
Fig. 3 is according to one of schematic diagram of Application Example of the present invention;
Fig. 4 is the two of the schematic diagram of the Application Example according to the present invention;
Fig. 5 is the three of the schematic diagram of the Application Example according to the present invention;
Fig. 6 is the four of the schematic diagram of the Application Example according to the present invention;
Fig. 7 is the five of the schematic diagram of the Application Example according to the present invention;
Fig. 8 is the six of the schematic diagram of the Application Example according to the present invention;
Fig. 9 is the embodiment two of the system switched over according to a kind of two dimensional surface of the present invention with virtual reality scenario Structured flowchart;
Figure 10 is a kind of embodiment three of data message consistency treatment device based on big data according to the present invention Structured flowchart.
Embodiment
In order that those skilled in the art more fully understand the present invention program, below in conjunction with the embodiment of the present invention Accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, rather than whole embodiments.Based on the embodiment in the present invention, ordinary skill people The every other embodiment that member is obtained under the premise of creative work is not made, should all belong to the model that the present invention is protected Enclose.
It should be noted that term " first " in description and claims of this specification and above-mentioned accompanying drawing, " Two " etc. be for distinguishing similar object, without for describing specific order or precedence.It should be appreciated that so using Data can exchange in the appropriate case, so as to embodiments of the invention described herein can with except illustrating herein or Order beyond those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover Lid is non-exclusive to be included, for example, the process, method, system, product or the equipment that contain series of steps or unit are not necessarily limited to Those steps or unit clearly listed, but may include not list clearly or for these processes, method, product Or the intrinsic other steps of equipment or unit.
Embodiment one
Reference picture 1, Fig. 1 shows the method that a kind of two dimensional surface that the present invention is provided is switched over virtual reality scenario An embodiment flow chart.Including:Step S110 to step S130.
In step s 110, sphere model is created, and vision camera is set up in the sphere model.
In the step s 120, sphere texture is attached on the sphere model, and UV is attached on the sphere texture Expanded view.
In step s 130, the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions are set up Mapping relations, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, exhibition Show the picture on sphere model.
It is preferred that, as shown in figure 3, setting up spheroid buffering area in server memory, it regard the UV expanded views of venue as material On matter textures, the sphere texture for being attached to spheroid buffering area, then sphere texture is attached on sphere model.Such as red area phase , thus can be with when in mobile phone or computer screen, when rotating mobile or dragging picture, spheroid rotates in the opposite direction See the picture on spheroid.Position where sphere model center, referred to as vision camera.
Further, the establishment sphere model, and also wrapped before erection vision camera in the sphere model Include:
Create the world coordinate system that the initialization of virtual reality three dimensions obtains initialization of virtual reality three-dimensional.
Further, the establishment sphere model, and vision camera is set up in the sphere model, including:
Sphere model is created in server memory, camera coordinates system is obtained, and set in the center of the sphere model Vision camera is put, the vision camera is the first visual angle of virtual reality scenario.
Further, UV expanded views are attached on the sphere texture, including:
It is panorama textures by the UV expanded views suture of shooting;
The panorama textures are loaded on sphere texture, and it is former by vision of the setting vision camera in sphere model Point.
Further, the image coordinate system for setting up UV expanded views and the world coordinate system of virtual reality three dimensions Mapping relations, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, exhibition Show the picture on sphere model, including:
Operational control behavior is added in the image coordinate system of UV expanded views,
Receive after the controlling behavior request, interact, inversely calculated by mapping relations with the control core of sphere model Obtain current location world coordinates, the world coordinate system viewing matrix value for changing the virtual reality three dimensions of sphere model changes Become angular field of view, and then adjust the displaying scope of the textures inside sphere model.
Based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions so that Realize the picture on the switching at visual angle, exhibition ball model.
So that in line selection one Application Example of seat, as shown in Figure 3 and Figure 4, in sphere model, three dimensions is by arenas Panoramic pictures render what is come, not real three dimensions.So the coordinate points of a position are two-dimensional space (figures in arenas As in coordinate system) a point coordinates be (u, v), when user selects this seat (u, v), system is just obtained currently Three dimensions (world coordinate system) coordinate (x, y, z) of position, then moving-vision video camera is on this position (x, y, z), So as to form new visual effect.
In line selection seat another application embodiment,
(1) world coordinates that scene initialization obtains initializing scene is created;
(2) sphere model (setting up camera coordinate system) is set up;
(a) sphere model is created;
(b) video camera (first person) is set up at sphere model center;
(c) the panorama textures that loading is sutured after shooting, using ball centre as vision origin (the camera origin of coordinates);
(d) interacting by input equipment sensing apparatus such as (flat-panel touch and mouse) gyroscopes and control core, changes Sphere model viewing matrix value changes angular field of view, and then adjusts the displaying scope of the textures of ball interior.(can be flat for PC Face 3D patterns, or VR patterns);
(3) ball textures are expanded into plane outspread drawing, and carries out plane coordinates setting;
(4) when user selects seat, the position clicked on according to user on plane outspread drawing is inversely calculated by mapping relations To current location world coordinates;
(5) visual angle of user is put and is moved at current position coordinates by system, so that the switching at visual angle is realized, so as not to same The switching and selection at seat.
Wherein, three global coordinate systems described in the embodiment of the present invention one:Image coordinate, camera coordinates, the principle of world coordinates It is explained as follows:
(a) image coordinate system
Image coordinate system is as shown in figure 5, be the direct coordinate system u- that origin is set up in units of pixel by the image upper left corner v.The abscissa u and ordinate v of pixel are the columns and place line number where in its image array respectively.
(b) camera coordinates system
The geometrical relationship of camera coordinates can be represented by Fig. 6.Wherein O points are video camera photocentre (projection centre), Xc axles and Yc Axle is parallel with the x-axis and y-axis of imaging plane coordinate system, and Zc axles are the optical axis of video camera, and the plane of delineation is vertical.Optical axis and image The intersection point of plane is the principal point O1 of image, and the rectangular coordinate system being made up of point O and Xc, Yc, Zc axle is referred to as the coordinate system of video camera. OO1 is the focal length of video camera.
(c) world coordinates
World coordinate system is the absolute coordinate system for the virtual world that system is created, and its X-axis is transverse axis, and Y-axis is the longitudinal axis, Z Axle is the intersection point (0,0,0) of graphic limits lower left corner X, Y and Z axis perpendicular to X/Y plane, origin, and world coordinates is also described in us Three dimensional space coordinate.
It is preferred that, the mapping algorithm of sphere model panorama sketch to three dimensions includes but is not limited to:
(1) mapping algorithm of the image coordinate to camera coordinates
As shown in fig. 7, the tangible plane coordinate system of the plane of delineation is exactly image coordinate system, by projection centre and i, j, k Constitute camera coordinates system.
A. principal point is not necessarily at the center of imager (plane of delineation) in practice, in order to optical axis skew that may be present It is modeled, we introduce two new parameters:Cx and cy.
B. in practice, square is rectangular rather than on the imager of low price due to single pixel, we introduce Two different focal length parameters:Fx and fy.(focal length here is in units of pixel) then, it is assumed that object camera sit Point Q in mark system, its coordinate is (X, Y, Z), and point q (xsrceen, yscrreen) is projected as in the way of some skews, its Middle srceen is following table, and coordinate relation is as follows:
(2) camera coordinates are to world coordinates to mapping algorithm
A. generally, the rotation of any dimension can be expressed as the product of the square formation of coordinate vector and suitable dimension.Final one Rotation is equivalent to the re under another different coordinates to a position.Coordinate system anglec of rotation θ, then be equal to target Point rotates same angle, θ around origin of coordinates opposite direction.Following formula show description of the matrix multiplication to Two Dimensional Rotating.Three In dimension space, rotation can be decomposed into the Two Dimensional Rotating (such as Fig. 8 is exactly to be rotated around z-axis) around respective reference axis, wherein rotating Axis measurement continue to have (here it is why spin matrix be orthogonal matrix cause).If successively around x, y, z-axis The anglec of rotation ψ, φ and θ, then total spin matrix R is three matrix Rx (ψ), Ry (φ), Rz (θ) product, wherein:
B. translation vector T, translation vector is used for representing how the origin of a coordinate system is moved into another seat The origin of system is marked, translation vector is the offset of first coordinate origin and second coordinate origin in other words.Therefore, from Another coordinate system for the origin being moved to by target's center for the coordinate system of origin centered on video camera, corresponding translation Vector is T=target points-video camera origin.The coordinate Po in world coordinate system is so put in camera coordinate system Middle Pc:Pc=R (Po-T).
Image coordinate is described to world coordinates to data:
A. the relation between pixel coordinate and photo coordinate system, referring to Fig. 5, obtains below equation:
Dx, dy, u0, v0 is assume parameter out, and dxdy represents the actual size of pixel on sensitive chip, are connection pictures Plain coordinate system and full-size(d) coordinate system, u0, v0 is plane of delineation center, is finally the inside and outside parameter for wanting us to ask.Draw We can represent equation with matrix form with the knowledge of linear algebra after this formula:
It can also be represented with another matrix form:
B. the relation between camera coordinates system and world coordinate system
Being related between the two coordinate systems, we can obtain following relation with spin matrix R and translation matrix T:
C. the relation between camera coordinates system and photo coordinate system
We can obtain below equation in camera model:
Represented with matrix form:
Above formula is integrated and be can be obtained by:
3. the packaged above-mentioned complicated mathematical algorithm of the THREE frameworks based on HTML5, is only needed in actual development The interface provided using THREE frameworks.
One preferred embodiment:Based on html5 and javascript language realize sphere model line selection seat, use Webgl built-in functions and Three.js frameworks, render for panorama spheroid and do not do excessive elaboration, and core code focuses on VR and selects seat Partial implementation process.
(1) world coordinate system that scene THREE.Scene initializes scene is created;
(2) add perspective camera THREE.PerspectiveCamera for scene and initialize camera coordinate system;
(3) picture texture loader THREE.TextureLoader is created;
(4) the arenas panoramic pictures loader.load (imgURL) of specified path is loaded, will be returned after the completion of Loading Image imageTexture;
(5) spheroid buffering area THREE.SphereBufferGeometry is created;
(6) material is created, venue picture texture is attached to THREE.MeshBasicMaterial ({ map in material: imageTexture});
(7) create grid and material is attached to THREE.Mesh (geometry, material) on spheroid buffering area;
(8) renderer THREE.WebGLRenderer is created, the pattern drafting after renderer is rendered is on Canvas;
(9) gravity and touch control component THREE.VRControls (camera) are created, is touched and GRAVITY CONTROL component Realize dragging and according to gravity feedback data, change the visual angle of scene camera;
(10) update gravity in render process and touch the data message of component, while more new scene and camera data;
(11) animation loops requestAnimationFrame;
(12) plane for creating UV expansion selects seat superposed layer, the initialization of plane coordinates has been completed before this, simultaneously The plane that each seat can be obtained from UV floating layers does coordinate;
(13) mouseup or touchend events are bound for superposed layer;
(14) world coordinates corresponding to UV coordinates is calculated when event is triggered, is expressed as in the way of vector THREE.Vector3 (mouse.x, mouse.y, -1) .unproject (camera), the coordinate vector vector that must now fall The world coordinates exactly calculated by plane coordinates and camera coordinate parameters;
(13) mobile camera is natural in video camera moving process to target location camera.position=vector The change at visual angle can be produced, new visual effect is formed;
(14) at new visual position, ejection business (purchase is determined) action link, user is jumped by clickthrough Blade-rotating face completes related service operation.
The part for being related to animation process in above procedure is all encapsulated as function and performed in render is placed on.
The embodiment of the present invention one provides a kind of method that two dimensional surface is switched over virtual reality scenario, and the invention is led to Establishment sphere model is crossed, and vision camera is set up in the sphere model, sphere texture is attached to the sphere model On, and the attaching UV expanded views on the sphere texture;Set up the image coordinate system and virtual reality three dimensions of UV expanded views World coordinate system mapping relations, vision camera is moved to virtual reality from UV expanded views system based on the mapping relations In three dimensions, how the technical scheme of the picture on exhibition ball model solves by the plane coordinates of two dimension, is mapped to three The technical problem of the apex coordinate of dimension space, allows user being stayed before being operated to business system in scene, and experience is not With the visual effect of position.
Embodiment two
Reference picture 9, Fig. 9 shows the system that a kind of two dimensional surface that the present invention is provided is switched over virtual reality scenario The structured flowchart of 200 embodiments.Including:
First creation module 21, vision camera is set up for creating sphere model, and in the sphere model,
Module 22 is attached, for sphere texture to be attached into the sphere model, and is attached on the sphere texture UV expanded views;
Module 23 is set up, for setting up the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions Mapping relations, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, Picture on exhibition ball model.
Further, in addition to:
Second creation module, initialization of virtual reality three-dimensional is obtained for creating the initialization of virtual reality three dimensions World coordinate system.
Further, first creation module, including:
First creating unit, for creating sphere model in server memory, obtains camera coordinates system, and in the ball The center of body Model sets vision camera, and the vision camera is the first visual angle of virtual reality scenario.
Further, module is attached, including:
Stapling unit, for being panorama textures by the UV expanded views suture of shooting;
Loading unit, for the panorama textures to be loaded into sphere texture, and with the setting vision in sphere model Video camera is vision origin.
Further, it is described to set up module, including:
Adding device, for adding operational control behavior in the image coordinate system of UV expanded views,
Interactive unit, for after the controlling behavior request is received, being interacted with the control core of sphere model, by reflecting Penetrating relation, inversely calculating obtains current location world coordinates, changes the world coordinate system of the virtual reality three dimensions of sphere model Viewing matrix value changes angular field of view, and then adjusts the displaying scope of the textures inside sphere model.
Display unit, for vision camera to be moved into virtual reality three from UV expanded views system based on the mapping relations In dimension space, so that the switching at visual angle is realized, the picture on exhibition ball model.
The specific steps that the function and processing mode implemented is described referring to embodiment of the method one.
The processing and function realized by the system of the present embodiment two essentially correspond to the method shown in earlier figures 1-8 Embodiment, principle and example, therefore not detailed part in the description of the present embodiment, may refer to mutually speaking on somebody's behalf in previous embodiment It is bright, it will not be described here.
The embodiment of the present invention two provides the system that a kind of two dimensional surface is switched over virtual reality scenario, and the invention is led to Establishment sphere model is crossed, and vision camera is set up in the sphere model, sphere texture is attached to the sphere model On, and the attaching UV expanded views on the sphere texture;Set up the image coordinate system and virtual reality three dimensions of UV expanded views World coordinate system mapping relations, vision camera is moved to virtual reality from UV expanded views system based on the mapping relations In three dimensions, how the technical scheme of the picture on exhibition ball model solves by the plane coordinates of two dimension, is mapped to three The technical problem of the apex coordinate of dimension space, allows user being stayed before being operated to business system in scene, and experience is not With the visual effect of position.
Embodiment three
Reference picture 10, Figure 10 shows the dress that a kind of two dimensional surface that the present invention is provided is switched over virtual reality scenario Put the structured flowchart of 300 examples.Including the system 200 described in any one of embodiment two.
The embodiment of the present invention three provides the device that a kind of two dimensional surface is switched over virtual reality scenario, and the invention is led to Establishment sphere model is crossed, and vision camera is set up in the sphere model, sphere texture is attached to the sphere model On, and the attaching UV expanded views on the sphere texture;Set up the image coordinate system and virtual reality three dimensions of UV expanded views World coordinate system mapping relations, vision camera is moved to virtual reality from UV expanded views system based on the mapping relations In three dimensions, how the technical scheme of the picture on exhibition ball model solves by the plane coordinates of two dimension, is mapped to three The technical problem of the apex coordinate of dimension space, allows user being stayed before being operated to business system in scene, and experience is not With the visual effect of position.
The embodiments of the present invention are for illustration only, and the quality of embodiment is not represented.
It should be noted that for foregoing each method embodiment, in order to be briefly described, therefore it is all expressed as a series of Combination of actions, but those skilled in the art should know, the present invention is not limited by described sequence of movement because According to the present invention, some steps can be carried out sequentially or simultaneously using other.Secondly, those skilled in the art should also know Know, embodiment described in this description belongs to preferred embodiment, involved action and module is not necessarily of the invention It is necessary.
In the above-described embodiments, the description to each embodiment all emphasizes particularly on different fields, and does not have the portion being described in detail in some embodiment Point, it may refer to the associated description of other embodiment.
, can be by another way in several embodiments provided herein, it should be understood that disclosed device Realize.For example, device embodiment described above is only schematical, such as the division of described unit is only one kind Division of logic function, can there is other dividing mode when actually realizing, such as multiple units or component can combine or can To be integrated into another system, or some features can be ignored, or not perform.It is another, it is shown or discussed each other Coupling direct-coupling or communication connection can be by some interfaces, the INDIRECT COUPLING or communication connection of device or unit, Can be electrical or other forms.
The unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs 's.
In addition, each functional unit in each embodiment of the invention can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list Member can both be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.
It may be noted that the need for according to implementation, each step/part described in this application can be split as into more multistep The part operation of two or more step/parts or step/part, can also be combined into new step/part by suddenly/part, To realize the purpose of the present invention.
Above-mentioned the method according to the invention can be realized in hardware, firmware, or be implemented as being storable in recording medium Software or computer code in (such as CD ROM, RAM, floppy disk, hard disk or magneto-optic disk), or it is implemented through network download Original storage in long-range recording medium or nonvolatile machine readable media and the meter that will be stored in local recording medium Calculation machine code, so that method described here can be stored in using all-purpose computer, application specific processor or programmable or special With such software processing in hardware (such as ASIC or FPGA) recording medium.It is appreciated that computer, processor, micro- Processor controller or programmable hardware include can storing or receive software or computer code storage assembly (for example, RAM, ROM, flash memory etc.), when the software or computer code are by computer, processor or hardware access and when performing, realize herein The processing method of description.In addition, when all-purpose computer accesses the code for realizing the processing being shown in which, the execution of code All-purpose computer is converted into the special-purpose computer for performing the processing being shown in which.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be based on the protection scope of the described claims.

Claims (11)

1. a kind of method that two dimensional surface is switched over virtual reality scenario, it is characterised in that including:
Sphere model is created, and vision camera is set up in the sphere model,
Sphere texture is attached on the sphere model, and the attaching UV expanded views on the sphere texture;
The mapping relations of the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions are set up, based on described Mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, the picture on exhibition ball model Face.
2. the method as described in claim 1, it is characterised in that the establishment sphere model, and the frame in the sphere model If before vision camera, in addition to:
Create the world coordinate system that the initialization of virtual reality three dimensions obtains initialization of virtual reality three-dimensional.
3. method as claimed in claim 1 or 2, it is characterised in that the establishment sphere model, and in the sphere model Vision camera is set up, including:
Sphere model is created in server memory, camera coordinates system is obtained, and regard in the center setting of the sphere model Feel video camera, the vision camera is the first visual angle of virtual reality scenario.
4. the method as described in one of claim 1-3, it is characterised in that UV expanded views, bag are attached on the sphere texture Include:
It is panorama textures by the UV expanded views suture of shooting;
The panorama textures are loaded on sphere texture, and using the setting vision camera in sphere model as vision origin.
5. the method as described in claim 1, it is characterised in that the image coordinate system and virtual reality for setting up UV expanded views Vision camera, is moved to by the mapping relations of the world coordinate system of three dimensions based on the mapping relations from UV expanded views system In virtual reality three dimensions, the picture on exhibition ball model, including:
Operational control behavior is added in the image coordinate system of UV expanded views,
Receive after the controlling behavior request, interacted with the control core of sphere model, inversely calculated and obtained by mapping relations Current location world coordinates, the world coordinate system viewing matrix value for changing the virtual reality three dimensions of sphere model regards to change Angular region, and then adjust the displaying scope of the textures inside sphere model.
Based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, so as to realize Picture on the switching at visual angle, exhibition ball model.
6. the system that a kind of two dimensional surface is switched over virtual reality scenario, it is characterised in that including:
First creation module, vision camera is set up for creating sphere model, and in the sphere model,
Module is attached, is deployed for sphere texture to be attached into the sphere model, and UV is attached on the sphere texture Figure;
Module is set up, the mapping for setting up the image coordinate system of UV expanded views and the world coordinate system of virtual reality three dimensions Relation, based on the mapping relations by vision camera from UV expanded views system is moved to virtual reality three dimensions, show ball Picture on body Model.
7. system as claimed in claim 6, it is characterised in that also include:
Second creation module, the generation of initialization of virtual reality three-dimensional is obtained for creating the initialization of virtual reality three dimensions Boundary's coordinate system.
8. system as claimed in claims 6 or 7, it is characterised in that first creation module, including:
First creating unit, for creating sphere model in server memory, obtains camera coordinates system, and in the spheroid mould The center of type sets vision camera, and the vision camera is the first visual angle of virtual reality scenario.
9. system as claimed in claim 6, it is characterised in that attach module, including:
Stapling unit, for being panorama textures by the UV expanded views suture of shooting;
Loading unit, for the panorama textures to be loaded into sphere texture, and is imaged with the setting vision in sphere model Machine is vision origin.
10. system as claimed in claim 6, it is characterised in that described to set up module, including:
Adding device, for adding operational control behavior in the image coordinate system of UV expanded views,
Interactive unit, for after the controlling behavior request is received, interacting, being closed by mapping with the control core of sphere model Reverse calculate of system obtains current location world coordinates, changes the world coordinate system view of the virtual reality three dimensions of sphere model Matrix value changes angular field of view, and then adjusts the displaying scope of the textures inside sphere model.
Display unit, for vision camera to be moved into virtual reality three-dimensional space from UV expanded views system based on the mapping relations Between in, so as to realize the switching at visual angle, the picture on exhibition ball model.
11. the device that a kind of two dimensional surface is switched over virtual reality scenario, including as described in claim any one of 6-10 System.
CN201710364281.2A 2017-05-22 2017-05-22 The method, system and device that two dimensional surface is switched over virtual reality scenario Pending CN107248193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710364281.2A CN107248193A (en) 2017-05-22 2017-05-22 The method, system and device that two dimensional surface is switched over virtual reality scenario

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710364281.2A CN107248193A (en) 2017-05-22 2017-05-22 The method, system and device that two dimensional surface is switched over virtual reality scenario

Publications (1)

Publication Number Publication Date
CN107248193A true CN107248193A (en) 2017-10-13

Family

ID=60017393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710364281.2A Pending CN107248193A (en) 2017-05-22 2017-05-22 The method, system and device that two dimensional surface is switched over virtual reality scenario

Country Status (1)

Country Link
CN (1) CN107248193A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107833265A (en) * 2017-11-27 2018-03-23 歌尔科技有限公司 A kind of image switching methods of exhibiting and virtual reality device
CN108564660A (en) * 2017-12-28 2018-09-21 灵图互动(武汉)科技有限公司 The exchange method and system of two-dimensional element and three-dimensional element in reality environment
CN108629832A (en) * 2018-04-23 2018-10-09 广东奥园奥买家电子商务有限公司 A kind of 3D spherical coordinates are mapped to the implementation method of two dimensional surface
CN108681987A (en) * 2018-05-10 2018-10-19 广州腾讯科技有限公司 The method and apparatus for generating panorama slice map
CN108921778A (en) * 2018-07-06 2018-11-30 成都品果科技有限公司 A kind of celestial body effect drawing generating method
CN108961395A (en) * 2018-07-03 2018-12-07 上海亦我信息技术有限公司 A method of three dimensional spatial scene is rebuild based on taking pictures
CN109062487A (en) * 2018-08-21 2018-12-21 苏州蜗牛数字科技股份有限公司 A kind of method of model material UV copy
CN111192321A (en) * 2019-12-31 2020-05-22 武汉市城建工程有限公司 Three-dimensional positioning method and device for target object
CN111246096A (en) * 2020-01-19 2020-06-05 广州启量信息科技有限公司 System and method for generating three-dimensional panoramic roaming model
CN111340960A (en) * 2020-02-21 2020-06-26 当家移动绿色互联网技术集团有限公司 Image modeling method and device, storage medium and electronic equipment
CN111352510A (en) * 2020-03-30 2020-06-30 歌尔股份有限公司 Virtual model creating method, system and device and head-mounted equipment
CN111524182A (en) * 2020-04-29 2020-08-11 杭州电子科技大学 Mathematical modeling method based on visual information analysis
CN111738912A (en) * 2020-06-22 2020-10-02 戈兰林艺创科技(深圳)有限公司 Negative plate, air mold surface layer and air mold manufacturing method
CN112241201A (en) * 2020-09-09 2021-01-19 中国电子科技集团公司第三十八研究所 Remote labeling method and system for augmented/mixed reality
CN112269618A (en) * 2020-11-12 2021-01-26 中煤航测遥感集团有限公司 Method, device and equipment for switching two-dimensional scene and three-dimensional scene of station and storage medium
CN113611181A (en) * 2021-07-09 2021-11-05 中国舰船研究设计中心 Three-dimensional display method and device for virtual simulation scene
CN114138106A (en) * 2020-09-02 2022-03-04 欧特克公司 Transitioning between states in a mixed virtual reality desktop computing environment
CN114445564A (en) * 2022-04-08 2022-05-06 腾讯科技(深圳)有限公司 Model expansion method, device, storage medium and computer program product
CN116363337A (en) * 2023-04-04 2023-06-30 如你所视(北京)科技有限公司 Model tour method and device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105163158A (en) * 2015-08-05 2015-12-16 北京奇艺世纪科技有限公司 Image processing method and device
CN105635551A (en) * 2014-10-29 2016-06-01 浙江大华技术股份有限公司 Method of dome camera for generating panoramic image, and dome camera
CN105787920A (en) * 2014-12-26 2016-07-20 秦永进 Dome screen demarcating method, demarcating system and control device
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal
CN105957048A (en) * 2016-01-26 2016-09-21 优势拓展(北京)科技有限公司 3D panorama display method and system of shooting image through fish eye lens
CN106658212A (en) * 2017-01-20 2017-05-10 北京红马传媒文化发展有限公司 VR online playing method, system and player based on HTML5
CN106651808A (en) * 2016-12-29 2017-05-10 北京爱奇艺科技有限公司 Fisheye image conversion method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105635551A (en) * 2014-10-29 2016-06-01 浙江大华技术股份有限公司 Method of dome camera for generating panoramic image, and dome camera
CN105787920A (en) * 2014-12-26 2016-07-20 秦永进 Dome screen demarcating method, demarcating system and control device
CN105163158A (en) * 2015-08-05 2015-12-16 北京奇艺世纪科技有限公司 Image processing method and device
CN105913478A (en) * 2015-12-28 2016-08-31 乐视致新电子科技(天津)有限公司 360-degree panorama display method and display module, and mobile terminal
CN105957048A (en) * 2016-01-26 2016-09-21 优势拓展(北京)科技有限公司 3D panorama display method and system of shooting image through fish eye lens
CN106651808A (en) * 2016-12-29 2017-05-10 北京爱奇艺科技有限公司 Fisheye image conversion method and device
CN106658212A (en) * 2017-01-20 2017-05-10 北京红马传媒文化发展有限公司 VR online playing method, system and player based on HTML5

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
师康潇男: ""应用于变电站管理系统的三维全景视图软件设计与实现"", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
裴玉: ""大庆湿地旅游虚拟现实系统设计"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107833265A (en) * 2017-11-27 2018-03-23 歌尔科技有限公司 A kind of image switching methods of exhibiting and virtual reality device
CN107833265B (en) * 2017-11-27 2021-07-27 歌尔光学科技有限公司 Image switching display method and virtual reality equipment
CN108564660A (en) * 2017-12-28 2018-09-21 灵图互动(武汉)科技有限公司 The exchange method and system of two-dimensional element and three-dimensional element in reality environment
CN108629832A (en) * 2018-04-23 2018-10-09 广东奥园奥买家电子商务有限公司 A kind of 3D spherical coordinates are mapped to the implementation method of two dimensional surface
CN108681987A (en) * 2018-05-10 2018-10-19 广州腾讯科技有限公司 The method and apparatus for generating panorama slice map
CN108961395A (en) * 2018-07-03 2018-12-07 上海亦我信息技术有限公司 A method of three dimensional spatial scene is rebuild based on taking pictures
CN108961395B (en) * 2018-07-03 2019-07-30 上海亦我信息技术有限公司 A method of three dimensional spatial scene is rebuild based on taking pictures
US11200734B2 (en) 2018-07-03 2021-12-14 Shanghai Yiwo Information Technology Co., Ltd. Method for reconstructing three-dimensional space scene based on photographing
CN108921778B (en) * 2018-07-06 2022-12-30 成都品果科技有限公司 Method for generating star effect map
CN108921778A (en) * 2018-07-06 2018-11-30 成都品果科技有限公司 A kind of celestial body effect drawing generating method
CN109062487A (en) * 2018-08-21 2018-12-21 苏州蜗牛数字科技股份有限公司 A kind of method of model material UV copy
CN111192321B (en) * 2019-12-31 2023-09-22 武汉市城建工程有限公司 Target three-dimensional positioning method and device
CN111192321A (en) * 2019-12-31 2020-05-22 武汉市城建工程有限公司 Three-dimensional positioning method and device for target object
CN111246096A (en) * 2020-01-19 2020-06-05 广州启量信息科技有限公司 System and method for generating three-dimensional panoramic roaming model
CN111340960A (en) * 2020-02-21 2020-06-26 当家移动绿色互联网技术集团有限公司 Image modeling method and device, storage medium and electronic equipment
CN111340960B (en) * 2020-02-21 2021-06-04 北京五一视界数字孪生科技股份有限公司 Image modeling method and device, storage medium and electronic equipment
CN111352510A (en) * 2020-03-30 2020-06-30 歌尔股份有限公司 Virtual model creating method, system and device and head-mounted equipment
CN111524182B (en) * 2020-04-29 2023-11-10 杭州电子科技大学 Mathematical modeling method based on visual information analysis
CN111524182A (en) * 2020-04-29 2020-08-11 杭州电子科技大学 Mathematical modeling method based on visual information analysis
CN111738912B (en) * 2020-06-22 2024-03-19 戈兰林艺创科技(深圳)有限公司 Negative film, air mould surface layer and manufacturing method of air mould
CN111738912A (en) * 2020-06-22 2020-10-02 戈兰林艺创科技(深圳)有限公司 Negative plate, air mold surface layer and air mold manufacturing method
CN114138106A (en) * 2020-09-02 2022-03-04 欧特克公司 Transitioning between states in a mixed virtual reality desktop computing environment
CN112241201A (en) * 2020-09-09 2021-01-19 中国电子科技集团公司第三十八研究所 Remote labeling method and system for augmented/mixed reality
CN112241201B (en) * 2020-09-09 2022-10-25 中国电子科技集团公司第三十八研究所 Remote labeling method and system for augmented/mixed reality
CN112269618B (en) * 2020-11-12 2024-01-26 中煤航测遥感集团有限公司 Station two-dimensional scene switching method, device, equipment and storage medium
CN112269618A (en) * 2020-11-12 2021-01-26 中煤航测遥感集团有限公司 Method, device and equipment for switching two-dimensional scene and three-dimensional scene of station and storage medium
CN113611181A (en) * 2021-07-09 2021-11-05 中国舰船研究设计中心 Three-dimensional display method and device for virtual simulation scene
CN114445564B (en) * 2022-04-08 2022-06-17 腾讯科技(深圳)有限公司 Model expansion method, device, storage medium and computer program product
CN114445564A (en) * 2022-04-08 2022-05-06 腾讯科技(深圳)有限公司 Model expansion method, device, storage medium and computer program product
CN116363337A (en) * 2023-04-04 2023-06-30 如你所视(北京)科技有限公司 Model tour method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN107248193A (en) The method, system and device that two dimensional surface is switched over virtual reality scenario
CN107018336B (en) The method and apparatus of method and apparatus and the video processing of image procossing
EP1008112B1 (en) Techniques for creating and modifying 3d models and correlating such models with 2d pictures
CN106780709B (en) A kind of method and device of determining global illumination information
CN108648269A (en) The monomerization approach and system of three-dimensional building object model
US20170186219A1 (en) Method for 360-degree panoramic display, display module and mobile terminal
CN106326334A (en) Display method and device for electronic map and generation method and device for electronic map
CN109074677A (en) Method and apparatus for handling image
CN107563959A (en) Panoramagram generation method and device
CN115187729B (en) Three-dimensional model generation method, device, equipment and storage medium
WO2017113729A1 (en) 360-degree image loading method and loading module, and mobile terminal
CN111161398A (en) Image generation method, device, equipment and storage medium
CN101477702A (en) Built-in real tri-dimension driving method for computer display card
JP6852224B2 (en) Sphere light field rendering method in all viewing angles
CN101521828B (en) Implanted type true three-dimensional rendering method oriented to ESRI three-dimensional GIS module
CN116109803B (en) Information construction method, device, equipment and storage medium
CN101540056A (en) Implanted true-three-dimensional stereo rendering method facing to ERDAS Virtual GIS
CN115830202A (en) Three-dimensional model rendering method and device
CN114419226A (en) Panorama rendering method and device, computer equipment and storage medium
CN101488229A (en) PCI three-dimensional analysis module oriented implantation type ture three-dimensional stereo rendering method
WO2018151612A1 (en) Texture mapping system and method
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
CN101482978B (en) ENVI/IDL oriented implantation type true three-dimensional stereo rendering method
CN110335335A (en) Uniform density cube for spherical projection renders
KR102056985B1 (en) Method and apparatus for virtual reality interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20171013

RJ01 Rejection of invention patent application after publication