CN102054290B - Construction method of panoramic/realistic hybrid reality platform - Google Patents

Construction method of panoramic/realistic hybrid reality platform Download PDF

Info

Publication number
CN102054290B
CN102054290B CN 200910219617 CN200910219617A CN102054290B CN 102054290 B CN102054290 B CN 102054290B CN 200910219617 CN200910219617 CN 200910219617 CN 200910219617 A CN200910219617 A CN 200910219617A CN 102054290 B CN102054290 B CN 102054290B
Authority
CN
China
Prior art keywords
panorama
data
roaming
user
scene image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200910219617
Other languages
Chinese (zh)
Other versions
CN102054290A (en
Inventor
佟国峰
刘晓龙
邵振洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
The European Silai (Beijing) Technology Co. Ltd.
Original Assignee
SHENYANG XUNJING TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHENYANG XUNJING TECHNOLOGY Co Ltd filed Critical SHENYANG XUNJING TECHNOLOGY Co Ltd
Priority to CN 200910219617 priority Critical patent/CN102054290B/en
Publication of CN102054290A publication Critical patent/CN102054290A/en
Application granted granted Critical
Publication of CN102054290B publication Critical patent/CN102054290B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a construction method of a panoramic/realistic hybrid reality platform, which comprises the following specific steps: acquiring panoramic picture data, geographic information data and other original data required for constructing a hybrid reality platform, and storing in a data storage unit of a data management background; then, creating a hybrid reality panoramic/realistic display curtain, creating a virtual ground and adding 3D entities to construct a hybrid reality space based on panoramic pictures; and finally, synchronizing the hybrid reality space with the geographic information of the real world. By organically combining the real scene with the 3D entities in the virtual space, the invention enables users to have strong space reality feeling and immersion feeling in the process of roaming in the hybrid reality space.

Description

Construction method of panoramic/realistic hybrid reality platform
Technical field
The present invention relates to virtual reality technology, a kind of panorama outdoor scene mixes reality (HybridReality) platform construction method specifically.
Background technology
In recent years, virtual reality technology is a domestic and international study hotspot on areas of information technology.Traditional virtual reality technology is all by reappear the three-dimensional sight of real world based on the environmental modeling technology of 3D.Adopt the method for 3D modeling such as virtual Philadelphia (Virtual Philadelphia), each building in whole Philadelphia is carried out modeling in the Virtual Space, the user can carry out virtual roaming in the street in Philadelphia.Virtual community and for example, as: Second Lifes (Second Life) etc. are equally also to carry out modeling in the space by the 3D instrument, and different from the former is that the modeling to environment in the community can be completed by user oneself.The advantage that adopts the 3D modeling technique to carry out virtual environment building is that space multistory sense and stereovision are strong.But usually adopt the virtual environment of three-dimensional scenic modeling and true environment to have than big difference, fidelity is not high, and the extensive work of modeling need to be by manually completing, and efficient is lower.The characteristics such as and the virtual reality technology based on Image Rendering that grew up in recent years is complete with its environmental information reproduction, and the scene structure is quick develop into a kind of environmental modeling technology of highly effective gradually.
In utilizing based on Image Rendering technique construction virtual reality scenario process, usually adopt the panorama real scene image of real scene to build whole Virtual Space.Numerous famous universities such as Harvard University (Harvard University) in the world, the institutions of higher learning such as Regius professor (University of Oxford) are adopting this technology to show campus environment to the world on the website separately.Take Harvard University as example, the key scenic spot in its inside, campus gathers the panorama real scene image, it is mapped in diameter of Spherical Volume, and calibrates these positions on Campus Map.When the user clicks these calibration position with mouse, realize observing the Campus Space scene with visual angle on fixed view.Usually, the camera site of these panorama real scene images is all discrete.
Google company adopts this virtual reality technology based on Image Rendering to release streetscape (Street View) function on Google map.Its adopt vehicle-mounted panoramic camera in the street on the way scenery take continuously, and in whole three dimensions according to a series of spherical panoramas of the shooting path construction space of two-dimensional map, the panorama real scene image in corresponding geographic position is mapped to its spherical panorama space, the user can complete by jumping the roaming to whole streetscape in different spherical panoramas spaces, and also can realize 360 comprehensive observations in each space.
The above-mentioned existing virtual reality system based on the Image Rendering technology has all adopted the method at the interior texture panorama real scene image of diameter of Spherical Volume (or other shapes cube for example, cylindrical space) to build panoramic space.Although adopt fidelity and the feeling of immersion of drawing based on its virtual scene of Image Rendering technology that large increase has been arranged, to compare with adopting 3D modeling technique structure virtual scene, its stereoscopic sensation and stereovision are inadequate again.Therefore only adopt based on the Virtual Space of Image Rendering technique construction and scene content rich can not be carried out comparatively complete performance.
For the limitation that solves single application 3D modeling technique or bring to the establishment virtual reality space based on the Image Rendering technology, the method that builds virtual reality space by hybrid modeling is suggested.Its main thought is that the real scene image of taking is set up its three-dimensional model by the method for three-dimensional reconstruction, and image texture is mapped on the three-dimensional model that establishes, this hybrid modeling method lays particular emphasis on technical mixing, is to carry out mixing with the Image Rendering technology on the basis of 3D modeling.And U.S. Google company and Microsoft company also adopt similar technology with texture in the three-dimensional model that establishes, show in roaming website, city separately.But can there be a large amount of problems such as pinup picture dislocation in the reality environment that adopts this technique construction, and therefore constructed reality environment is not really true to life.
Therefore people need a kind of environmental simulation Du Genggao, and scene has stronger stereoscopic sensation and stereovision, abundant in content VR-Platform.At present, can realize that the method that possesses above-mentioned characteristic VR-Platform not yet reports.
Summary of the invention
To build the independent employing of virtual scene in prior art based on the Image Rendering technology and adopt separately the scene structure fidelity of bringing based on graphics plotting lower in order to overcome, or the scene stereoscopic sensation and stereovision relatively poor, the problem that content is abundant not, the technical problem to be solved in the present invention is to provide a kind of environmental simulation Du Genggao, scene has stronger stereoscopic sensation and stereovision, abundant in content construction method of panoramic/realistic hybrid reality platform.
To achieve these goals, the technical solution used in the present invention is as follows:
1) create curtain outside the demonstration of panorama outdoor scene in 3d space
Adopt 3D graphic package interface to create the three-dimensional environment that mixes realistic space, and draw curtain outside the scene demonstration in mixing the three-dimensional environment of realistic space; The panorama real scene image data that collect are sent to the 3D display platform Application Program Interface from the data management backstage, texture shows outer scene plot face to scene, obtains the panorama outdoor scene and shows outer act;
2) create virtual ground and add the 3D entity
Built show based on the panorama outdoor scene outer curtain mix realistic space in set up for the virtual ground of putting the 3D entity, namely need to current show based on the panorama outdoor scene outer act mix realistic space in the 3D entity that shows add on virtual ground, roam on the virtual ground that the user creates in mixing realistic space;
3) will mix realistic space synchronizes with the geographic information data of real world
To show that the geographic information data that mixes the corresponding real world of realistic space current frame image of outer curtain passes to the preprepared geographical information platform in real time by the routine interface between 3D display platform and geographical information platform, demarcates and upgrade the position of user of current roaming on geographical information platform based on the panorama outdoor scene.
Described mix reality refer to by with the panorama real scene image as scene background, and 3D modeling entity is organically combined in the scene that the panorama real scene image builds the computer virtual scene that merges on scene content.
The preparation of described panorama real scene image data and geographic information data comprises:
Utilize the data acquisition platform collection to build and mix the needed raw data of real platform, panorama real scene image data and the corresponding geographic information data of this frame of one frame are stored respectively in the data acquisition storer according to continuous mode, and dumped in the data storage cell on data management backstage;
The deposit path geographic information data opening relationships table corresponding with this frame of each frame image data that will store at data storage cell in the data management backstage is stored in relation table in the database on data management backstage.
Described establishment virtual ground comprises the following steps:
The panorama outdoor scene that built show outer act place mix realistic space in set up an invisible face as virtual ground;
Determine the position of above-mentioned virtual ground in mixing realistic space, to guarantee showing that in the panorama outdoor scene image of demonstration on outer curtain is consistent with the perspective relation that mixes realistic space.
Described interpolation 3D entity is:
To import to and mix realistic space mixing the realistic space outside 3D solid model that creates, be on the virtual ground of user in mixing realistic space the roaming before, the 3D solid model file that utilizes the 3D modeling tool to establish is carried out the data storage, obtain the raw data of 3D entity, then utilize routine interface to read in the 3D display platform described raw data.
Perhaps described interpolation 3D entity is:
Mixing the inner 3D of interpolation of realistic space entity, the attribute information of the 3D entity of drawing in namely first preparing mixing realistic space in the 3D entity management unit on data management backstage is stored, when the user is mixing the geographic position of the 3D entity that roams into above-mentioned preparation drafting in realistic space, the data management backstage is transferred to the 3D display platform with crossing routine interface at the 3D of current geographic position entity attribute information exchange in 3D entity management unit, utilizes 3D graphic package interface drafting 3 D entity in the 3D display platform.
The described roaming on virtual ground comprises the following steps:
Distance on virtual ground and the actual range relation in real world are demarcated, namely in mixing realistic space from user's viewpoint to virtual ground distance and with actual photographed video camera center and ground actual range measure, the ratio of these two distances is as mixing the realistic space distance to actual range conversion scale;
Can't observe user's self 3D people entities but carry out the first visual angle roaming in can observing other users' the displayed scene of 3D people entities the user; Perhaps mixing in realistic space the user and can complete observation carrying out the 3rd visual angle roaming to the entity that represents this user self, in also can observing the displayed scene of other user subjects simultaneously.
Described the first visual angle roaming is the continous way roaming, is specially:
Begin to carry out and initialization the first visual angle continous way roaming program;
Read the first frame panorama real scene image data and the corresponding geographic information data of these panorama real scene image data of preliminary election Roam Path from the data management backstage;
With panorama real scene image texture curtain outside scene shows, gps data is sent to geographical information platform by routine interface and is used for upgrading the geographic position, and replacement user viewpoint position in mixing realistic space;
Judge that the current panorama outdoor scene that is presented at shows whether the panorama real scene image on outer curtain arrives end frame;
If the arrival end frame, the first visual angle continous way roaming program is carried out and is finished;
If do not arrive end frame, the 3D display platform begins to data management background request next frame panorama real scene image data and geographic information data corresponding to these frame panorama real scene image data, the panorama real scene image data texture of newly asking is mapped to curtain outside the scene demonstration, and calculates the geographic information data of present frame and the rotation angle of the distance between the previous frame geographic information data and panorama real scene image collecting device;
Utilize the reality distance of having demarcated and the corresponding relation that mixes the realistic space distance, with viewpoint together with the panorama outdoor scene show outer curtain move to the present frame geographic information data corresponding mix the realistic space position, geographic information data is sent to geographical information platform by routine interface and upgrades the geographic position simultaneously;
Go to " with panorama real scene image texture curtain outside scene shows, geographic information data is sent to geographical information platform by routine interface and is used for upgrading the geographic position, and replacement user viewpoint position in mixing realistic space " step.
Described the first visual angle roaming is the great-jump-forward roaming, is specially:
Begin to carry out and initialization the first visual angle great-jump-forward roaming program;
Be selected in inner virtual ground or the 3D architectural entity of the outer curtain of panorama outdoor scene demonstration by point and roam, obtain clicking the position at the coordinate of virtual ground;
The initial position of viewpoint center on virtual ground is as the initial point that mixes realistic space, obtain and the rotation angle of viewpoint the center actual range in real world and panorama real scene image collecting device between the initial position on virtual ground, calculate the actual longitude and latitude of this initial position;
The user judges whether to jump to selected position; If need redirect, next step continues;
The physical location that calculating is tried to achieve sends back the data management backstage, search the deposit path with the most contiguous panorama real scene image information in this position in database, and in data storage cell, panorama real scene image data are extracted, transmit back the 3D display platform, repaint curtain outside the demonstration of panorama outdoor scene on new position, upgrade simultaneously the current geographic position of user on geographical information platform;
The user judges whether to quit a program, as quits a program, the EOP (end of program) of the first visual angle great-jump-forward roaming;
As not withdrawing from the first visual angle great-jump-forward roaming program, go to " the viewpoint center initial position on virtual ground is as the initial point that mixes realistic space; obtain and the rotation angle of viewpoint the center actual range in real world and panorama real scene image collecting device between the initial position on virtual ground, calculates the actual longitude and latitude of this initial position " step;
If do not need to jump to selected position go to " the viewpoint center initial position on virtual ground is as the initial point that mixes realistic space; obtain and the rotation angle of viewpoint the center actual range in real world and panorama real scene image collecting device between the initial position on virtual ground, calculates the actual longitude and latitude of this initial position " step.
Described the 3rd visual angle roaming comprises the following steps:
Begin to carry out the 3rd visual angle roaming program;
Initialization the 3rd visual angle roaming program limits the roaming range of user subject in virtual ground position, present frame scene place;
Handle the corresponding 3D people entities of user self and begin roaming on the residing virtual ground of present frame;
When the user handles the 3D people entities away from initial position, calculate the current position of 3D people entities and initial position in mixing realistic space distance and with respect to the rotation angle of the panorama real scene image collecting device of initial position;
The user judges whether its viewpoint needs to jump to 3D people entities present position;
When user's viewpoint need to jump to 3D people entities present position, next step continued;
The physical location that calculates is sent back the data management backstage, search the deposit path with the most contiguous panorama real scene image information in this position in database, in data storage cell, panorama real scene image data are extracted, and result is transmitted back to the 3D display platform, draw curtain outside spherical scene on new position, upgrade simultaneously the current geographic position of user on geographical information platform;
The user judges whether to quit a program, as quits a program, the 3rd visual angle roaming EOP (end of program);
As not withdrawing from the 3rd visual angle roaming program, go to " when the user handles the 3D people entities away from initial position, calculate the current position of 3D people entities and initial position in mixing realistic space distance and with respect to the rotation angle of the panorama real scene image collecting device of initial position " step;
If judging its viewpoint, the user do not need to jump to 3D people entities present position, go to " when the user handles the 3D people entities away from initial position, calculate the current position of 3D people entities and initial position in mixing realistic space distance and with respect to the rotation angle of the panorama real scene image collecting device of initial position " step.
Described panorama outdoor scene shows that outer curtain adopts spherical, cube shaped, cylindrical or other enclosure space solid.。
Compared with prior art, beneficial effect of the present invention is as follows:
The present invention will organically combine based on the virtual reality system of Image Rendering technology and virtual reality system based on traditional 3D modeling technique, inherited the panorama real scene image and had advantages of high emulation in building virtual scene, simultaneously also inherit the 3D modeling and had very strong relief advantage, effectively solved the existing virtual reality system Scene sense of reality poor, stereoscopic sensation is strong not, the problems such as content is abundant not show as especially:
1. fidelity is high, the scene structure is true, abundant in content.The present invention not only adopts based on the Image Rendering technology panorama real scene image is mapped in the outer curtain of scene demonstration, and utilizes the 3D modeling method, by mixing the inner 3D of interpolation of realistic space entity, has set up efficiently abundant in content three-dimensional outdoor scene and has mixed realistic space.The constructed realistic space that mixes has high emulation, stronger stereovision, stereoscopic sensation and feeling of immersion; And the user can appear in the Virtual Space with a kind of entity identities, this panorama real scene image take reality scene is as background, be combined in the method that mixes embedding 3D entity in realistic space and can effectively strengthen the feeling of immersion in space, stereovision, stereoscopic sensation and fidelity, this is the present not available characteristics of all existing virtual reality systems.
2. the present invention will combine with the two-dimensional map (geographical information platform) of real world based on the realistic space that mixes of panorama outdoor scene, make the user the panorama outdoor scene show outer curtain be background mix realistic space in roam in, also roam on the map of real world, realized that the environment that the user is immersed in real world can also understand simultaneously own current residing geographic position, more increased the sense of reality of roaming.
3. the present invention is mapped to the outer curtain of scene demonstration with the panorama real scene image and creates the real world scene, and mixing the inner 3D of interpolation of realistic space modeling entity, this method provides a technical strategies for computer simulation real world scene, and user interface of the present invention is succinctly friendly, easy operating contains much information.
Description of drawings
Fig. 1 is system architecture diagram;
Fig. 2 A, 2B mix realistic space structural representation (), (two) for what show outer curtain based on the spherical panorama outdoor scene;
Fig. 3 the first visual angle continous way roaming program flow diagram;
The program flow diagram of Fig. 4 the first visual angle great-jump-forward roaming;
The program flow diagram of Fig. 5 the 3rd visual angle roaming.
Embodiment
Embodiment 1
The inventive method is by organically combining the panorama real scene image in the scene that the panorama real scene image builds as scene background and with 3D modeling entity, and so a kind of computer virtual scene that merges on scene content is called and is mixed reality.
The present invention mixes real platform construction method and is based upon on as shown in Figure 1 system architecture basis, and its system architecture comprises: data management backstage 11 is comprised of database 12, data storage cell 13,3D entity management unit 14; Application Program Interface 15 is comprised of 3D display platform 16 and geographical information platform 17.
, be used for mixing the roaming of realistic space to data management backstage 11 request msgs (comprising panorama real scene image data and geographic information data) by Application Program Interface 15.
Data storage cell 13 is that the disk space that reserves in data management backstage 11 is used for storing preprepared panorama real scene image data.Adopt jpeg format that panorama real scene image data are preserved in the present embodiment.
Database 12 can adopt the realizations such as SQL Server, Oracle or SyBase.Database 12 is stored deposit path with this frame panorama real scene image data corresponding geographic information data of preprepared panorama real scene image data in data management backstage 11 together.When 11 requests are mixing in realistic space roaming to the data management backstage when Application Program Interface 15 when needed next frame panorama real scene image data and geographic information data thereof, database 12 will find corresponding panorama real scene image data at data storage cell 13 by the deposit path of these frame panorama real scene image data, and this geographic information data corresponding to frame panorama real scene image data deposit path, pass in the lump Application Program Interface 15 back.Geographic information data is divided into indoor geographic information data and outdoor geographic information data, wherein indoor geographic information data can utilize the devices such as laser sensor, photoelectric code disk or gyroscope to gather, and outdoor geographic information data can adopt the devices such as GPS to gather; Obtaining of indoor and outdoor geographic information data also can unify to adopt the method that external parameters of cameras is demarcated.
The present embodiment only utilizes the GPS device to gather outdoor geography information gps data.
3D entity management unit 14 is used for the data messages such as longitude and latitude position, 3D entity place, 3D solid data and this user ID that the storage user adds, wherein user ID is used for rights management, only has the user who adds the 3D entity just to have authority that the 3D entity of its interpolation is deleted.
3D display platform 16 is to utilize the 3D graphic package interfaces such as OpenGL or Direct3D will mix realistic space to show in the client area of MFC single document program interface.Build in 3D display platform 16 and mix realistic space with what the panorama real scene image combined with the 3D entity.
Geographical information platform 17 embeds Application Program Interface 15 with form web page with electronic chart.Electronic chart can obtain by the data-interface function in web page program and 3D display platform 16 geographic information data of user of current roaming position.
The present embodiment adopts the Application Program Interface 15 based on the MFC single document interface, embeds 3D display platform 16 in Application Program Interface 15, and embeds geographical information platform (the present embodiment employing electronic chart).
The present invention mixes real platform construction method, at first utilizes the data acquisition platform collection to build and mixes the needed raw data of real platform.Data acquisition platform is comprised of panorama real scene image collecting device, geographic information acquisition equipment (the present embodiment adopts gps system that outdoor geographic information data is gathered), data acquisition control program and data acquisition storer.Panorama real scene image collecting device (as full-view camera) is responsible for gathering panorama real scene image data; Geographic information acquisition equipment gathers gps data; The data acquisition control program uses two threads to control respectively panorama real scene image collecting device and geographic information acquisition equipment, and adopt same clock frequency signal that panorama real scene image collecting device is synchronizeed with geographic information acquisition equipment and trigger, panorama real scene image data and the corresponding gps data of this frame of a frame are stored in the data acquisition storer respectively according to continuous mode.Each frame panorama real scene image data is preserved separately (such as saving as jpeg format or BMP form etc., this specific embodiment selects jpeg image format to store), be stored at last in the data storage cell 13 on data management backstage 11; The deposit path gps data opening relationships table corresponding with this frame of each frame image data that will store at data storage cell 13 in data management backstage 11 is stored in this relation table in the database 12 on data management backstage 11.So, the deposit path of every frame panorama real scene image is corresponding one by one with the gps data of these frame panorama real scene image data in database 12, can obtain data and its corresponding gps data of this frame panorama real scene image by the deposit path of every frame panorama real scene image.
Construction method of panoramic/realistic hybrid reality platform of the present invention comprises the following steps:
1. create curtain outside the demonstration of spherical panorama outdoor scene in 3d space
Adopt 3D graphic package interface (as OpenGL or Direct3D etc., the present embodiment adopts OpenGL) create the three-dimensional environment that mixes realistic space, and draw scene show outer curtain (scene of the present embodiment shows that outer curtain adopts outside spherical scene demonstration acts 22) in this mixes the three-dimensional environment of realistic space; The panorama real scene image data that collect 11 are sent to 3D display platform 16 from the data management backstage, adopt the texture mapping method in 3D graphic package interface that panorama real scene image data texture is mapped to the outer curtain 22 of spherical scene demonstration, can get the spherical panorama outdoor scene and show outer act;
2. create virtual ground and add the 3D entity
Above-mentioned virtual ground concrete method for building up in the present embodiment is as follows:
Show in outer curtain in the spherical panorama outdoor scene that built and set up virtual ground 21: virtual ground 21 is an invisible face, whole mix realistic space in for putting 3D architectural entity 24 or 3D people entities 25 etc.
In order to guarantee to observe from user's viewpoint, 3D architectural entity 24 shows that with the spherical panorama outdoor scene image scene on outer curtain is corresponding, need to guarantee that the image that shows on curtain outside the spherical panorama outdoor scene shows is consistent with the perspective relation that mixes realistic space.This perspective relation is observed viewpoint height, virtual ground position and spherical panorama outdoor scene by the user and is shown that outer curtain epigraph perspective relation determines jointly.The present embodiment is drawn two parallel lines on virtual ground, corresponding with roadside straight line on image.By regulating the height of viewpoint and virtual ground, when two parallel lines on virtual ground and highway straight line coincide, show that perspective relation demarcates successfully, thereby determined the position of virtual ground and the position of user's viewpoint.
Loaming method on virtual ground is as follows:
At first to demarcate the distance on virtual ground and the actual range relation in real world.To in mixing realistic space from user's viewpoint to virtual ground distance and with actual photographed video camera center and ground actual range measure.The ratio of these two distances namely can be used as and mixes the realistic space distance to actual range conversion scale, and this process is completed in initialization step.So, each position on virtual ground corresponding the longitude and latitude of a reality on real world.The user roams track 23 as shown in Fig. 2 A.The invention provides two kinds of roam mode in mixing realistic space: the first visual angle roaming and the 3rd visual angle roaming.The first visual angle and the 3rd visual angle are defined as follows in the present invention: the first visual angle refers to that the displayed scene that the user sees is the scene that the user observes with eyes on the 3D display platform, the user can't see the 3D people entities of self, but can see other users' 3D people entities; The 3rd visual angle refers to that the user can be complete sees self the entity that represents this user in the space, also can see other users' entity, and when user subject moved, user's viewpoint can be followed the user and be handled entity and arrive the target location.
A. the first visual angle roaming
The first visual angle roaming is divided into two kinds of loaming methods, and a kind of is the continous way roaming, and another is the great-jump-forward roaming.
A) continous way roaming
Continous way roaming is to roam according to the image taking path.Wherein the initial position of viewpoint center on virtual ground of user's roaming is the initial point that mixes realistic space.The initial position at viewpoint center is corresponding with the actual latitude and longitude coordinates of present frame panorama real scene image.The height of viewpoint is as the criterion with the position of reserving in the location positioning step acceptance of the bid of virtual ground.
Its program circuit as shown in Figure 3, concrete grammar is as follows:
(31,32) begin to carry out and initialization the first visual angle continous way roaming program;
(33) 11 the first frame panorama real scene image data and the corresponding gps datas of these panorama real scene image data that read the preliminary election Roam Path from the data management backstage;
(34) panorama real scene image texture is arrived curtain outside the demonstration of spherical panorama outdoor scene in mixing realistic space, gps data is sent to geographical information platform 17 by routine interface and is used for upgrading the geographic position, and replacement user viewpoint position;
(35) judge that in roam procedure the current spherical panorama outdoor scene that is presented at shows whether the panorama real scene image on outer act arrives end frame; If panoramic picture arrival end frame, the EOP (end of program) (37) of the first visual angle great-jump-forward roaming; Otherwise go to step (36);
(36) 3D display platform 16 beginning is to data management backstage 11 request next frame panorama real scene image data and gps datas corresponding to these frame panorama real scene image data.The panorama real scene image data texture of newly asking is mapped to curtain outside the demonstration of spherical panorama outdoor scene, and (the video camera rotation angle can be passed through gps data to calculate distance between present frame gps data and previous frame gps data and video camera rotation angle, the perhaps outer gain of parameter of camera calibration, the present embodiment adopts gps data to obtain the video camera rotation angle);
Utilize the distance in the real world demarcated and mix corresponding relation between distance in realistic space, with viewpoint together with the spherical panorama outdoor scene show outer curtain move to the present frame gps data corresponding mix the realistic space position, the while gps data is sent to geographical information platform by routine interface and upgrades the geographic position, goes to step (34).
B) great-jump-forward roaming
Great-jump-forward roaming can make the user show in the spherical panorama outdoor scene of present frame mixing in the realistic space scope that outer curtain covers, chooses arbitrarily the position of next step arrival from the path of stipulating, and at its corresponding panorama real scene image of this position display.Great-jump-forward roaming concrete methods of realizing in the present embodiment as shown in Figure 4.
(41,42) begin to carry out and initialization the first visual angle great-jump-forward roaming program;
(43) being selected in inner virtual ground or the 3D architectural entity of the outer curtain of spherical panorama outdoor scene demonstration by point roams, namely with click mode obtain wish the in-position mix the realistic space coordinate, with this mix the realistic space coordinate projection on virtual ground (if directly be selected on virtual ground, do not need to carry out projection), obtain clicking the coordinate of position on virtual ground;
(44) initial position of viewpoint center on virtual ground is as the initial point that mixes realistic space, obtain and viewpoint the center actual range in real world and video camera rotation angle between the initial position on virtual ground, calculate the actual longitude and latitude of this position;
(45) user judges whether to jump to selected position; If need redirect, subsequent steps (46), otherwise go to step (44);
The actual geographic position of (46) calculating being tried to achieve sends back data management backstage 11, search the deposit path with the most contiguous panorama real scene image information in this geographic position in database 12, and in data storage cell 13, panorama real scene image data are extracted, transmit back 3D display platform 16, repaint curtain outside spherical scene on new position, upgrade simultaneously the current geographic position of user on geographical information platform 17;
(47) user judges whether to quit a program, as quits a program, the EOP (end of program) (48) of the first visual angle great-jump-forward roaming; As not withdrawing from, go to step (44).
B. the 3rd visual angle roaming
The 3rd visual angle roaming concrete methods of realizing in the present embodiment as shown in Figure 5.
(51) begin to carry out the 3rd visual angle roaming program;
(52) initialization the 3rd visual angle roaming program, limit the roaming range of user subject in virtual ground position, present frame scene place;
The present invention does not have special provision to the shape of roaming range, can be the zone of the shapes such as circle or rectangle.The present embodiment adopts border circular areas, and user's people entities determines at the radius of present frame random zone radius outer curtain by spherical scene, and in the present embodiment, the random zone radius is 1/2 left and right that the spherical panorama outdoor scene shows outer act radius.
(53) handle the corresponding 3D people entities 25 of user self and begin roaming (concrete method of operating belongs to prior art, embodies) on the residing virtual ground 21 of present frame in program.Represent the actual latitude and longitude coordinates of real world due to each point on virtual ground 21.Therefore the roaming distance of this 3D people entities 25 on virtual ground 21 representing the actual range corresponding with real world too.
(54) when the user handles 3D people entities 25 away from initial position, calculate 3D people entities 25 current positions and the distance of initial position in mixing realistic space, and with respect to the video camera rotation angle of initial position.This distance that mixes in realistic space is changed according to demarcating the good transformational relation that mixes realistic space and real world distance, obtained the distance of 3D people entities 25 positions and this initial position in real world.Due to the longitude and latitude of known initial position, can utilize actual range between the video camera rotation angle that obtained and two positions to try to achieve the latitude and longitude coordinates of 3D people entities 25 position reality;
(55) user judges whether its viewpoint needs to jump to 3D people entities 25 present positions; When user's viewpoint need to jump to 3D people entities 25 present position, subsequent steps (56), otherwise go to step (54)
(56) when user's viewpoint need to jump to 3D people entities 25 present position, the actual geographic position that calculates is sent back data management backstage 11, search the deposit path with the most contiguous panorama real scene image information in this geographic position in database 12.In data storage cell 13, panorama real scene image data are extracted, and result is transmitted back to 3D display platform 16, draw curtain outside spherical scene on new position, upgrade simultaneously the current geographic position of user on geographical information platform 17.
(57) user judges whether to quit a program, as quits a program, the 3rd visual angle roaming EOP (end of program) (58); As not withdrawing from, go to step (54).
(the 3D entity comprises personage and the various entities except the personage to the present embodiment with the 3D entity of 3D display platform 16 outsides, as sign, buildings etc., lower same) import in the 3D display platform, concrete introduction method is as follows: will utilize 3D solid model file that 3D modeling tool (as 3D modeling softwares such as MAYA or 3DMax) establishes to store as file layouts such as .3ds .max or .ma, obtain the raw data (these raw data comprise the position of the Points And lines of 3D entity in the space, and the surface label diagram data) of 3D entity.Utilize routine interface to read in 3D display platform 16 raw data of these Points And lines etc., realize the importing of 3D entity.
In roam procedure, outside the scope that the 3D entity shows in mixing realistic space is shown by the spherical panorama outdoor scene, the radius size of curtain determines.Be specially, the radius that the 3D entity shows outside the distance of outer curtain viewpoint center should show less than the spherical panorama outdoor scene act in the position of virtual ground and present frame spherical panorama outdoor scene, in this scope, the 3D entity present frame mix realistic space in as seen.
3. will mix realistic space synchronizes with the geographic information data (electronic map data) of real world
The user is when the virtual ground roaming that mixes realistic space, and along with the change of roaming scence, its geographic position is also changing.The geographic information data that the spherical panorama outdoor scene is shown the corresponding real world of outer curtain current frame image, it is gps data, be that routine interface between electronic chart passes to the preprepared electronic chart in real time by 3D display platform 16 and geographical information platform 17, demarcate and upgrade the current location of user's roaming on electronic chart.
Mix the detailed process that realistic space synchronizes with electronic map data as follows:
As shown in Fig. 2 B, geographical information platform 17 is with Two-dimensional electron ground Figure 26 form embedding Application Program Interface 15.The gps data that the Two-dimensional electron map can obtain user of current roaming position by electronic chart and the data-interface function in 3D display platform 16 of form web page.The api function that utilizes the Two-dimensional electron map is calibrating this user geographic position 27 on map with icon by the gps data of the user of current roaming position that obtained on electronic chart.In said process, the Two-dimensional electron map upgrades the geographic position 27 of user on electronic chart carrying out data acquisition by the data-interface function of carrying out 3D display platform 16 one time every 500ms (this numerical value can change according to the real-time needs) in Figure 26 electronically.
Embodiment 2
Difference from Example 1 is:
The outer curtain of described scene be shaped as cubic space (regular hexahedron).
The method of adding the 3D entity in mixing realistic space is to utilize 3D graphic package interface mixing the inner 3D of establishment of realistic space entity, and is specific as follows:
The user can observe some buildingss or place in curtain outside cube panorama outdoor scene shows in the process of roaming, and in real scene, many buildingss, place etc. do not have significant identification marking.The present embodiment adopts 3D graphic package interface to add the 3D entity in curtain outside cube panorama outdoor scene shows, the present embodiment has solved take the virtual identifying board as example the problem that mixes scenery identification in the realistic space process in roaming for the user, and concrete methods of realizing is:
In 3D entity management unit 14 in data management backstage 11 in advance with the attribute information of 3D virtual identifying board (as in the place of virtual identifying board real world latitude and longitude information, Sign Board is towards the angle, the size of Sign Board, pattern, the word of writing, the information such as font) store.When the user is mixing when roaming into this geographic position in realistic space, data management backstage 11 will be transferred to 3D display platform 16 at 3D entity informations such as the 3D of current geographic position virtual identifying boards by routine interface in 3D entity management unit 14, utilize 3D graphic package interface to draw the virtual identifying board in 3D display platform 16, be used to indicate specific buildings or place etc. in scene.
Embodiment 3
Difference from Example 1 is:
The outer curtain of described scene is shaped as cylindrical space.
Be to utilize 3D graphic package interface to create the 3D entity in mixing realistic space in the method that mixes the inner 3D of interpolation of realistic space entity, and will import at the outside 3D solid model that creates of 3D display platform 16 and mix realistic space.
The outer curtain structure of scene of the present invention can also be created as other enclosure space solid.
Above embodiment is only for explanation the present invention; but not limitation of the present invention; relevant technologies field personnel without departing from the spirit and scope of the present invention; can also make various conversion and distortion, so within the technical scheme of all equivalent variations also should belong to the category of claim of the present invention protection.

Claims (9)

1. construction method of panoramic/realistic hybrid reality platform is characterized in that comprising the following steps:
1) create curtain outside the demonstration of panorama outdoor scene in 3d space
Adopt 3D graphic package interface to create the three-dimensional environment that mixes realistic space, and draw curtain outside the scene demonstration in mixing the three-dimensional environment of realistic space; The panorama real scene image data that collect are sent to the 3D display platform Application Program Interface from the data management backstage, texture shows outer scene plot face to scene, and the panorama outdoor scene that obtains to mix in realistic space shows outer act;
2) create virtual ground and add the 3D entity
Oneself through build show based on the panorama outdoor scene outer curtain mix realistic space in set up for the virtual ground of putting the 3D entity, namely need to current show based on the panorama outdoor scene outer act mix realistic space in the 3D entity that shows add on virtual ground, roam on the virtual ground that the user creates in mixing realistic space;
3) will mix realistic space synchronizes with the geographic information data of real world
To show that the geographic information data that mixes the corresponding real world of realistic space current frame image of outer curtain passes to the preprepared geographical information platform in real time by the routine interface between 3D display platform and geographical information platform, demarcates and upgrade the position of user of current roaming on geographical information platform based on the panorama outdoor scene;
The described roaming on virtual ground comprises the following steps:
Distance on virtual ground and the actual range relation in real world are demarcated, namely in mixing realistic space from user's viewpoint to virtual ground distance and the camera lens primary optical axis of panorama real scene image collecting device measure to the ground actual range, the ratio of these two distances transforms scale as mixing the realistic space distance to actual range;
In can not observing self the displayed scene of 3D people entities, the user carries out the first visual angle roaming; Perhaps can complete observation carry out the 3rd visual angle roaming in the displayed scene of himself 3D people entities the user;
Described the first visual angle roaming is the continous way roaming, is specially:
Begin to carry out and initialization the first visual angle continous way roaming program;
Read the first frame panorama real scene image data and the corresponding geographic information data of these panorama real scene image data of preliminary election Roam Path from the data management backstage;
With panorama real scene image texture curtain outside scene shows, gps data is sent to geographical information platform by routine interface and is used for upgrading the geographic position, and replacement user viewpoint position in mixing realistic space;
Judge that the current panorama outdoor scene that is presented at shows whether the panorama real scene image on outer curtain arrives end frame;
If the arrival end frame, the first visual angle continous way roaming program is carried out and is finished;
Perhaps, if do not arrive end frame, the 3D display platform begins to data management background request next frame panorama real scene image data and geographic information data corresponding to these frame panorama real scene image data, the panorama real scene image data texture of newly asking is mapped to curtain outside the scene demonstration, and calculates the geographic information data of present frame and the rotation angle of the distance between the previous frame geographic information data and panorama real scene image collecting device;
Utilize oneself distance of the reality through demarcating and the corresponding relation that mixes the realistic space distance, with viewpoint together with the panorama outdoor scene show outer curtain move to the present frame geographic information data corresponding mix the realistic space position, geographic information data is sent to geographical information platform by routine interface and upgrades the geographic position simultaneously;
Go to " ask and middle panorama real scene image texture is shown outer curtain to scene mixing real sky, geographic information data is sent to geographical information platform by routine interface and is used for renewal geographic position and replacement user viewpoint position " step.
2. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described mix reality refer to by with the panorama real scene image as scene background, and 3D modeling entity is organically combined in the scene of panorama real scene image structure the computer virtual scene that merges on scene content.
3. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that the preparation of described panorama real scene image data and geographic information data comprises:
Utilize the data acquisition platform collection to build and mix the needed raw data of real platform, panorama real scene image data and the corresponding geographic information data of this frame of one frame are stored respectively in the data acquisition storer according to continuous mode, and dumped in the data storage cell on data management backstage;
The deposit path geographic information data opening relationships table corresponding with this frame of each frame image data that will store at data storage cell in the data management backstage is stored in relation table in the database on data management backstage.
4. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described establishment virtual ground comprises the following steps:
The panorama outdoor scene that built show outer act place mix realistic space in set up an invisible face as virtual ground;
Determine the position of above-mentioned virtual ground in mixing realistic space, to guarantee showing that in the panorama outdoor scene image of demonstration on outer curtain is consistent with the perspective relation that mixes realistic space.
5. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described interpolation 3D entity is:
To import to and mix realistic space mixing the realistic space outside 3D solid model that creates, before namely roaming on the virtual ground in mixing realistic space, the 3D solid model file that utilizes the 3D modeling tool to establish is carried out the data storage, obtain the raw data of 3D entity, then utilize routine interface to read in the 3D display platform described raw data.
6. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described interpolation 3D entity is:
Mixing the inner 3D of interpolation of realistic space entity, namely first each is stored mixing real sky and ask the attribute information of the 3D entity of middle drafting with standard in the 3D entity management unit on data management backstage, when the user is mixing the geographic position of the 3D entity that roams into above-mentioned preparation drafting in realistic space, the data management backstage is transferred to the 3D display platform with crossing routine interface at the 3D of current geographic position entity attribute information exchange in 3D entity management unit, utilizes 3D graphic package interface drafting 3 D entity in the 3D display platform.
7. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described the first visual angle roaming is the great-jump-forward roaming, is specially:
Begin to carry out and initialization the first visual angle great-jump-forward roaming program;
Be selected in inner virtual ground or the 3D architectural entity of the outer curtain of panorama outdoor scene demonstration by point and roam, obtain clicking the position at the coordinate of virtual ground;
The initial position of viewpoint center on virtual ground is as mixing the initial point that real sky is asked, obtain and the rotation angle of viewpoint the center actual range in real world and panorama real scene image collecting device between the initial position on virtual ground, calculate the actual longitude and latitude of this initial position;
The user judges whether to jump to selected position; If need redirect, next step continues; The physical location that calculating is tried to achieve sends back the data management backstage, search the deposit path with the most contiguous panorama real scene image information in this position in database, and in data storage cell, panorama real scene image data are extracted, transmit back the 3D display platform, repaint curtain outside the demonstration of panorama outdoor scene on new position, upgrade simultaneously the current geographic position of user on geographical information platform;
The user judges whether to quit a program, as quits a program, the EOP (end of program) of the first visual angle great-jump-forward roaming; Perhaps, as not withdrawing from the first visual angle great-jump-forward roaming program, go to " the viewpoint center initial position on virtual ground is as mixing the initial point that real sky is asked; obtain and the rotation angle of viewpoint the center actual range in real world and panorama real scene image collecting device between the initial position on virtual ground, calculates the actual longitude and latitude of this initial position " step;
Perhaps, if do not need to jump to selected position go to " the viewpoint center initial position on virtual ground is as the initial point that mixes realistic space; obtain and the rotation angle of viewpoint the center actual range in real world and panorama real scene image collecting device between the initial position on virtual ground, calculates the actual longitude and latitude of this initial position " step.
8. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described the 3rd visual angle roaming comprises the following steps:
Begin to carry out the 3rd visual angle roaming program;
Initialization the 3rd visual angle roaming program limits the roaming range of user subject in virtual ground position, present frame scene place;
Handle the corresponding 3D people entities of user self and begin roaming on the residing virtual ground of present frame;
When the user handles the 3D people entities away from initial position, calculate the current position of 3D people entities and initial position in mixing realistic space distance and with respect to the rotation angle of the panorama real scene image collecting device of initial position;
The user judges whether its viewpoint needs to jump to 3D people entities present position;
When user's viewpoint need to jump to 3D people entities present position, next step continued;
The physical location that calculates is sent back the data management backstage, search the deposit path with the most contiguous panorama real scene image information in this position in database, in data storage cell, panorama real scene image data are extracted, and result is transmitted back to the 3D display platform, draw curtain outside spherical scene on new position, upgrade simultaneously the current geographic position of user on geographical information platform;
The user judges whether to quit a program, as quits a program, the 3rd visual angle roaming EOP (end of program);
Perhaps, as not withdrawing from the 3rd visual angle roaming program, go to " when the user handles the 3D people entities away from initial position, calculate the current position of 3D people entities and initial position in mixing realistic space distance and with respect to the rotation angle of the panorama real scene image collecting device of initial position " step;
Perhaps, if judging its viewpoint, the user do not need to jump to 3D people entities present position, go to " when the user handles the 3D people entities away from initial position, calculate the current position of 3D people entities and initial position in mixing realistic space distance and with respect to the rotation angle of the panorama real scene image collecting device of initial position " step.
9. by the described construction method of panoramic/realistic hybrid reality platform of claim 1, it is characterized in that: described panorama outdoor scene shows that outer curtain adopts spherical, cube shaped, cylindrical.
CN 200910219617 2009-11-04 2009-11-04 Construction method of panoramic/realistic hybrid reality platform Expired - Fee Related CN102054290B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200910219617 CN102054290B (en) 2009-11-04 2009-11-04 Construction method of panoramic/realistic hybrid reality platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200910219617 CN102054290B (en) 2009-11-04 2009-11-04 Construction method of panoramic/realistic hybrid reality platform

Publications (2)

Publication Number Publication Date
CN102054290A CN102054290A (en) 2011-05-11
CN102054290B true CN102054290B (en) 2013-11-06

Family

ID=43958574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200910219617 Expired - Fee Related CN102054290B (en) 2009-11-04 2009-11-04 Construction method of panoramic/realistic hybrid reality platform

Country Status (1)

Country Link
CN (1) CN102054290B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855651B (en) * 2012-07-31 2018-05-01 深圳市赛格导航科技股份有限公司 Method seen by personage's walking is simulated in a kind of threedimensional model
CN103810748B (en) * 2012-11-08 2019-02-12 北京京东尚科信息技术有限公司 The building of 3D simulation system, management method and 3D simulator
CN103177475B (en) * 2013-03-04 2016-01-27 腾讯科技(深圳)有限公司 A kind of streetscape map exhibiting method and system
CN103279091B (en) * 2013-05-13 2015-04-15 西安科技大学 Automatic centering real-scene platform based on laser positioning and plane transmission
CN103646427A (en) * 2013-12-16 2014-03-19 北京经纬恒润科技有限公司 Method and device for acquiring image data
CN104484327A (en) * 2014-10-09 2015-04-01 上海杰图天下网络科技有限公司 Project environment display method
CN106162019A (en) * 2015-04-16 2016-11-23 上海机电工程研究所 Single immersion pseudo operation training visualization system and method for visualizing thereof
CN106157359B (en) * 2015-04-23 2020-03-10 中国科学院宁波材料技术与工程研究所 Design method of virtual scene experience system
CN105183161A (en) * 2015-09-02 2015-12-23 胡剑颖 Synchronized moving method for user in real environment and virtual environment
CN105224179A (en) * 2015-09-30 2016-01-06 北京恒华伟业科技股份有限公司 Method for information display and device
CN105427378B (en) * 2015-11-06 2018-01-30 武汉东湖学院 Cultural city methods of exhibiting and system based on Digital object identifier
CN107038625A (en) * 2017-04-13 2017-08-11 安徽省沃瑞网络科技有限公司 A kind of building sale based on augmented reality is with seeing room guidance system
CN107102794B (en) * 2017-04-27 2020-08-11 武汉数文科技有限公司 Operation processing method and device
CN107229345A (en) * 2017-07-21 2017-10-03 河北科技大学 Projection algorithm in two axle gyro ceremony motion storehouses
CN108647242B (en) * 2018-04-10 2022-04-29 北京天正聚合科技有限公司 Generation method and system of thermodynamic diagram
CN108776544B (en) * 2018-06-04 2021-10-26 网易(杭州)网络有限公司 Interaction method and device in augmented reality, storage medium and electronic equipment
CN112527108A (en) * 2020-12-03 2021-03-19 歌尔光学科技有限公司 Virtual scene playback method and device, electronic equipment and storage medium
CN113379838B (en) * 2021-08-16 2021-10-22 贝壳技术有限公司 Method for generating roaming path of virtual reality scene and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1838176A (en) * 2006-04-06 2006-09-27 胡小云 Method for making urban three-dimensional dynamic traveling network map
CN1932799A (en) * 2006-09-04 2007-03-21 罗中根 System and method for simulating real three-dimensional virtual network travel
CN1957378A (en) * 2004-07-13 2007-05-02 上条有 Image processing program, recording medium, and apparatus
CN1987857A (en) * 2006-12-18 2007-06-27 于慧 Method for realizing digital city system using virtual artificial comprehensive image and text information interaction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1957378A (en) * 2004-07-13 2007-05-02 上条有 Image processing program, recording medium, and apparatus
CN1838176A (en) * 2006-04-06 2006-09-27 胡小云 Method for making urban three-dimensional dynamic traveling network map
CN1932799A (en) * 2006-09-04 2007-03-21 罗中根 System and method for simulating real three-dimensional virtual network travel
CN1987857A (en) * 2006-12-18 2007-06-27 于慧 Method for realizing digital city system using virtual artificial comprehensive image and text information interaction

Also Published As

Publication number Publication date
CN102054290A (en) 2011-05-11

Similar Documents

Publication Publication Date Title
CN102054290B (en) Construction method of panoramic/realistic hybrid reality platform
CN102054289B (en) 3D virtual community construction method based on panoramic and real image and geographic information
CN102054121B (en) Method for building 3D (three-dimensional) panoramic live-action online game platform
CN110222137B (en) Intelligent campus system based on oblique photography and augmented reality technology
CN108446310B (en) Virtual street view map generation method and device and client device
CN104484327A (en) Project environment display method
CN104102678B (en) The implementation method and realization device of augmented reality
CN106296783B (en) A kind of space representation method of combination space overall situation 3D view and panoramic pictures
CN102052916A (en) Method for three-dimensional measurement of panoramic real scenes
CN110379010A (en) Three-dimensional geographic information method for visualizing and system based on video fusion
CN105719343A (en) Method for constructing virtual streetscape map
CN106780734A (en) A kind of intelligent guide service system based on virtual panoramic
CN108337664B (en) Tourist attraction augmented reality interactive navigation system and method based on geographical position
CN101872243B (en) System and method for realizing 360-degree panoramic play following real space direction
Shahabi et al. GeoDec: A framework to effectively visualize and query geospatial data for decision-making
CN106162204A (en) Panoramic video generation, player method, Apparatus and system
JP2005250560A (en) Landscape display device
US20180322143A1 (en) Interactive Device With Three-Dimensional Display
CN109242966A (en) A kind of 3D panorama model modeling method based on laser point cloud data
CN105741340B (en) A kind of transmission line of electricity three-dimensional scenic emulation mode and system for web page display
CN110058696A (en) A kind of virtual reality implementation method and its application method and correlation technique device
CN112446804A (en) Intelligent tourism system based on country culture and virtual reality
CN102930585A (en) Method for using Flash to achieve three-dimensional network map diorama
CN117351797B (en) Position real-time linkage system
CN111932446B (en) Method and device for constructing three-dimensional panoramic map

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent for invention or patent application
CB02 Change of applicant information

Address after: 110004, room 183, No. five, 906 South Road, Heping District, Liaoning, Shenyang

Applicant after: Shenyang Xunjing Technology Co., Ltd.

Address before: 110004, room 183, No. five, 906 South Road, Heping District, Liaoning, Shenyang

Applicant before: Shenyang Longhui Technology Co., Ltd.

COR Change of bibliographic data

Free format text: CORRECT: APPLICANT; FROM: SHENYANG LONGHUI TECHNOLOGY CO., LTD. TO: SHENYANG QUICKSCENE TECHNOLOGY CO., LTD.

C14 Grant of patent or utility model
GR01 Patent grant
DD01 Delivery of document by public notice

Addressee: The European Silai (Beijing) Technology Co. Ltd.

Document name: Notification that Application Deemed not to be Proposed

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20170516

Address after: 100097, room 4, building 98, 302 West Lake Road, Mentougou District, Beijing

Patentee after: The European Silai (Beijing) Technology Co. Ltd.

Address before: 110004, room 183, No. five, 906 South Road, Heping District, Liaoning, Shenyang

Patentee before: Shenyang Xunjing Technology Co., Ltd.

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20131106

Termination date: 20191104

CF01 Termination of patent right due to non-payment of annual fee