CN102262705A - Virtual reality method of actual scene - Google Patents

Virtual reality method of actual scene Download PDF

Info

Publication number
CN102262705A
CN102262705A CN2010101879272A CN201010187927A CN102262705A CN 102262705 A CN102262705 A CN 102262705A CN 2010101879272 A CN2010101879272 A CN 2010101879272A CN 201010187927 A CN201010187927 A CN 201010187927A CN 102262705 A CN102262705 A CN 102262705A
Authority
CN
China
Prior art keywords
visual human
zone
distant view
described step
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2010101879272A
Other languages
Chinese (zh)
Inventor
孟兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN2010101879272A priority Critical patent/CN102262705A/en
Publication of CN102262705A publication Critical patent/CN102262705A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a method for transforming an actual scene into a virtual scene in the computer and reappearing the actual scene, in particular a method for visiting scenic spots by users through a multimedia system. The method provided by the invention comprises the following steps of: shooting multiple panoramic photographs in a real area and packaging all obtained panoramic photographs to obtain virtual scene data of the real area; when users access the virtual scene data through special browse software, obtaining the completely real feeling of being personally on the scene.

Description

Outdoor scene virtual reality method
Technical field:
The present invention relates to a kind of method that actual scene is converted into virtual scene in the computing machine and reproduces, relate in particular to the method that a kind of user of making goes sight-seeing the sight spot by multimedia system.
Background technology:
Virtual reality is to utilize computer simulation to produce a three-dimensional virtual world, and the user is provided the simulation about sense organs such as vision, the sense of hearings, allows the user as being personally on the scene, and can observe the things in the three-dimensional space in time, ad lib.The main computed real-time calculating of present virtual reality technology produces graphic scene, so that the user to be provided the simulation about the visual perception.This technology is widely used in aspects such as education, geography, city planning, indoor design, recreation, tourism.But present technology has following defective: 1. emulation degree is low, can't visually give the user real fully sensation; 2. simulated effect is coarse, and cost is higher aspect treatment of details; 3. it is higher to carry out the time cost of model construction for delicate items or environment.
Summary of the invention:
The purpose of this invention is to provide a kind of outdoor scene virtual reality method.In order to overcome above-mentioned defective, technical scheme of the present invention is as follows:
A kind of outdoor scene virtual reality method may further comprise the steps:
A. in real zone, take many Zhang Quanjings photo, may further comprise the steps:
A1. determine that the visitor can free movable zone, this zone is the visual human can free movable zone in virtual scene, is called the reality zone;
A2. determine a plurality of observation places in this zone, the observation place is the position that the visitor can stop, and also is the position that the visual human can stop in virtual scene, and the visitor can rest on the observation place so that surrounding environment is observed;
A3. use photographic instrument in each observation place to the surrounding environment photo that pans.
B. encapsulate taking resulting whole distant view photographs among the step a, obtain the virtual scene data in reality zone, may further comprise the steps:
Observation place at photographic instrument place was numbered when b1. all distant view photographs that shooting among a is obtained were taken according to it;
B2. all distant view photographs after will numbering are placed under the same catalogue or with them and write same data file, and this catalogue file or data file are called the virtual scene data in reality zone.
C. the user conducts interviews to the virtual scene data by special-purpose browsing software, and the job step of browsing software comprises:
C1. determine current visual human in virtual scene residing position and facial towards;
C2. select and the corresponding distant view photograph in the current present position of visual human, the position at photographic instrument place is the observation place in the pairing real zone, the current present position of visual human when taking this distant view photograph;
C3. selecting and the pairing image-region of the current face orientation of visual human in the resulting distant view photograph of c2, the content of this image-region is the current viewed scene of visual human;
C4. on display, play through the resulting image-region of c3;
C5. the request that real-time supervisory user is sent browser is with direction of motion and the facial rotation direction that obtains the visual human;
C6. according to determining residing position of visual human and face orientation subsequently through the resulting visual human's of c5 direction of motion, facial rotation direction and visual human's present located position and face orientation;
C7. use the residing position of visual human subsequently and the face orientation that obtain through c6 to upgrade residing position of current visual human and face orientation;
C8. repeat c2 to c7.
The indication distant view photograph is spherical panorama photo or cylindricality distant view photograph among the above-mentioned a.
Among the above-mentioned c, when the picture number of being play in a second during more than or equal to 48, the flicker that the user is produced when just not recognizing image switching.
Among the above-mentioned c, when the picture number of being play in a second more than or equal to the content of 16 and these images when changing continuously, the user just can observe the continually varying picture, and produces the sensation of walking in actual environment very really.
The present invention has following advantage:
1. picture is the outdoor scene picture, visually gives the user real fully sensation.
2. the user can control visual human's smooth arbitrarily activity in virtual scene, and can change arbitrarily visual human's face towards, thereby change its visual angle.
3. real-time, the picture smoothness, less demanding to the employed computing machine of user.
4. the complexity of the constructions cost of virtual environment model and true environment itself is irrelevant, and therefore, when the true environment complexity was higher, the cost of this method was more cheap.
5. be convenient to combine the experience that the user is reached be personally on the scene fully with systems such as stereophonic sound system and senses of touch.
Description of drawings:
Fig. 1 is the process synoptic diagram of the photo that pans in the embodiments of the invention.
Fig. 2 is the principle of work synoptic diagram of browsing software in the embodiments of the invention.
Fig. 3 is the equipment synoptic diagram of user side in the embodiments of the invention.
Embodiment:
The present invention is described in further detail below in conjunction with drawings and Examples.
As shown in Figure 1, Fig. 1 is the process synoptic diagram of the photo that pans in the embodiments of the invention, may further comprise the steps:
A. determine reality zone 1, reality zone 1 is the visitor can free movable zone, also is that the visual human can free movable zone in virtual scene;
B. that reality zone 1 usefulness is mutually orthogonal straight line x1, x2 ..., x100 and y1, y2 ..., y100 the grid of size such as is divided into, x1, x2 ..., x100 and y1, y2 ..., y100 each intersection point in reality zone 1 be an observation place;
C. use photographic instrument in each observation place to the surrounding environment photo that pans, when pan-shot each time, camera initially towards identical, above-mentioned distant view photograph is spherical panorama photo or cylindricality distant view photograph.
Finish after the pan-shot of all observation places, resulting distant view photograph photographic instrument position when taking is numbered, be numbered pairing observation place coordinate (x1, y1) ..., (x1, y100), (x2, y1) ..., (x2, y100) ..., (x100, y100).
Distant view photograph after all numberings is put under the same catalogue, and this catalogue file is called the virtual scene data.
As shown in Figure 2, Fig. 2 is the principle of work synoptic diagram of browsing software in the embodiments of the invention, and the course of work comprises:
A. determine current visual human in virtual scene residing position and facial towards;
B. select to have with the current present position of visual human coordinate the distant view photograph 2 of reference numeral, the position at photographic instrument place is the observation place in the pairing real zone, the current present position of visual human when taking this distant view photograph;
C. selecting and the pairing image-region of the current face orientation of visual human in the resulting distant view photograph of b, the content of this image-region is the current viewed scene of visual human, also is current screening area 3;
D. on display, play through the resulting current screening area 3 of c;
E. the request that real-time supervisory user is sent browser is with direction of motion and the facial rotation direction that obtains the visual human;
F. according to determining residing position of visual human and face orientation subsequently through the resulting visual human's of e direction of motion, facial rotation direction and visual human's present located position and face orientation;
G. use the residing position of visual human subsequently and the face orientation that obtain through f to upgrade residing position of current visual human and face orientation;
H. repeat b to g, b carries out time of one time smaller or equal to 1/16 second to g.
As shown in Figure 3, Fig. 3 is the equipment synoptic diagram of user side in the embodiments of the invention, comprise a display 5, keyboard 4, a mouse 6 and a main frame 7, the user controls visual human's walking and facial rotation by keyboard 4 and mouse 6, and come that by display 5 object scene is carried out vision and browse, main frame 7 is used to support the work of browsing software.

Claims (10)

1. an outdoor scene virtual reality method is characterized in that, may further comprise the steps at least:
A. in real zone, take many Zhang Quanjings photo;
B. encapsulate taking resulting whole distant view photographs among the step a, obtain the virtual scene data in reality zone;
C. the user conducts interviews to the virtual scene data by special-purpose browsing software.
2. method according to claim 1 is characterized in that, described step a comprises: a1. determines the zone that the visitor can be free movable, and this zone is the visual human can free movable zone in virtual scene, is called the reality zone; A2. determine a plurality of observation places in real zone, the observation place is the position that the visitor can stop, and also is the position that the visual human can stop in virtual scene, and the visitor can rest on the observation place so that surrounding environment is observed; A3. use photographic instrument in each observation place to the surrounding environment photo that pans.
3. method according to claim 1 is characterized in that, described step b comprises: observation place at photographic instrument place was numbered when all distant view photographs that b1. obtains shooting among a were taken according to it.
4. method according to claim 3, it is characterized in that, described step b also comprises: all distant view photographs after b2. will number are placed under the same catalogue or with them and write same data file, and this catalogue file or data file are called the virtual scene data in reality zone.
5. method according to claim 1 is characterized in that, the job step of the browsing software among the described step c comprises: c1. determine current visual human in virtual scene residing position and facial towards.
6. method according to claim 1, it is characterized in that, the job step of the browsing software among the described step c also comprises: c2. selects and the corresponding distant view photograph in the current present position of visual human, and the position at photographic instrument place is the observation place in the pairing real zone, the current present position of visual human when taking this distant view photograph; C3. selecting and the pairing image-region of the current face orientation of visual human in the resulting distant view photograph of c2, the content of this image-region is the current viewed scene of visual human.
7. method according to claim 6 is characterized in that, the job step of the browsing software among the described step c also comprises: c4. plays the resulting image-region through c3 on display.
8. method according to claim 7 is characterized in that, the job step of the browsing software among the described step c also comprises: the request that the real-time supervisory user of c5. is sent browser, with direction of motion and the facial rotation direction that obtains the visual human; C6. according to determining residing position of visual human and face orientation subsequently through the resulting visual human's of c5 direction of motion, facial rotation direction and visual human's present located position and face orientation; C7. use the residing position of visual human subsequently and the face orientation that obtain through c6 to upgrade residing position of current visual human and face orientation.
9. method according to claim 1 is characterized in that, the distant view photograph among the described step a is spherical panorama photo or cylindricality distant view photograph.
10. method according to claim 8 is characterized in that, described step c2 carries out a used time smaller or equal to 1/16 second to c7.
CN2010101879272A 2010-05-31 2010-05-31 Virtual reality method of actual scene Pending CN102262705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010101879272A CN102262705A (en) 2010-05-31 2010-05-31 Virtual reality method of actual scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010101879272A CN102262705A (en) 2010-05-31 2010-05-31 Virtual reality method of actual scene

Publications (1)

Publication Number Publication Date
CN102262705A true CN102262705A (en) 2011-11-30

Family

ID=45009329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010101879272A Pending CN102262705A (en) 2010-05-31 2010-05-31 Virtual reality method of actual scene

Country Status (1)

Country Link
CN (1) CN102262705A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN106066702A (en) * 2016-08-03 2016-11-02 温州大学 A kind of culture space analogy method based on Multimedia Digitalization technology
CN106157359A (en) * 2015-04-23 2016-11-23 中国科学院宁波材料技术与工程研究所 A kind of method for designing of virtual scene experiencing system
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes
CN107093204A (en) * 2017-04-14 2017-08-25 苏州蜗牛数字科技股份有限公司 It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama
CN107957772A (en) * 2016-10-17 2018-04-24 阿里巴巴集团控股有限公司 The method that the processing method of VR images is gathered in reality scene and realizes VR experience
CN109656441A (en) * 2018-12-21 2019-04-19 广州励丰文化科技股份有限公司 A kind of guide method and system based on virtual reality
CN113096453A (en) * 2020-01-08 2021-07-09 沈阳农业大学 Sharing type panoramic teaching mode and teaching system thereof

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106157359A (en) * 2015-04-23 2016-11-23 中国科学院宁波材料技术与工程研究所 A kind of method for designing of virtual scene experiencing system
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN106066702A (en) * 2016-08-03 2016-11-02 温州大学 A kind of culture space analogy method based on Multimedia Digitalization technology
CN107957772A (en) * 2016-10-17 2018-04-24 阿里巴巴集团控股有限公司 The method that the processing method of VR images is gathered in reality scene and realizes VR experience
CN106940897A (en) * 2017-03-02 2017-07-11 苏州蜗牛数字科技股份有限公司 A kind of method that real shadow is intervened in AR scenes
CN107093204A (en) * 2017-04-14 2017-08-25 苏州蜗牛数字科技股份有限公司 It is a kind of that the method for virtual objects effect of shadow is influenceed based on panorama
CN109656441A (en) * 2018-12-21 2019-04-19 广州励丰文化科技股份有限公司 A kind of guide method and system based on virtual reality
CN109656441B (en) * 2018-12-21 2020-11-06 广州励丰文化科技股份有限公司 Navigation method and system based on virtual reality
CN113096453A (en) * 2020-01-08 2021-07-09 沈阳农业大学 Sharing type panoramic teaching mode and teaching system thereof

Similar Documents

Publication Publication Date Title
US10893219B2 (en) System and method for acquiring virtual and augmented reality scenes by a user
CN102262705A (en) Virtual reality method of actual scene
CN100534158C (en) Generating images combining real and virtual images
US8217995B2 (en) Providing a collaborative immersive environment using a spherical camera and motion capture
CN109584295A (en) The method, apparatus and system of automatic marking are carried out to target object in image
CN107168534B (en) Rendering optimization method and projection method based on CAVE system
CN110850977B (en) Stereoscopic image interaction method based on 6DOF head-mounted display
CN111880659A (en) Virtual character control method and device, equipment and computer readable storage medium
CN108377361B (en) Display control method and device for monitoring video
US10861249B2 (en) Methods and system for manipulating digital assets on a three-dimensional viewing platform
WO2020084951A1 (en) Image processing device and image processing method
CN106028115A (en) Video playing method and device
CN107957772B (en) Processing method for collecting VR image in real scene and method for realizing VR experience
US20200273243A1 (en) Remote monitoring and assistance techniques with volumetric three-dimensional imaging
JP2021034885A (en) Image generation device, image display device, and image processing method
CN113941138A (en) AR interaction control system, device and application
JP6709426B2 (en) Image display control device and program
CN113253843B (en) Indoor virtual roaming realization method and realization system based on panorama
CN115857163A (en) OsgEarth-based holographic intelligent sand table display method and device and medium
Chheang et al. Natural embedding of live actors and entities into 360 virtual reality scenes
JP7006912B2 (en) Image processing device, image display device and image processing program
CN112825215A (en) Nuclear power plant anti-anthropogenic training system and method based on virtual reality technology
Zhang et al. Virtual Museum Scene Design Based on VRAR Realistic Interaction under PMC Artificial Intelligence Model
US20230177759A1 (en) Systems and methods for facilitating scalable shared rendering
US20240078767A1 (en) Information processing apparatus and information processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
DD01 Delivery of document by public notice

Addressee: Meng Bing

Document name: Notification of Publication of the Application for Invention

DD01 Delivery of document by public notice

Addressee: Meng Bing

Document name: Notification that Application Deemed to be Withdrawn

DD01 Delivery of document by public notice

Addressee: Meng Bing

Document name: Notification of before Expiration of Request of Examination as to Substance

C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20111130