CN104021585A - Three-dimensional exhibition method based on real scene - Google Patents
Three-dimensional exhibition method based on real scene Download PDFInfo
- Publication number
- CN104021585A CN104021585A CN201410251245.1A CN201410251245A CN104021585A CN 104021585 A CN104021585 A CN 104021585A CN 201410251245 A CN201410251245 A CN 201410251245A CN 104021585 A CN104021585 A CN 104021585A
- Authority
- CN
- China
- Prior art keywords
- real scene
- client
- method based
- distant view
- view photograph
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Studio Devices (AREA)
- Processing Or Creating Images (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention provides a three-dimensional exhibition method based on a real scene. The method relates to a server and a plurality of clients connected with the server, wherein the clients can be wearing-type equipment or mobile clients. When the clients are the wearing-type equipment, the real scene can be reproduced truly, virtual wandering and exhibition can be achieved, and the method is quite matched with the advancing direction of techniques. When the clients are the mobile clients, a panorama effect can be shown in a webpage mode on a desktop system, a mobile phone IOS, android and other platforms, and image browsing can be achieved under the condition that extra components do not need to be installed.
Description
Technical field
The invention belongs to virtual reality field, especially utilize the technology of panoramic digital image technique reproducible real scene, can be widely used in the fields such as tourism, real estate, education and game.
Background technology
By the real scene of 3d technology demonstration true to nature, can stay indoors and can realize tourism or see room, make user spend few expense and can obtain approaching real experience, this technology will constantly improving and acquisition application widely on internet, Wearable equipment, mobile device along with function.
At present, the technology that realizes above-mentioned functions mainly contains following several:
1) utilize dimensional Modeling Technology to produce and approach real three-dimensional scenic, carry out the simulation of real world.Although the method is used flexibly, cost of manufacture is large, the sense of reality is not enough, and data processing amount is large, to equipment performance, requires high.
2) utilize digital picture to process, produce panoramic digital image, according to the variation at visual angle and real time modelling goes out the image that eyes show, thereby obtain real scene impression.Obviously, the method precision is high, few to equipment performance requirement, and the sense of reality is very strong.At present, realize image and show, conventionally adopt JAVA or FLASH technology to realize.For example, in patent 2009102425467, disclose the panoramic virtual tour method based on panorama sketch that a kind of like this FLASH of utilization technology realizes.But, along with the progress of technology, the especially development of mobile Internet, people require more and more higher for the effect of every technology, speed, ease-to-operate etc.This scape of will demanding perfection represents real scene and has following characteristic: light weight, cross-platform, clear.Adopt JAVA technology, cause small product size large, speed is slow; Adopt FLASH technology, capacity is little, performance requirement is high, can not be cross-platform.
Summary of the invention
For the problems referred to above, the invention provides a kind of light weight, the three-dimensional display method based on real scene that cross-platform, display effect is good.
The invention provides a kind of three-dimensional display method based on real scene, it comprises that server is connected the client of described server with several, and described client comprises at least one display, it is characterized in that: it comprises the following steps:
1) finding a view a little, the digital picture that adopts fish eye lens to take in order the whole visual angles centered by camera, described digital picture at least comprise camera front, rear, left and right and directly over image;
2) Digital image synthesis, server carries out fusion treatment by the whole described digital picture of finding a view a little, in order whole described digital pictures are projected on sphere, and the edge between described digital picture is carried out to characteristic matching and image co-registration, final acquisition be take the spherical panorama photo of finding a view a little as the centre of sphere;
3) output, exports described distant view photograph to described client, and on the display of described client, shows the local described distant view photograph at acquiescence visual angle;
4) by sensor, identify the variation at described client visual angle, on described display, show the local described distant view photograph of corresponding described client current visual angle.
Preferably, described client is Wearable display device, it comprises a shell, two displays that correspond respectively to two eyes of wearer are set in described shell, the gyroscope that the described shell angle of one monitoring changes is also set in described shell, described server connects described display to carry video and picture signal by video line, and described gyroscope connects described server to carry the angle of described client to change by USB connecting line.
Preferably, described step 2) in, the feature based on bore hole 3D, synthesizes respectively left eye panorama sketch and right eye panorama sketch, and the interpupillary distance of right and left eyes adopts into interpupillary distance for each person; In described step 3), by 3D, show that software flows to respectively the display of corresponding left eye and the display of corresponding right eye by left eye panorama sketch and right eye panorama sketch.
Preferably, on described shell, be also provided with adjusting knob, described adjusting knob regulate two described in spacing and angle between display.
Preferably, described step 2) further comprising the steps of:
A) adopt PS technology to modify described distant view photograph, eliminate ghost image, burr and sawtooth in described distant view photograph;
B) adopt WEBGL technology to play up described distant view photograph, to improve the display quality of described distant view photograph.
Preferably, one of described client is following equipment: smart mobile phone and panel computer, described client connects described server by mobile Internet.
Preferably, described step 2) and 3) between in comprise the following steps:
C) described distant view photograph is carried out to low damage compression;
D) adopt HTML5 fabrication techniques webpage, on described webpage, show the described distant view photograph of specified view angle; And in step 3), described webpage is flowed to described client.
Preferably, further comprising the steps of in described step 3):
E) the described distant view photograph that described webpage utilizes WEBGL to show on to described webpage in described client is played up, to improve the display quality of described distant view photograph.
Preferably, in described step 4), by described webpage, according to the variation at described client visual angle, be converted into the visual angle change of spherical space corresponding, and the described panoramic pictures at corresponding visual angle is presented to described client, to obtain impression on the spot in person.
Three-dimensional display method based on real scene of the present invention adopts Wearable equipment mode and browser model to represent the virtual effect of real scene.When adopting mobile terminal, the technology such as its plug-in Automatically invoked WEBGL, image is played up, thereby save volume of transmitted data and reach good display effect, and can cross over the platforms such as desktop system, mobile phone IOS, Android, do not need to install in the situation of additional components, can realize picture browsing; When adopting Wearable equipment, by video data and separately transmission of control signal, improved the response speed of equipment, improve and show fluency.
Accompanying drawing explanation
Below in conjunction with the drawings and specific embodiments, the present invention is further detailed explanation.
Fig. 1 is the treatment scheme schematic diagram of first embodiment of the invention;
Fig. 2 is the structural representation of second embodiment of the invention.
Embodiment
In order to make those skilled in the art person understand better the present invention program, and above-mentioned purpose of the present invention, feature and advantage can be become apparent more, below in conjunction with embodiment and embodiment accompanying drawing, the present invention is further detailed explanation.
The invention provides a kind of three-dimensional display method based on real scene, comprise the client of server and several connection servers.The invention provides 2 kinds of embodiment and implement method of the present invention: mobile terminal and Wearable equipment.
In the first embodiment, while adopting mobile terminal, its treatment scheme as shown in Figure 1:
1) take the digital picture at the whole visual angles centered by finding a view a little;
Its concrete step is:
1.1) select scene, formulate roaming route;
1.2) fixed tripod, regulates tripod level;
1.3) set up The Cloud Terrace, regulate The Cloud Terrace level;
1.4) camera is installed, is adopted fish-eye camera, fish eye lens contributes to increase takes visual angle, in image is synthetic, obtains more overlapping region, improves fusion mass;
1.5) determine the first visual angle, take first photo; Clockwise rotate, every 90 degree are taken a photo, last flip vertical 90 degree, photo directly over shooting;
1.6) along roaming route, take the digital photos of next scene, repeat above-mentioned action;
1.7) by scene order and shooting order, photo is input to server and carries out Digital image synthesis operation; Above-mentioned steps is simple, easy to operate, is adapted to common user.
2) Digital image synthesis, server carries out fusion treatment by whole digital pictures of finding a view a little, builds to take the spherical panorama photo of finding a view a little as the centre of sphere; This technology is ripe at present, as disclosed method in patent 2009102425467; In this process, can carry out certain compression to digital picture, to reduce the volume of image; The present invention adopts and in order whole described digital pictures to be projected on sphere, and the edge between described digital picture is carried out to characteristic matching and image co-registration, and final acquisition be take the method acquisition spherical panorama photo of finding a view a little as the spherical panorama photo of the centre of sphere.
3) spherical panorama photo is compressed, and adopt HTML5 fabrication techniques webpage, on webpage, show the distant view photograph of specified view angle;
4) output webpage is to client;
5) customer end adopted WEBGL technology is played up the distant view photograph showing on webpage, to improve the display quality of distant view photograph, makes up the image impairment causing because of compression of images;
6) by the variation at sensor identify customer end visual angle, adjust the visual angle of the distant view photograph showing, to obtain impression on the spot in person, this sensor can adopt and be widely used in the gyroscope of mobile phone, panel computer or other can realize the sensor of this function arbitrarily; It can also increase acceleration transducer etc. further, obtains direction and distance that client moves, thereby along roaming route, represents gradually the distant view photograph that difference is found a view a little, the impression of moving in real scene to obtain people.
By above step, can realize cross-platform exhibiting panoramagram, anyone can utilize digital camera (especially slr camera) to take the comprehensive photo of anywhere as requested, uploads onto the server; Server merges picture automatically, the panoramic pictures of synthesizing spherical, and be made into webpage; Client is to this webpage, because browser is standard configuration, so do not need that any other support software is installed or parts can be opened this webpage, browses the panoramic pictures obtaining.Panoramic pictures is before transferring to client, first through overcompression, to reduce the data volume of transmission; Client, when showing, is utilized the technology such as WEBGL, picture is played up, significantly to improve the quality of picture.The present invention has adapted to mobile Internet age well, and the demand of " sharing " whenever and wherever possible, has market outlook.
As shown in Figure 2, the client of the second embodiment is Wearable display device, it comprises a shell 10, shell 10 is interior arranges two displays 12 that correspond respectively to two eyes of wearer, the gyroscope (not shown) that one monitoring shell 10 angles change is also set in shell 10, server (not shown) connects display to carry video and the picture signal of high definition by video line 14, and by USB connecting line 16 connection servers, the angle with conveying shell 10 changes gyroscope.
Its treatment step is slightly different from the first embodiment, at server, panoramic pictures is played up, and so just can simplify the requirement to Wearable display device, and any one has the Wearable equipment of Presentation Function all can realize this method.
In addition, server calculates the real visual angle deviation of left and right eyes according to people's interpupillary distance, make the webpage of the distant view photograph at eyes visual angle, two corresponding left and right of difference, and two webpages are flowed to respectively to a display.Like this, can avoid the different distortions that cause in visual angle because of two eyes.
On shell 10, be also provided with the knob 18 that regulates interpupillary distance, spacing and the angle of 18 two displays of knob regulate.Thereby, make different people wear this Wearable display device and can obtain the very high sense of reality.
The above is only the specific embodiment of the present invention.Protection scope of the present invention is not limited to this, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses, and the variation that can expect easily or replacement, within all should being encompassed in protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain that claim was defined.
Claims (9)
1. the three-dimensional display method based on real scene, it comprises that server is connected the client of described server with several, described client comprises at least one display, it is characterized in that: it comprises the following steps:
1) finding a view a little, the digital picture that adopts fish eye lens to take in order the whole visual angles centered by camera, described digital picture at least comprise camera front, rear, left and right and directly over image;
2) Digital image synthesis, server carries out fusion treatment by the whole described digital picture of finding a view a little, in order whole described digital pictures are projected on sphere, and the edge between described digital picture is carried out to characteristic matching and image co-registration, final acquisition be take the spherical panorama photo of finding a view a little as the centre of sphere;
3) output, exports described distant view photograph to described client, and on the display of described client, shows the local described distant view photograph at acquiescence visual angle;
4) by sensor, identify the variation at described client visual angle, on described display, show the local described distant view photograph of corresponding described client current visual angle.
2. the three-dimensional display method based on real scene according to claim 1, it is characterized in that: described client is Wearable display device, it comprises a shell, two displays that correspond respectively to two eyes of wearer are set in described shell, the gyroscope that the described shell angle of one monitoring changes is also set in described shell, described server connects described display to carry video and picture signal by video line, and described gyroscope connects described server to carry the angle of described client to change by USB connecting line.
3. the three-dimensional display method based on real scene according to claim 2, is characterized in that: described step 2), the feature based on bore hole 3D, synthesizes respectively left eye panorama sketch and right eye panorama sketch, and the interpupillary distance of right and left eyes adopts into interpupillary distance for each person; In described step 3), by 3D, show that software flows to respectively the display of corresponding left eye and the display of corresponding right eye by left eye panorama sketch and right eye panorama sketch.
4. the three-dimensional display method based on real scene according to claim 3, is characterized in that: on described shell, be also provided with adjusting knob, described adjusting knob regulate two described in spacing and angle between display.
5. the three-dimensional display method based on real scene according to claim 4, is characterized in that: described step 2) further comprising the steps of:
A) adopt PS technology to modify described distant view photograph, eliminate ghost image, burr and sawtooth in described distant view photograph;
B) adopt WEBGL technology to play up described distant view photograph, to improve the display quality of described distant view photograph.
6. the three-dimensional display method based on real scene according to claim 1, is characterized in that: one of described client is following equipment: smart mobile phone and panel computer, described client connects described server by mobile Internet.
7. the three-dimensional display method based on real scene according to claim 1, is characterized in that: described step 2) and 3) between in comprise the following steps:
C) described distant view photograph is carried out to low damage compression;
D) adopt HTML5 fabrication techniques webpage, on described webpage, show the described distant view photograph of specified view angle; And in step 3), described webpage is flowed to described client.
8. the three-dimensional display method based on real scene according to claim 7, is characterized in that: further comprising the steps of in described step 3):
E) the described distant view photograph that described webpage utilizes WEBGL to show on to described webpage in described client is played up, to improve the display quality of described distant view photograph.
9. the three-dimensional display method based on real scene according to claim 8, it is characterized in that: in described step 4), by described webpage according to the variation at described client visual angle, be converted into the visual angle change of spherical space corresponding, and the described panoramic pictures at corresponding visual angle is presented to described client, to obtain impression on the spot in person.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410251245.1A CN104021585B (en) | 2014-06-09 | 2014-06-09 | Three-dimensional exhibition method based on real scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410251245.1A CN104021585B (en) | 2014-06-09 | 2014-06-09 | Three-dimensional exhibition method based on real scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104021585A true CN104021585A (en) | 2014-09-03 |
CN104021585B CN104021585B (en) | 2017-04-26 |
Family
ID=51438321
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410251245.1A Expired - Fee Related CN104021585B (en) | 2014-06-09 | 2014-06-09 | Three-dimensional exhibition method based on real scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104021585B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105306887A (en) * | 2015-09-21 | 2016-02-03 | 北京奇虎科技有限公司 | Method and device for sharing panoramic data |
CN105597313A (en) * | 2015-09-16 | 2016-05-25 | 网易(杭州)网络有限公司 | Spherical scene presentation method and presentation device |
CN105975172A (en) * | 2015-12-11 | 2016-09-28 | 乐视网信息技术(北京)股份有限公司 | Method and device for adjustment of panoramic video and mobile terminal |
WO2017032336A1 (en) * | 2015-08-26 | 2017-03-02 | Holumino Limited | System and method for capturing and displaying images |
CN106887033A (en) * | 2017-01-20 | 2017-06-23 | 腾讯科技(深圳)有限公司 | The rendering intent and device of scene |
CN107198876A (en) * | 2017-06-07 | 2017-09-26 | 北京小鸟看看科技有限公司 | The loading method and device of scene of game |
CN107844190A (en) * | 2016-09-20 | 2018-03-27 | 腾讯科技(深圳)有限公司 | Image presentation method and device based on Virtual Reality equipment |
CN108848311A (en) * | 2018-07-25 | 2018-11-20 | 北京小米移动软件有限公司 | Distant view photograph display methods and device |
CN108960951A (en) * | 2017-05-23 | 2018-12-07 | 阿里巴巴集团控股有限公司 | A kind of method and apparatus of order processing |
CN110083231A (en) * | 2019-03-12 | 2019-08-02 | 杭州电子科技大学 | A kind of WebGL panorama display methods shown towards Android VR integral type head |
CN110968962A (en) * | 2019-12-19 | 2020-04-07 | 武汉英思工程科技股份有限公司 | Cloud rendering-based three-dimensional display method and system at mobile terminal or large screen |
CN114651221A (en) * | 2019-09-11 | 2022-06-21 | 萨万特系统公司 | Three-dimensional virtual room-based user interface for home automation system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101877140A (en) * | 2009-12-18 | 2010-11-03 | 北京邮电大学 | Panorama-based panoramic virtual tour method |
-
2014
- 2014-06-09 CN CN201410251245.1A patent/CN104021585B/en not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101877140A (en) * | 2009-12-18 | 2010-11-03 | 北京邮电大学 | Panorama-based panoramic virtual tour method |
Non-Patent Citations (7)
Title |
---|
刘子建等编: "《计算机辅助设计 CAD原理与应用技术》", 30 September 1997 * |
刘海娜: "基于HTML5的全景漫游技术研究", 《中国优秀硕士学位论文全文数据库信息科技辑(月刊)》 * |
张哲等: "三维全景效果图的实现技术和应用", 《计算机系统应用》 * |
张辉等: "全景图像生成算法的研究与实现", 《计算机工程》 * |
杨超然: "基于球面模型的鱼眼图像拼接技术的研究与实现", 《中国优秀硕士学位论文全文数据库信息科技辑(月刊)》 * |
鲍豫鸿: "基于移动互联网的三维全景展示系统", 《网络与信息工程》 * |
鲍豫鸿等: "基于三维真景的漫游校园平台", 《电脑知识与技术》 * |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017032336A1 (en) * | 2015-08-26 | 2017-03-02 | Holumino Limited | System and method for capturing and displaying images |
CN105597313B (en) * | 2015-09-16 | 2020-03-03 | 网易(杭州)网络有限公司 | Spherical scene display method and display device |
CN105597313A (en) * | 2015-09-16 | 2016-05-25 | 网易(杭州)网络有限公司 | Spherical scene presentation method and presentation device |
CN105306887A (en) * | 2015-09-21 | 2016-02-03 | 北京奇虎科技有限公司 | Method and device for sharing panoramic data |
CN105975172A (en) * | 2015-12-11 | 2016-09-28 | 乐视网信息技术(北京)股份有限公司 | Method and device for adjustment of panoramic video and mobile terminal |
CN107844190B (en) * | 2016-09-20 | 2020-11-06 | 腾讯科技(深圳)有限公司 | Image display method and device based on virtual reality VR equipment |
CN107844190A (en) * | 2016-09-20 | 2018-03-27 | 腾讯科技(深圳)有限公司 | Image presentation method and device based on Virtual Reality equipment |
WO2018054267A1 (en) * | 2016-09-20 | 2018-03-29 | 腾讯科技(深圳)有限公司 | Image display method and device utilized in virtual reality-based apparatus |
US10754420B2 (en) | 2016-09-20 | 2020-08-25 | Tencent Technology (Shenzhen) Company Limited | Method and device for displaying image based on virtual reality (VR) apparatus |
CN106887033A (en) * | 2017-01-20 | 2017-06-23 | 腾讯科技(深圳)有限公司 | The rendering intent and device of scene |
WO2018133757A1 (en) * | 2017-01-20 | 2018-07-26 | 腾讯科技(深圳)有限公司 | Method and device for rendering scene, and storage medium |
CN108960951A (en) * | 2017-05-23 | 2018-12-07 | 阿里巴巴集团控股有限公司 | A kind of method and apparatus of order processing |
CN107198876A (en) * | 2017-06-07 | 2017-09-26 | 北京小鸟看看科技有限公司 | The loading method and device of scene of game |
CN108848311A (en) * | 2018-07-25 | 2018-11-20 | 北京小米移动软件有限公司 | Distant view photograph display methods and device |
CN110083231A (en) * | 2019-03-12 | 2019-08-02 | 杭州电子科技大学 | A kind of WebGL panorama display methods shown towards Android VR integral type head |
CN110083231B (en) * | 2019-03-12 | 2022-04-08 | 杭州电子科技大学 | WebGL panoramic display method for android VR integrated head display |
CN114651221A (en) * | 2019-09-11 | 2022-06-21 | 萨万特系统公司 | Three-dimensional virtual room-based user interface for home automation system |
CN110968962A (en) * | 2019-12-19 | 2020-04-07 | 武汉英思工程科技股份有限公司 | Cloud rendering-based three-dimensional display method and system at mobile terminal or large screen |
CN110968962B (en) * | 2019-12-19 | 2023-05-12 | 武汉英思工程科技股份有限公司 | Three-dimensional display method and system based on cloud rendering at mobile terminal or large screen |
Also Published As
Publication number | Publication date |
---|---|
CN104021585B (en) | 2017-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104021585A (en) | Three-dimensional exhibition method based on real scene | |
US10217189B2 (en) | General spherical capture methods | |
US11024083B2 (en) | Server, user terminal device, and control method therefor | |
CN106101741B (en) | Method and system for watching panoramic video on network video live broadcast platform | |
EP3166079A1 (en) | Augmented reality method and system based on wearable device | |
EP3337158A1 (en) | Method and device for determining points of interest in an immersive content | |
US20150235408A1 (en) | Parallax Depth Rendering | |
WO2018000609A1 (en) | Method for sharing 3d image in virtual reality system, and electronic device | |
CN106296781B (en) | Special effect image generation method and electronic equipment | |
CN102075694A (en) | Stereoscopic editing for video production, post-production and display adaptation | |
WO2017128887A1 (en) | Method and system for corrected 3d display of panoramic image and device | |
JP7425196B2 (en) | hybrid streaming | |
WO2019076348A1 (en) | Virtual reality (vr) interface generation method and apparatus | |
TW201828258A (en) | Method and device for rendering scene | |
CN105812768A (en) | Method and system for playing 3D video in VR (Virtual Reality) device | |
TW201701051A (en) | Panoramic stereoscopic image synthesis method, apparatus and mobile terminal | |
JP7110378B2 (en) | METHOD AND PROGRAM FOR PROVIDING AUGMENTED REALITY IMAGE USING DEPTH DATA | |
US20210058611A1 (en) | Multiviewing virtual reality user interface | |
US11887249B2 (en) | Systems and methods for displaying stereoscopic rendered image data captured from multiple perspectives | |
CN106547557A (en) | A kind of multi-screen interactive exchange method based on virtual reality and bore hole 3D | |
TWI817335B (en) | Stereoscopic image playback apparatus and method of generating stereoscopic images thereof | |
TWM630947U (en) | Stereoscopic image playback apparatus | |
WO2024174050A1 (en) | Video communication method and device | |
US11688124B2 (en) | Methods and apparatus rendering images using point clouds representing one or more objects | |
US9875526B1 (en) | Display of three-dimensional images using a two-dimensional display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20170426 Termination date: 20180609 |
|
CF01 | Termination of patent right due to non-payment of annual fee |