CN114758062A - Panoramic roaming scene construction method and device, computer equipment and storage medium - Google Patents

Panoramic roaming scene construction method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114758062A
CN114758062A CN202210269585.1A CN202210269585A CN114758062A CN 114758062 A CN114758062 A CN 114758062A CN 202210269585 A CN202210269585 A CN 202210269585A CN 114758062 A CN114758062 A CN 114758062A
Authority
CN
China
Prior art keywords
calibration
coordinate
roaming
panorama
panoramic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210269585.1A
Other languages
Chinese (zh)
Inventor
张莹
曾少铭
马亮亮
郑少贤
李泽明
王初凌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCB Finetech Co Ltd
Original Assignee
CCB Finetech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCB Finetech Co Ltd filed Critical CCB Finetech Co Ltd
Priority to CN202210269585.1A priority Critical patent/CN114758062A/en
Publication of CN114758062A publication Critical patent/CN114758062A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The disclosure relates to a panoramic roaming scene construction method, a panoramic roaming scene construction device, computer equipment and a storage medium, and relates to the technical field of artificial intelligent virtual reality. The method comprises the following steps: calculating a coordinate deviation rate between a roaming panoramic image and a three-dimensional panoramic image, wherein the roaming panoramic image is obtained by inputting the three-dimensional panoramic image into a panoramic roaming production platform; adjusting the roaming panoramic view according to the coordinate deviation rate to obtain a calibration panoramic view; constructing a project model according to the calibration panorama and the panoramic roaming production platform; adjusting point location space data of the project model according to the coordinate deviation rate to obtain a calibration model; and integrating and connecting the calibration panoramic image and the calibration model to obtain a panoramic roaming scene. By adopting the method, the accuracy of the space coordinate can be ensured and the scene browsing can be smoothly carried out when the panorama roams.

Description

Panoramic roaming scene construction method and device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of artificial intelligence virtual reality technologies, and in particular, to a method and an apparatus for constructing a panoramic roaming scene, a computer device, and a storage medium.
Background
With the continuous improvement of the virtual reality technology, a virtual three-dimensional space is built through computer-aided software, so that a real display environment is simulated, and immersive browsing and visual and vivid interactive experience are visually provided for people. A virtual 3D scene is built by using the panoramic image, so that the computer is helped to save a large amount of operating cost, the scene content can be loaded more quickly, and the space effect can be displayed in time.
At present, the panorama is mainly applied to some panoramic roaming manufacturing tools, and the panorama is imported into a panoramic software manufacturing platform and then displayed by a Web end. But the prior finished product has a plurality of visual problems which influence the experience effect. At present, after a project model corresponding to a panoramic image is generated by using a panoramic roaming platform, when panoramic roaming is performed through the project model, point location coordinates are seriously deviated in the process of switching the panoramic image, the accuracy of space coordinates cannot be ensured, and the visual browsing effect is influenced due to the obvious image dragging condition.
Disclosure of Invention
In view of the above, it is necessary to provide a panoramic roaming scene construction method, apparatus, computer device, and storage medium, which can ensure the accuracy of spatial coordinates during panoramic roaming and perform scene browsing smoothly.
In a first aspect, the present disclosure provides a method for constructing a panoramic roaming scene, where the method includes:
calculating the coordinate deviation rate between a roaming panoramic image and a three-dimensional panoramic image, wherein the roaming panoramic image is obtained by inputting the three-dimensional panoramic image into a panoramic roaming manufacturing platform;
adjusting the roaming panorama according to the coordinate deviation rate to obtain a calibration panorama;
constructing a project model according to the calibration panorama and the panorama roaming production platform;
adjusting point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model;
and integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene.
In one embodiment, the calculating a coordinate deviation ratio between the roaming panorama and the three-dimensional panorama comprises:
acquiring a first position difference value of a preset position in the roaming panorama and a second position difference value of the preset position in the three-dimensional panorama, wherein the number of the preset positions is at least two, and the vertical coordinate position of each preset position is the same;
calculating to obtain a coordinate difference ratio according to the first position difference and the second position difference, wherein the coordinate difference ratio comprises a first coordinate axis difference ratio and a second coordinate axis difference ratio;
And calculating to obtain a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
In one embodiment, the adjusting the roaming panorama according to the coordinate deviation ratio to obtain a calibrated panorama includes:
adjusting each coordinate point in the roaming panoramic image according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value;
and determining a calibration panorama according to the calibration coordinate value.
In one embodiment, the adjusting point location space data of the item model according to the coordinate deviation ratio to obtain a calibration model includes:
performing a simplification process on the project model, wherein the simplification process comprises: and simplifying the number of faces of the project model on the basis of not influencing the project model structure.
In one embodiment, the adjusting point location space data of the item model according to the coordinate deviation ratio to obtain a calibration model includes:
adjusting point location space data in the project model according to the first deviation rate and the second deviation rate to obtain calibration space point location data;
And obtaining a calibration model according to the calibration space point location data.
In one embodiment, the integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene includes:
integrating and connecting the calibration panoramic image and the calibration model through an image splicing technology to obtain a panoramic roaming scene, wherein the image splicing technology comprises the following steps: image registration and image fusion, wherein the image registration algorithm adopts a region-based method.
In one embodiment, the roaming panorama, the three-dimensional panorama and the calibration panorama are spherical panoramas.
In a second aspect, the present disclosure further provides a panoramic roaming scene constructing apparatus, where the apparatus includes:
the calculation module is used for calculating the coordinate deviation rate between the roaming panoramic image and the three-dimensional panoramic image, wherein the roaming scene is obtained by inputting the three-dimensional scene into a panoramic roaming production platform;
the panorama calibration module is used for adjusting the roaming panorama according to the coordinate deviation rate to obtain a calibrated panorama;
the model building module is used for building a project model according to the calibration panorama and the panoramic roaming production platform;
The model calibration module is used for adjusting point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model;
and the integration module is used for integrating and connecting the calibration panoramic image and the calibration model to obtain a panoramic roaming scene.
In one embodiment of the apparatus, the calculation module comprises: the device comprises a coordinate difference value acquisition module, a difference value ratio calculation module and a deviation ratio calculation module;
the coordinate difference value acquisition module is used for acquiring a first position difference value of a preset position in the roaming panoramic image and a second position difference value of a preset position in the three-dimensional panoramic image, wherein the number of the preset positions is at least two, and the vertical coordinate positions of the preset positions are the same;
the difference ratio calculation module is used for calculating a coordinate difference ratio according to the first position difference and the second position difference, wherein the coordinate difference ratio comprises a first coordinate axis difference ratio and a second coordinate axis difference ratio;
and the deviation ratio calculation module is used for calculating a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
In one embodiment of the apparatus, the panorama calibration module comprises: the device comprises a deviation rate adjusting module and a coordinate value determining module;
the deviation rate adjusting module is used for adjusting each coordinate point in the roaming panorama according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value;
and the coordinate value determining module is used for determining a calibration panorama according to the calibration coordinate value.
In one embodiment of the apparatus, the apparatus further comprises: a simplification processing module, configured to perform simplification processing on the project model, where the simplification processing includes: and simplifying the number of faces of the project model on the basis of not influencing the project model structure.
In one embodiment of the apparatus, the model calibration module comprises: the system comprises a data adjusting module and a model determining module;
the data adjusting module is used for adjusting the point location space data in the project model according to the first deviation rate and the second deviation rate to obtain calibration space point location data;
and the model determining module is used for obtaining a calibration model according to the calibration space point location data.
In one embodiment of the apparatus, the integrating module is further configured to integrate and connect the calibration panorama and the calibration model through an image stitching technique to obtain a panoramic roaming scene, where the image stitching technique includes: image registration and image fusion, wherein the image registration algorithm adopts a region-based method.
In a third aspect, the present disclosure also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the steps of the above method when executing the computer program.
In a fourth aspect, the present disclosure also provides a computer-readable storage medium. The computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method.
In a fifth aspect, the present disclosure also provides a computer program product. The computer program product comprises a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method.
In the embodiments, the coordinate deviation rate between the roaming panorama and the three-dimensional panorama is calculated, the roaming panorama is adjusted according to the coordinate deviation rate, a calibration panorama is obtained, the calibration panorama is ensured to be closer to an actual scene, and the coordinate of the 2D layer is enabled not to deviate. And the point location space data of the project model is adjusted through the coordinate deviation rate, the project model can be integrated on the 3D layer, the accuracy of the point location space data is realized on the 3D layer, the deviation of the coordinates on the 3D layer is avoided, the calibration panoramic image and the 3D layer calibration model on the 2D layer are further integrated, and the obtained panoramic roaming scene point location cannot deviate. The panoramic roaming scene obtained finally can be browsed smoothly without obstacles and offset, and the experience effect of free walking in the virtual space is achieved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic diagram of an application environment of a panoramic roaming scene construction method in an embodiment;
FIG. 2 is a flowchart illustrating a method for constructing a panoramic roaming scene according to an embodiment;
FIG. 3 is a flowchart illustrating the step S20 according to an embodiment;
FIG. 4 is a flowchart illustrating a process before the step S50 and the step S50 in one embodiment;
FIG. 5 is a flowchart illustrating a panoramic roaming scene construction method according to another embodiment;
FIG. 6 is a block diagram illustrating an exemplary panoramic roaming scene constructing apparatus;
FIG. 7 is a diagram showing an internal configuration of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present disclosure more clearly understood, the present disclosure is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the disclosure and are not intended to limit the disclosure.
It should be noted that the terms "first," "second," and the like in the description and claims herein and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments herein described are capable of operation in sequences other than those illustrated or described herein. Moreover, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, apparatus, article, or device that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or device.
In this document, the term "and/or" is only one kind of association relation describing the associated object, and means that there may be three kinds of relations. For example, a and/or B, may represent: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter associated objects are in an "or" relationship.
In the present disclosure, a panorama, typically a wide-angle map, may exist in the form of a painting, photograph, video, three-dimensional model. Modern panoramas represent the surrounding environment as much as possible by means of wide-angle representation and forms such as painting, photos, videos and three-dimensional models. The 360-degree panorama is to capture image information of the whole scene by a professional camera or use a picture rendered by modeling software, use the software to split the picture, use a special player to play, namely change a plane photo or a computer modeling picture into a 360-degree full view for virtual reality browsing, simulate a two-dimensional plane picture into a real three-dimensional space, and present the real three-dimensional space to an observer.
As described in the background, with the development of virtual reality technology, the VR virtual reality technology breaks the two-dimensional boundary of time and space, which is also a technical innovation application of the virtual reality technology. Meanwhile, the generation technology of the panoramic image is more and more diverse and universal. The form of presenting a virtual space through web or html5 at present can be roughly divided into two types. The first is a spatial presentation of a real scene. And the other is an imaginary scene space display. Both are presented in the same form, but the producers of the content are not identical. For the virtual space, the corresponding panorama needs to be more complicated to manufacture, and the period of the panorama is longer. But facilitates later scene modification.
Typically, the desired panorama can be generated on computer graphics software, such as three-dimensional modeling software. A three-dimensional space model can be established in three-dimensional modeling software, rendering requirements and parameter requirements of a panoramic image are set, and the panoramic image of a corresponding space is obtained by performing rendering calculation through calculation. And the panorama corresponding to the real scene is shot in a multi-angle all-round way through a camera, and is finally displayed on the terminal equipment through data acquisition, transmission and processing of a panoramic shooting technology.
The virtual display space based on the panoramic Image is formed by combining a two-dimensional panoramic Image IBR (Image-based-Rendering) drawn by an Image and a three-dimensional model manufactured on the basis of the panoramic Image, and integrating and connecting mutually associated images by using an Image splicing technology to realize that a scene is browsed in 360-degree surrounding time. Since the viewing angle range of normal eyes of a person is usually in the range of 90 ° horizontally and 70 ° vertically, 360 ° horizontally and 180 ° vertically around the viewing angle is required to view all scene information omnidirectionally. So that the person appears to be in the virtual display scene.
Panorama is mainly used in some panoramic roaming production tools. The current panoramic roaming production tools are Panno2VR, Krpano, everpano and the like. The code of Krpano is an open source attribute, so that an interface easy to expand is provided for developers, and at present, many panoramic service software which is interface-based is manufactured on the core of Krpano, so that the interface operation is visualized, the learning and manufacturing cost is reduced, but certain limitation is also provided. The panoramic making tool with the interface visualization attribute does not need to carry out secondary modeling on a panoramic picture, but can directly release the product link. But the disadvantages are: the jump transition between the panoramic image and the panoramic image is hard, the immersive experience feeling is poor, and the browsing transition is not smooth. Such panoramic effect essentially still belongs to a two-dimensional plane, does not include three-dimensional space information, and cannot be regarded as a real three-dimensional scene, let alone freely walking in the scene and randomly moving forward. And the visual panoramic roaming system easy to operate can only control the switching of the panoramic roaming scene through touch, a mouse or a keyboard, and the scene switching has a certain difference with the actual free-forward effect.
Therefore, to solve the above problem, the embodiments of the present disclosure provide a virtual space display method, which can be applied to the application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104, or may be located on the cloud or other network server. Three-dimensional modeling software may be loaded in the server 104 or the terminal 102, and a three-dimensional panorama is output through the three-dimensional modeling software. Inputting the three-dimensional panorama into a panorama roaming production platform in the terminal 102 or the server 104, inputting the three-dimensional panorama into the panorama roaming production platform, processing the three-dimensional panorama through the panorama roaming production platform, and deriving the roaming panorama. The terminal 102 may calculate a coordinate deviation ratio between the roaming panorama and the three-dimensional panorama. And adjusting the obtained roaming panoramic image according to the coordinate deviation rate to obtain a calibration panoramic image. And the terminal 102 adjusts the point location space data of the project model according to the coordinate deviation rate, and obtains the calibration model after adjustment. The terminal 102 integrates and connects the calibration model and the calibration panorama to obtain a panoramic roaming scene. The terminal 102 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and the like. The portable wearable device can be a smart watch, a smart bracelet, a head-mounted device, and the like. The server 104 may be implemented as a stand-alone server or as a server cluster comprised of multiple servers. It should be noted that the solution can also be used for the terminal 102 or the server 104 alone.
In an embodiment, as shown in fig. 2, a panoramic roaming scene constructing method is provided, which is described by taking the method as an example applied to the terminal 102 in fig. 1, and includes the following steps:
and S20, calculating the coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image, wherein the roaming panoramic image is obtained by inputting the three-dimensional panoramic image into a panoramic roaming production platform.
Wherein the three-dimensional panorama may generally be a panorama generated by three-dimensional modeling software, the panorama generated using the three-dimensional modeling software may be more easily modified than a panorama taken by a panoramic camera. The roaming panorama can be obtained by processing a three-dimensional panorama through a panorama roaming production platform. The panoramic roaming platform may typically be an Everpano 3D platform. The Everpano 3D is developed aiming at building a three-dimensional model for a Krpano roaming scene, the roaming function after the model is output is highly integrated with the Depthmap depth mapping technology of Krpano1.20, the reverse modeling and mapping of a Krpano panoramic scene space are realized without the support of stereo scanner data, the Krpano panoramic Web project can be three-dimensionally modeled, and the high-precision smooth forward roaming effect is realized. The implementation principle of Everpano is that the vertical surface contour of an indoor space and an object is drawn manually by adding lines and geometric bodies, so that a virtual space model is constructed, the direction is judged and roaming is realized according to the information such as the size, the relative position and the like of the contour. The coordinate deviation ratio is usually the deviation between the three-dimensional modeling software and the panoramic roaming production platform, and can be generally regarded as the difference proportion between the three-dimensional modeling software and the panoramic roaming production platform. In this embodiment, the Everpano 3D platform and the Everpano platform are the same platform.
In particular, since the three-dimensional modeling software and the panoramic roaming production platform are two different software operating platforms, there is usually a deviation in data derived from the two platforms. It is necessary to determine deviation data between the three-dimensional modeling software and the panoramic roaming production platform. Therefore, the three-dimensional panorama which is manufactured according to the actual scene through the three-dimensional modeling software is input into the panorama roaming manufacturing platform. And then a roaming panorama inputted by the panorama roaming making platform. And further calculating the coordinate deviation rate between the roaming panoramic image and the three-dimensional panoramic image.
And S30, adjusting the roaming panoramic view according to the coordinate deviation rate to obtain a calibration panoramic view.
Specifically, the coordinate values of the roaming panoramic image in the panoramic roaming production platform are redefined according to the coordinate deviation rate, and the calibration panoramic image is obtained. The calibration panorama will typically be closer to the actual scene.
And S40, constructing a project model according to the calibration panorama and the panorama roaming production platform.
Wherein the project model may typically be a model of a presentation space, typically a three-dimensional scene.
Specifically, coordinate acquisition can be performed according to the obtained calibration panorama, specific coordinate points of the calibration panorama in space are obtained, a multipoint path is continuously drawn, then a general three-dimensional model is sketched out through a panorama roaming manufacturing platform according to the information in a manual drawing mode, three-dimensional space modeling is reversely built on the basis of a two-dimensional plane (calibration panorama), and a project model is manufactured.
It should be noted that, the above is only performed in a manner of manually pulling a model, and a person skilled in the art may select other manners to make a project model according to the actual situation according to the calibration panorama and the panoramic roaming making platform, and a specific modeling method is not limited in this embodiment.
And S50, adjusting the point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model.
Specifically, the modeling is a project model constructed by calibrating a panorama, and may not be matched with an actual model, and output model data may be inaccurate and imprecise. Therefore, the project model needs to be adjusted through the coordinate deviation ratio, so that the project model is more accurate, and the adjusted project model can be a calibration model.
And S60, integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene.
Specifically, the calibration panorama and the calibration model may be integrated and connected, so that the calibration panorama and the calibration model are combined to obtain a panoramic roaming scene. The effect of experience immersive roaming can be maximized through the panoramic roaming scene.
According to the panoramic roaming scene construction method, the coordinate deviation rate between the roaming panoramic image and the three-dimensional panoramic image is calculated, the roaming panoramic image is re-optimized on the 2D level, the roaming panoramic image is adjusted according to the coordinate deviation rate, the calibration panoramic image is obtained, the calibration panoramic image is guaranteed to be closer to an actual scene, and the coordinate of the 2D level is enabled not to deviate. And the point location space data of the project model is adjusted through the coordinate deviation rate, the project model can be integrated on a 3D layer, the accuracy of the point location space data is realized on the 3D layer, the coordinate of the 3D layer is ensured not to deviate, and then the calibration panorama of a 2D layer and the 3D layer calibration model are integrated, and the obtained panoramic roaming scene point location is not deviated. The user can browse smoothly without obstacles and offsets. The experience effect of free walking in the virtual space is realized.
In one embodiment, as shown in fig. 3, the calculating a coordinate deviation ratio between the roaming panorama and the three-dimensional panorama S20 includes:
and S22, acquiring a first position difference value of a preset position in the roaming panorama and a second position difference value of a preset position in the three-dimensional panorama, wherein the number of the preset positions is at least two, and the vertical coordinate position of each preset position is the same.
The preset position may be a coordinate position determined by a person skilled in the art according to actual conditions in the roaming panorama and the three-dimensional panorama, and the number of the preset positions should be at least two, so that a position difference value can be calculated conveniently.
Specifically, a first position difference between at least two preset positions in the roaming panorama and a second position difference between at least two preset positions in the three-dimensional panorama are obtained. It should be noted that the preset position in the roaming panorama corresponds to the preset position in the three-dimensional panorama, so that the deviation ratio can be calculated finally. The panorama is typically a horizontal 360-degree panorama, so in this embodiment the preset position vertical coordinate positions may be set to be the same, and the position difference between the horizontal positions may be calculated.
Typically the preset position may be represented by three-dimensional coordinates. Three-dimensional coordinates may be generally represented by a horizontal axis x, a vertical axis y, and a vertical axis z. The three-dimensional coordinate corresponding to the z-axis can be generally regarded as a vertical coordinate position. The first position difference and the second position difference may generally include a difference between positions corresponding to the x-axis of the horizontal axis and the y-axis of the vertical axis.
In some exemplary embodiments, if the preset positions are three, the corresponding coordinates in the three-dimensional panoramic view are P1(x1, y1, z1), P2(x2, y2, z2), and P3(x3, y3, z 3). Wherein, z 1-z 2-z 3. The difference between the X axis and the Y axis of the X axis of the three positions P1, P2 and P3 is calculated. Comprises the following steps: x1-x2, x1-x3, x2-x3, y1-y2, y1-y3 and y2-y3, and the difference values in the three-dimensional panoramic image can be collectively called as a second position difference value. The corresponding coordinates of the three preset positions in the roaming panorama are respectively P1"(x1", y1", z1"), P2"(x2", y2", z2"), P3(x3", y3", z3 "). The difference between the X axis and the Y axis of the three positions of P1 ', P2 ', P3 ' is calculated as x1 ' -x2 ', x1 ' -x3 ', x2 ' -x3 ', y1 ' -y2 ', y1 ' -y3 ', y2 ' -y3 '. The difference values of the roaming panorama described above may be collectively referred to as a first position difference value.
It should be noted that, only three preset positions are exemplified here, and in an actual process, a person skilled in the art may select two or more preset positions for calculation as long as a position difference value of a corresponding panorama can be calculated.
And S24, calculating to obtain coordinate difference values according to the first position difference values and the second position difference values, wherein the coordinate difference value ratio comprises a first coordinate axis difference value ratio and a second coordinate axis difference value ratio.
Wherein the coordinate difference ratio may be a difference ratio of the three-dimensional panorama and the roaming panorama with respect to coordinate axes in general.
Specifically, the coordinate difference may be obtained by comparing the first position difference with the second position difference. Further, because the first position difference and the second position difference may generally include the difference between positions corresponding to the x-axis of the horizontal axis and the y-axis of the vertical axis. Therefore, the difference between the positions corresponding to the x axis of the horizontal axis in the first position difference and the second position difference can be compared to obtain the difference ratio of the first coordinate axis. The difference between the positions corresponding to the y axis of the longitudinal axis in the first position difference and the second position difference can be compared to obtain a difference ratio of the second coordinate axis.
In some exemplary embodiments, as described above, the second position difference is: x1-x2, x1-x3, x2-x3, y1-y2, y1-y3 and y2-y 3. The first position difference is: x1 '-x 2', x1 '-x 3', x2 '-x 3', y1 '-y 2', y1 '-y 3', y2 '-y 3'. The corresponding first coordinate axis difference ratio may include: (x1"-x2")/(x1-x2) ═ a0, (x1"-x3")/(x1-x3) ═ a1, (x2"-x3")/(x2-x3) ═ a 3; or, (x1-x2)/(x1"-x2") ═ a0, (x1-x3)/(x1"-x3") ═ a1, (x2-x3)/(x2"-x3") ═ a 2. The second coordinate axis difference ratio may include: (y1"-y2")/(y1-y2) ═ b0, (y1"-y3")/(y1-y3) ═ b1, (y2"-y3")/(y2-y3) ═ b 2; or, (y1-y2)/(y1"-y2") ═ b0, (y1-y3)/(y1"-y3") ═ b1, (y2-y3)/(y2"-y3") ═ b 2.
And S26, calculating to obtain a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
Specifically, if the difference ratio of the first coordinate axis is one, a first deviation ratio of the first coordinate axis between the roaming panorama and the three-dimensional panorama can be obtained according to the difference ratio of the first coordinate axis. If the difference ratio of the first coordinate axis is multiple items, a first deviation ratio of the first coordinate axis between the roaming panoramic image and the three-dimensional panoramic image can be obtained through calculation according to the difference ratio of each first coordinate axis and the number of corresponding items.
If the difference ratio of the second coordinate axis is one, a second deviation ratio of the second coordinate axis between the roaming panoramic image and the three-dimensional panoramic image can be obtained through calculation according to the difference ratio of the second coordinate axis. If the difference ratio of the second coordinate axes is multiple items, a second deviation ratio of the second coordinate axes between the roaming panoramic image and the three-dimensional panoramic image can be obtained through calculation according to the difference ratio of each second coordinate axis and the corresponding number of the items.
In some exemplary embodiments, the difference ratio of the first coordinate axis is a0, a1, a2, and the first deviation ratio of the first coordinate axis may be: (a0+ a1+ a 2)/3. The second deviation ratio of the second coordinate axis may be: (b0+ b1+ b 2)/3.
In this embodiment, if a single preset position is used, the position difference between the roaming panorama and the three-dimensional panorama on the x axis of the horizontal axis and the y axis of the vertical axis cannot be calculated, and thus the deviation ratio cannot be calculated. And the deviation ratio between the roaming panoramic image and the three-dimensional panoramic image can be more accurately determined during calculation by using the position difference value of at least two preset positions.
In one embodiment, the adjusting the roaming panorama according to the coordinate deviation ratio to obtain a calibrated panorama includes:
adjusting each coordinate point in the roaming panoramic image according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value;
and determining a calibration panorama according to the calibration coordinate value.
Specifically, after the first deviation rate and the second deviation rate are calculated, each coordinate point in the roaming panorama is recalculated through the first deviation rate and the second deviation rate, and each coordinate point is adjusted, where each adjusted coordinate point may be a calibration coordinate value. And then synthesizing all the adjusted calibration coordinate values to obtain a calibration panorama.
In this embodiment, each coordinate point is adjusted through the first deviation ratio and the second deviation ratio, so that the adjusted coordinate is closer to the coordinates of a camera point location for producing a panorama in the original scene, and the connection between the point locations is more accurate.
In one embodiment, as shown in fig. 4, S50, the adjusting the point space data of the project model according to the coordinate deviation ratio to obtain a calibration model includes:
s52, embedding the codes of the project model into the webpage.
Specifically, the codes of the project model are embedded into a webpage for packaging development.
In some exemplary embodiments, a project model may be imported into a panoramic roaming production platform, such as an Everpano platform, corresponding options, such as Render Settings — - √ depth, √ convert 2cube, and Krpano project may be checked, and a code of the project model is embedded into a web page for package development and external deployment.
S54, simplifying the project model, wherein the simplifying process comprises: and on the basis of not influencing the structure of the project model, simplifying the number of faces of the project model.
Specifically, the simplification processing can be generally that the project model needs to be simplified on the basis of not influencing the overall structure due to the large number of faces on the 3dmax software platform. A simplified processing of the project model is required.
In this embodiment, the simplified processing of the project model can obtain a lighter project model file, and in the subsequent processing, if the project model after the simplified processing is used, the subsequent data loading is lighter and more friendly, and the panoramic roaming production platform can run faster.
S50, adjusting the point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model, including:
and S56, adjusting the point location space data in the project model according to the first deviation ratio and the second deviation ratio to obtain calibration space point location data.
The point location space data may be, among other things, the spatial coordinates in the project model.
Specifically, the project model is usually configured and output in the panoramic roaming production platform, so that the point location space data in the project model can be enlarged or reduced according to the first deviation ratio and the second deviation ratio, and converted into accurate point location space data.
In some exemplary embodiments, a corresponding option in the panoramic-roaming production platform, such as Render Settings —) - (v/stl,/convert 2cube,/Krpano projc, is selected to generate a file corresponding to the new panoramic-roaming production platform, and the spatial point location data in the project model may be processed one by one according to the first deviation ratio and the second deviation ratio, and the spatial point location data may be enlarged or reduced to be converted into accurate point location spatial data. The project model mentioned in the present embodiment may be a project model subjected to the simplification processing by the above-described procedure, or may be a project model not subjected to the simplification processing.
And S58, obtaining a calibration model according to the calibration space point location data.
Specifically, the new calibration spatial point location data is replaced with the original spatial point location data, and the obtained new model may be a calibration model.
In an exemplary embodiment, as shown in table 1,
table 1 replacement data table
Panos file Replacement stl files xml code file
0.tiles 0.stl "VR001" field 1 ═ panos \0.tiles \ xx.jpg ">
1.tiles 1.stl "VR002" field 1 ═ panos \1.tiles \ xx.>
2.tiles 2.stl The "VR003" field 1 is "panos \2.tiles \ xx.>
3.tiles 3.stl "VR004" field 1 ═"panos\3.tiles\xx.jpg">
4.tiles 4.stl The "VR005" field 1 is "panos \4.tiles \ xx.>
N.tiles N.stl The "VR005" field 1 is "panos \ n.tiles \ xx.>
Among them, a Panos file can be generally understood as a file corresponding to spatial point location data. The stl file may be generally understood as a file corresponding to calibration space point location data. xml code files are generally understood to be code files that require the replacement of new calibration spatial point location data with the original spatial point location data. Field 1 is generally understood to be the field in which the code that needs replacement is located. Jpg can be generally understood as a corresponding picture in a calibration model or a project model.
In this embodiment, the project model is adjusted through the coordinate deviation rate to replace the spatial point location data therein, so that the processed calibration model is more accurate, the whole virtually generated space is assisted to be accurate in the point location spatial coordinate value, and the post-stage virtual space experience has good spatial fluency, accurate spatial transition feeling and no offset of the point location spatial coordinate position.
In an embodiment, the integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene includes:
integrating and connecting the calibration panorama and the calibration model through an image splicing technology to obtain a panoramic roaming scene, wherein the image splicing technology comprises the following steps: image registration and image fusion, wherein the image registration algorithm adopts a region-based method.
Wherein, the integration connection may be a method of combining a 2D panorama and a 3D calibration model. The image stitching technology is a technology for stitching a plurality of images with overlapped parts (which may be obtained at different times, different viewing angles or different sensors) into a seamless panoramic image or a high-resolution image. The image registration may generally be to find out the corresponding position of the template or the feature point in the images to be stitched in the reference image by using a certain matching strategy, and further determine the transformation relationship between the two images. Image Fusion (Image Fusion) refers to that Image data collected by a multi-source channel and related to the same target is processed by an Image processing and computer technology, beneficial information in each channel is extracted to the maximum extent, and finally, high-quality images are synthesized to improve the utilization rate of Image information
Specifically, the calibration panorama and the calibration model may be integrally connected by an image stitching technique. The image stitching technology can comprise the following steps: image registration and image fusion. The image registration technology usually adopts a region-based method, the region-based method takes one image in an overlapping region of the images as a template, a matching block most similar to the template is searched in the other image, and the panoramic roaming scene obtained by adopting the algorithm has higher precision. It should be noted that the image registration technique may also adopt a phase correlation method or a feature-based method.
In this embodiment, the panoramic roaming scene obtained by the image stitching technology can be associated in both a 2D level and a 3D level. By using the method based on the area during image registration, the panoramic roaming scene with higher precision can be obtained, and the accuracy and smoothness of later roaming experience of the panoramic roaming scene can be greatly ensured.
In one embodiment, the roaming panorama, the three-dimensional panorama and the calibration panorama are spherical panoramas.
In this embodiment, the panorama may generally include: the panoramic image used in the application is a spherical panoramic image which is relatively in line with the observation habit of human eyes and can also have the best immersion experience in the roaming of a virtual display space.
In another embodiment, as shown in fig. 5, the present disclosure further provides a panoramic roaming scene construction method, including:
and S502, creating a three-dimensional panorama, and inputting the three-dimensional panorama into a panorama roaming making scene to obtain a roaming panorama.
S504, a first position difference value of a preset position in the roaming panoramic image and a second position difference value of the preset position in the three-dimensional panoramic image are obtained, the number of the preset positions is at least two, and the vertical coordinate position of each preset position is the same.
S506, calculating to obtain a coordinate difference ratio according to the first position difference and the second position difference, wherein the coordinate difference ratio comprises a first coordinate axis difference ratio and a second coordinate axis difference ratio
And S508, calculating to obtain a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
And S510, adjusting each coordinate point in the roaming panorama according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value.
And S512, determining a calibration panorama according to the calibration coordinate value.
And S514, constructing a project model according to the calibration panorama and the panoramic roaming production platform.
S516, simplifying the project model, wherein the simplifying process comprises the following steps: and simplifying the number of faces of the project model on the basis of not influencing the project model structure.
And S518, adjusting the point location space data in the project model according to the first deviation rate and the second deviation rate to obtain calibration space point location data.
And S520, obtaining a calibration model according to the calibration space point location data.
S522, integrating and connecting the calibration panoramic image and the calibration model through an image splicing technology to obtain a panoramic roaming scene, wherein the image splicing technology comprises the following steps: image registration and image fusion, wherein the image registration algorithm adopts a region-based method.
It should be noted that, for specific implementation and limitation in this embodiment, reference may be made to the above-mentioned embodiment, and repeated descriptions are not repeated in this embodiment.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the disclosure further provides a panoramic roaming scene construction device for implementing the panoramic roaming scene construction method. The implementation scheme for solving the problem provided by the apparatus is similar to the implementation scheme described in the method, so the specific limitations in one or more embodiments of the panoramic roaming scene construction apparatus provided below may refer to the limitations on the panoramic roaming scene construction method in the foregoing, and details are not described here again.
In one embodiment, as shown in fig. 6, there is provided a panoramic roaming scene constructing apparatus 600, including: a calculation module 602, a panorama calibration module 604, a model construction module 606, a model calibration module 608, and an integration module 610, wherein:
a calculating module 602, configured to calculate a coordinate deviation ratio between a roaming panorama and a three-dimensional panorama, where the roaming scene is obtained by inputting the three-dimensional scene to a panorama roaming production platform;
a panorama calibration module 604, configured to adjust the roaming panorama according to the coordinate deviation ratio to obtain a calibrated panorama;
a model building module 606, configured to build a project model according to the calibration panorama and the panorama roaming production platform;
The model calibration module 608 is configured to adjust point location space data of the item model according to the coordinate deviation ratio, so as to obtain a calibration model;
and an integration module 610, configured to integrate and connect the calibration panorama and the calibration model to obtain a panoramic roaming scene.
In one embodiment of the apparatus, the calculation module 602 comprises: the device comprises a coordinate difference value acquisition module, a difference value ratio calculation module and a deviation ratio calculation module;
the coordinate difference value acquisition module is used for acquiring a first position difference value of a preset position in the roaming panorama and a second position difference value of a preset position in the three-dimensional panorama, wherein the number of the preset positions is at least two, and the vertical coordinate position of each preset position is the same;
the difference ratio calculation module is used for calculating a coordinate difference ratio according to the first position difference and the second position difference, and the coordinate difference ratio comprises a first coordinate axis difference ratio and a second coordinate axis difference ratio;
and the deviation ratio calculation module is used for calculating a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
In one embodiment of the apparatus, the panorama calibration module 604 comprises: the device comprises a deviation rate adjusting module and a coordinate value determining module;
the deviation rate adjusting module is used for adjusting each coordinate point in the roaming panorama according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value;
and the coordinate value determining module is used for determining a calibration panorama according to the calibration coordinate value.
In one embodiment of the apparatus, the apparatus further comprises: a simplification processing module, configured to perform simplification processing on the project model, where the simplification processing includes: and simplifying the number of faces of the project model on the basis of not influencing the project model structure.
In one embodiment of the apparatus, the model calibration module 608 includes: the system comprises a data adjusting module and a model determining module;
the data adjusting module is used for adjusting the point location space data in the project model according to the first deviation rate and the second deviation rate to obtain calibration space point location data;
and the model determining module is used for obtaining a calibration model according to the calibration space point location data.
In an embodiment of the apparatus, the integrating module 610 is further configured to integrate and connect the calibration panorama and the calibration model through an image stitching technique to obtain a panoramic roaming scene, where the image stitching technique includes: image registration and image fusion, wherein the image registration algorithm adopts a region-based method.
In one embodiment of the apparatus, the rover panorama, the three-dimensional panorama, and the calibration panorama are spherical panoramas.
All or part of the modules in the panoramic roaming scene construction device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 7. The computer device comprises a processor, a memory, a communication interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a panoramic roaming scene construction method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 7 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, a computer device is provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the above method embodiments when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
In an embodiment, a computer program product is provided, comprising a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, databases, or other media used in the embodiments provided by the present disclosure may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), Magnetic Random Access Memory (MRAM), Ferroelectric Random Access Memory (FRAM), Phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases involved in embodiments provided by the present disclosure may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided in this disclosure may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic, quantum computing based data processing logic, etc., without limitation.
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several implementation modes of the present disclosure, and the description thereof is specific and detailed, but not to be understood as limiting the scope of the present disclosure. It should be noted that various changes and modifications can be made by one skilled in the art without departing from the spirit of the disclosure, and these changes and modifications are all within the scope of the disclosure. Therefore, the protection scope of the present disclosure should be subject to the appended claims.

Claims (16)

1. A panoramic roaming scene construction method is characterized by comprising the following steps:
calculating the coordinate deviation rate between a roaming panoramic image and a three-dimensional panoramic image, wherein the roaming panoramic image is obtained by inputting the three-dimensional panoramic image into a panoramic roaming manufacturing platform;
adjusting the roaming panorama according to the coordinate deviation rate to obtain a calibration panorama;
constructing a project model according to the calibration panorama and the panorama roaming production platform;
Adjusting point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model;
and integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene.
2. The method of claim 1, wherein calculating a coordinate deviation ratio between the roaming panorama and the three-dimensional panorama comprises:
acquiring a first position difference value of a preset position in the roaming panorama and a second position difference value of the preset position in the three-dimensional panorama, wherein the number of the preset positions is at least two, and the vertical coordinate position of each preset position is the same;
calculating to obtain a coordinate difference ratio according to the first position difference and the second position difference, wherein the coordinate difference ratio comprises a first coordinate axis difference ratio and a second coordinate axis difference ratio;
and calculating to obtain a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
3. The method of claim 2, wherein the adjusting the roaming panorama according to the coordinate deviation ratio to obtain a calibrated panorama comprises:
Adjusting each coordinate point in the roaming panoramic image according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value;
and determining a calibration panorama according to the calibration coordinate value.
4. The method according to any one of claims 1 to 3, wherein the adjusting point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model previously comprises:
performing a simplification process on the project model, wherein the simplification process comprises: and simplifying the number of faces of the project model on the basis of not influencing the project model structure.
5. The method of claim 2, wherein the adjusting point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model comprises:
adjusting point location space data in the project model according to the first deviation rate and the second deviation rate to obtain calibration space point location data;
and obtaining a calibration model according to the calibration space point location data.
6. The method of claim 1, wherein the integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene comprises:
Integrating and connecting the calibration panorama and the calibration model through an image splicing technology to obtain a panoramic roaming scene, wherein the image splicing technology comprises the following steps: image registration and image fusion, wherein the algorithm of the image registration adopts a region-based method.
7. The method of claim 1, wherein the roving panorama, the three-dimensional panorama, and the calibration panorama are spherical panoramas.
8. A panoramic roaming scene construction apparatus, characterized in that the apparatus comprises:
the calculation module is used for calculating the coordinate deviation rate between a roaming panoramic image and a three-dimensional panoramic image, wherein the roaming scene is obtained by inputting the three-dimensional scene to a panoramic roaming production platform;
the panorama calibration module is used for adjusting the roaming panorama according to the coordinate deviation rate to obtain a calibrated panorama;
the model building module is used for building a project model according to the calibration panorama and the panorama roaming production platform;
the model calibration module is used for adjusting point location space data of the project model according to the coordinate deviation ratio to obtain a calibration model;
and the integration module is used for integrating and connecting the calibration panorama and the calibration model to obtain a panoramic roaming scene.
9. The apparatus of claim 8, wherein the computing module comprises: the device comprises a coordinate difference value acquisition module, a difference value ratio calculation module and a deviation ratio calculation module;
the coordinate difference value acquisition module is used for acquiring a first position difference value of a preset position in the roaming panoramic image and a second position difference value of a preset position in the three-dimensional panoramic image, wherein the number of the preset positions is at least two, and the vertical coordinate positions of the preset positions are the same;
the difference ratio calculation module is used for calculating a coordinate difference ratio according to the first position difference and the second position difference, wherein the coordinate difference ratio comprises a first coordinate axis difference ratio and a second coordinate axis difference ratio;
and the deviation ratio calculation module is used for calculating a coordinate deviation ratio between the roaming panoramic image and the three-dimensional panoramic image according to the coordinate difference ratio, wherein the coordinate deviation ratio comprises a first deviation ratio of a first coordinate axis and a second deviation ratio of a second coordinate axis.
10. The apparatus of claim 9, wherein the panorama calibration module comprises: the device comprises a deviation rate adjusting module and a coordinate value determining module;
the deviation rate adjusting module is used for adjusting each coordinate point in the roaming panoramic image according to the first deviation rate and the second deviation rate to obtain a calibration coordinate value;
And the coordinate value determining module is used for determining a calibration panoramic image according to the calibration coordinate value.
11. The apparatus according to any one of claims 8-10, further comprising: a simplification processing module, configured to perform simplification processing on the project model, where the simplification processing includes: and simplifying the number of faces of the project model on the basis of not influencing the project model structure.
12. The apparatus of claim 9, wherein the model calibration module comprises: the system comprises a data adjusting module and a model determining module;
the data adjusting module is used for adjusting the point location space data in the project model according to the first deviation rate and the second deviation rate to obtain calibration space point location data;
and the model determining module is used for obtaining a calibration model according to the calibration space point location data.
13. The apparatus of claim 8, wherein the integrating module is further configured to integrate and connect the calibration panorama and the calibration model through an image stitching technique to obtain a panoramic roaming scene, and the image stitching technique includes: image registration and image fusion, wherein the image registration algorithm adopts a region-based method.
14. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
16. A computer program product comprising a computer program, characterized in that the computer program realizes the steps of the method of any one of claims 1 to 7 when executed by a processor.
CN202210269585.1A 2022-03-18 2022-03-18 Panoramic roaming scene construction method and device, computer equipment and storage medium Pending CN114758062A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210269585.1A CN114758062A (en) 2022-03-18 2022-03-18 Panoramic roaming scene construction method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210269585.1A CN114758062A (en) 2022-03-18 2022-03-18 Panoramic roaming scene construction method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114758062A true CN114758062A (en) 2022-07-15

Family

ID=82327868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210269585.1A Pending CN114758062A (en) 2022-03-18 2022-03-18 Panoramic roaming scene construction method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114758062A (en)

Similar Documents

Publication Publication Date Title
US11164381B2 (en) Clothing model generation and display system
EP3365874B1 (en) Mixed-reality and cad architectural design environment
CN105164728B (en) For mixing the apparatus and method in real border
US10460510B2 (en) Methods and systems for viewing a three-dimensional (3D) virtual object
US20140225922A1 (en) System and method for an augmented reality software application
US20130293686A1 (en) 3d reconstruction of human subject using a mobile device
JP2016526222A (en) HUD object design and display method.
AU2017272304B2 (en) Auto vr: an assistant system for virtual reality painting
CN105637559A (en) Structural modeling using depth sensors
CN105023266A (en) Method and device for implementing augmented reality (AR) and terminal device
US10740981B2 (en) Digital stages for presenting digital three-dimensional models
CN104252712A (en) Image generating apparatus and image generating method
CN106980378A (en) Virtual display methods and system
US8358311B1 (en) Interpolation between model poses using inverse kinematics
US20170104982A1 (en) Presentation of a virtual reality scene from a series of images
KR102294989B1 (en) Method for displaying body model for clothing modeling
CN114782646A (en) House model modeling method and device, electronic equipment and readable storage medium
CN111739134B (en) Model processing method and device for virtual character and readable storage medium
JP6152888B2 (en) Information processing apparatus, control method and program thereof, and information processing system, control method and program thereof
US10891780B2 (en) Methods and systems for viewing a three-dimensional (3D) virtual object
CN114758062A (en) Panoramic roaming scene construction method and device, computer equipment and storage medium
CN103530869B (en) For mating the system and method that moving mass controls
CN114820980A (en) Three-dimensional reconstruction method and device, electronic equipment and readable storage medium
WO2020173222A1 (en) Object virtualization processing method and device, electronic device and storage medium
CN115601512B (en) Interactive three-dimensional reconstruction method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination