CN113593052B - Scene orientation determining method and marking method - Google Patents

Scene orientation determining method and marking method Download PDF

Info

Publication number
CN113593052B
CN113593052B CN202110902964.5A CN202110902964A CN113593052B CN 113593052 B CN113593052 B CN 113593052B CN 202110902964 A CN202110902964 A CN 202110902964A CN 113593052 B CN113593052 B CN 113593052B
Authority
CN
China
Prior art keywords
preset
determining
point
vector
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110902964.5A
Other languages
Chinese (zh)
Other versions
CN113593052A (en
Inventor
郝稼力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seashell Housing Beijing Technology Co Ltd
Original Assignee
Seashell Housing Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Seashell Housing Beijing Technology Co Ltd filed Critical Seashell Housing Beijing Technology Co Ltd
Priority to CN202110902964.5A priority Critical patent/CN113593052B/en
Publication of CN113593052A publication Critical patent/CN113593052A/en
Application granted granted Critical
Publication of CN113593052B publication Critical patent/CN113593052B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Abstract

The invention relates to the technical field of computers, and discloses a scene orientation determining method and a marking method. The determination method comprises the following steps: determining three-dimensional coordinates of a starting point and an end point of a preset vector on a preset sky box for pasting the panorama according to two-dimensional coordinates of the starting point and the end point on the panorama, wherein the direction of the preset vector is a specific direction; determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box; determining a second projection vector of a position vector of a camera in a scene where the preset sky box is located on the horizontal section; and determining the orientation of the scene where the preset sky box is located according to the first projection vector and the second projection vector. The invention can accurately determine the orientation of the three-dimensional scene in real time no matter what visual angle the camera is switched to.

Description

Scene orientation determining method and marking method
Technical Field
The invention relates to the technical field of computers, in particular to a scene orientation determining method and a marking method.
Background
At present, a two-dimensional vector map is commonly used, and identification of east, west, south and north directions is easy to realize. For example, in the Baidu map shown in FIG. 1, an icon with a compass in the lower right-hand corner can be seen. If the map is rotated, the compass will follow the rotation.
If outdoor large-scale scene real model rendering and display are carried out, the requirement on hardware equipment is very high, and normal loading and high-quality rendering of the model are difficult to realize on common mobile equipment. Therefore, for the display of outdoor large-scale scenes, a large scene with a large amount of model data is generally rendered into a panorama, the panorama is taken as a sky box map of a three-dimensional scene, a camera is placed in a sky box, and the scene is rendered and displayed through a camera angle. If the orientation of the compass is marked as in the two-dimensional vector map, the orientation of the compass marked by the marker does not change even though the orientation of the scene actually changes significantly during the rotation of the camera in the three-dimensional scene (that is, the switching of the view angle), and thus the conventional marking method of the orientation of the two-dimensional scene is not suitable for the three-dimensional scene.
Disclosure of Invention
It is an object of the present invention to provide a scene orientation determination method and a marking method which can determine the orientation of a three-dimensional scene accurately in real time regardless of the perspective to which the camera is switched.
In order to achieve the above object, a first aspect of the present invention provides a method for determining a scene orientation, the method comprising: determining three-dimensional coordinates of a starting point and an end point of a preset vector on a preset sky box for pasting the panorama according to two-dimensional coordinates of the starting point and the end point on the panorama, wherein the direction of the preset vector is a specific direction; determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box; determining a second projection vector of a position vector of a camera in a scene where the preset sky box is located on the horizontal section; and determining the orientation of the scene where the preset sky box is located according to the first projection vector and the second projection vector.
Preferably, in a case that the starting point and the ending point are both in the upper half of the preset sky box, the determining a first projection vector of the preset vector on a horizontal cross section where the center of the preset sky box is located includes: determining projection coordinates of the starting point and the end point on the horizontal section according to three-dimensional coordinates of the starting point and the end point on the preset sky box; and determining a first projection vector of the preset vector on the horizontal section according to the projection coordinates of the starting point and the end point on the horizontal section.
Preferably, the preset sky box is a spherical sky box.
Preferably, in a case where the starting point and the ending point are both in the lower half of the preset sky box, the determining a first projection vector of the preset vector on a horizontal cross section where the center of the preset sky box is located includes: determining specific projection coordinates of the starting point and the ending point on the horizontal section through a specific mapping relation according to three-dimensional coordinates of the starting point and the ending point on the preset sky box; and determining a first projection vector of the preset vector on the horizontal section according to the specific projection coordinates of the starting point and the end point on the horizontal section.
Preferably, determining the specific projection coordinate of the start point or the end point on the horizontal cross section through the specific mapping relationship includes: determining a specific projection coordinate of the start point or the end point on the horizontal section by the following mapping relationship: si=-(1-θi/π)×cos(αi)×ri(ii) a And ti=(1-θi/π)sin(αi)×riWherein i is 1 or 2; (r)1,θ1,α1) And (r)2,θ2,α2) Three-dimensional coordinates of the starting point and the ending point on the preset sky box respectively; and(s)1,t1) And(s)2,t2) The specific projection coordinates of the starting point and the end point on the horizontal section are respectively.
Preferably, the determining the orientation of the scene in which the preset sky box is located includes: calculating an angle from the second projection vector to the first projection vector; and using the obtained angle as the deviation angle of the orientation of the scene.
Preferably, the starting point and the ending point of the preset vector correspond to a first preset point and a second preset point in the real space, respectively, and accordingly, the determining method further includes: and determining two-dimensional coordinates of the starting point and the ending point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and the position information of the camera.
Preferably, the position information of the camera includes displacement information, rotation information and scaling information, and accordingly, the determining the two-dimensional coordinates of the start point and the end point on the panorama includes: determining a world space transformation matrix of the camera according to the displacement information, the rotation information and the zooming information of the camera; determining a viewport matrix of the camera according to the world space transformation matrix of the camera; and determining two-dimensional coordinates of the starting point and the ending point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and a viewport matrix of the camera.
According to the technical scheme, the three-dimensional coordinates of the starting point and the ending point of the preset vector on the panorama on the preset sky box are creatively determined; then determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box; then determining a second projection vector of the position vector of the camera in the scene on the horizontal interface; finally, the orientation of the scene is determined according to the two determined projection vectors. Therefore, the orientation of the three-dimensional scene can be accurately determined in real time no matter what visual angle the camera is switched to.
A second aspect of the present invention provides a method for marking an orientation of a scene, the method comprising: determining the orientation of the scene where the preset sky box is located according to the scene orientation determining method; and marking the determined orientation at a preset position on the panorama so that the marked orientation is always within the range of viewing angles of the cameras in the scene.
Preferably, the panorama includes a live-action area and a sky area, and accordingly, the preset position is any one of positions that are centered on a center of the panorama and distributed in a preset area within the live-action area of the panorama.
Through the technical scheme, the orientation of the scene where the preset sky box is located is creatively determined through the method for determining the orientation of the scene; the orientation determined by the preset position markers on the panorama can then be used to accurately characterize the orientation of the three-dimensional scene in real time, regardless of the perspective to which the camera is switched.
A third aspect of the present invention provides a system for determining an orientation of a scene, the system comprising: the first coordinate determination device is used for determining three-dimensional coordinates of a starting point and an end point of a preset vector on a preset sky box for pasting the panorama according to two-dimensional coordinates of the starting point and the end point on the panorama, wherein the direction of the preset vector is a specific direction; the first vector determining device is used for determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box; the second vector determining device is used for determining a second projection vector of the position vector of the camera in the scene where the preset sky box is located on the horizontal section; and the orientation determining device is used for determining the orientation of the scene where the preset sky box is located according to the first projection vector and the second projection vector.
Preferably, in a case where the starting point and the ending point are both in the upper half of the preset sky box, the first vector determination device includes: the first coordinate determination module is used for determining projection coordinates of the starting point and the end point on the horizontal section according to three-dimensional coordinates of the starting point and the end point on the preset sky box; and the first vector determining module is used for determining a first projection vector of the preset vector on the horizontal cross section according to the projection coordinates of the starting point and the end point on the horizontal cross section.
Preferably, the preset sky box is a spherical sky box.
Preferably, in a case where both the start point and the end point are in a lower half of the preset sky box, the first vector determination device includes: a second coordinate determination module, configured to determine, according to three-dimensional coordinates of the starting point and the ending point on the preset sky box, specific projection coordinates of the starting point and the ending point on the horizontal cross section through a specific mapping relationship; and the second vector determining module is used for determining a first projection vector of the preset vector on the horizontal section according to the specific projection coordinates of the starting point and the end point on the horizontal section.
Preferably, the orientation determining means comprises: the angle calculation module is used for calculating the angle from the second projection vector to the first projection vector; and an orientation determination module for taking the calculated angle as a deviation angle of the orientation of the scene.
Preferably, the starting point and the ending point of the preset vector correspond to a first preset point and a second preset point in the real space, respectively, and accordingly, the determining system further includes: and the second coordinate determination device is used for determining the two-dimensional coordinates of the starting point and the end point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and the position information of the camera.
Preferably, the position information of the camera includes displacement information, rotation information, and zoom information, and accordingly, the second coordinate determination device includes: the first matrix determining module is used for determining a world space transformation matrix of the camera according to the displacement information, the rotation information and the zooming information of the camera; a second matrix determination module for determining a viewport matrix of the camera according to the world space transformation matrix of the camera; and a third coordinate determination module, configured to determine two-dimensional coordinates of the start point and the end point on the panorama according to three-dimensional coordinates of the first preset point and the second preset point in the actual space and a viewport matrix of the camera.
For details and benefits of the system for determining a scene orientation provided by the present invention, reference may be made to the above description of the method for determining a scene orientation, which is not described herein again.
A fourth aspect of the invention provides a marking system for orientation of a scene, the marking system comprising: the scene orientation determining system is used for determining the orientation of the panoramic image; and marking the determined orientation at a preset position on the panorama so that the marked orientation is always within the view angle range of the camera in the scene.
Preferably, the panorama includes a live-action area and a sky area, and accordingly, the preset position is any one of positions that are centered on a center of the panorama and distributed in a preset area within the live-action area of the panorama.
For specific details and benefits of the scene orientation marking system provided by the present invention, reference may be made to the above description of the scene orientation marking method, which is not described herein again.
The fifth aspect of the present invention further provides a computer readable storage medium having stored thereon a computer program which, when being executed by a processor, implements the steps of the method for determining an orientation of a scene and/or the steps of the method for marking an orientation of a scene.
The sixth aspect of the present invention also provides an electronic apparatus, including: at least one processor; a memory coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, and the at least one processor implements the steps of the scene orientation determination method and/or the steps of the scene orientation marking method by executing the computer program stored in the memory.
The seventh aspect of the present invention also provides a computer program product comprising a computer program which, when being executed by a processor, carries out the steps of the method for determining an orientation of a scene and/or the steps of the method for marking an orientation of a scene.
Additional features and advantages of the invention will be set forth in the detailed description which follows.
Drawings
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
FIG. 1 is a screen shot of a prior art two-dimensional vector map;
fig. 2 is a flowchart of a method for determining a scene orientation according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a coordinate system on a panorama provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of a coordinate system of a celestial sphere provided by one embodiment of the present invention;
FIG. 5 is a flow chart of a method for marking scene orientations according to an embodiment of the present invention; and
fig. 6 is a block diagram of a scene orientation determination system according to an embodiment of the present invention.
Detailed Description
The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating the present invention, are given by way of illustration and explanation only, not limitation.
A panorama (also referred to as 720) is a picture taken by a spherical 360 ° field angle (FOV) camera or rendered by three-dimensional design software, and it can be wrapped on a spherical object through coordinate transformation or wrapped on a cube through pixel mapping to be CubeMap, so as to form a 360 ° space up, down, left and right.
Fig. 2 is a flowchart of a method for determining a scene orientation according to an embodiment of the present invention. As shown in fig. 2, the determination method may include the following steps S201 to S204.
Step S201, according to two-dimensional coordinates of a starting point and an end point of a preset vector on a panoramic image, determining three-dimensional coordinates of the starting point and the end point on a preset sky box for pasting the panoramic image.
Wherein, the direction of the preset vector is a specific direction. For example, the direction of the preset vector may be a true south or a true north. That is, the starting point and the ending point in a specific direction (e.g., a north or south direction) can be selected according to actual requirements.
In the existing products that use panoramas for 360 ° display, some points of interest (which may also be referred to as POI points) are usually marked on the map. At these points, a tag is generally added to the user interaction interface for displaying the information of the POI point and performing the user interaction. Such as adding a blue label in the center of a certain figure and multiple landmark labels around it, etc. Generally, the manufacturing process of these tags is to render a panorama by using three-dimensional design software or to capture a panorama by using a panoramic camera, and then manually label and enter information of POI points on the panorama.
When manually labeling a panorama, a labeling person typically records the two-dimensional coordinates of a POI point on the panorama. A panorama is generally a rectangular picture with a width-height ratio of 2:1, and in computer graphics, for a coordinate system of a texture picture, the upper left corner of the rectangle is usually used as an origin (0,0) point, and the lower right corner of the rectangle is used as a (1,1) point; taking the horizontal direction as a U axis, and increasing towards the right side; the vertical direction is taken as the V axis and is increased downwards as shown in figure 3. The components of the UV coordinates lie within the [0,1] interval. And after the UV coordinates of the POI points on the panoramic image are obtained, the UV coordinates are calculated and converted into three-dimensional world coordinates of the panoramic image wrapped on a three-dimensional sphere.
The POI marking of the traditional panoramic image needs to be manually marked at the corresponding coordinates of the panoramic image in a buried manner after the rendering or shooting of the panoramic image is finished, so that the efficiency is low, and the coordinate deviation phenomenon can occur. In addition, when the camera position of the panoramic image changes, the coordinates of the POI points need to be marked again; a more extreme example is that in the case of panoramic video, the coordinates of the POI points in each frame of image need to be manually marked, which is a huge effort.
In one embodiment, known three-dimensional coordinates of a preset point (i.e., a POI point, such as a haichi a location in beijing) in real space and position information (e.g., displacement, rotation, and zoom information) of a camera may be used to pre-calculate corresponding two-dimensional coordinates of the preset point on a panorama. As the labels of the POI points can be automatically calculated by a computer program, manual labeling is not needed, a large amount of time is saved, and the operation efficiency can be greatly improved.
Specifically, the starting point and the ending point of the preset vector may correspond to a first preset point and a second preset point in the actual space, respectively. Accordingly, before performing step S201, the determining method may further include: and determining two-dimensional coordinates of the starting point and the ending point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and the position information of the camera.
The position information of the camera may include displacement information, rotation information, and zoom information. Accordingly, the determining the two-dimensional coordinates of the start point and the end point on the panorama can include: determining a world space transformation matrix of the camera according to the displacement information, the rotation information and the zooming information of the camera; determining a viewport matrix of the camera according to the world space transformation matrix of the camera; and determining two-dimensional coordinates of the starting point and the ending point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and a viewport matrix of the camera.
In particular, the camera may be a spherical camera. Accordingly, the panorama is the view port image when the camera FOV is 360 °. First, the world space transformation matrix of the camera is a 4x4 matrix composed of displacement information, rotation information, and scaling information of the camera, wherein the displacement information and the scaling information are represented by two three-dimensional vectors, and the rotation information is represented by one quaternion vector. Considering the displacement P (x, y, z), rotation Q (x, y, z, w), scaling S (x, y, z) of the camera, the world space variation matrix T of the camera can be calculated as follows:
Figure BDA0003200643860000081
then, an inverse matrix of the matrix T is calculated, and a viewport matrix M of the camera is obtained (where M may be a column-first 4 × 4 matrix).
Finally, according to the viewport matrix M of the camera and the three-dimensional coordinates POI (x) of the first preset point1,y1,z1) And determining the three-dimensional coordinate POI of the transformed first preset point according to the following relational expressionTransformed(x1,y1,z1)。
POITransformed(x1)=(M(0)·POI(x1)+M(4)·POI(y1)+M(8)·POI(z1)+M(12))·w;
POITransformed(y1)=(M(1)·POI(x1)+M(5)·POI(y1)+M(9)·POI(z1)+M(13))·w;
POITransformed(z1)=(M(2)·POI(x1)+M(6)·POI(y1)+M(10)·POI(z1)+M(14))·w;
Where M (i) is the ith element of the viewport matrix M (starting counting 0 according to the first column of the first row, incrementing 1 downward row by row, after the lowermost element of the first column, turning to the second column to start counting, and similarly stopping counting until the last element of the last column of the last row, e.g., M (0) is the element of the first column of the first row, M (3) is the element of the fourth row of the first column),
Figure BDA0003200643860000082
then calculating the vector length L of the transformed first preset point1
Figure BDA0003200643860000091
Then, the UV coordinates (u) of said first preset point in the panorama1,v1) Can be calculated according to the following relation:
Figure BDA0003200643860000092
and if u1<0,u1=u1+1;
Figure BDA0003200643860000093
Similarly, the second preset point POI (x) can be calculated according to the above process2,y2,z2) UV coordinate (u) in panorama2,v2) And will not be described herein.
Wherein, the preset sky box can be a sphere sky box. And under the condition that the shape of the preset sky box is a sphere, the three-dimensional coordinates of the starting point and the ending point on the sky box are polar coordinates.
In a case where two-dimensional coordinates of a start point and an end point of a preset vector on a panorama are obtained, three-dimensional coordinates of the start point and the end point on the preset sky box may be determined according to the two-dimensional coordinates of the two points.
The preset sky box is a sphere, which may be referred to as a sky sphere.
The central point of the celestial sphere is the (0,0, 0) point of the PanoScene coordinate system. The coordinates (u, v) of any S point (e.g. the first preset point or the second preset point) on the panorama are obtained as uv coordinates of the sky ball grid, and then the position (r, θ, α) of the S point (e.g. the first preset point or the second preset point) in the panosgene three-dimensional scene when the panorama is pasted on the sky ball is calculated through the uv coordinates. Wherein r is the radius of the sphere, θ is the included angle in the vertical direction, and α is the angle (i.e. the azimuth angle) between the projection vector of the vector r on the horizontal section where the center point of the celestial sphere is located and the positive direction of the x-axis.
Specifically, the polar coordinates of the S point on the sky sphere are calculated by the coordinates (u, v) of the S point and the following formula:
Figure BDA0003200643860000094
thus, the polar coordinates of the start point (corresponding to the first preset point) and the end point (corresponding to the second preset point) on the celestial sphere can be calculated according to the above formula, respectively.
If the coordinate system is represented by a rectangular coordinate system in a three-dimensional space, the three-dimensional position coordinate of the point S in the scene can be calculated according to the following formula,
Figure BDA0003200643860000101
wherein θ is v × pi; α ═ u × π × 2. Thereby, the three-dimensional position coordinates of the start point (corresponding to the first preset point) and the end point (corresponding to the second preset point) in the sky box can be calculated respectively according to the above formula.
Step S202, determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box.
The first projection vector is an initial orientation (and a direction of the initial vector) of a scene where the preset sky box is located.
In a case that the starting point and the ending point are both at the upper half of the preset sky box, the determining a first projection vector of the preset vector on a horizontal cross section where the center of the preset sky box is located (i.e., the step S202) may include: determining the projection coordinate of the starting point on the horizontal section according to the three-dimensional coordinate of the starting point on the preset sky box; determining the projection coordinate of the end point on the horizontal section according to the three-dimensional coordinate of the end point on the preset sky box; and determining a first projection vector of the preset vector on the horizontal section according to the projection coordinates of the starting point and the end point on the horizontal section.
Specifically, the following will explain a scene of a sky ball as an example. According to the polar coordinates (r) of the starting point on the sky sphere111) And the first two equations in the equation set (2) determine the projection coordinate of the starting point on the horizontal interface as (x)1,y1) (ii) a Similarly, according to the polar coordinates (r) of the end point on the sky sphere222) And the first two equations in the equation set (2), and the projection coordinate of the end point on the horizontal section is determined to be (x)2,y2) (ii) a Then, according to the projection coordinates of the starting point and the end point, a first projection vector (x) of the preset vector on the horizontal section is determined2-x1,y2-y1)。
In a case where the starting point and the ending point are both in the lower half of the preset sky box, the determining a first projection vector of the preset vector on a horizontal cross section where the center of the preset sky box is located (i.e., the step S202) includes: determining a specific projection coordinate of the starting point on the horizontal section through a specific mapping relation according to the three-dimensional coordinate of the starting point on the preset sky box; according to the three-dimensional coordinate of the end point on the preset sky box, determining a specific projection coordinate of the end point on the horizontal section through the specific mapping relation; and determining a first projection vector of the preset vector on the horizontal section according to the specific projection coordinates of the starting point and the end point on the horizontal section.
In a case where the preset sky box is the sphere sky box, determining the specific projection coordinate of the start point or the end point on the horizontal section through the specific mapping relationship may include: determining a specific projection coordinate of the start point or the end point on the horizontal section by the following mapping relationship:
si=-(1-θi/π)×cos(αi)×ri(ii) a And
ti=(1-θi/π)×sin(αi)×ri
wherein i is 1 or 2; (r)1,θ1,α1) And (r)2,θ2,α2) Three-dimensional coordinates of the starting point and the ending point on the preset sky box respectively; and(s)1,t1) And(s)2,t2) The specific projection coordinates of the starting point and the end point on the horizontal section are respectively.
To accurately identify the orientation in the horizontal section, consider the mapping of the lower hemisphere of the celestial sphere onto the horizontal ground. Thus, the sky box is a half sphere above and a round ground plane below.
Specifically, a scene in which the sky ball is located is taken as an example for explanation. Firstly, according to the polar coordinates (r) of the starting point on the preset sky box1,θ1,α1) And the above mapping relation, the projection coordinate(s) of the starting point on the horizontal section can be calculated1,t1) (ii) a Then, according to the polar coordinates (r) of the end point on the preset sky box2,θ2,α2) And the mapping relation can be calculated to obtain the projection coordinate(s) of the end point on the horizontal section2,t2) (ii) a Finally, calculating the preset direction according to the projection coordinates of the starting point and the end pointA first projection vector(s) measured on said horizontal section2-s1,t2-t1)。
Step S203, determining a second projection vector of the position vector of the camera in the scene where the preset sky box is located on the horizontal cross section.
Firstly, projection coordinates of a start point and an end point of a position vector of the camera (the camera is used for shooting and rendering the panorama) on the horizontal section can be respectively determined (the specific process can refer to the related calculation process of the start point and the end point of the preset vector); then, the projection coordinate of the position vector on the horizontal section is determined according to the projection coordinates of the starting point and the end point.
And step S204, determining the orientation of the scene where the preset sky box is located according to the first projection vector and the second projection vector.
For step S204, the determining the orientation of the scene in which the preset sky box is located may include: calculating an angle from the second projection vector to the first projection vector; and using the obtained angle as the deviation angle of the orientation of the scene.
And under the condition that the included angle between the first projection vector and a certain fixed direction on the horizontal section is angle and the included angle between the second projection vector and the fixed direction is angle Camara, solving the angle angLECompass of the second projection vector to the first projection vector to meet the angLECompass-angle.
That is, two points are predefined on the panorama (the connecting line of the two points in the real space is taken as a connecting line in the due south direction (or due north direction)), the vectors of the two points in the three-dimensional scene of the sky box are the initial orientation of the three-dimensional scene, and then the vector description of the connecting line can be updated as the visual angle of the three-dimensional scene moves, so as to calculate the offset of the compass (or the compass needle) (i.e. the included angle between the real-time orientation and the initial orientation of the scene in which the preset sky box is located).
In summary, the present invention creatively determines three-dimensional coordinates of a start point and an end point of a preset vector on a panorama on a preset sky box; then determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box; then determining a second projection vector of the position vector of the camera in the scene on the horizontal interface; finally, the orientation of the scene is determined according to the two determined projection vectors. Therefore, the orientation of the three-dimensional scene can be accurately determined in real time no matter what visual angle the camera is switched to.
Fig. 5 is a flowchart of a method for marking a scene orientation according to an embodiment of the present invention. As shown in fig. 5, the marking method may include: step S501, determining the orientation of the scene where the preset sky box is located according to the method for determining the orientation of the scene; and step S502, marking the determined orientation at a preset position on the panoramic image so as to enable the marked orientation to be always within the view angle range of the camera in the scene.
The panorama can comprise a live-action area and a sky area. Correspondingly, the preset position is any position which takes the center of the panoramic image as the center and is distributed in a preset area in the real scene area of the panoramic image.
In the present embodiment, in consideration of the deformation of the panorama, the orientation mark is set in the central area of the panorama and in the live-action area (i.e., the non-sky area) below the panorama. Therefore, the direction of the map can be represented accurately in real time through the marked direction.
In summary, the present invention creatively determines the orientation of the scene where the preset sky box is located by the method for determining the orientation of the scene; the orientation determined by the preset position markers on the panorama can then be used to accurately characterize the orientation of the three-dimensional scene in real time, regardless of the perspective to which the camera is switched.
Fig. 6 is a block diagram of a scene orientation determination system according to an embodiment of the present invention. As shown in fig. 5, the determination system may include: the first coordinate determination device 10 is configured to determine, according to two-dimensional coordinates of a start point and an end point of a preset vector on a panorama, three-dimensional coordinates of the start point and the end point on a preset sky box for pasting the panorama, where a direction of the preset vector is a specific direction; a first vector determining device 20, configured to determine, according to the three-dimensional coordinates of the starting point and the ending point on the preset sky box, a first projection vector of the preset vector on a horizontal cross section where the center of the preset sky box is located; a second vector determining device 30, configured to determine a second projection vector of a position vector of a camera in a scene where the preset sky box is located on the horizontal cross section; and an orientation determining device 40, configured to determine, according to the first projection vector and the second projection vector, an orientation of a scene where the preset sky box is located.
Preferably, in a case where the starting point and the ending point are both in the upper half of the preset sky box, the first vector determination device includes: the first coordinate determination module is used for determining projection coordinates of the starting point and the end point on the horizontal section according to three-dimensional coordinates of the starting point and the end point on the preset sky box; and the first vector determining module is used for determining a first projection vector of the preset vector on the horizontal cross section according to the projection coordinates of the starting point and the end point on the horizontal cross section.
Preferably, the preset sky box is a spherical sky box.
Preferably, in a case where both the start point and the end point are in a lower half of the preset sky box, the first vector determination device includes: a second coordinate determination module, configured to determine, according to three-dimensional coordinates of the starting point and the ending point on the preset sky box, specific projection coordinates of the starting point and the ending point on the horizontal cross section through a specific mapping relationship; and the second vector determining module is used for determining a first projection vector of the preset vector on the horizontal section according to the specific projection coordinates of the starting point and the end point on the horizontal section.
Preferably, the orientation determining means comprises: the angle calculation module is used for calculating the angle from the second projection vector to the first projection vector; and an orientation determination module for taking the calculated angle as a deviation angle of the orientation of the scene.
Preferably, the starting point and the ending point of the preset vector correspond to a first preset point and a second preset point in the real space, respectively, and accordingly, the determining system further includes: and the second coordinate determination device is used for determining the two-dimensional coordinates of the starting point and the end point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and the position information of the camera.
Preferably, the position information of the camera includes displacement information, rotation information, and zoom information, and accordingly, the second coordinate determination device includes: the first matrix determining module is used for determining a world space transformation matrix of the camera according to the displacement information, the rotation information and the zooming information of the camera; a second matrix determination module for determining a viewport matrix of the camera according to the world space transformation matrix of the camera; and a third coordinate determination module, configured to determine two-dimensional coordinates of the start point and the end point on the panorama according to three-dimensional coordinates of the first preset point and the second preset point in the actual space and a viewport matrix of the camera.
For details and benefits of the system for determining a scene orientation provided by the present invention, reference may be made to the above description of the method for determining a scene orientation, which is not described herein again.
An embodiment of the present invention further provides a system for marking a scene orientation. The marking system may include: the scene orientation determining system is used for determining the orientation of the panoramic image; and marking the determined orientation at a preset position on the panorama so that the marked orientation is always within the view angle range of the camera in the scene.
Preferably, the panorama includes a live-action area and a sky area, and accordingly, the preset position is any one of positions that are centered on a center of the panorama and distributed in a preset area within the live-action area of the panorama.
For specific details and benefits of the scene orientation marking system provided by the present invention, reference may be made to the above description of the scene orientation marking method, which is not described herein again.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the method for determining a scene orientation and/or the steps of the method for marking a scene orientation.
An embodiment of the present invention further provides an electronic device, including: at least one processor; a memory coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, and the at least one processor implements the steps of the scene orientation determination method and/or the steps of the scene orientation marking method by executing the computer program stored in the memory.
An embodiment of the present invention further provides a computer program product, which includes a computer program, and which, when being executed by a processor, implements the steps of the method for determining a scene orientation and/or the steps of the method for marking a scene orientation.
Although the embodiments of the present invention have been described in detail with reference to the accompanying drawings, the embodiments of the present invention are not limited to the details of the above embodiments, and various simple modifications can be made to the technical solutions of the embodiments of the present invention within the technical idea of the embodiments of the present invention, and the simple modifications all belong to the protection scope of the embodiments of the present invention.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention do not describe every possible combination.
Those skilled in the art will understand that all or part of the steps in the method according to the above embodiments may be implemented by a program, which is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, any combination of various different implementation manners of the embodiments of the present invention is also possible, and the embodiments of the present invention should be considered as disclosed in the embodiments of the present invention as long as the combination does not depart from the spirit of the embodiments of the present invention.

Claims (10)

1. A method for determining orientation of a scene, the method comprising:
determining three-dimensional coordinates of a starting point and an end point of a preset vector on a preset sky box for pasting the panorama according to two-dimensional coordinates of the starting point and the end point on the panorama, wherein the direction of the preset vector is a specific direction, the starting point is a first preset point in an actual space, and the end point is a second preset point in the actual space;
determining a first projection vector of the preset vector on a horizontal section where the center of the preset sky box is located according to the three-dimensional coordinates of the starting point and the end point on the preset sky box;
determining a second projection vector of a position vector of a camera in a scene where the preset sky box is located on the horizontal section; and
and determining the orientation of the scene where the preset sky box is located according to the first projection vector and the second projection vector.
2. The method according to claim 1, wherein, in a case where the starting point and the ending point are both at the upper half of the preset sky box, the determining a first projection vector of the preset vector on a horizontal section where a center of the preset sky box is located includes:
determining projection coordinates of the starting point and the end point on the horizontal section according to three-dimensional coordinates of the starting point and the end point on the preset sky box; and
and determining a first projection vector of the preset vector on the horizontal section according to the projection coordinates of the starting point and the end point on the horizontal section.
3. The method of determining scene orientation of claim 1, wherein the preset sky box is a sphere sky box.
4. The method for determining the orientation of the scene according to claim 3, wherein in a case where the starting point and the ending point are both in a lower half of the preset sky box, the determining a first projection vector of the preset vector on a horizontal section where a center of the preset sky box is located includes:
determining specific projection coordinates of the starting point and the ending point on the horizontal section through a specific mapping relation according to three-dimensional coordinates of the starting point and the ending point on the preset sky box; and
and determining a first projection vector of the preset vector on the horizontal section according to the specific projection coordinates of the starting point and the end point on the horizontal section.
5. The method of determining an orientation of a scene as claimed in claim 4, wherein determining the specific projection coordinates of the start point or the end point on the horizontal section plane by the specific mapping relationship comprises:
determining a specific projection coordinate of the start point or the end point on the horizontal section by the following mapping relationship:
si=-(1-θi/π)×cos(αi)×ri(ii) a And
ti=(1-θi/π)×sin(αi)×ri
wherein i is 1 or 2; (r)1,θ1,α1) And (r)2,θ2,α2) Three-dimensional coordinates of the starting point and the ending point on the preset sky box respectively; and(s)1,t1) And(s)2,t2) The specific projection coordinates of the starting point and the end point on the horizontal section are respectively.
6. The method for determining the orientation of the scene according to claim 1, wherein the determining the orientation of the scene in which the preset sky box is located includes:
calculating an angle from the second projection vector to the first projection vector; and
taking the obtained angle as a deviation angle of the orientation of the scene; and/or
The starting point and the ending point of the preset vector respectively correspond to a first preset point and a second preset point in an actual space, and correspondingly, the determining method further comprises:
and determining two-dimensional coordinates of the starting point and the ending point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and the position information of the camera.
7. The method of determining scene orientation according to claim 6, wherein the position information of the camera includes displacement information, rotation information, and scaling information, and accordingly, the determining the two-dimensional coordinates of the start point and the end point on the panorama image includes:
determining a world space transformation matrix of the camera according to the displacement information, the rotation information and the zooming information of the camera;
determining a viewport matrix of the camera according to the world space transformation matrix of the camera; and
and determining two-dimensional coordinates of the starting point and the end point on the panoramic image according to the three-dimensional coordinates of the first preset point and the second preset point in the actual space and a viewport matrix of the camera.
8. A method of marking an orientation of a scene, the method comprising:
the method of determining orientation of a scene according to any one of claims 1-7, determining orientation of a scene in which a predetermined sky box is located; and
marking the determined orientation at a preset position on the panorama such that the marked orientation is always within a range of viewing angles of a camera in the scene.
9. The method of claim 8, wherein the panorama comprises a live-action area and a sky area, and the predetermined position is any one of a predetermined position centered on a center of the panorama and distributed within the live-action area of the panorama.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, carries out the steps of the method for determining a scene orientation according to any one of the preceding claims 1 to 7 and/or the steps of the method for marking a scene orientation according to claim 8 or 9.
CN202110902964.5A 2021-08-06 2021-08-06 Scene orientation determining method and marking method Active CN113593052B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110902964.5A CN113593052B (en) 2021-08-06 2021-08-06 Scene orientation determining method and marking method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110902964.5A CN113593052B (en) 2021-08-06 2021-08-06 Scene orientation determining method and marking method

Publications (2)

Publication Number Publication Date
CN113593052A CN113593052A (en) 2021-11-02
CN113593052B true CN113593052B (en) 2022-04-29

Family

ID=78256275

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110902964.5A Active CN113593052B (en) 2021-08-06 2021-08-06 Scene orientation determining method and marking method

Country Status (1)

Country Link
CN (1) CN113593052B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114299809B (en) * 2021-12-30 2024-03-22 北京有竹居网络技术有限公司 Direction information display method, display device, electronic apparatus, and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107545040A (en) * 2017-08-04 2018-01-05 深圳航天智慧城市系统技术研究院有限公司 A kind of method and system of the label direction in Computerized three-dimensional geographic information scene
CN111028336A (en) * 2019-11-30 2020-04-17 北京城市网邻信息技术有限公司 Scene switching method and device and storage medium
CN111247561A (en) * 2018-07-03 2020-06-05 上海亦我信息技术有限公司 Method for reconstructing three-dimensional space scene based on photographing

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100110069A1 (en) * 2008-10-31 2010-05-06 Sharp Laboratories Of America, Inc. System for rendering virtual see-through scenes
US9041743B2 (en) * 2010-11-24 2015-05-26 Aria Glassworks, Inc. System and method for presenting virtual and augmented reality scenes to a user
US20160063671A1 (en) * 2012-08-30 2016-03-03 Nokia Corporation A method and apparatus for updating a field of view in a user interface
WO2014043814A1 (en) * 2012-09-21 2014-03-27 Tamaggo Inc. Methods and apparatus for displaying and manipulating a panoramic image by tiles
US9558559B2 (en) * 2013-04-05 2017-01-31 Nokia Technologies Oy Method and apparatus for determining camera location information and/or camera pose information according to a global coordinate system
US10352717B2 (en) * 2017-02-28 2019-07-16 Google Llc Navigation application programming interface
CN108564654B (en) * 2018-04-03 2020-07-31 中德(珠海)人工智能研究院有限公司 Picture entering mode of three-dimensional large scene
CN111161350B (en) * 2019-12-18 2020-12-04 北京城市网邻信息技术有限公司 Position information and position relation determining method, position information acquiring device
CN112233214B (en) * 2020-10-15 2023-11-28 洛阳众智软件科技股份有限公司 Snow scene rendering method, device and equipment for large scene and storage medium
CN112396684A (en) * 2020-11-13 2021-02-23 贝壳技术有限公司 Ray tracing method, ray tracing device and machine-readable storage medium
CN112950759B (en) * 2021-01-28 2022-12-06 贝壳找房(北京)科技有限公司 Three-dimensional house model construction method and device based on house panoramic image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107545040A (en) * 2017-08-04 2018-01-05 深圳航天智慧城市系统技术研究院有限公司 A kind of method and system of the label direction in Computerized three-dimensional geographic information scene
CN111247561A (en) * 2018-07-03 2020-06-05 上海亦我信息技术有限公司 Method for reconstructing three-dimensional space scene based on photographing
CN111028336A (en) * 2019-11-30 2020-04-17 北京城市网邻信息技术有限公司 Scene switching method and device and storage medium

Also Published As

Publication number Publication date
CN113593052A (en) 2021-11-02

Similar Documents

Publication Publication Date Title
CN109523471B (en) Method, system and device for converting ground coordinates and wide-angle camera picture coordinates
WO2020062434A1 (en) Static calibration method for external parameters of camera
CN108447097A (en) Depth camera scaling method, device, electronic equipment and storage medium
CN108154558B (en) Augmented reality method, device and system
CN108344401B (en) Positioning method, positioning device and computer readable storage medium
CN110176032A (en) A kind of three-dimensional rebuilding method and device
CN107843251A (en) The position and orientation estimation method of mobile robot
US8675013B1 (en) Rendering spherical space primitives in a cartesian coordinate system
CN110298924A (en) For showing the coordinate transformation method of detection information in a kind of AR system
CN111476876B (en) Three-dimensional image rendering method, device, equipment and readable storage medium
CN109120901B (en) Method for switching pictures among cameras
CN105095314A (en) Point of interest (POI) marking method, terminal, navigation server and navigation system
CN113593052B (en) Scene orientation determining method and marking method
CN115439528B (en) Method and equipment for acquiring image position information of target object
CN107679015B (en) Three-dimensional map-based real-time monitoring range simulation method for pan-tilt camera
CN110807413B (en) Target display method and related device
CN115830135A (en) Image processing method and device and electronic equipment
CN114004890A (en) Attitude determination method and apparatus, electronic device, and storage medium
CN116740716A (en) Video labeling method, video labeling device, electronic equipment and medium
CN116862997A (en) Method, device, equipment and storage medium for calculating and verifying camera calibration
CN114549766B (en) Real-time AR visualization method, device, equipment and storage medium
JP3660108B2 (en) Image storage method and machine-readable medium
CN113012032B (en) Aerial panoramic image display method capable of automatically labeling place names
CN114693820A (en) Object extraction method and device, electronic equipment and storage medium
CN113008135B (en) Method, apparatus, electronic device and medium for determining a position of a target point in space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220113

Address after: 100085 Floor 101 102-1, No. 35 Building, No. 2 Hospital, Xierqi West Road, Haidian District, Beijing

Applicant after: Seashell Housing (Beijing) Technology Co.,Ltd.

Address before: 101309 room 24, 62 Farm Road, Erjie village, Yangzhen, Shunyi District, Beijing

Applicant before: Beijing fangjianghu Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant