CN113157835B - Image processing method, device and platform based on GIS platform and storage medium - Google Patents

Image processing method, device and platform based on GIS platform and storage medium Download PDF

Info

Publication number
CN113157835B
CN113157835B CN202110238719.9A CN202110238719A CN113157835B CN 113157835 B CN113157835 B CN 113157835B CN 202110238719 A CN202110238719 A CN 202110238719A CN 113157835 B CN113157835 B CN 113157835B
Authority
CN
China
Prior art keywords
image
coordinates
texture
polygon
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110238719.9A
Other languages
Chinese (zh)
Other versions
CN113157835A (en
Inventor
崔子豪
蔡少仲
胡金晖
魏俊博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smart City Research Institute Of China Electronics Technology Group Corp
Original Assignee
Smart City Research Institute Of China Electronics Technology Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smart City Research Institute Of China Electronics Technology Group Corp filed Critical Smart City Research Institute Of China Electronics Technology Group Corp
Priority to CN202110238719.9A priority Critical patent/CN113157835B/en
Publication of CN113157835A publication Critical patent/CN113157835A/en
Application granted granted Critical
Publication of CN113157835B publication Critical patent/CN113157835B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Image Generation (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image processing method, an image processing device, an image processing platform and a storage medium based on a GIS platform, wherein the image processing method comprises the following steps: acquiring an original texture image and a target texture image, wherein the shape to be attached of the target texture image at the position to be attached on a map is rectangular; determining the position coordinates of a circumscribed rectangle of an original polygon according to the original texture image, wherein the original polygon is a projected image of the original texture image; determining the image coordinates of a texture polygon according to the position coordinates of the circumscribed rectangle, wherein the texture polygon is a graph corresponding to an original polygon in the target texture image; determining a projective transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon; and updating the target texture image according to the projective transformation matrix. By adopting the technical scheme, the texture image which is more fit with the map can be generated.

Description

Image processing method, device and platform based on GIS platform and storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing device, an image processing platform, and a storage medium based on a GIS platform.
Background
A Geographic Information System (GIS) is a very important spatial Information System, which is a technical System for collecting, storing, managing, computing, analyzing, displaying and describing Geographic distribution data in the whole or part of the space of the earth's surface layer (including the atmosphere) under the support of computer software and hardware systems. The GIS platform can realize that the terrain video or the image that will gather adopt the mode of texture laminating to demonstrate on the map, and each GIS platform is inside all to have own texture laminating processing mechanism, if the Cesium platform is when laminating the texture, can cut out the texture image automatically, if the terrain video or the image that gather are not vertical projection's image, will cause the condition that the image after the texture laminating takes place to cut out, can not laminate actual topography accurately on the map. Therefore, how to generate a texture image more fitting to a map becomes an urgent problem to be solved.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, an image processing platform and a storage medium based on a GIS platform, and a texture image which is more fit with a map can be generated.
A first aspect of an embodiment of the present application provides an image processing method based on a GIS platform, where the image processing method includes:
acquiring an original texture image and a target texture image, wherein the shape to be attached of the target texture image at the position to be attached on a map is rectangular;
determining the position coordinates of a circumscribed rectangle of an original polygon according to the original texture image, wherein the original polygon is a projected graph of the original texture image, and the position coordinates of the circumscribed rectangle refer to the coordinates of the circumscribed rectangle in a geographic coordinate system;
determining the image coordinates of a texture polygon according to the position coordinates of the circumscribed rectangle, wherein the texture polygon is a graph corresponding to an original polygon in the target texture image, and the image coordinates of the texture polygon refer to the coordinates of the texture polygon in a target texture image coordinate system;
determining a projective transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon;
and updating the target texture image according to the projective transformation matrix.
A second aspect of embodiments of the present application provides an image processing apparatus based on a GIS platform, the image processing apparatus including:
the image obtaining module is used for obtaining an original texture image and a target texture image, wherein the shape to be attached of the position to be attached of the target texture image on the map is a rectangle;
the first coordinate determination module is used for determining the position coordinates of a circumscribed rectangle of an original polygon according to the original texture image, wherein the original polygon is a projected graph of the original texture image, and the position coordinates of the circumscribed rectangle refer to the coordinates of the circumscribed rectangle in a geographic coordinate system;
the second coordinate determination module is used for determining the position coordinates of a texture polygon according to the image coordinates of the circumscribed rectangle, wherein the texture polygon is a graph corresponding to an original polygon in the target texture image, and the image coordinates of the texture polygon refer to the coordinates of the texture polygon in a target texture image coordinate system;
the matrix determining module is used for determining a projection transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon;
and the image updating module is used for updating the target texture image according to the projective transformation matrix.
A third aspect of an embodiment of the present application provides a GIS platform, including: a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the image processing method based on the GIS platform according to the first aspect when executing the computer program.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the GIS platform-based image processing method according to the first aspect.
A fifth aspect of the embodiments of the present application provides a computer program product, which, when running on a GIS platform, enables the GIS platform to execute the image processing method based on the GIS platform according to the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, an original texture image and a target texture image are firstly obtained, the original texture image is projected to the surface of a map to obtain an original polygon, the position coordinate of an external rectangle of the original polygon is calculated according to the coordinate of the original polygon, and the position coordinate is the coordinate of the external rectangle in a geographic coordinate system. Because the texture polygon is a graph corresponding to the original texture image in the target texture image, the normalized coordinates of the texture polygon compared with the target texture image and the normalized coordinates of the original polygon compared with the external rectangle thereof are equal in value, so the position coordinates of the texture polygon can be determined according to the position coordinates of the external rectangle, and the four-corner point coordinates of the texture polygon correspond to the four-corner point coordinates of the original texture image, so that the projection transformation matrix from the target texture image to the original texture image can be determined according to the position coordinates of the texture polygon, finally, the target texture image is updated according to the projection transformation matrix, and the updated target texture image can be attached to the external rectangle of the original polygon without cutting, so as to generate the texture image more attached to the map.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of an image processing method based on a GIS platform according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an image processing method based on a GIS platform according to a second embodiment of the present application;
FIG. 3 is a diagram showing a positional relationship between an original polygon and a circumscribed rectangle;
FIG. 4 is a diagram of the positional relationship of a texture polygon and a target texture image;
FIG. 5 is a graph showing the correspondence between pixels in the original texture pattern and the updated target texture image;
FIG. 6 is a comparison graph of the original Cesium platform image processing effect and the image processing effect of the present application on the Cesium platform;
FIG. 7 is a comparison graph of the image processing effect of the original Mapbox platform and the image processing effect of the present application on the Mapbox platform;
fig. 8 is a schematic structural diagram of an image processing apparatus based on a GIS platform according to a third embodiment of the present application;
fig. 9 is a schematic structural diagram of a GIS platform according to a fourth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items and includes such combinations.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution of the present application, the following description is given by way of specific examples.
Referring to fig. 1, a schematic flow diagram of an image processing method based on a GIS platform provided in an embodiment of the present application is shown, where the image processing method is applied to a GIS platform, and specifically, may be applied to a sesium platform or a Mapbox platform inside the GIS platform, and as shown in the figure, the image processing method may include the following steps:
step 101, an original texture image and a target texture image are obtained.
In the embodiment of the application, the images of street view, driving road conditions and the like displayed on the map are acquired by the camera, and after the images are acquired by the camera, the images are transmitted to the GIS platform to acquire the original texture images. The target texture image refers to a texture image to be attached to the GIS platform, wherein the shape to be attached of the position to be attached of the target texture image on the map is a rectangle, the shape to be attached of the position to be attached on the map is a rectangle, and when the target texture image is attached to the position to be attached on the map, cutting is not needed.
Specifically, the target texture image is an image which is created by a developer and has a rectangular shape, and the size of the target texture image may be the same as that of the original texture image, or the size of the target texture image may be scaled according to actual conditions. Meanwhile, the initial pixel value of each pixel point in the target texture image may be any value, which is not limited in the present application.
It should be noted that the scaling ratio in the present application is not limited.
And 102, determining the position coordinates of the circumscribed rectangle of the original polygon according to the original texture image.
In the embodiment of the application, the original polygon is a projected graph of the original texture image, and because the original texture image is not deformed after being projected and still has the shape of the original texture image only under the condition of vertical projection, the projected image is deformed except for the vertical projection, and therefore when texture fitting is performed, if the shape to be fitted at the position to be fitted on the map is set to be a rectangle (namely the shape of the original texture image), clipping cannot occur when the target texture image to be fitted is fitted, and the phenomenon of texture missing cannot be caused. The shape to be bonded of the position to be bonded on the map is set to be a rectangle, an external rectangle of the projected original polygon can be obtained, and the external rectangle can correspond to the shape to be bonded of the position to be bonded on the map.
Specifically, when an original texture image is projected onto a map, firstly, the position coordinates of the projected original polygon can be directly obtained, secondly, the position coordinates of the circumscribed rectangle of the original polygon are obtained according to the position coordinates of the original polygon, because the original texture image is obtained by shooting through a camera, the original texture image does not have actual coordinates, but when the original texture image is projected onto the map and is fitted with the map, the formed original polygon has the position coordinates which are the coordinates of the original polygon in a geographic coordinate system, and finally, the position coordinates of the circumscribed rectangle of the original polygon are calculated according to the position coordinates of the original polygon. Wherein, two opposite sides of the external rectangle are respectively parallel to the longitude line and the latitude line.
Illustratively, each vertex coordinate of the original polygon is first acquired, and the maximum initial value and the minimum initial value of the latitude and longitude, which are in units of radians, are set. Setting the maximum initial value and the minimum initial value of the latitude and longitude may be expressed as follows:
max Lng=-PI
min Lng=PI
maxLat=-PI/2
minLat=PI/2
wherein maxLng is the maximum longitude; minLng is the minimum longitude; maxLat is the maximum latitude; minLat is the minimum latitude; PI is the circumferential ratio.
Secondly, acquiring the maximum value and the minimum value of the longitude and latitude coordinates of the original polygon, and determining the maximum value and the minimum value of the final longitude and latitude coordinates according to the size relationship between each vertex coordinate of the original polygon and the maximum initial value and the minimum initial value of the longitude and latitude. The maximum value and the minimum value of the final longitude and latitude coordinates can be determined by the following method:
obtaining vertex coordinates (lng) of an original polygon i ,lat i ) The unit of the vertex coordinate is radian, wherein i =1,2,3 \8230nis the serial number of each vertex; if ng i >maxLng, let maxLng = lng i (ii) a If ng i <minLng, let minLng = ling i (ii) a If ng i If = maxLng, let maxLng = lng i (ii) a If ng i If minLng = minLng, let minLng = lng i (ii) a If lat i >maxLat, let maxLat = lat i (ii) a If lat i <minLat, let minLat = lat i (ii) a If lat i If = maxLng, let maxLng = lat i (ii) a If lat i If minLng, let minLng = lat i . And comparing all vertex coordinates of the original polygon to obtain the maximum value and the minimum value of the final longitude and latitude coordinates.
And finally, according to the maximum value and the minimum value of the final longitude and latitude coordinates obtained through calculation, determining that the position coordinates of the external rectangle of the original polygon are (minLng, minLat), (maxLng, minLat), (maxLng, maxLat), (minLng, maxLat) in sequence.
And 103, determining the image coordinates of the texture polygons according to the position coordinates of the circumscribed rectangle.
In the embodiment of the present application, fig. 3 is a position relationship diagram of an original polygon and a bounding rectangle thereof, and fig. 4 is a position relationship diagram of a texture polygon and a target texture image. The texture polygon is a figure corresponding to an original polygon in a target texture image, and because the size of the target texture image and the size of a circumscribed rectangle of the original polygon can be unequal, the position coordinate of the original polygon compared with the circumscribed rectangle is unequal to the coordinate of the texture polygon compared with the target texture image, but because the coordinate corresponding relation of the original polygon and the circumscribed rectangle is the same as the coordinate corresponding relation of the texture polygon and the target texture image, the normalized coordinate of the original polygon compared with the circumscribed rectangle can be obtained, and the image coordinate of the texture polygon is determined according to the normalized coordinate.
Optionally, determining the image coordinates of the texture polygon according to the position coordinates of the circumscribed rectangle includes:
acquiring a normalized coordinate of a circumscribed rectangle;
and determining the image coordinates of the texture polygon according to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle.
In this embodiment of the present application, the normalized coordinates of the circumscribed rectangle may be obtained by performing normalization processing on the coordinates of the current circumscribed rectangle, for example, it may be assumed that the normalized coordinates of each vertex of the circumscribed rectangle are: (0, 0), (1, 1), (0, 1). According to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle, the normalized coordinates of each vertex of the original polygon can be calculated, and since the texture polygon is the image corresponding to the original polygon in the target texture image, the normalized coordinates of the texture polygon can be determined according to the normalized coordinates of each vertex of the original polygon, and then the image coordinates of the texture polygon can be determined according to the normalized coordinates of the texture polygon and the size of the target texture image.
Optionally, determining the image coordinates of the texture polygon according to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle includes:
according to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle, performing normalization processing on the position coordinates of the original polygon to obtain normalized coordinates of the original polygon;
determining the normalized coordinates of the original polygon as the normalized coordinates of the texture polygon;
and calculating the image coordinates of the texture polygon according to the normalized coordinates of the texture polygon.
Exemplarily, the normalized coordinates of each vertex of the circumscribed rectangle are assumed to be: (0, 0), (1, 1) and (0, 1), the vertex coordinate of the original polygon is (lng) i ,lat i ) Wherein i =1,2,3, \8230;, n is the number of each vertex. Normalized coordinates (x) of vertices of original polygon i ,y i ) The specific calculation method of (a) can be expressed as follows:
x i =(lng i -minLng)/(maxLng-minLng)
y i =(lat i -minLat)/(maxLat-minLat)
wherein i =1,2,3, \8230;, n is the serial number of each vertex.
The normalized coordinates of the vertices of the original polygon are equal in value to the normalized coordinates of the circumscribed rectangle of the original polygon, which is obtained from the analysis, in the reference system of the circumscribed rectangle of the original polygon, and the normalized coordinates of the vertices of the original polygon on the target texture image are equal in value to the normalized coordinates of the vertices of the texture polygon, which is determined as the normalized coordinates of the vertices of the texture polygon, and a polygon composed of the vertices of the original polygon on the target texture image can be a texture polygon as shown in fig. 4, and after the normalized coordinates of the texture polygon are obtained, the normalized coordinates of the texture polygon are multiplied by the size width of the target texture image, so as to obtain the image coordinates of the texture polygon, wherein the image coordinates of the texture polygon refer to the coordinates of the texture polygon in the target texture image coordinate system.
It should be noted that, according to the normalized coordinates of each vertex of the circumscribed rectangle obtained in step 102, the normalized coordinates of each vertex of the original polygon can be calculated, so as to obtain the normalized coordinates of each vertex of the texture polygon. If position coordinates of other pixel points except the vertex in the texture polygon need to be obtained, the normalized coordinates of the pixel points except the vertex in the original polygon can be obtained, and then the normalized coordinates of the other pixel points in the texture polygon are obtained.
It should be further noted that, the image coordinates of the texture polygon may be obtained by performing normalization calculation according to the same coordinate corresponding relationship between the original polygon and the external rectangle thereof and the same coordinate corresponding relationship between the texture polygon and the target texture image, but at this time, the fixed mapping relationship between the original texture image and the target texture image may not be obtained, that is, the image coordinates of the corresponding pixel point in the original texture image may not be obtained according to the image coordinates of the pixel point in the target texture image, so that the image coordinates of the corresponding point between the target texture image and the original texture image may be obtained by calculating the fixed mapping relationship between the original texture image and the target texture image.
And 104, determining a projection transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon.
In the embodiment of the present application, the fixed mapping relationship between the original texture image and the target texture image may be represented by a projective transformation matrix from the target texture image to the original texture image. Wherein, the projective transformation matrix can be calculated by solving the homography matrix.
Illustratively, taking the solution of the homography matrix as an example, the homography matrix is set as a two-dimensional matrix, and then the original texture image is obtained by multiplying the target texture image by the homography matrix. The homography matrix can be represented as follows:
Figure BDA0002961308420000091
the transformation relationship between the target texture image and the original texture image may be specifically expressed as:
p b =H ab p a
wherein p is a Is the position coordinate, p, of a pixel point in the target texture image b And the position coordinates of the pixel points in the original texture image are obtained.
It should be noted that the projective transformation matrix may be calculated by a method of solving the homography matrix, but is not limited to the homography matrix, and may also be a method of solving other projective transformation matrices.
Optionally, determining a projective transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon comprises:
acquiring image coordinates of M first pixel points in a texture polygon, wherein M is an integer larger than 3;
acquiring image coordinates of M second pixel points in the original texture image, wherein the second pixel points are pixel points corresponding to the first pixel points in the original texture image, and the image coordinates of the second pixel points refer to the coordinates of the second pixel points in an original texture image coordinate system;
and determining a projection transformation matrix from the target texture image to the original texture image according to the image coordinates of the M first pixel points and the image coordinates of the M second pixel points.
In the embodiment of the present application, the homography matrix H ab A total of 9 unknown parameters, but since the homography matrix is a unit matrix, h 33 Usually 1 is taken, so there are 8 unknowns in the homography matrix, and at least 4 pairs of corresponding points in the target texture image and the original texture image are needed to solve the homography matrix, i.e. the projective transformation matrix.
Specifically, the image coordinates of the M first pixel points in the texture polygon may be obtained by first obtaining image coordinates of four vertices in the texture polygon, and obtaining image coordinates of points corresponding to the four vertices in the texture polygon in the original texture image, and obtaining a projection transformation matrix from the target texture image to the original texture image by using the method for calculating the projection transformation matrix according to the image coordinates of the four pairs of points.
It should be noted that the image coordinates and the number of the coordinate point pairs used for calculating the projective transformation matrix are not limited to the four vertices of the original texture image, and may also be the image coordinates and the texture of the apparent object points on the original texture imageThe image coordinates of corresponding points in the polygon are directly adopted to solve the homography matrix H by adopting the corresponding coordinates of four or more pairs of points ab The number of homography matrices to be solved and the image coordinates of corresponding points are not limited in the present application.
And 105, updating the target texture image according to the projective transformation matrix.
In this embodiment of the present application, updating the target texture image may be implemented by updating pixel values of pixels in the target texture image, first obtaining image coordinates of each pixel in the target texture image, determining image coordinates of pixels in the original texture image corresponding to the pixels in the target texture image according to the projective transformation matrix obtained in step 104, and if the pixels corresponding to the pixels in the target texture image are inside the original texture image, directly obtaining the pixel values of the corresponding pixels in the original texture image, and directly assigning the pixel values to the corresponding pixels in the target texture image, so as to update the pixel values of the corresponding pixels. If the pixel point corresponding to the pixel point of the target texture image is not in the original texture image, the corresponding pixel point is beyond the range of the original texture image, and the pixel point in the target texture image is set as a transparent pixel point, so that the pixel values of all the pixel points in the target texture image are continuously updated. And finally, when the pixel values of all the pixel points in the target texture image are updated, finishing updating the target texture image to obtain a new target texture image, wherein the updated target texture image is shown in fig. 5.
As can be seen from the above, in the embodiment of the present application, the original texture image and the target texture image are obtained, the original texture image is projected onto the surface of the map to obtain the original polygon, and the position coordinate of the external rectangle of the original texture image is calculated according to the coordinate of the original polygon, where the position coordinate is the coordinate of the external rectangle in the geographic coordinate system. Because the texture polygon is a graph corresponding to the original texture image in the target texture image, the normalized coordinates of the texture polygon compared with the target texture image and the normalized coordinates of the original polygon compared with the external rectangle thereof are equal in value, so the position coordinates of the texture polygon can be determined according to the position coordinates of the external rectangle, and the four-corner point coordinates of the texture polygon correspond to the four-corner point coordinates of the original texture image, so that the projection transformation matrix from the target texture image to the original texture image can be determined according to the position coordinates of the texture polygon, finally, the target texture image is updated according to the projection transformation matrix, and the updated target texture image can be attached to the external rectangle of the original polygon without cutting, so as to generate the texture image more attached to the map.
Referring to fig. 2, a schematic flow chart of an image processing method based on a GIS platform provided in the second embodiment of the present application is shown, where the image processing method is applied to the GIS platform, and as shown in the figure, the image processing method may include the following steps:
step 201, obtaining an original texture image and a target texture image.
Step 201 of this embodiment is similar to step 101 of the previous embodiment, and can refer to each other, which is not described herein again.
Step 202, according to the original texture image, determining the position coordinates of the circumscribed rectangle of the original polygon.
Step 202 of this embodiment is similar to step 102 of the previous embodiment, and can be referred to each other, which is not described herein again.
And step 203, determining the image coordinates of the texture polygon according to the position coordinates of the circumscribed rectangle.
Step 203 of this embodiment is similar to step 103 of the previous embodiment, and can refer to each other, which is not described herein again.
And step 204, determining a projective transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon.
Step 204 of this embodiment is similar to step 104 of the previous embodiment, and reference may be made to this embodiment, which is not described herein again.
And step 205, determining all second pixel points corresponding to all pixel points in the target texture image according to the projection transformation matrix.
In this embodiment, all the second pixel points refer to pixel points corresponding to all the pixel points in the target texture image, where all the second pixel points include pixel points in the original texture image and pixel points outside the original texture image. Determining all second pixel points corresponding to all pixel points in the target texture image refers to determining image coordinates of all second pixel points, and since the size of the target texture image can be the same as or different from that of the original texture image, the normalized coordinates of all pixel points in the target texture image and the normalized coordinates of the second pixel points are obtained according to the projection transformation matrix, and whether the pixel points corresponding to all pixel points in the target texture image are in the range of the original texture image or not can be determined according to the normalized coordinates of the second pixel points.
And step 206, acquiring image coordinates of all second pixel points.
In the embodiment of the present application, obtaining the image coordinates of all the second pixel points includes obtaining the image coordinates of the second pixel points in the original texture image and the image coordinates of the second pixel points outside the original texture image.
Illustratively, the image coordinates of the pixel points in the target texture image are obtained as (col) i ,row i ) Firstly, normalizing the image coordinates of the pixel points in the target texture image, which can be specifically expressed as the following form:
x i =col i /width
y i =row i /height
wherein x is i Is the abscissa, y, of the pixel point in the target texture image after normalization i The vertical coordinate of the pixel point after normalization in the target texture image is shown, width is the width of the size of the target texture image, height is the length of the size of the target texture image, and the size of the target texture image is the same as that of the original texture image. In the application, a certain proportional relationship may exist between the size of the target texture image and the size of the original texture image, and the size of the proportional relationship is not limited in the application.
Next, according to step 204The resulting projective transformation matrix H ab Calculating the pixel point (col) in the target texture image i ,row i ) The normalized coordinates of the corresponding second pixel point may be specifically expressed as:
Figure BDA0002961308420000131
Figure BDA0002961308420000132
wherein (x) 0i ,y 0i ) Is the normalized coordinate of the second pixel point, z 0i Is 1.
After obtaining the normalized coordinates of the second pixel points, according to the size of the original texture image, calculating the image coordinates of the second pixel points on the original texture image as follows:
Figure BDA0002961308420000133
wherein (col) 0i ,row 0i ) The image coordinates of the second pixel points on the original texture image can be located according to the image coordinates, so that the image coordinates of the second pixel points corresponding to all the pixel points in the target texture image can be obtained.
And step 207, updating the pixel value of each pixel point in the target texture image according to the image coordinates of all the second pixel points.
In the embodiment of the present application, the image coordinates of the second pixel points refer to coordinates of the second pixel points in the original texture image, and since only the second pixel points corresponding to the first pixel points in the texture polygon are in the original texture image in the target texture image, it is necessary to determine whether each pixel point in the target texture image belongs to the texture polygon, specifically, image coordinates of pixel points corresponding to each pixel point in the target texture image may be calculated, and whether each pixel point in the target texture image belongs to the texture polygon is determined.
Optionally, updating the pixel value of each pixel point in the target texture image according to the image coordinates of all the second pixel points, including:
if the image coordinates of the second pixel points are not in the original texture image, updating the pixel points corresponding to the second pixel points in the target texture image into transparent pixel points;
and if the image coordinates of the second pixel points are in the original texture image, updating the pixel values of the pixel points corresponding to the second pixel points in the target texture image into the pixel values of the second pixel points.
In the embodiment of the application, whether the image coordinate of the second pixel point is in the original image is judged, firstly, the normalized coordinate of the image coordinate of the second pixel point can be calculated, and if the normalized coordinate of the image coordinate of the second pixel point is between (0, 0) and (1, 1), the image coordinate of the second pixel point is in the original texture image; if the normalized coordinates of the image coordinates of the second pixel point are outside (0, 0) to (1, 1), for example, if the normalized coordinates of the image coordinates of the second pixel point are (2, 3), it indicates that the image coordinates of the second pixel point are outside the original texture image.
Specifically, if the image coordinate of the second pixel point is not in the original texture image, it indicates that the first pixel point in the target texture image corresponding to the second pixel point is not in the texture polygon, and at this time, the pixel point in the target texture image corresponding to the second pixel point may be updated to be a transparent pixel point.
If the image coordinate of the second pixel point is in the original texture image, it indicates that the first pixel point in the target texture image corresponding to the second pixel point is in the texture polygon, and at this time, the pixel value of the pixel point corresponding to the second pixel point in the target texture image can be updated to the value of the second pixel point.
Illustratively, the pixel value of the second pixel point may be obtained by an interpolation algorithm. Since the abscissa and the ordinate of the obtained image coordinate of the second pixel point may not be integers (for example, the obtained coordinate is (0.5, 0.7)), at this time, the pixel value of the pixel point cannot be directly obtained from the original texture image, and the pixel value needs to be calculated through an interpolation algorithm. The method comprises the steps of firstly obtaining image coordinates of a second pixel point on an original image, simultaneously obtaining four pixel points around the second pixel point, obtaining the four pixel points, directly obtaining pixel values of the four pixel points after obtaining the four pixel points, and calculating the pixel value of the second pixel point according to the pixel value interpolation of the four pixel points.
And updating the pixel value of each pixel point in the target texture image according to the method until all the pixel points are updated, and acquiring the updated target texture image as shown in fig. 5.
In the embodiment of the present application, fig. 6 is a comparison graph of the original effect of processing a cesum platform image and the effect of processing the cesum platform image in the present application. The updated target texture image is applied to the Cesium platform for texture attachment, the image processing effect of the application on the Cesium platform as shown in fig. 6 can be obtained, the attachment position of the texture cannot be controlled by the texture attachment mechanism of the original Cesium platform through the condition of ground object attachment position and road connection in the image processing effect of the original Cesium platform, and the purpose of virtual-real fusion cannot be met.
In the embodiment of the present application, fig. 7 is a comparison graph of an original Mapbox platform image processing effect and an image processing effect of the present application on the Mapbox platform. And applying the updated target texture image to the Mapbox platform for texture fitting, so as to obtain the image processing effect of the application on the Mapbox platform as shown in fig. 7. Compared with the original Mapbox platform image processing method (namely, the method for directly adhering the textures by adopting the original texture image), the image processing method solves the problem that deformation occurs near the diagonal line in the original Mapbox platform image processing.
It should be noted that the pixel value of the second pixel point in the original texture image may be obtained through any other interpolation algorithm or an existing algorithm for obtaining the pixel value of the pixel point, which is not limited in this application.
Compared with the first embodiment, the method and the device have the advantages that the pixel value corresponding to the pixel point is directly obtained from the original texture image, the situation that the horizontal and vertical coordinates of the pixel point are not integers is also considered, and the situation that the pixel value cannot be directly obtained when the horizontal and vertical coordinates of the pixel point are not integers is provided, and the pixel value of the pixel point is obtained by adopting a bilinear interpolation algorithm. The pixel value obtained by the bilinear interpolation algorithm is more accurate, and the target texture image is more favorably updated, so that the updated target texture image is more fit with a map. In addition, the image processing method eliminates the differentiation caused by different projection mechanisms of different platforms on the premise of not modifying the bottom layer codes of the platforms, and achieves the purpose of controlling the fitting position of the texture image.
Referring to fig. 8, a schematic structural diagram of an image processing device based on a GIS platform according to a third embodiment of the present application is shown, and for convenience of description, only the parts related to the third embodiment of the present application are shown, where the image processing device based on a GIS platform may specifically include the following modules:
an image obtaining module 801, configured to obtain an original texture image and a target texture image, where a shape to be attached of a position to be attached of the target texture image on a map is a rectangle;
a first coordinate determining module 802, configured to determine, according to the original texture image, position coordinates of a circumscribed rectangle of an original polygon, where the original polygon is a projected graph of the original texture image, and the position coordinates of the circumscribed rectangle refer to coordinates of the circumscribed rectangle in a geographic coordinate system;
a second coordinate determining module 803, configured to determine a position coordinate of a texture polygon according to a position coordinate of a circumscribed rectangle, where the texture polygon is a graph corresponding to an original polygon in a target texture image;
a matrix determining module 804, configured to determine a projective transformation matrix from the target texture image to the original texture image according to the position coordinates of the texture polygon;
and an image updating module 805, configured to update the target texture image according to the projective transformation matrix.
In this embodiment of the application, the second coordinate determination module 803 may specifically include the following sub-modules:
the rectangle normalization submodule is used for acquiring the normalization coordinate of the circumscribed rectangle;
and the coordinate determination submodule is used for determining the image coordinates of the texture polygon according to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle.
Alternatively, the coordinate determination submodule may include the following units:
the polygon normalization unit is used for normalizing the position coordinates of the original polygon according to the normalization coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle to obtain the normalization coordinates of the original polygon;
the texture determining unit is used for determining the normalized coordinates of the original polygon as the normalized coordinates of the texture polygon;
and the coordinate calculation unit is used for calculating the image coordinates of the texture polygons according to the normalized coordinates of the texture polygons.
In this embodiment, the matrix determining module 804 may specifically include the following sub-modules:
the first coordinate obtaining submodule is used for obtaining image coordinates of M first pixel points in the texture polygon, and M is an integer larger than 3;
the second coordinate acquisition submodule acquires image coordinates of M second pixel points in the original texture image, wherein the second pixel points are pixel points corresponding to the first pixel points in the original texture image, and the image coordinates of the second pixel points refer to coordinates of the second pixel points in an original texture image coordinate system;
and the matrix determining submodule is used for determining a projection transformation matrix from the target texture image to the original texture image according to the image coordinates of the M first pixel points and the image coordinates of the M second pixel points.
In this embodiment, the image update module 805 may specifically include the following sub-modules:
the pixel point determining submodule is used for determining all second pixel points corresponding to all pixel points in the target texture image according to the projection transformation matrix;
the third coordinate acquisition submodule is used for acquiring the image coordinates of all the second pixel points;
and the pixel value updating submodule is used for updating the pixel value of each pixel point in the target texture image according to the image coordinates of all the second pixel points.
Optionally, the pixel value update sub-module may specifically include the following units:
the transparent updating unit is used for updating pixel points corresponding to the second pixel points in the target texture image into transparent pixel points if the position coordinates of the second pixel points are not in the original texture image;
and the pixel updating unit is used for updating the pixel value of the pixel point corresponding to the second pixel point in the target texture image into the pixel value of the second pixel point if the position coordinate of the second pixel point is in the original texture image.
The image processing device based on the GIS platform provided in the embodiment of the present application can be applied to the foregoing method embodiment, and for details, reference is made to the description of the foregoing method embodiment, which is not described herein again.
Fig. 9 is a schematic structural diagram of a GIS platform according to a fourth embodiment of the present application. As shown in fig. 9, the GIS platform 900 of this embodiment includes: at least one processor 910 (only one shown in fig. 9), a memory 920, and a computer program 921 stored in the memory 920 and executable on the at least one processor 910, wherein the processor 910 implements the steps in any of the various image processing method embodiments described above when executing the computer program 921.
The GIS platform 900 may be a platform that integrates functions of map editing, querying, positioning, zooming in, zooming out, network analysis, path analysis, equivalence analysis, and the like. The GIS platform may include, but is not limited to, a processor 910, a memory 920. Those skilled in the art will appreciate that fig. 9 is merely an example of the GIS platform 900 and does not constitute a limitation of the GIS platform 900 and may include more or less components than those shown, or some components may be combined, or different components may include, for example, input and output devices, network access devices, etc.
The Processor 910 may be a Central Processing Unit (CPU), and the Processor 910 may also be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 920 may be an internal storage unit of the GIS platform 900 in some embodiments, such as a hard disk or a memory of the GIS platform 900. The memory 920 may also be an external storage device of the GIS platform 900 in other embodiments, for example, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the GIS platform 900. Further, the memory 920 may also include both an internal storage unit and an external storage device of the GIS platform 900. The memory 920 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs. The memory 920 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed device/GIS platform and method may be implemented in other ways. For example, the above-described device/GIS platform embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be implemented in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated module/unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
When the computer program product runs on a GIS platform, the steps in the method embodiments can be realized when the GIS platform is executed.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present application, and they should be construed as being included in the present application.

Claims (8)

1. An image processing method based on a GIS platform is characterized by comprising the following steps:
acquiring an original texture image and a target texture image, wherein the shape to be attached of the target texture image at the position to be attached on a map is rectangular and is the same as the shape to be attached of the position to be attached on the map, and the initial pixel value of each pixel point in the target texture image is an arbitrary value;
determining the position coordinates of a circumscribed rectangle of an original polygon according to the original texture image, wherein the original polygon is a projected graph of the original texture image, and the position coordinates of the circumscribed rectangle refer to the coordinates of the circumscribed rectangle in a geographic coordinate system;
determining the image coordinates of a texture polygon according to the position coordinates of the circumscribed rectangle, wherein the texture polygon is a graph corresponding to an original polygon in the target texture image, and the image coordinates of the texture polygon refer to the coordinates of the texture polygon in a target texture image coordinate system;
determining a projective transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon;
updating the target texture image according to the projective transformation matrix;
determining the image coordinates of the texture polygon according to the position coordinates of the circumscribed rectangle, comprising:
determining the coordinate corresponding relation between the original polygon and the circumscribed rectangle according to the position coordinates of the circumscribed rectangle and the position coordinates of the original polygon;
determining the coordinate corresponding relation between the texture polygon and the target texture image based on the same coordinate corresponding relation between the original polygon and the external rectangle and between the texture polygon and the target texture image;
determining the image coordinates of the texture polygon in the target texture image according to the coordinate corresponding relation between the texture polygon and the target texture image;
the determining a projective transformation matrix from the target texture image to the original texture image according to the image coordinates of the texture polygon includes:
acquiring image coordinates of M first pixel points in the texture polygon, wherein M is an integer larger than 3;
acquiring image coordinates of M second pixel points in the original texture image, wherein the second pixel points are pixel points corresponding to the first pixel points in the original texture image, and the image coordinates of the second pixel points refer to coordinates of the second pixel points in an original texture image coordinate system;
and determining a projection transformation matrix from the target texture image to the original texture image according to the image coordinates of the M first pixel points and the image coordinates of the M second pixel points.
2. The image processing method according to claim 1, wherein said determining image coordinates of texture polygons from position coordinates of the circumscribed rectangle comprises:
acquiring a normalized coordinate of a circumscribed rectangle;
and determining the image coordinates of the texture polygon according to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle.
3. The image processing method of claim 2, wherein said determining image coordinates of texture polygons from normalized coordinates of the bounding rectangle and position coordinates of the bounding rectangle, comprises:
according to the normalized coordinates of the circumscribed rectangle and the position coordinates of the circumscribed rectangle, performing normalization processing on the position coordinates of the original polygon to obtain normalized coordinates of the original polygon;
determining the normalized coordinates of the original polygon as the normalized coordinates of the texture polygon;
and calculating the image coordinates of the texture polygons according to the normalized coordinates of the texture polygons.
4. The image processing method of any of claims 1 to 3, wherein said updating the target texture image according to the projective transformation matrix comprises:
determining all second pixel points corresponding to all pixel points in the target texture image according to the projective transformation matrix;
acquiring image coordinates of all second pixel points;
and updating the pixel value of each pixel point in the target texture image according to the image coordinates of all the second pixel points.
5. The image processing method according to claim 4, wherein the updating the pixel value of each pixel point in the target texture image according to the image coordinates of all the second pixel points comprises:
if the image coordinate of the second pixel point is not in the original texture image, updating the pixel point corresponding to the second pixel point in the target texture image into a transparent pixel point;
and if the image coordinate of the second pixel point is in the original texture image, updating the pixel value of the pixel point corresponding to the second pixel point in the target texture image into the pixel value of the second pixel point.
6. An image processing apparatus based on a GIS platform, the image processing apparatus comprising:
the image obtaining module is used for obtaining an original texture image and a target texture image, the shape to be attached of the position to be attached of the target texture image on a map is rectangular and is the same as the shape to be attached of the position to be attached of the map, and the initial pixel value of each pixel point in the target texture image is an arbitrary value;
the first coordinate determination module is used for determining the position coordinates of a circumscribed rectangle of an original polygon according to the original texture image, wherein the original polygon is a projected graph of the original texture image, and the position coordinates of the circumscribed rectangle refer to the coordinates of the circumscribed rectangle in a geographic coordinate system;
the second coordinate determination module is used for determining the position coordinates of a texture polygon according to the position coordinates of the circumscribed rectangle, wherein the texture polygon is a graph corresponding to an original polygon in the target texture image, and the image coordinates of the texture polygon refer to the coordinates of the texture polygon in a target texture image coordinate system;
the matrix determining module is used for determining a projective transformation matrix from the target texture image to the original texture image according to the position coordinates of the texture polygon;
the image updating module is used for updating the target texture image according to the projective transformation matrix;
the second coordinate determination module is specifically further configured to:
determining the coordinate corresponding relation between the original polygon and the circumscribed rectangle according to the position coordinates of the circumscribed rectangle and the position coordinates of the original polygon;
determining the coordinate corresponding relation between the texture polygon and the target texture image based on the same coordinate corresponding relation between the original polygon and the external rectangle and between the texture polygon and the target texture image;
determining the image coordinates of the texture polygon in the target texture image according to the coordinate corresponding relation between the texture polygon and the target texture image;
the matrix determination module includes:
the first coordinate obtaining submodule is used for obtaining image coordinates of M first pixel points in the texture polygon, and M is an integer larger than 3;
a second coordinate obtaining submodule for obtaining image coordinates of M second pixel points in the original texture image, where the second pixel points are pixel points in the original texture image corresponding to the first pixel points, and the image coordinates of the second pixel points refer to coordinates of the second pixel points in an original texture image coordinate system;
and the matrix determining submodule is used for determining a projection transformation matrix from the target texture image to the original texture image according to the image coordinates of the M first pixel points and the image coordinates of the M second pixel points.
7. A GIS platform comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
CN202110238719.9A 2021-03-04 2021-03-04 Image processing method, device and platform based on GIS platform and storage medium Active CN113157835B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110238719.9A CN113157835B (en) 2021-03-04 2021-03-04 Image processing method, device and platform based on GIS platform and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110238719.9A CN113157835B (en) 2021-03-04 2021-03-04 Image processing method, device and platform based on GIS platform and storage medium

Publications (2)

Publication Number Publication Date
CN113157835A CN113157835A (en) 2021-07-23
CN113157835B true CN113157835B (en) 2022-10-18

Family

ID=76884143

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110238719.9A Active CN113157835B (en) 2021-03-04 2021-03-04 Image processing method, device and platform based on GIS platform and storage medium

Country Status (1)

Country Link
CN (1) CN113157835B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062784A (en) * 2018-02-05 2018-05-22 深圳市易尚展示股份有限公司 Threedimensional model texture mapping conversion method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062784A (en) * 2018-02-05 2018-05-22 深圳市易尚展示股份有限公司 Threedimensional model texture mapping conversion method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种三维地理环境中的灾场影像快速更新方法;曹巍等;《测绘通报》;20120430(第4期);79-82、98 *

Also Published As

Publication number Publication date
CN113157835A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN112581629B (en) Augmented reality display method, device, electronic equipment and storage medium
US10726580B2 (en) Method and device for calibration
CN110163831B (en) Method and device for dynamically displaying object of three-dimensional virtual sand table and terminal equipment
CN111640180B (en) Three-dimensional reconstruction method and device and terminal equipment
CN112927306B (en) Calibration method and device of shooting device and terminal equipment
CN111583381A (en) Rendering method and device of game resource map and electronic equipment
CN111015654A (en) Visual positioning method and device for robot, terminal equipment and storage medium
CN113744256A (en) Depth map hole filling method and device, server and readable storage medium
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN115830135A (en) Image processing method and device and electronic equipment
CN110070581B (en) Double-view positioning method, device and system
CN112597983B (en) Method for identifying target object in remote sensing image and storage medium and system thereof
CN112419460B (en) Method, apparatus, computer device and storage medium for baking model map
CN117522963A (en) Corner positioning method and device of checkerboard, storage medium and electronic equipment
CN113157835B (en) Image processing method, device and platform based on GIS platform and storage medium
CN114913105A (en) Laser point cloud fusion method and device, server and computer readable storage medium
CN112270693B (en) Method and device for detecting motion artifact of time-of-flight depth camera
CN112419459B (en) Method, apparatus, computer device and storage medium for baking model AO mapping
CN114168695A (en) Target position determining method, device, terminal and storage medium
CN112150621B (en) Bird&#39;s eye view image generation method, system and storage medium based on orthographic projection
CN115409962A (en) Method for constructing coordinate system in illusion engine, electronic equipment and storage medium
CN110335205B (en) Landform smoothing method and device, computer equipment and storage medium
CN114549650A (en) Camera calibration method and device, electronic equipment and readable storage medium
CN116109758B (en) Method and device for positioning projection position of light source and rendering scene
CN110136053B (en) WebGIS-based grid picture registration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant