CN111105488B - Imaging simulation method, imaging simulation device, electronic equipment and storage medium - Google Patents

Imaging simulation method, imaging simulation device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111105488B
CN111105488B CN201911325493.5A CN201911325493A CN111105488B CN 111105488 B CN111105488 B CN 111105488B CN 201911325493 A CN201911325493 A CN 201911325493A CN 111105488 B CN111105488 B CN 111105488B
Authority
CN
China
Prior art keywords
sensor
parameters
image
determining
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911325493.5A
Other languages
Chinese (zh)
Other versions
CN111105488A (en
Inventor
刘夯
饶丹
曹治锦
王陈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Jouav Automation Technology Co ltd
Original Assignee
Chengdu Jouav Automation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Jouav Automation Technology Co ltd filed Critical Chengdu Jouav Automation Technology Co ltd
Priority to CN201911325493.5A priority Critical patent/CN111105488B/en
Publication of CN111105488A publication Critical patent/CN111105488A/en
Application granted granted Critical
Publication of CN111105488B publication Critical patent/CN111105488B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Abstract

The invention relates to the technical field of computer imaging simulation, and provides an imaging simulation method, an imaging simulation device, electronic equipment and a storage medium. The method comprises the following steps: acquiring external parameters of a sensor, geometric parameters of the sensor and internal parameters of the sensor; determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor; and determining the pixel value of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, the preset digital surface model and the preset map, and obtaining the simulated photo containing the pixel values of all the image points. Compared with the prior art, the invention can obtain the simulation image without developing based on a three-dimensional engine, thereby greatly reducing the calculation amount of imaging simulation, reducing the requirement on the graphic rendering capability of the computer and being easier to realize.

Description

Imaging simulation method, imaging simulation device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of computer imaging simulation technology, and in particular, to an imaging simulation method, an apparatus, an electronic device, and a storage medium.
Background
At present, many researches on imaging simulation technology exist, and two main categories exist: (1) based on the physical simulation of a hardware system, an imaging environment needs to be built by arranging experimental equipment to simulate an imaging process; (2) based on the computer simulation of the software system, the imaging process is simulated by creating a digitized virtual imaging model.
The computer simulation can be divided into two modes: (1) the imaging result deduction based on the physical target and the imaging theory is mainly applied to imaging quality evaluation, image defogging and denoising, optical sensor design optimization and other scenes; (2) the scene generation based on the three-dimensional visualization of the virtual model is mainly applied to scenes such as game engines, flight simulation, digital sand table deduction, digital cities (or digital earth) and the like. The method for generating the scene based on the three-dimensional visualization of the virtual model needs to be developed based on a three-dimensional engine to generate the image of the virtual model which is quite different from a physical object, and has large calculated amount and high requirement on the rendering capability of the computer graphics.
Disclosure of Invention
The invention provides an imaging simulation method, an imaging simulation device, electronic equipment and a storage medium, which can determine the pixel value of each pixel in a simulation photo to be imaged through the external parameter, the geometric parameter and the internal parameter of a sensor to finally obtain a simulation image.
In order to achieve the above object, the technical scheme adopted by the embodiment of the invention is as follows:
in a first aspect, an embodiment of the present invention provides an imaging simulation method, including: acquiring external parameters of a sensor, geometric parameters of the sensor and internal parameters of the sensor; determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor; and determining the pixel value of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, the preset digital surface model and the preset map, and obtaining the simulated photo containing the pixel values of all the image points.
In a second aspect, an embodiment of the present invention provides an imaging simulation apparatus, including: the acquisition module is used for acquiring the external parameters of the sensor, the geometric parameters of the sensor and the internal parameters of the sensor; the determining module is used for determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor; the generation module is used for determining the pixel value of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, the preset digital surface model and the preset map, and obtaining the simulated photo containing the pixel values of all the image points.
In a third aspect, an embodiment of the present invention provides an electronic device, including: one or more processors; a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the imaging simulation method of any of the preceding embodiments.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements an imaging simulation method as in any of the previous embodiments.
Compared with the prior art, the invention provides an imaging simulation method, an imaging simulation device, electronic equipment and a storage medium, which can determine the pixel value of each image point in a simulation photo to be imaged through the external parameter, the geometric parameter and the internal parameter of a sensor to finally obtain a simulation image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a block schematic diagram of an electronic device according to an embodiment of the present invention.
Fig. 2 shows a flowchart of an imaging simulation method according to an embodiment of the present invention.
Fig. 3 shows a flowchart of another imaging simulation method provided by an embodiment of the present invention.
Fig. 4 is a schematic diagram showing a relationship between a pixel coordinate system and an image coordinate system according to an embodiment of the present invention.
FIG. 5 shows an exemplary diagram of a process for imaging simulation provided by an embodiment of the present invention.
Fig. 6 shows a flowchart of an imaging simulation apparatus according to an embodiment of the present invention.
Icon: 10-an electronic device; 11-memory; 12-a communication interface; 13-a processor; 14-buses; 100-imaging simulation device; 110-an acquisition module; 120-determining a module; 130-a generation module.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the invention, as presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present invention, it should be noted that, if the terms "upper", "lower", "inner", "outer", and the like indicate an azimuth or a positional relationship based on the azimuth or the positional relationship shown in the drawings, or the azimuth or the positional relationship in which the inventive product is conventionally put in use, it is merely for convenience of describing the present invention and simplifying the description, and it is not indicated or implied that the apparatus or element referred to must have a specific azimuth, be configured and operated in a specific azimuth, and thus it should not be construed as limiting the present invention.
Furthermore, the terms "first," "second," and the like, if any, are used merely for distinguishing between descriptions and not for indicating or implying a relative importance.
It should be noted that the features of the embodiments of the present invention may be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a block schematic diagram of an electronic device according to an embodiment of the invention. The electronic device 10 further comprises a memory 11, a communication interface 12, a processor 13 and a bus 14. The memory 11 and the communication interface 12, and the processor 13 are connected via a bus 14.
The memory 11 is used for storing a program, such as the above-mentioned network traffic restoration device, which includes at least one software functional module that may be stored in the memory 11 in the form of software or firmware (firmware), and the processor 13 executes the program after receiving the execution instruction to implement the imaging simulation method disclosed in the above-mentioned embodiment.
The memory 11 may include a high-speed random access memory (RAM: random Access Memory) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. Alternatively, the memory 11 may be a storage device built in the processor 13, or may be a storage device independent of the processor 13.
Communication connections with other external devices are made through at least one communication interface 12 (which may be wired or wireless).
Bus 14 may be an ISA bus, a PCI bus, an EISA bus, or the like. Fig. 1 is represented by only one double-headed arrow, but does not represent only one bus or one type of bus.
The processor 13 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 13 or by instructions in the form of software. The processor 13 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but may also be a Digital Signal Processor (DSP), application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Referring to fig. 2, fig. 2 shows a flowchart of an imaging simulation method according to an embodiment of the invention, and the method includes the following steps:
step S101, acquiring an external parameter of the sensor, a geometric parameter of the sensor, and an internal parameter of the sensor.
In this embodiment, the entity that carries the optical area array sensor, including but not limited to an aircraft, an engineering vehicle, a wearable device, and the like, is a virtual model formed by parameters such as a geographic position, and can express a point element or a volume element in a geographic information system GIS (Geographic Information System, GIS), where GIS is a technical system that collects, stores, manages, computes, analyzes, displays and describes related geographic distribution data in a whole or part of the earth surface (including the atmosphere) space under the support of a computer hard and software system.
In this embodiment, the sensor, here, refers to a virtual model composed of geometric parameters, internal parameters, and the like of the optical area array sensor, can be expressed as a point element or a volume element in the geographic information system, and can be entered into a database of the geographic information system.
In this embodiment, the external parameter of the sensor is also called the pose of the sensor, and the external parameter of the sensor may be obtained by calculating the pose of the carrier and the pose of the sensor relative to the carrier, which are acquired in advance, or by calculating the pose of the carrier and the pose of the sensor relative to the carrier, which are acquired in real time.
Step S102, determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor.
In this embodiment, the simulated photo to be imaged is initialized to a Data-free (No Data) area, that is, the pixel value of each image point in the area is initialized to a preset value, for example, the preset value is (0, 0), that is, the initial pixel value of each image point in the simulated photo to be imaged is (0, 0), so that the simulated photo to be imaged is a pure black image. Of course, the preset value may be other specific values, which are not limited in the embodiment of the present invention.
Step S103, determining pixel values of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, the preset digital surface model and the preset map, and obtaining the simulated photo containing the pixel values of all the image points.
In this embodiment, the preset map is an entity in the geographic information system for visualizing the basic geospatial data, including but not limited to 2D and 2.5D maps, such as google map, hundred degree map and altitude map, which can be accessed online.
In this embodiment, the digital surface model DSM (Digital Surface Model, DSM) is a collection of object surface morphologies expressed in numbers, and can express topography and topography, and well express the three-dimensional landscape of a city. The preset digital surface model is a digital surface model related to the target sampling data, i.e. a digital surface model related to the simulation imaging.
According to the method provided by the embodiment of the invention, the pixel value of each pixel in the simulated photo to be imaged can be determined through the external parameters, the geometric parameters and the internal parameters of the sensor, and the simulated image is finally obtained, and the simulated image can be obtained without development based on a three-dimensional engine, so that the calculated amount of imaging simulation can be greatly reduced, the requirement on the rendering capability of a computer graph is reduced, and the method is easier to realize.
Referring to fig. 3, fig. 3 shows a flowchart of another imaging simulation method according to an embodiment of the present invention, and step S103 includes the following sub-steps:
in the substep S1031, the geospatial coordinates of the object point corresponding to each image point in the simulated photo to be imaged on the preset digital surface model are determined based on the external parameters of the sensor, the geometric parameters of the sensor and the internal parameters of the sensor.
In this embodiment, taking a sensor as an example of a camera, external parameters of the camera refer to pose of the camera in a world coordinate system, and a relative pose relationship between the camera and the world coordinate system is determined. The intrinsic parameters of the camera are determined by the camera itself and are only related to the camera itself. The geometric parameter of the camera may be the size of the optical elements in the camera.
As a specific embodiment, the method for determining the geospatial coordinates of the object point corresponding to each image point in the simulated photo to be imaged on the preset digital surface model may be:
first, the plane coordinates of each image point in the simulated photo to be imaged are determined based on the geometric parameters of the sensor and the internal parameters of the sensor.
In this embodiment, the pixel coordinates of the image point in the simulated photo to be imaged are obtained, the pixel coordinates are the coordinates of the image point in the pixel coordinate system, the plane coordinates of the image point are determined according to the geometric parameters of the sensor and the internal parameters of the sensor, the plane coordinates are the coordinates of the image point in the image coordinate system, and the relationship between the pixel coordinate system and the image coordinate system can be shown in fig. 4. In FIG. 4, the origin of the pixel coordinate system u-v is O 0 The abscissa u and the ordinate v are the row and the column of the image respectively, and the origin of the image coordinate system x-y is O 1 . In addition, the relationship between the pixel coordinate system and the image coordinate system may be represented by other image plane coordinate systems, for example, an image plane coordinate system bloh defined by the university of Hannover of hanocar, germany, an image plane coordinate system PATB defined by the university of Stuttgart, an image plane coordinate system CCHZ defined by the chinese low-altitude digital photogrammetry Specification (CH/Z3005-2010), etc., and the embodiment of the present invention is not limited to the specific image coordinate system used.
And secondly, determining the geospatial coordinates of the object points corresponding to the plane coordinates in the preset digital surface model through inverse center projection transformation based on the external parameters of the sensor.
In this embodiment, the central projection is a projection where projection lines intersect at one point, that is, there is a point light source and an object, where the light source irradiates the object in a scattered manner, that is, the central projection is a projection formed by projection rays from the same point, the central projection transformation is a transformation from 3 dimensions to 2 dimensions, and the central projection inverse transformation combined with the preset digital surface model is a transformation from 2 dimensions to 3 dimensions.
In step S1032, the pixel value of the image point corresponding to each object point is determined from the preset map according to the geospatial coordinates of each object point.
In this embodiment, the preset map is a map related to the simulation imaging. The geospatial coordinates of each object point can find a corresponding coordinate point in a preset map, a target map area can be determined from the preset map according to the geospatial coordinates of all object points corresponding to all image points in the simulated photo to be imaged, and then the final pixel value of the image point corresponding to each object point is determined according to the target map area. As an embodiment, the manner of determining the final pixel value of the image point corresponding to each object point may be:
firstly, determining a target map area from a preset map according to the geospatial coordinates of all object points, and acquiring a view port image of the target map area.
In this embodiment, the target map area is a part of a preset map, and the target map area includes coordinate points corresponding to geospatial coordinates of all object points in the preset map.
In this embodiment, the view port image of the target map area may be acquired through an interface provided by the electronic map of the GIS.
And secondly, carrying out affine transformation on the geospatial coordinates of each object point to obtain the corresponding map viewport coordinates of each object point in the viewport image.
In this embodiment, the electronic map system of the GIS provides an affine transformation interface from the geospatial coordinates to the map viewport coordinates, through which the map viewport coordinates corresponding to each object point in the viewport image can be obtained.
And finally, sampling pixel values according to the view port coordinates of each map, and taking the obtained pixel values as pixel values of image points corresponding to each object point.
In this embodiment, the pixel value sampling manner may, but is not limited to, nearest neighbor, bilinear interpolation, cubic convolution interpolation, and the like, and the acquired pixel value is used as the final pixel value of the image point corresponding to each object point.
In order to more intuitively show the imaging simulation process, the embodiment of the invention also provides an example of the imaging simulation process, please refer to fig. 5, fig. 5 shows an example diagram of the imaging simulation process provided by the embodiment of the invention, in fig. 5, the origin S of the camera coordinate system u-v-w, the origin of the image coordinate system x-y is O 1 The origin of the geospatial coordinate system X-Y-Z is O 2 Firstly, initializing an initial pixel value of each image point in a simulation image to be (0, 0); secondly, determining a plane coordinate (namely, a plane coordinate of a in fig. 5) corresponding to each image point in the image coordinate system according to the pixel coordinate of the image point in the pixel coordinate system, the geometric parameter of the camera and the external parameter; thirdly, calculating coefficients of a collineation equation of the center projection according to the plane coordinates and the external parameters of the camera, and carrying out inverse center projection transformation according to the coefficients to determine the geospatial coordinates (namely the geospatial coordinates of A in FIG. 5) of the object points corresponding to the plane coordinates in the digital surface model; fourth, the geospatial coordinates of the object point (i.e., the geospatial coordinates of a in fig. 5) determine the map viewport coordinates of the object point in the viewport map (i.e., the map viewport coordinates of a' in fig. 5); and finally, sampling pixel values according to the view port coordinates of the map, taking the obtained pixel values as final pixel values of the image points, and obtaining the simulation image after the final pixel values of all the image points replace the corresponding initial pixel values.
It should be noted that, the above embodiment only provides an imaging simulation process of the simulation image, and the imaging simulation process of the corresponding simulation video can be obtained based on the imaging simulation process of the simulation image.
For example, a certain unmanned aerial vehicle is used as a carrier, a certain digital camera is used as a sensor, the digital camera and the unmanned aerial vehicle are connected by a cradle head, the relative pose of the digital camera and the unmanned aerial vehicle can be measured, imaging simulation is carried out on a google satellite image map with a network ink card support projection (Web Mercator Projection) as a coordinate system, and the unmanned aerial vehicle is arranged to send a signal at a fixed course distance to trigger the digital camera to take pictures, and the specific steps are as follows: (1) obtaining geometric parameters and internal parameters of the digital camera (the selected model obtains corresponding parameters from a preset database or the model and the corresponding parameters which do not exist in the preset database are input by a user); (2) the geographic information system acquires pose data of the unmanned aerial vehicle on the pose and the digital camera relative to the unmanned aerial vehicle; (3) performing imaging simulation according to the simulation imaging method disclosed by the embodiment based on the geographic information system; (4) a series of simulated images "shot" at a fixed interval are output.
For another example, a certain unmanned aerial vehicle is taken as a carrier, a certain digital camera is assembled on a certain photoelectric pod as a sensor, the shafting control system of the photoelectric pod can measure the pose of the digital camera relative to the unmanned aerial vehicle, and imaging simulation is carried out on a map of a google street map layer laminated image layer by taking network ink card bracket projection as a coordinate system at a frame rate of 25fps, and the specific steps are as follows: (1) obtaining geometric parameters and internal parameters of the digital camera (the selected model obtains corresponding parameters from a preset database or the model and the corresponding parameters which do not exist in the preset database are input by a user); (2) the geographic information system collects pose data of the opposite pose and the digital camera relative to the unmanned aerial vehicle at the frequency of 25 Hz; (3) performing imaging simulation according to the simulation imaging method disclosed by the embodiment based on the geographic information system; (4) the simulated video is output "shot" at a frame rate of 25 fps.
The method provided by the embodiment of the invention does not need a technician to have a very high level in the aspects of electromagnetic radiation theory of a physical target and imaging theory of a sensor, so that the method is easier to realize and popularize and apply; secondly, the method is not dependent on a three-dimensional engine, and is suitable for a geographic information system for providing a 2D or 2.5D map rendering mechanism; thirdly, the method is not only suitable for local geographic space data, but also suitable for external geographic space data, including but not limited to Google maps, hundred degree maps, gordon maps and the like which can be accessed online, and can be subjected to imaging simulation; fourth, the simulation image or video generated by the implementation of the invention can sample various different abstract geospatial data to meet the requirements of different scenes, for example, the street and administrative division map is sampled, so that the information such as artificial structures and human activities can be reflected; the real geographic space data can be selected for sampling, for example, the real geographic environment can be reflected by sampling from a satellite image map; the two types of geospatial data can be mixed and overlapped for sampling, so that the geographic information of augmented reality can be reflected on an image or video.
In order to perform the respective steps of the above examples and of the various possible embodiments, an implementation of an imaging simulation is given below. Referring to fig. 6, fig. 6 is a functional block diagram of an imaging simulation apparatus 100 according to an embodiment of the invention. It should be noted that, the basic principle and the technical effects of the imaging simulation device 100 provided in this embodiment are the same as those of the above embodiment, and for brevity, the description of this embodiment is not mentioned, and reference may be made to the corresponding content in the above embodiment. The imaging simulation apparatus 100 includes an acquisition module 110, a determination module 120, and a generation module 130.
The acquisition module 110 is configured to acquire an external parameter of the sensor, a geometric parameter of the sensor, and an internal parameter of the sensor.
A determining module 120, configured to determine a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor.
The generating module 130 is configured to determine a pixel value of each image point in the simulated photo to be imaged according to the internal parameter of the sensor, the geometric parameter of the sensor, the external parameter of the sensor, the preset digital surface model and the preset map, so as to obtain the simulated photo including the pixel values of all the image points.
In an alternative embodiment, the generating module 130 is specifically configured to: determining the geospatial coordinates of corresponding object points of each image point in the simulated photo to be imaged on a preset digital surface model based on the external parameters of the sensor, the geometric parameters of the sensor and the internal parameters of the sensor; and determining pixel values of image points corresponding to each object point from a preset map according to the geospatial coordinates of each object point.
In an alternative embodiment, the generating module 130 performs determining the geospatial coordinate of the object point corresponding to each image point in the simulated photo to be imaged on the preset digital surface model based on the external parameter of the sensor, the geometric parameter of the sensor and the internal parameter of the sensor, which includes: determining plane coordinates of each image point in the simulated photo to be imaged based on the geometric parameters of the sensor and the internal parameters of the sensor; and determining the geospatial coordinates of the object points corresponding to the plane coordinates in the preset digital surface model through inverse center projection transformation based on the external parameters of the sensor.
In an alternative embodiment, the generating module 130 determines, in performing the geospatial coordinate according to each object point, the pixel value of the image point corresponding to each object point from the preset map, including: determining a target map area from a preset map according to the geospatial coordinates of all object points, and acquiring a view port image of the target map area; affine transformation is carried out on the geospatial coordinates of each object point to obtain the corresponding map viewport coordinates of each object point in the viewport image; and sampling pixel values according to the view port coordinates of each map, and taking the obtained pixel values as pixel values of image points corresponding to each object point.
An embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements an imaging simulation method as in any of the previous embodiments.
In summary, the embodiment of the invention provides an imaging simulation method, an imaging simulation device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring external parameters of a sensor, geometric parameters of the sensor and internal parameters of the sensor; determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor; and determining the pixel value of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, the preset digital surface model and the preset map, and obtaining the simulated photo containing the pixel values of all the image points. Compared with the prior art, the method can determine the pixel value of each pixel point in the simulated photo to be imaged through the external parameters, the geometric parameters and the internal parameters of the sensor, and finally obtain the simulated image.
The foregoing is merely illustrative of the present invention, and the present invention is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present invention should be included in the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. An imaging simulation method, the method comprising:
acquiring external parameters of a sensor, geometric parameters of the sensor and internal parameters of the sensor;
determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor, wherein the simulated photo to be imaged is initialized to a non-data area, and the pixel value of each image point in the non-data area is initialized to a preset value;
determining pixel values of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, a preset digital surface model and a preset map to obtain the simulated photo containing the pixel values of all the image points;
the step of determining the pixel value of each image point in the simulated photo to be imaged according to the internal parameter of the sensor, the geometric parameter of the sensor, the external parameter of the sensor, the preset digital surface model and the preset map to obtain the simulated photo containing the pixel values of all the image points comprises the following steps:
determining geospatial coordinates of object points corresponding to each image point in the simulated photo to be imaged on the preset digital surface model based on the external parameters of the sensor, the geometric parameters of the sensor and the internal parameters of the sensor;
and determining pixel values of image points corresponding to each object point from the preset map according to the geospatial coordinates of each object point.
2. The imaging simulation method of claim 1 wherein the step of determining geospatial coordinates of a corresponding object point in the predetermined digital surface model for each image point in the simulated photo to be imaged based on the external parameters of the sensor, the geometric parameters of the sensor, and the internal parameters of the sensor comprises:
determining plane coordinates of each image point in the simulated photo to be imaged based on the geometric parameters of the sensor and the internal parameters of the sensor;
and determining the geospatial coordinates of the object points corresponding to the plane coordinates in the preset digital surface model through inverse center projection transformation based on the external parameters of the sensor.
3. The imaging simulation method as set forth in claim 1, wherein the step of determining the pixel value of the image point corresponding to each object point from the preset map according to the geospatial coordinates of each object point includes:
determining a target map area from the preset map according to the geospatial coordinates of all the object points, and acquiring a view port image of the target map area;
affine transformation is carried out on the geospatial coordinates of each object point, so that map viewport coordinates corresponding to each object point in the viewport image are obtained;
and sampling pixel values according to the view port coordinates of each map, and taking the obtained pixel values as pixel values of image points corresponding to each object point.
4. An imaging simulation apparatus, the apparatus comprising:
the acquisition module is used for acquiring the external parameters of the sensor, the geometric parameters of the sensor and the internal parameters of the sensor;
the determining module is used for determining a simulated photo to be imaged according to the internal parameters of the sensor and the geometric parameters of the sensor, wherein the simulated photo to be imaged is initialized to a non-data area, and the pixel value of each image point in the non-data area is initialized to a preset value;
the generation module is used for determining the pixel value of each image point in the simulated photo to be imaged according to the internal parameters of the sensor, the geometric parameters of the sensor, the external parameters of the sensor, a preset digital surface model and a preset map to obtain the simulated photo containing the pixel values of all the image points;
the generating module is specifically configured to:
determining geospatial coordinates of object points corresponding to each image point in the simulated photo to be imaged on the preset digital surface model based on the external parameters of the sensor, the geometric parameters of the sensor and the internal parameters of the sensor;
and determining pixel values of image points corresponding to each object point from the preset map according to the geospatial coordinates of each object point.
5. The imaging simulation apparatus of claim 4 wherein the means for generating executing a determination of geospatial coordinates of a corresponding object point on the predetermined digital surface model for each image point in the simulated image to be imaged based on the external parameters of the sensor, the geometric parameters of the sensor, and the internal parameters of the sensor comprises:
determining plane coordinates of each image point in the simulated photo to be imaged based on the geometric parameters of the sensor and the internal parameters of the sensor;
and determining the geospatial coordinates of the object points corresponding to the plane coordinates in the preset digital surface model through inverse center projection transformation based on the external parameters of the sensor.
6. The imaging simulation apparatus of claim 4 wherein the means for determining the pixel value of the image point corresponding to each object point from the predetermined map based on the geospatial coordinates of each object point comprises:
determining a target map area from the preset map according to the geospatial coordinates of all the object points, and acquiring a view port image of the target map area;
affine transformation is carried out on the geospatial coordinates of each object point, so that map viewport coordinates corresponding to each object point in the viewport image are obtained;
and sampling pixel values according to the view port coordinates of each map, and taking the obtained pixel values as pixel values of image points corresponding to each object point.
7. An electronic device, the electronic device comprising:
one or more processors;
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the imaging simulation method of any of claims 1-3.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the imaging simulation method as claimed in any one of claims 1 to 3.
CN201911325493.5A 2019-12-20 2019-12-20 Imaging simulation method, imaging simulation device, electronic equipment and storage medium Active CN111105488B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911325493.5A CN111105488B (en) 2019-12-20 2019-12-20 Imaging simulation method, imaging simulation device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911325493.5A CN111105488B (en) 2019-12-20 2019-12-20 Imaging simulation method, imaging simulation device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111105488A CN111105488A (en) 2020-05-05
CN111105488B true CN111105488B (en) 2023-09-08

Family

ID=70422738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911325493.5A Active CN111105488B (en) 2019-12-20 2019-12-20 Imaging simulation method, imaging simulation device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111105488B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004092826A1 (en) * 2003-04-18 2004-10-28 Appro Technology Inc. Method and system for obtaining optical parameters of camera
FR2978276A1 (en) * 2011-07-22 2013-01-25 Thales Sa Method for modeling building represented in geographically-referenced image of terrestrial surface for e.g. teledetection, involves determining parameters of model, for which adequacy is best, from optimal parameters for modeling object
CN103226838A (en) * 2013-04-10 2013-07-31 福州林景行信息技术有限公司 Real-time spatial positioning method for mobile monitoring target in geographical scene
CN104123710A (en) * 2013-04-25 2014-10-29 南京理工大学常熟研究院有限公司 Implement method of three-dimensional video camera system
CN107451957A (en) * 2017-07-26 2017-12-08 国家测绘地理信息局卫星测绘应用中心 A kind of spaceborne TDI CMOS camera imagings emulation mode and equipment
CN107492069A (en) * 2017-07-01 2017-12-19 国网浙江省电力公司宁波供电公司 Image interfusion method based on more lens sensors
CN109035320A (en) * 2018-08-12 2018-12-18 浙江农林大学 Depth extraction method based on monocular vision
CN109191415A (en) * 2018-08-22 2019-01-11 成都纵横自动化技术股份有限公司 Image interfusion method, device and electronic equipment
CN109300120A (en) * 2018-09-12 2019-02-01 首都师范大学 Remotely sensed image emulation mode and device
CN109708662A (en) * 2018-12-05 2019-05-03 北京空间机电研究所 A kind of pouring-in star chart simulation test platform of high frame frequency high-precision based on target identification
CN109712249A (en) * 2018-12-31 2019-05-03 成都纵横大鹏无人机科技有限公司 Geographic element augmented reality method and device
CN110211214A (en) * 2019-05-07 2019-09-06 高新兴科技集团股份有限公司 Texture stacking method, device and the storage medium of three-dimensional map

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150206337A1 (en) * 2014-01-17 2015-07-23 Nokia Corporation Method and apparatus for visualization of geo-located media contents in 3d rendering applications
JP6551184B2 (en) * 2015-11-18 2019-07-31 オムロン株式会社 Simulation apparatus, simulation method, and simulation program

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004092826A1 (en) * 2003-04-18 2004-10-28 Appro Technology Inc. Method and system for obtaining optical parameters of camera
FR2978276A1 (en) * 2011-07-22 2013-01-25 Thales Sa Method for modeling building represented in geographically-referenced image of terrestrial surface for e.g. teledetection, involves determining parameters of model, for which adequacy is best, from optimal parameters for modeling object
CN103226838A (en) * 2013-04-10 2013-07-31 福州林景行信息技术有限公司 Real-time spatial positioning method for mobile monitoring target in geographical scene
CN104123710A (en) * 2013-04-25 2014-10-29 南京理工大学常熟研究院有限公司 Implement method of three-dimensional video camera system
CN107492069A (en) * 2017-07-01 2017-12-19 国网浙江省电力公司宁波供电公司 Image interfusion method based on more lens sensors
CN107451957A (en) * 2017-07-26 2017-12-08 国家测绘地理信息局卫星测绘应用中心 A kind of spaceborne TDI CMOS camera imagings emulation mode and equipment
CN109035320A (en) * 2018-08-12 2018-12-18 浙江农林大学 Depth extraction method based on monocular vision
CN109191415A (en) * 2018-08-22 2019-01-11 成都纵横自动化技术股份有限公司 Image interfusion method, device and electronic equipment
CN109300120A (en) * 2018-09-12 2019-02-01 首都师范大学 Remotely sensed image emulation mode and device
CN109708662A (en) * 2018-12-05 2019-05-03 北京空间机电研究所 A kind of pouring-in star chart simulation test platform of high frame frequency high-precision based on target identification
CN109712249A (en) * 2018-12-31 2019-05-03 成都纵横大鹏无人机科技有限公司 Geographic element augmented reality method and device
CN110211214A (en) * 2019-05-07 2019-09-06 高新兴科技集团股份有限公司 Texture stacking method, device and the storage medium of three-dimensional map

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
可见光成像制导半实物仿真中的图像生成技术;虞红;雷杰;;现代防御技术;第34卷(第06期);112-115、119 *

Also Published As

Publication number Publication date
CN111105488A (en) 2020-05-05

Similar Documents

Publication Publication Date Title
CN107564089B (en) Three-dimensional image processing method, device, storage medium and computer equipment
CN107223269B (en) Three-dimensional scene positioning method and device
KR101504383B1 (en) Method and apparatus of taking aerial surveys
US9641755B2 (en) Reimaging based on depthmap information
AU2011312140C1 (en) Rapid 3D modeling
EP3170151B1 (en) Blending between street view and earth view
TWI494898B (en) Extracting and mapping three dimensional features from geo-referenced images
CA2751025A1 (en) Fusion of a 2d electro-optical image and 3d point cloud data for scene interpretation and registration performance assessment
Gomez-Jauregui et al. Quantitative evaluation of overlaying discrepancies in mobile augmented reality applications for AEC/FM
CN111161398B (en) Image generation method, device, equipment and storage medium
KR20180017108A (en) Display of objects based on multiple models
CN108010122B (en) Method and system for reconstructing and measuring three-dimensional model of human body
Koeva 3D modelling and interactive web-based visualization of cultural heritage objects
US8884950B1 (en) Pose data via user interaction
CN116168143A (en) Multi-view three-dimensional reconstruction method
CN116858215B (en) AR navigation map generation method and device
EP2225730A2 (en) Transition method between two three-dimensional geo-referenced maps
CN111105488B (en) Imaging simulation method, imaging simulation device, electronic equipment and storage medium
CN117611781B (en) Flattening method and device for live-action three-dimensional model
US11776148B1 (en) Multi-view height estimation from satellite images
CN113449027A (en) Three-dimensional visual display method and device for dynamic information of urban intersection
CN117876234A (en) Image synthesis method, device, equipment and storage medium
CN117237500A (en) Three-dimensional model display visual angle adjusting method and device, electronic equipment and storage medium
CN117611781A (en) Flattening method and device for live-action three-dimensional model
CN116863093A (en) Terrain modeling method, apparatus, computer device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 7 / F, area a, building 6, No. 200, Tianfu 5th Street, high tech Zone, Chengdu, Sichuan 610000

Patentee after: CHENGDU JOUAV AUTOMATION TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: Room 801-805, 8th floor, Building A, No. 200, Tianfu Wujie, Chengdu High-tech Zone, Sichuan Province, 610000

Patentee before: CHENGDU JOUAV AUTOMATION TECHNOLOGY Co.,Ltd.

Country or region before: China

CP03 Change of name, title or address