CN110136237B - Image processing method, device, storage medium and electronic equipment - Google Patents

Image processing method, device, storage medium and electronic equipment Download PDF

Info

Publication number
CN110136237B
CN110136237B CN201910428565.2A CN201910428565A CN110136237B CN 110136237 B CN110136237 B CN 110136237B CN 201910428565 A CN201910428565 A CN 201910428565A CN 110136237 B CN110136237 B CN 110136237B
Authority
CN
China
Prior art keywords
depth
image
determining
value
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910428565.2A
Other languages
Chinese (zh)
Other versions
CN110136237A (en
Inventor
姚立
陈正魁
刘守军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Lotu Digital Technology Co ltd
Original Assignee
Wuhan Lotu Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Lotu Digital Technology Co ltd filed Critical Wuhan Lotu Digital Technology Co ltd
Priority to CN201910428565.2A priority Critical patent/CN110136237B/en
Publication of CN110136237A publication Critical patent/CN110136237A/en
Application granted granted Critical
Publication of CN110136237B publication Critical patent/CN110136237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides an image processing method, an image processing device, a storage medium and electronic equipment, wherein the method comprises the following steps: obtaining a mapping image of a model, wherein the mapping image comprises a plurality of pixel points with different depth values; determining a depth of field range of the map image, wherein the depth of field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of all pixel points is at least two; and determining a clear map from the map image, wherein the depth values of all pixel points in the clear map are in the depth range. The clear mapping of the image determined by the method can ensure that each pixel point in the clear mapping is in the depth of field range, thereby ensuring the definition of the clear mapping and ensuring that the obtained clear mapping meets the requirements of the model mapping. Therefore, clear mapping meeting modeling mapping requirements can be conveniently obtained, and waste of manpower and material resources caused by unqualified mapping quality can be avoided.

Description

Image processing method, device, storage medium and electronic equipment
Technical Field
The present invention relates to the field of image processing, and in particular, to an image processing method, an image processing device, a storage medium, and an electronic apparatus.
Background
At present, when a three-dimensional model is used for texture mapping, a high-definition texture map of a small object is needed, and in order to obtain the high-definition texture map, long-focus and micro-distance lenses are often used for shooting, so that the depth of field of shooting is very small, and the imaging of partial targets in an image outside the depth of field range and partial contents on the image is blurred. If the image with the fuzzy area is used, the texture part area of the generated three-dimensional model is fuzzy, so that the texture quality of the three-dimensional model is unqualified, a great deal of manpower is often required to be consumed for texture selection, and after the fuzzy area is removed, texture mapping is performed again, so that the method is time-consuming and labor-consuming.
Disclosure of Invention
In view of the foregoing, an object of the present application is to provide an image processing method, apparatus, storage medium, and electronic device for obtaining a clear map without blurred regions as much as possible.
In order to achieve the above object, embodiments of the present application are realized by:
in a first aspect, an embodiment of the present application provides an image processing method, including:
obtaining a map image of a model, wherein the map image comprises a plurality of pixel points with different depth values; determining a depth of field range of the map image, wherein the depth of field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of all pixel points is at least two; and determining a clear map from the map image, wherein the depth values of all pixel points in the clear map are in the depth range.
By determining the depth of field range of the map image and the depth value of each pixel point in the map image, the pixel point with the depth value within the depth of field range can be determined, so that the clear map of the image in the map image can be determined. The method can remove the pixels with depth values not in the depth range, and ensure that each pixel in the clear map is in the depth range, thereby ensuring the definition of the clear map and ensuring that the obtained clear map meets the requirements of the required model map. Therefore, clear mapping meeting the mapping requirements of the model can be conveniently obtained, and waste of manpower and material resources caused by unqualified mapping quality can be avoided.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the map image is obtained by performing depth processing on a captured source image, and determining a depth range of the map image includes:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot the source image; and determining the depth of field range of the map image according to the focusing area, the focal length value and the aperture value.
The depth of field range of the map image is determined by utilizing the focusing area, the focal length value and the aperture value which are utilized for shooting the source image corresponding to the map image, so that the determined depth of field range is very accurate, and the definition of the clear map determined by taking the depth of field range as a screening condition is ensured.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, determining a depth of field range of the map image according to the focusing area, the focal length value and the aperture value includes:
determining an average depth value of a pixel point positioned in the focusing area in the map image; determining a front depth of field and a rear depth of field of the map image according to the average depth value, the focal length value and the aperture value; and determining the depth of field range of the map image according to the front depth of field and the rear depth of field.
And determining a focusing area, calculating an average depth value of a pixel point in the focusing area in the map image, and calculating a front depth of field and a rear depth of field by using a depth of field calculation formula according to the average depth value, the focal length value and the aperture value, so as to determine a depth of field range. The depth of field range determined by the average depth value in the focusing area is very accurate, and the reliability of the depth of field range can be ensured, so that the definition of a clear map is ensured.
With reference to the second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, determining a depth of field range of the map image according to the front depth of field and the rear depth of field includes:
determining a first adjustment value matched with the focal length value and the aperture value, wherein the first adjustment value is used for adjusting the size of the depth of field; and determining a first depth of field value of the front depth of field after being adjusted by the first adjustment value, determining a second depth of field value of the rear depth of field after being adjusted by the first adjustment value, and determining the depth of field range between the first depth of field value and the second depth of field value.
The calculated front depth of field and the calculated rear depth of field are further adjusted after the front depth of field and the calculated rear depth of field are determined, so that the determined depth of field range is more accurate.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, the map image is obtained by performing a depth processing on a captured source image, and determining a clear map from the map image includes:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot the source image; determining partial images corresponding to the focusing areas in the map images according to the focal length values and the aperture values; and determining the clear map formed by the pixel points with depth values within the depth range from the partial image.
The method comprises the steps of determining a partial image corresponding to a focusing area from a map image, and determining a clear map formed by target pixel points with depth values in a depth range from the partial image, so that the number of pixel points needing to calculate the depth values can be reduced, and the speed and efficiency of determining the clear map can be improved.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, determining, according to the focal length value and the aperture value, a partial image corresponding to the focusing area in the map image includes:
determining a second adjustment value matched with the focal length value and the aperture value, wherein the second adjustment value is used for adjusting the size of the focusing area in the map image; and determining the partial image in the map image according to the second adjustment value and the focusing area.
The size of the focusing area in the map image can be adjusted by determining the second adjusting value matched with the focal length value and the aperture value, so that partial images corresponding to the focusing area in the map image can be corrected according to different focal length values and aperture values, the rationality and the whole definition of the partial images are ensured, the efficiency of determining the clear map is ensured, and the quality of the clear map is ensured.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a map image of a model, and the map image comprises a plurality of pixel points with different depth values; the processing module is used for determining a depth-of-field range of the map image, wherein the depth-of-field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of the all pixel points is at least two; and the processing module is further used for determining a clear map from the map image, wherein the depth values of all pixel points in the clear map are in the depth range.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the processing module is further configured to:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot a source image corresponding to the map image, wherein the map image is obtained by carrying out depth processing on the source image; and determining the depth of field range of the map image according to the focusing area, the focal length value and the aperture value.
With reference to the first possible implementation manner of the second aspect, in a second possible implementation manner of the second aspect, the processing module is further configured to:
determining an average depth value of a pixel point positioned in the focusing area in the map image; determining a front depth of field and a rear depth of field of the map image according to the average depth value, the focal length value and the aperture value; and determining the depth of field range of the map image according to the front depth of field and the rear depth of field.
With reference to the second possible implementation manner of the second aspect, in a third possible implementation manner of the second aspect, the processing module is further configured to:
determining a first adjustment value matched with the focal length value and the aperture value, wherein the first adjustment value is used for adjusting the size of the depth of field; and determining a first depth of field value of the front depth of field after being adjusted by the first adjustment value, determining a second depth of field value of the rear depth of field after being adjusted by the first adjustment value, and determining the depth of field range between the first depth of field value and the second depth of field value.
With reference to the second aspect, in a fourth possible implementation manner of the second aspect, the processing module is further configured to:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot a source image corresponding to the map image, wherein the map image is obtained by carrying out depth processing on the shot source image; determining partial images corresponding to the focusing areas in the map images according to the focal length values and the aperture values; and determining the clear map formed by the pixel points with depth values within the depth range from the partial image.
With reference to the fourth possible implementation manner of the second aspect, in a fifth possible implementation manner of the second aspect, the processing module is further configured to:
determining a second adjustment value matched with the focal length value and the aperture value, wherein the second adjustment value is used for adjusting the size of the focusing area in the map image; and determining the partial image in the map image according to the second adjustment value and the focusing area.
In a third aspect, embodiments of the present application provide a computer-readable storage medium having computer-executable non-volatile program code for storing program code which, when read and run by a computer, performs the image processing method of the first aspect or any possible implementation manner of the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: the device comprises a communication interface, a bus, a processor and a memory, wherein the processor, the memory and the communication interface are connected through the bus; the memory is configured to store computer readable instructions, and the processor is configured to execute the image processing method according to the first aspect or any possible implementation manner of the first aspect by calling and executing the computer readable instructions.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting in scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a flowchart of a method of image processing according to an embodiment of the present application;
FIG. 2 illustrates a map image of a scene provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a focusing area in a source image corresponding to a map image according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a portion of a first map image provided in an embodiment of the present application;
FIG. 5 illustrates a schematic diagram of a partial image of a second type of map image provided in an embodiment of the present application;
FIG. 6 shows a schematic diagram of a distinct map in a map image provided by an embodiment of the present application;
fig. 7 shows a block diagram of a first electronic device according to an embodiment of the present application;
fig. 8 shows a block diagram of a second electronic device according to an embodiment of the present application;
fig. 9 shows a block diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
The embodiment of the application provides an image processing method which is used for removing a blurred part of an image and reserving a clear part of the image so as to obtain a clear map in the image. The method can be applied to electronic equipment and executed by the electronic equipment, and the flow of the method will be described in detail from the perspective of the electronic equipment.
Referring to fig. 1, fig. 1 provides a flowchart of an image processing method. In this embodiment, the image processing method includes: step S11, step S12, and step S13.
Step S11: a map image of a model is acquired, the map image including a plurality of pixels of different depth values.
Step S12: and determining the depth of field range of the map image, wherein the depth of field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of all pixel points is at least two.
Step S13: and determining a clear map from the map image, wherein the depth values of all pixel points in the clear map are in the depth range.
In this embodiment, before step S11 is performed, if the electronic device is a terminal with an image capturing module, the electronic device may capture the scenery with multiple angles through its own image capturing module, and acquire a source image of the image. The source image captured by the electronic device includes Exif (Exchangeable Image File ) information including information such as a focus area, a focal length value, and an aperture value used when the camera captures the source image.
If the electronic device is a server, a terminal not having an image capturing module, or a terminal not capturing an image using an image capturing module, the electronic device may receive a source image including Exif information transmitted from an external device. The source image transmitted by the external device may be a source image photographed by a camera from multiple perspectives of the same scene.
After obtaining the source image, the electronic device may perform step S11.
Step S11: a map image of a model is acquired, the map image including a plurality of pixels of different depth values.
In this embodiment, in order to obtain a map image corresponding to a source image, the electronic device may construct a three-dimensional model corresponding to the source image, so as to obtain the map image corresponding to the source image.
For example, the electronic device may match and align the acquired source images captured from multiple perspectives to recover the inner azimuth element and the outer azimuth element of the source images; after the internal azimuth element and the external azimuth element of the source image are restored, a three-dimensional model corresponding to the source image can be constructed in a mode of generating dense point clouds; after the three-dimensional model corresponding to the source image is built, the depth value of each pixel point of the source image is calculated by using the internal azimuth element, the external azimuth element and the three-dimensional model of the source image, so that a mapping image corresponding to the source image is obtained.
Of course, the above method is only one of a plurality of ways to acquire the map image corresponding to the source image. In this embodiment, the electronic device may also receive a source image of a scene captured by an external device, and receive a three-dimensional model created by scanning the scene with an external scanner; calculating an inner azimuth element and an outer azimuth element of the source image through homonymous point pairs in the source image and the three-dimensional model; and calculating the depth value of each pixel point in the source image according to the calculated internal azimuth element and the calculated external azimuth element, thereby obtaining a mapping image corresponding to the source image. Therefore, this should not be considered as limiting the present application.
The accuracy of the mapping image can be ensured by constructing a three-dimensional model corresponding to the source image, calculating the internal azimuth element and the external azimuth element of the source image, and further calculating the depth value of each pixel point in the source image to obtain the mapping image corresponding to the source image.
It should be noted that, according to the imaging principle when an image is captured, a source image for focusing and capturing a scene necessarily includes at least two pixel points having different depth values in a corresponding map image.
Referring to fig. 2, assume that: the electronic device obtains a map image of the shot scenery by processing the source image of the scenery.
After obtaining a map image of the model, the map image containing pixels of a plurality of different depth values, the electronic device may perform step S12.
Step S12: and determining the depth of field range of the map image, wherein the depth of field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of all pixel points is at least two.
In this embodiment, in order to determine the depth of field range of the map image, the electronic device may take the following way.
In one possible implementation manner, the electronic device may determine information such as a focusing area, a focal length value, and an aperture value utilized by the camera to capture the source image by reading Exif information of the source image. The electronic device may determine, according to the focusing area, all pixel points located in the focusing area from the map image, where all pixel points include at least two pixel points with different depth values, and calculate an average depth value of the pixel points.
As shown in fig. 3, the foregoing assumption continues: after the electronic device reads the Exif information of the source image, a corresponding area a is determined from the map image from the focusing area utilized by the shooting source image contained in the Exif information, and an average value of depth values of all pixel points located in the area a in the map image is calculated, wherein the calculated average depth value is L.
After determining the average depth value, the electronic device may calculate the front depth of field and the rear depth of field of the map image according to the focal length value and the aperture value contained in the Exif information, respectively, by using a calculation formula of the depth of field.
The front and rear depths of field of the map image can be calculated by equation (1):
wherein L is 1 Representing the depth of field, L 2 Represents the depth of field, L represents the average depth value, F represents the aperture value, F represents the focal length value, delta tableShowing the allowable circle of confusion diameter. The delta can be preset as follows:
camera picture 24mm x 36mm 6cm x 9cm 4″x5″
Diameter of circle of confusion 0.035mm 0.0817mm 0.146mm
The camera frame may be determined by a camera model contained in the Exif information, whereby the front and rear depth values of the map image may be calculated.
After the front depth-of-field value and the rear depth-of-field value of the map image are determined, the electronic device can use the value range determined by the front depth-of-field value and the rear depth-of-field value as the depth-of-field range of the map image. The depth of field range determined in this way is accurate and reliable.
In another optional implementation manner, in order to obtain a higher-quality clear map, the electronic device may further determine a first adjustment value for adjusting the depth range, which is matched with the focal length value and the aperture value, by using the focal length value and the aperture value included in the Exif information of the source image, subtract the average depth value from the front depth value, multiply the subtracted result with the first adjustment value, and then add the average depth value to obtain the first depth value; and subtracting the average depth value from the rear depth value, multiplying the rear depth value by the first adjustment value, and adding the average depth value to obtain a second depth value. The value range formed between the first depth-of-field value and the second depth-of-field value is used as the depth-of-field range of the map image.
For example, before calculating the depth of field value L 1 And a post depth of field value L 2 Then, the aperture value F and the focal length value F are matched with a preset adjusting value for adjusting the depth of field range, so that a first adjusting value a is determined, and the depth of field range is:the preset adjustment value for adjusting the depth of field is set empirically, and is generally smaller than 1, so that the image in the depth of field determined in this way is clearer.
In this way, the depth of field range can be made narrower than the original depth of field range, and therefore, the image formed by the pixels in the depth of field range is clearer.
After determining the depth of field range of the map image, the electronic device may execute step S13.
Step S13: and determining a clear map from the map image, wherein the depth values of all pixel points in the clear map are in the depth range.
In this embodiment, the electronic device may determine a clear map formed by pixel points whose depth values are within the depth range by traversing the depth values of all the pixel points in the map image.
In this embodiment, in order to improve the execution efficiency of the image processing method, the electronic device may further determine a partial image corresponding to the focusing area from the map image according to the focal length value and the aperture value included in the Exif information.
Specifically, referring to fig. 3, the electronic device may determine, according to the focusing area, a corresponding focusing area a from the map image. And the electronic equipment can match the focal length value and the aperture value with a preset adjusting value for adjusting the size of the focusing area in the map image, and determines a matched second adjusting value.
After determining the second adjustment value, the electronic device may scale the size of the focusing area according to the proportion of the second adjustment value on the basis of the focusing area a, so as to determine a part of the image corresponding to the focusing area from the map image.
Of course, the electronic device may determine the partial image by determining the center of the focusing area, and determining a partial image of a preset shape, such as a rectangle, an ellipse, etc., in the map image by scaling the size of the focusing area. The size of the partial image determined in this way is usually slightly larger than the focus area, for example, 1.2 times, 1.5 times, 2 times, or the like, as the size of the focus area, and is not limited in this case as far as practical.
By determining a partial image in such a way, the position of the clear map can be approximately determined without traversing the depth value of each pixel point in the map image, and only the pixel point with the depth value within the depth range is determined from the partial image, so that the time for determining the clear map from the map image can be greatly saved, and the execution efficiency of the image processing method is improved.
Referring to fig. 4-6, the above assumptions are continued: as shown in fig. 4, the electronic device determines a partial image B corresponding to the focusing area from the map image according to the focal length value and the aperture value, and the area of the partial image B is 3.5 times the area of the focusing area, and is elliptical. The electronic device can determine that the depth value of the pixel point is in the depth of field range (aL 1 ,aL 2 ) To determine the clear map D shown in fig. 6. Alternatively, as shown in fig. 5, the electronic device determines, from the map image, a partial image C corresponding to the focus area, based on the focal length value and the aperture value, to be rectangular. The electronic device can determine that the depth value of the pixel point is in the depth of field range (aL 1 ,aL 2 ) To determine the clear map D shown in fig. 6.
After determining the distinct map D, the distinct map D may be used for model mapping. The model mapping is carried out in the mode, the quality of the mapping can be ensured, and the situation that the mapping is manually removed and selected due to unclear mapping is avoided as much as possible, so that the convenience and the high efficiency of the model mapping can be improved, and the labor cost is saved.
The structure of an example row of the electronic device to which the image processing method provided in the embodiment of the present application is applied may be as shown in fig. 7 and fig. 8.
Referring to fig. 7, fig. 7 provides a block diagram of an electronic device 10. The electronic device 10 may be a server or a terminal. When the electronic device 10 is a server, the server may be a web server, a database server, a cloud server, a server integration made up of a plurality of sub-servers, or the like. While the electronic device 10 is a terminal, the terminal may be a smart phone, a tablet computer, a personal digital assistant, etc., of course, the above-listed devices are for facilitating understanding of the present embodiment, and should not be taken as limiting the present embodiment.
When the electronic device 10 is a terminal that acquires images using a camera module, the electronic device 10 may include a communication interface 12 to which the camera module 15 is connected via a network, one or more processors 14 for executing program instructions, a bus 13, and a different form of memory 11, such as a disk, ROM, or RAM, or any combination thereof.
Illustratively, the memory 11 has a program stored therein. The processor 14 can call and execute these programs from the memory 11, so that the image processing method can be executed by running the programs. The processor 14 may process the image captured by the image capturing module 15 by executing the image processing method, determine the first position, and generate the capturing positioning mark at the second position of the capturing interface of the first terminal or the second terminal based on the first position, so as to complete execution of the image processing method.
Referring to fig. 8, when the electronic device 10 is a server, a terminal without an image capturing module, or a terminal that does not use an image capturing module to acquire an image, the electronic device 10 may be configured to receive an image transmitted from an external device through the communication interface 12 without the image capturing module 15 shown in fig. 3, so as to perform an image processing method.
Referring to fig. 9, an embodiment of the present application further provides an image processing apparatus 20, including:
an obtaining module 21, configured to obtain a map image of a model, where the map image includes a plurality of pixels with different depth values; the processing module is used for determining a depth-of-field range of the map image, wherein the depth-of-field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of the all pixel points is at least two; and the processing module is further used for determining a clear map from the map image, wherein the depth values of all pixel points in the clear map are in the depth range.
In this embodiment, the processing module 22 is further configured to:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot a source image corresponding to the map image, wherein the map image is obtained by carrying out depth processing on the source image; and determining the depth of field range of the map image according to the focusing area, the focal length value and the aperture value.
In this embodiment, the processing module 22 is further configured to:
determining an average depth value of a pixel point positioned in the focusing area in the map image; determining a front depth of field and a rear depth of field of the map image according to the average depth value, the focal length value and the aperture value; and determining the depth of field range of the map image according to the front depth of field and the rear depth of field.
In this embodiment, the processing module 22 is further configured to:
determining a first adjustment value matched with the focal length value and the aperture value, wherein the first adjustment value is used for adjusting the size of the depth of field; and determining a first depth of field value of the front depth of field after being adjusted by the first adjustment value, determining a second depth of field value of the rear depth of field after being adjusted by the first adjustment value, and determining the depth of field range between the first depth of field value and the second depth of field value.
In this embodiment, the processing module 22 is further configured to:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot a source image corresponding to the map image, wherein the map image is obtained by carrying out depth processing on the shot source image; determining partial images corresponding to the focusing areas in the map images according to the focal length values and the aperture values; and determining the clear map formed by the pixel points with depth values within the depth range from the partial image.
In this embodiment, the processing module 22 is further configured to:
determining a second adjustment value matched with the focal length value and the aperture value, wherein the second adjustment value is used for adjusting the size of the focusing area in the map image; and determining the partial image in the map image according to the second adjustment value and the focusing area.
In summary, the embodiments of the present application provide an image processing method, an apparatus, a storage medium, and an electronic device, which determine a depth range of a map image and a depth value of each pixel point in the map image, so as to determine a pixel point with a depth value within the depth range, thereby determining a clear map of the image in the map image. The method can remove the pixels with depth values not in the depth range of the image, and ensure that each pixel in the clear map is in the depth range of the image, thereby ensuring the definition of the clear map and ensuring that the obtained clear map meets the requirements of the model map. Therefore, clear mapping meeting the mapping requirements of the model can be conveniently obtained, and waste of manpower and material resources caused by unqualified mapping quality can be avoided.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (7)

1. An image processing method, the method comprising:
obtaining a map image of a model, wherein the map image comprises a plurality of pixel points with different depth values;
determining a depth of field range of the map image, wherein the depth of field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of all pixel points is at least two;
determining a clear map from the map image, wherein depth values of all pixel points in the clear map are in the depth range;
the map image is obtained by performing depth processing on a shot source image, and clear maps are determined from the map image, and the method comprises the following steps:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot the source image;
determining partial images corresponding to the focusing areas in the map images according to the focal length values and the aperture values;
determining the clear map formed by the pixel points with depth values within the depth range from the partial image;
and determining a part of the image corresponding to the focusing area in the map image according to the focal length value and the aperture value, wherein the method comprises the following steps:
determining a second adjustment value matched with the focal length value and the aperture value, wherein the second adjustment value is used for adjusting the size of the focusing area in the map image;
and determining the partial image in the map image according to the second adjustment value and the focusing area.
2. The image processing method according to claim 1, wherein the map image is obtained by subjecting a photographed source image to a depth processing, and determining a depth range of the map image includes:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot the source image;
and determining the depth of field range of the map image according to the focusing area, the focal length value and the aperture value.
3. The image processing method according to claim 2, wherein determining the depth of field range of the map image based on the focus area, the focal length value, and the aperture value includes:
determining an average depth value of a pixel point positioned in the focusing area in the map image;
determining a front depth of field and a rear depth of field of the map image according to the average depth value, the focal length value and the aperture value;
and determining the depth of field range of the map image according to the front depth of field and the rear depth of field.
4. The image processing method according to claim 3, wherein determining a depth of field range of the map image from the front depth of field and the rear depth of field comprises:
determining a first adjustment value matched with the focal length value and the aperture value, wherein the first adjustment value is used for adjusting the size of the depth of field;
and determining a first depth of field value of the front depth of field after being adjusted by the first adjustment value, determining a second depth of field value of the rear depth of field after being adjusted by the first adjustment value, and determining the depth of field range between the first depth of field value and the second depth of field value.
5. An image processing apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a map image of a model, and the map image comprises a plurality of pixel points with different depth values;
the processing module is used for determining a depth-of-field range of the map image, wherein the depth-of-field range is a range formed by depth values of all pixel points in a clearly imaged area in the map image, and the number of the all pixel points is at least two;
the processing module is further configured to determine a clear map from the map image, where depth values of all pixels in the clear map are within the depth range;
the processing module is further used for determining a focusing area, a focal length value and an aperture value which are utilized by the camera to shoot a source image corresponding to the map image, wherein the map image is obtained by carrying out depth processing on the source image; determining a depth of field range of the map image according to the focusing area, the focal length value and the aperture value;
the map image is obtained by performing depth processing on a shot source image, and clear maps are determined from the map image, and the method comprises the following steps:
determining a focusing area, a focal length value and an aperture value which are utilized by a camera to shoot the source image;
determining partial images corresponding to the focusing areas in the map images according to the focal length values and the aperture values;
determining the clear map formed by the pixel points with depth values within the depth range from the partial image;
and determining a part of the image corresponding to the focusing area in the map image according to the focal length value and the aperture value, wherein the method comprises the following steps:
determining a second adjustment value matched with the focal length value and the aperture value, wherein the second adjustment value is used for adjusting the size of the focusing area in the map image;
and determining the partial image in the map image according to the second adjustment value and the focusing area.
6. A computer readable storage medium having computer executable non-volatile program code for storing program code, characterized in that the program code, when read and run by a computer, performs the image processing method of any of claims 1-4.
7. An electronic device, comprising: the device comprises a communication interface, a bus, a processor and a memory, wherein the processor, the memory and the communication interface are connected through the bus; the memory for storing computer readable instructions and the processor for executing the image processing method of any of claims 1-4 by invoking and executing the computer readable instructions.
CN201910428565.2A 2019-05-21 2019-05-21 Image processing method, device, storage medium and electronic equipment Active CN110136237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910428565.2A CN110136237B (en) 2019-05-21 2019-05-21 Image processing method, device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910428565.2A CN110136237B (en) 2019-05-21 2019-05-21 Image processing method, device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110136237A CN110136237A (en) 2019-08-16
CN110136237B true CN110136237B (en) 2023-12-26

Family

ID=67572375

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910428565.2A Active CN110136237B (en) 2019-05-21 2019-05-21 Image processing method, device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110136237B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117372324A (en) * 2022-07-06 2024-01-09 深圳青澜生物技术有限公司 Microneedle patch detection method and device, computer equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105282443B (en) * 2015-10-13 2019-06-14 哈尔滨工程大学 A kind of panorama depth panoramic picture imaging method
CN106060423B (en) * 2016-06-02 2017-10-20 广东欧珀移动通信有限公司 Blur photograph generation method, device and mobile terminal
CN106651870B (en) * 2016-11-17 2020-03-24 山东大学 Segmentation method of image out-of-focus fuzzy region in multi-view three-dimensional reconstruction

Also Published As

Publication number Publication date
CN110136237A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN108898567B (en) Image noise reduction method, device and system
CN109474780B (en) Method and device for image processing
EP3496383A1 (en) Image processing method, apparatus and device
US9898856B2 (en) Systems and methods for depth-assisted perspective distortion correction
WO2017016050A1 (en) Image preview method, apparatus and terminal
CN107749944A (en) A kind of image pickup method and device
CN110493527B (en) Body focusing method and device, electronic equipment and storage medium
CN110324532B (en) Image blurring method and device, storage medium and electronic equipment
EP3598385B1 (en) Face deblurring method and device
EP3798975B1 (en) Method and apparatus for detecting subject, electronic device, and computer readable storage medium
CN109064504B (en) Image processing method, apparatus and computer storage medium
CN111415310B (en) Image processing method and device and storage medium
CN111385461B (en) Panoramic shooting method and device, camera and mobile terminal
CN110505398B (en) Image processing method and device, electronic equipment and storage medium
CN109068060B (en) Image processing method and device, terminal device and computer readable storage medium
CN108776800B (en) Image processing method, mobile terminal and computer readable storage medium
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN110490196A (en) Subject detection method and apparatus, electronic equipment, computer readable storage medium
CN111311481A (en) Background blurring method and device, terminal equipment and storage medium
CN110136237B (en) Image processing method, device, storage medium and electronic equipment
CN110392211A (en) Image processing method and device, electronic equipment, computer readable storage medium
CN111932462B (en) Training method and device for image degradation model, electronic equipment and storage medium
KR101598399B1 (en) System for combining images using coordinate information of roadview image
WO2014165159A1 (en) System and method for blind image deconvolution
CN113225484B (en) Method and device for rapidly acquiring high-definition picture shielding non-target foreground

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant