CN113034382A - Brightness uniformity adjusting method and device, computer equipment and readable storage medium - Google Patents

Brightness uniformity adjusting method and device, computer equipment and readable storage medium Download PDF

Info

Publication number
CN113034382A
CN113034382A CN202110199525.2A CN202110199525A CN113034382A CN 113034382 A CN113034382 A CN 113034382A CN 202110199525 A CN202110199525 A CN 202110199525A CN 113034382 A CN113034382 A CN 113034382A
Authority
CN
China
Prior art keywords
gray
image
target
original
uniformity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110199525.2A
Other languages
Chinese (zh)
Other versions
CN113034382B (en
Inventor
刘辉林
唐京科
陈春
敖丹军
李嘉怡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chuangxiang 3D Technology Co Ltd
Original Assignee
Shenzhen Chuangxiang 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chuangxiang 3D Technology Co Ltd filed Critical Shenzhen Chuangxiang 3D Technology Co Ltd
Priority to CN202110199525.2A priority Critical patent/CN113034382B/en
Publication of CN113034382A publication Critical patent/CN113034382A/en
Application granted granted Critical
Publication of CN113034382B publication Critical patent/CN113034382B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application relates to a brightness uniformity adjusting method, a brightness uniformity adjusting device, computer equipment and a readable storage medium, wherein the brightness uniformity adjusting method obtains an original gray scale image by obtaining a brightness distribution map of a light source to be detected and carrying out gray scale processing on the brightness distribution map; acquiring a gray value interval where the gray value of the original gray image is located, and calculating the gray uniformity of the original gray image based on the gray value interval; if the gray level uniformity is not consistent with the preset target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image; and generating a target mask image according to the original gray-scale image and the target gray-scale image. The brightness uniformity adjusting method can not be influenced by the characteristics that light emitted by the light source is in an inverted cone shape and the like, so that the brightness uniformity adjusting method has a good effect.

Description

Brightness uniformity adjusting method and device, computer equipment and readable storage medium
Technical Field
The present application relates to the field of 3D printing technologies, and in particular, to a brightness uniformity adjusting method and apparatus, a computer device, and a readable storage medium.
Background
Photo-curing printing technology is commonly employed in 3D printers. The principle of operation of a photocuring 3D printer is to use a light source to harden the liquid resin in the container to produce the desired 3D shape. For the light source of the photocuring 3D printer, a single-lamp-bead light source or a parallel light source with multiple lamp beads is generally selected. Because the light emitted by the single-lamp-bead light source or the parallel light source with multiple lamp beads is uneven, the precision of the printing model can be influenced.
In the conventional technology, the distribution of the light source intensity is usually identified by using the exposure pattern of the light source, and the uniformity of the light source is improved by adjusting the brightness of the light source. However, since the light emitted from the light source is in the shape of an inverted cone, the closer the light source is, the higher the light intensity is, and the light intensity has a certain decreasing effect at the boundary of the light source, which makes the effect of improving the uniformity of the light source by adjusting the brightness of the light source worse.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a brightness uniformity adjusting method, apparatus, computer device and readable storage medium.
In a first aspect, an embodiment of the present application provides a brightness uniformity adjusting method, including:
acquiring a brightness distribution diagram of a light source to be detected, and carrying out graying processing on the brightness distribution diagram to obtain an original gray level diagram;
acquiring a gray value interval where the gray value of the original gray image is located, and calculating the gray uniformity of the original gray image based on the gray value interval;
if the gray level uniformity is not consistent with the preset target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image;
and generating a target mask image according to the original gray-scale image and the target gray-scale image, wherein the target mask image is used for being arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
In one embodiment, calculating the gray uniformity of the original gray map based on the gray value interval comprises:
acquiring the maximum gray value and the minimum gray value of the original gray map according to the gray value interval;
and determining the gray uniformity according to the ratio of the minimum gray value to the maximum gray value.
In one embodiment, adjusting the gray-level values of the original gray-level map based on the target uniformity to obtain the target gray-level map includes:
obtaining a target gray value interval according to the ratio between the maximum gray value and the target uniformity and the ratio between the minimum gray value and the target uniformity;
and adjusting the gray value of the original gray image to be within the target gray value interval to obtain the target gray image.
In one embodiment, acquiring a gray value interval in which the gray value in the original gray map is located includes:
carrying out partition processing on the original gray-scale image to obtain an original partition gray-scale image;
and acquiring the mean value of the gray value of each area in the original partition gray image, and determining a gray value interval based on the mean value of the gray value of each area.
In one embodiment, generating the target mask image from the original gray scale map and the target gray scale map comprises:
and calculating the difference between the original gray-scale image and the target gray-scale image to obtain a target mask image.
In one embodiment, if the light source to be measured is a non-single light source, graying the brightness distribution map to obtain an original grayscale map, including:
carrying out graying processing on the brightness distribution diagram to obtain an initial gray level diagram;
carrying out perspective transformation on the initial gray level image to obtain a modified gray level image, wherein the modified gray level image comprises a brightness stripe boundary;
and carrying out interpolation processing on the brightness fringe boundary in the corrected gray-scale image to obtain an original gray-scale image.
In one embodiment, the interpolating the luminance stripe boundary in the modified gray-scale image to obtain the original gray-scale image includes:
determining the gray value distribution of the row coordinate and the gray value distribution of the column coordinate of the corrected gray map according to the corrected gray map;
determining a mean value curve of the brightness fringe boundary according to the gray value distribution of the row coordinate and the gray value distribution of the column coordinate;
determining a gradient change curve of the mean curve based on the mean curve;
and carrying out interpolation processing on the brightness stripe boundary according to the gradient change curve to obtain a target gray level image.
In one embodiment, if the light source to be measured is a non-single light source, generating the target mask image according to the original gray-scale image and the target gray-scale image includes:
determining a compensation value of the initial mask image and the brightness fringe boundary according to the target gray-scale image and the original gray-scale image;
and determining the target mask image according to the initial mask image and the compensation value.
In a second aspect, an embodiment of the present application provides a luminance uniformity adjusting apparatus, including:
the acquisition module is used for acquiring a brightness distribution map of the light source to be detected and carrying out graying processing on the brightness distribution map to obtain an original grayscale map;
the calculation module is used for acquiring a gray value interval where each gray value in the original gray image is located and calculating the gray uniformity of the original gray image based on the gray value interval;
the determining module is used for adjusting the gray value of the original gray image based on the target uniformity degree to obtain a target gray image if the gray uniformity degree is inconsistent with the preset target uniformity degree;
and the generating module is further used for generating a target mask image according to the original gray-scale image and the target gray-scale image, and the target mask image is used for being arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
In a third aspect, an embodiment of the present application provides a computer device, which includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the brightness uniformity adjusting method provided in the above embodiment when executing the computer program.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the brightness uniformity adjusting method provided in the foregoing embodiment.
The embodiment of the application provides a brightness uniformity adjusting method, a brightness uniformity adjusting device, computer equipment and a readable storage medium. The method comprises the steps of carrying out graying processing on an acquired brightness distribution diagram of a light source to be detected to obtain an original grayscale diagram; calculating the gray level uniformity of the original gray level image based on the gray level value interval where the gray level value of the obtained original gray level image is located; if the gray level uniformity is not consistent with the target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image; and generating a target mask image according to the original gray-scale image and the target gray-scale image. The brightness uniformity adjusting method provided by the embodiment of the application is not affected by the characteristics that the brightness of the light beam emitted by the light source to be detected is in an inverted cone shape and the like, and the gray value of the original gray level image is adjusted based on the target uniformity, so that the uniformity of the obtained target gray level image is the target uniformity. Therefore, the effect of adjusting the brightness uniformity of the light source to be detected by using the target mask image generated according to the target gray-scale image and the original gray-scale image is better, the uniformity of the brightness distribution diagram of the adjusted light source to be detected can be improved, and the precision of the printing model can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or the conventional technologies of the present application, the drawings used in the description of the embodiments or the conventional technologies will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
fig. 2 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
fig. 3 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
fig. 4 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
fig. 5 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
FIG. 6 is an initial gray scale map provided by an embodiment of the present application;
FIG. 7 is an original gray scale map provided by an embodiment of the present application;
fig. 8 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
fig. 9 is a schematic diagram of a row coordinate gray scale value distribution of a modified gray scale map according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a gray scale distribution of column coordinates of a modified gray scale provided by an embodiment of the present application;
FIG. 11 is a schematic illustration of a slope change curve provided in accordance with an embodiment of the present application;
FIG. 12 is a schematic illustration of a slope change curve provided in accordance with an embodiment of the present application;
fig. 13 is a flowchart illustrating steps of a brightness uniformity adjusting method according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a luminance uniformity adjusting device according to an embodiment of the present application;
fig. 15 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, embodiments accompanying the present application are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. This application is capable of embodiments in many different forms than those described herein and that modifications may be made by one skilled in the art without departing from the spirit and scope of the application and it is therefore not intended to be limited to the specific embodiments disclosed below.
The following describes the technical solutions of the present application and how to solve the technical problems with the technical solutions of the present application in detail with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The numbering of the components as such, e.g., "first", "second", etc., is used herein only to distinguish the objects as described, and does not have any sequential or technical meaning.
The brightness uniformity adjusting method can be used for a 3D printer. The working principle of a 3D printer is to harden the liquid resin in the container using a light source to produce the desired 3D model. The brightness uniformity adjusting method provided by the application can adjust the brightness uniformity of the light beam emitted by the light source in the 3D printer, so that the precision of the 3D model generated by the 3D printer is higher.
The brightness uniformity adjusting method can be realized through computer equipment. Computer devices include, but are not limited to, control chips, personal computers, laptops, smartphones, tablets, and portable wearable devices. The method provided by the application can be realized through JAVA software and can also be applied to other software.
Referring to fig. 1, an embodiment of the present application provides a brightness uniformity adjusting method, which includes the following steps:
step 101, obtaining a brightness distribution diagram of a light source to be detected, and performing graying processing on the brightness distribution diagram to obtain an original grayscale image.
The computer equipment acquires a brightness distribution map of a light source to be detected, wherein the light source to be detected refers to a light source used by the 3D printer, and the light source can be a single-lamp pearlescent source, a parallel light source and the like. The brightness profile may be captured by a camera and transmitted to the computer device. The present embodiment does not limit the obtaining of the brightness distribution map of the light source to be measured, and the type and structure of the light source to be measured, as long as the functions thereof can be realized. After acquiring the brightness distribution map, the computer device performs graying processing on the brightness distribution map, namely converting the brightness distribution map into an original grayscale map. The original gray-scale image can still reflect the distribution and the characteristics of the chromaticity and the brightness level of the whole image and the local image like the brightness distribution image. The method of performing the graying process on the luminance distribution chart in the present embodiment is not limited as long as the function thereof can be realized.
And 102, acquiring a gray value interval where the gray value of the original gray image is located, and calculating the gray uniformity of the original gray image based on the gray value interval.
The computer equipment can obtain the gray value of the original gray map according to the obtained original gray map, and can obtain the gray value interval where the gray value is located according to the obtained gray value. The gray value interval may refer to an interval formed by a minimum gray value and a maximum gray value of gray values of the original gray map. And the computer equipment can calculate the gray uniformity of the original gray map according to the obtained gray value interval. The gray uniformity is used to represent the uniformity of the gray distribution of the original gray map. The method for acquiring the gray value interval where the gray value of the original gray map is located is not limited in this embodiment, as long as the gray value interval can be obtained.
103, if the gray level uniformity is not consistent with a preset target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image;
and comparing the obtained gray level uniformity with a preset target uniformity by the computer equipment, and if the gray level uniformity is consistent with the target uniformity, indicating that the gray level distribution uniformity of the original gray level map reaches the gray level distribution uniformity of the gray level map required by a user, so that the brightness distribution uniformity of the light source to be detected can be represented. If the gray uniformity is not consistent with the target uniformity, it means that the uniformity of the gray distribution of the original gray map does not reach the uniformity of the gray distribution of the gray map desired by the user. The preset target uniformity may be set by the user according to actual needs and stored in the memory of the computer device. And after judging that the gray degree uniformity is inconsistent with the target uniformity, the computer equipment adjusts the gray value in the original gray map based on the preset target uniformity, so that the uniformity of the gray distribution of the adjusted original gray map reaches the gray distribution uniformity of the gray map of a user. That is, the uniformity of the gray distribution of the target gray map can be a preset target uniformity. The present embodiment does not set any limitation to a specific method for adjusting the gray-scale values of the original gray-scale image based on the target uniformity, as long as the function thereof can be achieved.
And 104, generating a target mask image according to the original gray-scale image and the target gray-scale image, wherein the target mask image is arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
The computer device can generate a target mask image according to the obtained original gray-scale image and the target gray-scale image. The target mask image may represent a region where the original grayscale map and the target grayscale map are different. When printing the model using a 3D printer, a target mask image is set on the side of the light emitting surface of the light source to be measured. The target mask image can be used for shielding part of light beams emitted by the light source to be detected, so that the brightness distribution of the light beams passing through the target mask image is more uniform.
The brightness uniformity adjusting method provided by the embodiment of the application obtains an original gray-scale image by performing graying processing on the acquired brightness distribution diagram of the light source to be detected; calculating the gray level uniformity of the original gray level image based on the gray level value interval where the gray level value of the obtained original gray level image is located; if the gray level uniformity is not consistent with the target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image; and generating a target mask image according to the original gray-scale image and the target gray-scale image. The brightness uniformity adjusting method provided by the embodiment of the application is not affected by the characteristics that the brightness of the light beam emitted by the light source to be detected is in an inverted cone shape and the like, and the gray value of the original gray level image is adjusted based on the target uniformity, so that the uniformity of the obtained target gray level image is the target uniformity. Therefore, the effect of adjusting the brightness uniformity of the light source to be detected by using the target mask image generated according to the target gray-scale image and the original gray-scale image is better, the uniformity of the brightness distribution diagram of the adjusted light source to be detected can be improved, and the precision of the printing model can be improved.
Referring to fig. 2, in one embodiment, calculating the gray uniformity of the original gray map based on the gray value interval includes:
step 201, acquiring a maximum gray value and a minimum gray value of an original gray map according to a gray value interval;
the gray value interval is a gray value interval formed by the minimum gray value and the minimum gray value of the gray values in the original gray map, and the computer device can acquire the maximum gray value and the minimum gray value of the original gray map according to the obtained gray value interval.
Step 202, determining the gray uniformity according to the ratio of the minimum gray value to the maximum gray value.
The computer device can obtain the gray uniformity of the original gray map by calculating the ratio of the minimum gray value to the maximum gray value of the original gray map. In a specific embodiment, assuming that the gray scale interval of the original gray scale map is [ minData, maxData ], and the gray scale uniformity is a, the gray scale uniformity can be expressed as: a is minData/maxData.
In this embodiment, the method for calculating the gray uniformity is simple and easy to understand, so that the efficiency of calculating the target mask image can be improved, and the practicability of the brightness uniformity adjusting method can be improved.
Referring to fig. 3, in an embodiment, adjusting the gray-level values of the original gray-level map based on the target uniformity to obtain the target gray-level map includes:
301, obtaining a target gray value interval according to a ratio between the maximum gray value and the target uniformity and a ratio between the minimum gray value and the target uniformity;
the computer equipment can obtain the maximum gray value and the minimum gray value of the original gray map according to the gray value interval, and can obtain the target minimum gray value in the target gray value interval by calculating the ratio of the minimum gray value to the target uniformity; the target maximum gray value in the target gray value interval can be obtained by calculating the ratio of the maximum gray value to the target uniformity; and the interval formed by the target minimum gray value and the target maximum gray value is the target gray value interval. In a specific embodiment, assuming that the target uniformity is B and the target gray-value interval is [ minDataB, maxDataB ], the calculation method of the target gray-value interval can be expressed as: minDataB is minData/B, and minData is minData/B.
Step 302, adjusting the gray value of the original gray map to the target gray value interval to obtain the target gray map.
And the computer equipment adjusts the gray values in the original gray image to the target gray value interval according to the target gray value interval obtained by calculation to form the required target gray image. Specifically, when the gray value in the original gray image is smaller than the target minimum gray value in the target gray value interval, the gray value is set as the target minimum gray value; and when the gray value in the original gray image is larger than the target maximum gray value in the target gray value interval, setting the gray value as the target maximum gray value.
In this embodiment, the gray values of the original gray map are adjusted to the target gray value interval, so that the gray values in the obtained target gray values are all distributed in the target gray value interval, and the uniformity of the target gray map can reach the target uniformity preset by the user. In addition, in the present embodiment, the method for calculating the target grayscale interval and the target grayscale map is simple, and the efficiency of obtaining the target grayscale map can be improved.
Referring to fig. 4, in an embodiment, the obtaining of the gray value interval where the gray value in the original gray map is located includes:
step 401, performing partition processing on the original gray-scale image to obtain an original partition gray-scale image.
The computer device may divide the original grayscale map into a plurality of regions according to a preset partition rule, and record the partitioned original grayscale map as an original partition grayscale map. In a specific embodiment, the preset partition rule may be a preset graph, that is, the computer device divides the original gray-scale map into a plurality of regions according to the preset image, so as to obtain the original partition gray-scale map. The preset image may be a rectangle with a preset length and width. The length and width of the rectangle can be set by a user according to the size of the acquired brightness distribution map. When the computer device conducts partition processing on the original gray-scale image according to a preset rectangle, if pixel points of the area where the last row or the last column is located are not enough to enable the pixel points to form the preset rectangle, the pixel points of the area where the last row or the last column is located are filled according to the pixel points of adjacent areas of the last row or the last column, and therefore the pixel points can form the preset rectangle.
Step 402, obtaining the mean value of the gray values of each region in the original partition gray image, and determining a gray value interval based on the mean value of the gray values of each region.
After obtaining the original partition gray-scale image, the computer device acquires the gray-scale value of each region of the original partition gray-scale image, calculates the mean value of the gray-scale values of each region, and uses the calculated mean value as the gray-scale value of the region, so that the gray-scale value of each region of the original partition gray-scale image can be obtained. The computer equipment obtains the minimum gray value and the maximum gray value of the original partition gray map according to the gray value of each region to determine a gray value interval, and the minimum gray value and the maximum gray value forming interval are gray value intervals.
In this embodiment, after the computer device performs partition processing on the original gray-scale image according to a preset partition rule, the gray-scale value interval is determined by calculating the mean value of the gray-scale values of each region in the partitioned original partition gray-scale image, so that the calculation efficiency for determining the gray-scale value interval can be improved.
In one embodiment, generating the target mask image from the original gray scale map and the target gray scale map comprises:
and calculating the difference between the original gray-scale image and the target gray-scale image to obtain a target mask image.
The computer device can obtain different areas between the original gray-scale image and the target gray-scale image by calculating the difference between the original gray-scale image and the obtained target gray-scale image, thereby obtaining the target mask image. The present embodiment does not set any limitation to the specific method for calculating the difference between the original gray-scale map and the target gray-scale map, as long as the function thereof can be achieved. In the embodiment, the calculation method for obtaining the target mask image is simple, and the calculation efficiency can be improved.
Referring to fig. 5, in an embodiment, if the light source to be measured is a non-single light source, the graying processing is performed on the brightness distribution map to obtain an original grayscale map, which includes:
step 501, graying the brightness distribution map to obtain an initial grayscale map.
When the light source to be measured is a non-single light source, specifically, when the light source to be measured is a parallel light source, and the computer device obtains the original gray-scale image, the obtained brightness distribution map is grayed to obtain the original gray-scale image, as shown in fig. 6. For the description of the method for obtaining the luminance distribution map and the method for performing the graying processing on the luminance distribution map, reference may be made to the description in the above embodiments, and details are not repeated here.
Step 502, performing perspective transformation on the initial gray level image to obtain a modified gray level image, wherein the modified gray level image includes a brightness stripe boundary.
And when the light source to be detected is a non-single light source, the computer equipment performs perspective transformation processing on the initial gray-scale image after acquiring the initial gray-scale image to obtain a corrected gray-scale image. The perspective transformation is the transformation that the original gray-scale image is rotated by a certain angle around a perspective axis according to a perspective rotation law by utilizing the condition that three points of a perspective center, a phase point and a target point are collinear, so that the original projection light beam is damaged, and the graph of the original gray-scale image can still be kept unchanged. When the light source to be measured is a non-single light source, and the camera is used for acquiring the brightness distribution map, the brightness distribution map cannot be completely parallel to all the non-single light sources, and if an oblique image exists in the acquired brightness distribution map, an oblique image also exists in the initial gray level map. In this embodiment, the perspective transformation is performed on the initial gray scale map, so that the image tilted in the initial gray scale map can be corrected, and the corrected initial gray scale map, i.e., the corrected gray scale map, can be obtained. Because the light source to be detected is a non-single light source, the light beams emitted by each light source are influenced by the light beams emitted by other adjacent light sources, so that the light beams superposed among the light sources exist in the finally formed brightness distribution diagram, and the finally obtained modified gray level diagram comprises the brightness stripe boundary. The luminance stripe boundaries are formed by the light beams superimposed between the light sources.
And 503, performing interpolation processing on the brightness fringe boundary in the corrected gray-scale image to obtain an original gray-scale image.
After obtaining the modified gray scale image, the computer device performs interpolation processing on the brightness stripe boundary existing in the modified gray scale image, so that the gray value at the brightness stripe boundary is the same as the gray value of the adjacent area, thereby eliminating the brightness stripe boundary in the modified gray scale image and obtaining the original gray scale image, as shown in fig. 7.
In this embodiment, when the light source to be measured may be a non-single light source, the condition that an oblique image exists in the initial gray-scale image and a brightness fringe boundary exists is considered, and perspective transformation and interpolation processing are performed on the initial gray-scale image, so that the obtained original gray-scale image is more accurate, the target mask image obtained after the original gray-scale image is subsequently processed is more accurate, and the brightness uniformity adjustment effect can be further improved.
Referring to fig. 8, in an embodiment, the interpolating the luminance stripe boundary in the modified gray-scale image to obtain the original gray-scale image includes:
step 801, determining the gray value distribution of the row coordinate and the gray value distribution of the column coordinate of the corrected gray map according to the corrected gray map.
After obtaining the corrected gray scale image, the computer device obtains the distribution of the gray scale values in the direction of the row coordinate of the corrected gray scale image, and obtains the gray scale value distribution of the row coordinate of the corrected gray scale image, as shown in fig. 9; the distribution of the gray scale values in the direction of the column coordinates of the corrected gray scale is obtained, and the gray scale distribution of the column coordinates of the corrected gray scale is obtained, as shown in fig. 10.
Step 802, determining a mean curve of the brightness fringe boundary according to the gray value distribution of the row coordinate and the gray value distribution of the column coordinate.
The computer device can determine the corresponding brightness stripe boundary of the modified gray-scale image in the row coordinate direction according to the obtained gray-scale value distribution of the row coordinate, and can determine the distribution of the gray-scale values at the brightness stripe boundary by obtaining the gray-scale value distribution of the modified gray-scale image in the row coordinate, thereby determining the mean value curve of the brightness stripe boundary in the row coordinate direction. Similarly, the computer device may determine the luminance stripe boundary of the modified gray-scale image in the column coordinate direction according to the obtained gray-scale value distribution of the column coordinate, and may determine the distribution of the gray-scale values at the luminance stripe boundary by obtaining the gray-scale value distribution of the modified gray-scale image in the column coordinate, so that the mean curve of the luminance stripe boundary in the row coordinate direction may be determined.
And 803, determining a gradient change curve of the mean curve based on the mean curve.
The computer device may obtain a slope change curve corresponding to the mean value curve by calculating the slope of the mean value curve of the luminance stripe boundary in the row coordinate direction, as shown in fig. 11. The black square points in fig. 11 indicate the slope of the mean curve in the row coordinate direction, and X and Y indicated in fig. 11 are coordinate values of the corresponding black square points. Similarly, the computer device may calculate the slope of the mean curve of the luminance stripe boundary in the column coordinate direction, and may use a slope change curve corresponding to the mean curve, as shown in fig. 12. The black square points in fig. 12 indicate the slope of the mean curve in the column coordinate direction, and X and Y indicated in fig. 12 are coordinate values of the corresponding black square points.
And 804, carrying out interpolation processing on the brightness stripe boundary according to the gradient change curve to obtain a target gray level image.
The computer equipment performs interpolation processing on the brightness stripe boundary in the row coordinate direction of the correction gray scale image according to the gradient change curve corresponding to the obtained mean value curve of the brightness stripe boundary in the row coordinate direction of the correction gray scale image, so that the brightness stripe boundary in the row coordinate direction of the correction gray scale image can be eliminated. Similarly, the computer device performs interpolation processing on the brightness stripe boundary in the column coordinate direction of the corrected gray scale image according to the gradient change curve corresponding to the mean value curve of the brightness stripe boundary in the column coordinate direction of the obtained corrected gray scale image, so that the brightness stripe boundary in the row coordinate direction of the corrected gray scale image can be eliminated. The computer device can obtain the target gray-scale image by eliminating the brightness stripe boundary in the row coordinate direction and the brightness stripe boundary in the column coordinate direction of the corrected gray-scale image.
The method for eliminating and correcting the brightness fringe boundary in the gray-scale image provided by the embodiment is simple and easy to understand, and can improve the efficiency of determining the target gray-scale image.
Referring to fig. 13, in an embodiment, if the light source to be measured is a non-single light source, generating a target mask image according to the original gray-scale image and the target gray-scale image includes:
and step 131, determining a compensation value of the initial mask image and the brightness fringe boundary according to the target gray-scale image and the original gray-scale image.
Step 132, determining the target mask image according to the initial mask image and the compensation value.
And calculating the difference between the original gray-scale image and the target gray-scale image when the light source to be detected is a non-single light source by the computer equipment according to the acquired original gray-scale image and the acquired target gray-scale image when the light source to be detected is a non-single light source to obtain an initial mask image and a compensation value of the brightness fringe boundary. And adding the initial mask image and the compensation value to obtain the target mask image when the light source to be detected is a non-single light source. In this embodiment, when determining the target mask image, the condition that the light source to be measured is a non-single light source is considered, so that the accuracy of obtaining the target mask image can be improved, the effect of adjusting the brightness distribution uniformity of the light source to be measured by using the target mask image can be improved, and the brightness uniformity adjusting method provided by this embodiment has strong practicability.
It should be understood that, although the steps in the flowcharts in the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least some of the sub-steps or stages of other steps.
Referring to fig. 14, an embodiment of the present application provides a luminance uniformity adjusting apparatus 10 including an obtaining module 11, a calculating module 12, a determining module 13, and a generating module 14. Wherein the content of the first and second substances,
the obtaining module 11 is configured to obtain a brightness distribution map of the light source to be detected, and perform graying processing on the brightness distribution map to obtain an original grayscale map;
the calculation module 12 is configured to obtain a gray value interval where a gray value in the original gray image is located, and calculate a gray uniformity of the original gray image based on the gray value interval;
the determining module 13 is configured to, if the gray scale uniformity is inconsistent with a preset target uniformity, adjust the gray scale value of the original gray scale map based on the target uniformity to obtain a target gray scale map;
the generating module 14 is configured to generate a mask image according to the original gray scale image and the target gray scale image, where the target mask image is configured to be disposed on one side of the light emitting surface of the light source to be measured during the 3D printing process.
In one embodiment, the calculation module 12 comprises an acquisition unit 121 and a determination unit 122, wherein
The obtaining unit 121 is configured to obtain a maximum gray value and a minimum gray value of the original gray map according to the gray value interval;
the determining unit 122 is configured to determine the gray uniformity according to a ratio of the minimum gray value to the minimum gray value.
In one embodiment, the determining module 13 is further configured to obtain a target gray value interval according to a ratio between the maximum gray value and the target uniformity, and a ratio between the minimum gray value and the target uniformity; and adjusting the gray value of the original gray image to be within the target gray value interval to obtain the target gray image.
In one embodiment, the calculation module 12 is further specifically configured to perform partition processing on the original gray-scale image to obtain an original fecal sewage gray-scale image; and acquiring the mean value of the gray value of each area in the original partition gray image, and determining a gray value interval based on the mean value of the gray value of each area.
In an embodiment, the generating module 14 is specifically configured to calculate a difference between the original gray-scale image and the target gray-scale image to obtain a mask image.
In one embodiment, the acquisition module 11 comprises a first processing unit, a transformation unit and a second processing unit. Wherein the content of the first and second substances,
the first processing unit is used for carrying out graying processing on the brightness distribution diagram to obtain an initial gray level diagram;
the transformation unit is used for carrying out perspective transformation on the initial gray level image to obtain a modified gray level image, wherein the modified gray level image comprises a brightness stripe boundary;
and the second processing unit is used for carrying out interpolation processing on the brightness fringe boundary in the corrected gray-scale image to obtain an original gray-scale image.
In one embodiment, the second processing unit is specifically configured to determine, according to the modified gray scale map, a gray scale value distribution of row coordinates and a gray scale value distribution of column coordinates of the modified gray scale map; determining a mean value curve of the brightness fringe boundary according to the gray value distribution of the row coordinate and the gray value distribution of the column coordinate; determining a gradient change curve of the mean curve based on the mean curve; and carrying out interpolation processing on the brightness stripe boundary according to the gradient change curve to obtain a target gray level image.
In one embodiment, the generating module 14 is further configured to determine a compensation value of the initial mask image and the luminance stripe boundary according to the target gray-scale image and the original gray-scale image; determining a target mask image according to the initial mask image and the compensation value
For the specific limitations of the brightness uniformity adjusting device 10, reference may be made to the above limitations of the brightness uniformity adjusting method, which are not described herein again. The various modules in the brightness uniformity adjusting apparatus 10 may be implemented in whole or in part by software, hardware, and combinations thereof. The above devices, modules or units may be embedded in hardware or independent from a processor in a computer device, or may be stored in a memory in the computer device in software, so that the processor can call and execute operations corresponding to the above devices or modules.
Referring to fig. 15, in one embodiment, a computer device is provided, and the computer device may be a server, and the internal structure thereof may be as shown in fig. 15. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is used to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer equipment is used for storing the brightness distribution map of the light source to be detected, the preset target uniformity and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer device is executed by the processor to realize a brightness uniformity adjusting method.
Those skilled in the art will appreciate that the architecture shown in fig. 15 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, there is provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the following steps when executing the computer program:
acquiring a brightness distribution diagram of a light source to be detected, and carrying out graying processing on the brightness distribution diagram to obtain an original gray level diagram;
acquiring a gray value interval where the gray value of the original gray image is located, and calculating the gray uniformity of the original gray image based on the gray value interval;
if the gray level uniformity is not consistent with the preset target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image;
and generating a target mask image according to the original gray-scale image and the target gray-scale image, wherein the target mask image is used for being arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
In one embodiment, the processor, when executing the computer program, further performs the steps of: acquiring the maximum gray value and the minimum gray value of the original gray map according to the gray value interval; and determining the gray uniformity according to the ratio of the minimum gray value to the maximum gray value.
In one embodiment, the processor, when executing the computer program, further performs the steps of: obtaining a target gray value interval according to the ratio between the maximum gray value and the target uniformity and the ratio between the minimum gray value and the target uniformity; and adjusting the gray value of the original gray image to be within the target gray value interval to obtain the target gray image.
In one embodiment, the processor, when executing the computer program, further performs the steps of: carrying out partition processing on the original gray-scale image to obtain an original partition gray-scale image; and acquiring the mean value of the gray value of each area in the original partition gray image, and determining a gray value interval based on the mean value of the gray value of each area.
In one embodiment, the processor, when executing the computer program, further performs the steps of: and calculating the difference between the original gray-scale image and the target gray-scale image to obtain the target mask image.
In one embodiment, the processor, when executing the computer program, further performs the steps of: carrying out graying processing on the brightness distribution diagram to obtain an initial gray level diagram; carrying out perspective transformation on the initial gray level image to obtain a modified gray level image, wherein the modified gray level image comprises a brightness stripe boundary; and carrying out interpolation processing on the brightness fringe boundary in the corrected gray-scale image to obtain an original gray-scale image.
In one embodiment, the processor, when executing the computer program, further performs the steps of: determining the gray value distribution of the row coordinate and the gray value distribution of the column coordinate of the corrected gray map according to the corrected gray map; determining a mean value curve of the brightness fringe boundary according to the gray value distribution of the row coordinate and the gray value distribution of the column coordinate; determining a gradient change curve of the mean curve based on the mean curve; and carrying out interpolation processing on the brightness stripe boundary according to the gradient change curve to obtain a target gray level image.
In one embodiment, the processor, when executing the computer program, further performs the steps of: determining a compensation value of the initial mask image and the brightness fringe boundary according to the target gray-scale image and the original gray-scale image; and determining the target mask image according to the initial mask image and the compensation value.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring a brightness distribution diagram of a light source to be detected, and carrying out graying processing on the brightness distribution diagram to obtain an original gray level diagram;
acquiring a gray value interval where the gray value of the original gray image is located, and calculating the gray uniformity of the original gray image based on the gray value interval;
if the gray level uniformity is not consistent with the preset target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image;
and generating a target mask image according to the original gray-scale image and the target gray-scale image, wherein the target mask image is used for being arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
In one embodiment, the computer program when executed by the processor further performs the steps of: acquiring the maximum gray value and the minimum gray value of the original gray map according to the gray value interval; and determining the gray uniformity according to the ratio of the minimum gray value to the maximum gray value.
In one embodiment, the computer program when executed by the processor further performs the steps of: obtaining a target gray value interval according to the ratio between the maximum gray value and the target uniformity and the ratio between the minimum gray value and the target uniformity; and adjusting the gray value of the original gray image to be within the target gray value interval to obtain the target gray image.
In one embodiment, the computer program when executed by the processor further performs the steps of: carrying out partition processing on the original gray-scale image to obtain an original partition gray-scale image; and acquiring the mean value of the gray value of each area in the original partition gray image, and determining a gray value interval based on the mean value of the gray value of each area.
In one embodiment, the computer program when executed by the processor further performs the steps of: and calculating the difference between the original gray-scale image and the target gray-scale image to obtain the target mask image.
In one embodiment, the computer program when executed by the processor further performs the steps of: carrying out graying processing on the brightness distribution diagram to obtain an initial gray level diagram; carrying out perspective transformation on the initial gray level image to obtain a modified gray level image, wherein the modified gray level image comprises a brightness stripe boundary; and carrying out interpolation processing on the brightness fringe boundary in the corrected gray-scale image to obtain an original gray-scale image.
In one embodiment, the computer program when executed by the processor further performs the steps of: determining the gray value distribution of the row coordinate and the gray value distribution of the column coordinate of the corrected gray map according to the corrected gray map; determining a mean value curve of the brightness fringe boundary according to the gray value distribution of the row coordinate and the gray value distribution of the column coordinate; determining a gradient change curve of the mean curve based on the mean curve; and carrying out interpolation processing on the brightness stripe boundary according to the gradient change curve to obtain a target gray level image.
In one embodiment, the computer program when executed by the processor further performs the steps of: determining a compensation value of the initial mask image and the brightness fringe boundary according to the target gray-scale image and the original gray-scale image; and determining a target mask image according to the initial mask image and the compensation value.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (11)

1. A brightness uniformity adjusting method, comprising:
acquiring a brightness distribution diagram of a light source to be detected, and carrying out graying processing on the brightness distribution diagram to obtain an original grayscale diagram;
acquiring a gray value interval where the gray value of the original gray image is located, and calculating the gray uniformity of the original gray image based on the gray value interval;
if the gray level uniformity is not consistent with a preset target uniformity, adjusting the gray level value of the original gray level image based on the target uniformity to obtain a target gray level image;
and generating a target mask image according to the original gray-scale image and the target gray-scale image, wherein the target mask image is arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
2. The brightness uniformity adjusting method according to claim 1, wherein said calculating the gray uniformity of the original gray map based on the gray value interval comprises:
acquiring the maximum gray value and the minimum gray value of the original gray map according to the gray value interval;
and determining the gray uniformity according to the ratio of the minimum gray value to the maximum gray value.
3. The brightness uniformity adjusting method according to claim 2, wherein the adjusting the gray-level values of the original gray-level map based on the target uniformity to obtain a target gray-level map comprises:
obtaining a target gray value interval according to the ratio between the maximum gray value and the target uniformity and the ratio between the minimum gray value and the target uniformity;
and adjusting the gray value of the original gray image to the target gray value interval to obtain the target gray image.
4. The brightness uniformity adjusting method according to claim 1, wherein the obtaining of the gray value interval in which the gray value in the original gray map is located comprises:
carrying out partition processing on the original gray-scale image to obtain an original partition gray-scale image;
and acquiring the mean value of the gray value of each area in the original partition gray image, and determining the gray value interval based on the mean value of the gray value of each area.
5. The luminance uniformity adjusting method according to claim 1, wherein generating a target mask image from the original gray scale map and the target gray scale map comprises:
and calculating the difference between the original gray-scale image and the target gray-scale image to obtain the target mask image.
6. The method as claimed in claim 1, wherein if the light source to be measured is a non-single light source, the graying the luminance distribution map to obtain an original grayscale map comprises:
carrying out graying processing on the brightness distribution diagram to obtain an initial gray level diagram;
carrying out perspective transformation on the initial gray level image to obtain a modified gray level image, wherein the modified gray level image comprises a brightness stripe boundary;
and carrying out interpolation processing on the brightness fringe boundary in the corrected gray-scale image to obtain the original gray-scale image.
7. The method as claimed in claim 6, wherein the interpolating the luminance stripe boundary in the modified gray-scale map to obtain the original gray-scale map comprises:
determining the gray value distribution of the row coordinate and the gray value distribution of the column coordinate of the corrected gray map according to the corrected gray map;
determining a mean value curve of the brightness fringe boundary according to the gray value distribution of the row coordinate and the gray value distribution of the column coordinate;
determining a gradient change curve of the mean curve based on the mean curve;
and carrying out interpolation processing on the brightness stripe boundary according to the gradient change curve to obtain the target gray-scale image.
8. The method as claimed in claim 6, wherein if the light source to be measured is a non-single light source, the generating the target mask image according to the original gray scale map and the target gray scale map comprises:
determining a compensation value of the initial mask image and the brightness fringe boundary according to the target gray-scale image and the original gray-scale image;
and determining a target mask image according to the initial mask image and the compensation value.
9. A luminance uniformity adjusting apparatus, comprising:
the acquisition module is used for acquiring a brightness distribution map of a light source to be detected and carrying out graying processing on the brightness distribution map to obtain an original gray scale map;
the calculation module is used for acquiring a gray value interval where each gray value in the original gray image is located and calculating the gray uniformity of the original gray image based on the gray value interval;
the determining module is used for adjusting the gray value of the original gray image based on the target uniformity degree to obtain a target gray image if the gray uniformity degree is inconsistent with a preset target uniformity degree;
the generating module is further used for generating a target mask image according to the original gray-scale image and the target gray-scale image, and the target mask image is arranged on one side of the light emitting surface of the light source to be detected in the 3D printing process.
10. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 8.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 8.
CN202110199525.2A 2021-02-23 2021-02-23 Brightness uniformity adjustment method, brightness uniformity adjustment device, computer device, and readable storage medium Active CN113034382B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110199525.2A CN113034382B (en) 2021-02-23 2021-02-23 Brightness uniformity adjustment method, brightness uniformity adjustment device, computer device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110199525.2A CN113034382B (en) 2021-02-23 2021-02-23 Brightness uniformity adjustment method, brightness uniformity adjustment device, computer device, and readable storage medium

Publications (2)

Publication Number Publication Date
CN113034382A true CN113034382A (en) 2021-06-25
CN113034382B CN113034382B (en) 2024-04-30

Family

ID=76461014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110199525.2A Active CN113034382B (en) 2021-02-23 2021-02-23 Brightness uniformity adjustment method, brightness uniformity adjustment device, computer device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN113034382B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113469918A (en) * 2021-07-22 2021-10-01 广州黑格智造信息科技有限公司 Method and device for calibrating exposure surface of optical system, computer equipment and storage medium
CN114281274A (en) * 2021-11-30 2022-04-05 深圳市纵维立方科技有限公司 Method for adjusting brightness uniformity, printing method, printing system and equipment
CN114559653A (en) * 2022-01-07 2022-05-31 宁波智造数字科技有限公司 Photocuring 3D printing uniformity adjusting process method utilizing cube matrix

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020121616A1 (en) * 2000-07-07 2002-09-05 Sony Corporation Quantity-of-light unevenness inspection apparatus, and quantity-of-light unevenness inspection method
US20110007132A1 (en) * 2009-07-07 2011-01-13 William Gibbens Redmann Method and system for brightness correction for three-dimensional (3D) projection
CN107941808A (en) * 2017-11-10 2018-04-20 中国计量大学 3D printing Forming Quality detecting system and method based on machine vision
CN108564633A (en) * 2018-01-05 2018-09-21 珠海市杰理科技股份有限公司 Gray-scale Image Compression method, apparatus and computer equipment
CN108961175A (en) * 2018-06-06 2018-12-07 平安科技(深圳)有限公司 Face luminance regulating method, device, computer equipment and storage medium
KR20190011419A (en) * 2017-07-25 2019-02-07 주식회사 레이 Calibration Method of 3D Printer
CN109801240A (en) * 2019-01-15 2019-05-24 武汉鸿瑞达信息技术有限公司 A kind of image enchancing method and image intensifier device
CN109856164A (en) * 2019-02-02 2019-06-07 上海福赛特机器人有限公司 A kind of machine vision acquires the optimization device and its detection method of a wide range of image
CN112172149A (en) * 2020-09-30 2021-01-05 深圳市创想三维科技有限公司 Method, device and equipment for automatically improving printing effect and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020121616A1 (en) * 2000-07-07 2002-09-05 Sony Corporation Quantity-of-light unevenness inspection apparatus, and quantity-of-light unevenness inspection method
US20110007132A1 (en) * 2009-07-07 2011-01-13 William Gibbens Redmann Method and system for brightness correction for three-dimensional (3D) projection
KR20190011419A (en) * 2017-07-25 2019-02-07 주식회사 레이 Calibration Method of 3D Printer
CN107941808A (en) * 2017-11-10 2018-04-20 中国计量大学 3D printing Forming Quality detecting system and method based on machine vision
CN108564633A (en) * 2018-01-05 2018-09-21 珠海市杰理科技股份有限公司 Gray-scale Image Compression method, apparatus and computer equipment
CN108961175A (en) * 2018-06-06 2018-12-07 平安科技(深圳)有限公司 Face luminance regulating method, device, computer equipment and storage medium
CN109801240A (en) * 2019-01-15 2019-05-24 武汉鸿瑞达信息技术有限公司 A kind of image enchancing method and image intensifier device
CN109856164A (en) * 2019-02-02 2019-06-07 上海福赛特机器人有限公司 A kind of machine vision acquires the optimization device and its detection method of a wide range of image
CN112172149A (en) * 2020-09-30 2021-01-05 深圳市创想三维科技有限公司 Method, device and equipment for automatically improving printing effect and storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113469918A (en) * 2021-07-22 2021-10-01 广州黑格智造信息科技有限公司 Method and device for calibrating exposure surface of optical system, computer equipment and storage medium
WO2023001306A1 (en) * 2021-07-22 2023-01-26 广州黑格智造信息科技有限公司 Exposure surface calibration method and apparatus for optical system, calibration measurement method and apparatus, computer device, and storage medium
CN113469918B (en) * 2021-07-22 2024-02-02 广州黑格智造信息科技有限公司 Method and device for calibrating exposure surface of optical system, computer equipment and storage medium
CN114281274A (en) * 2021-11-30 2022-04-05 深圳市纵维立方科技有限公司 Method for adjusting brightness uniformity, printing method, printing system and equipment
CN114559653A (en) * 2022-01-07 2022-05-31 宁波智造数字科技有限公司 Photocuring 3D printing uniformity adjusting process method utilizing cube matrix
CN114559653B (en) * 2022-01-07 2024-01-19 宁波智造数字科技有限公司 Photo-curing 3D printing uniformity adjustment method using cube matrix

Also Published As

Publication number Publication date
CN113034382B (en) 2024-04-30

Similar Documents

Publication Publication Date Title
CN113034382B (en) Brightness uniformity adjustment method, brightness uniformity adjustment device, computer device, and readable storage medium
US9007457B2 (en) Acquisition of 3D topographic images of tool marks using non-linear photometric stereo method
JP4917351B2 (en) Calibration method in three-dimensional shape measuring apparatus
CN108195316B (en) Three-dimensional measurement method and device based on self-adaptive phase error correction
JP6161276B2 (en) Measuring apparatus, measuring method, and program
EP3394830B1 (en) Data processing apparatus and method of controlling same
JP5971050B2 (en) Shape measuring apparatus and shape measuring method
CN109781030B (en) Phase correction method and device based on point spread function estimation
WO2020151153A1 (en) Image processing method and apparatus, and computer device and storage medium
US11055855B2 (en) Method, apparatus, device, and storage medium for calculating motion amplitude of object in medical scanning
CN113352618A (en) Gray level setting method and device of 3D printer and 3D printer
JP2016180708A (en) Distance measurement device, distance measurement method and program
CN110738603A (en) image gray scale processing method, device, computer equipment and storage medium
US11043009B2 (en) Method and device for calibrating depth of 3D camera, and computer device
US8588507B2 (en) Computing device and method for analyzing profile tolerances of products
CN110046573B (en) Face image recognition method and device, computer equipment and storage medium
CN114777687B (en) Regional phase error compensation method and device based on probability distribution function
CN110910436B (en) Distance measuring method, device, equipment and medium based on image information enhancement technology
CN113793402B (en) Image rendering method and device, electronic equipment and storage medium
CN109144436B (en) Liquid level control method and device, computer equipment and storage medium
JP6590489B2 (en) Information processing apparatus and method
Liu et al. A novel 3D scanning technique for reflective metal surface based on HDR-like image from pseudo exposure image fusion method
CN112229349A (en) Method and device for determining working area of agricultural machine and agricultural machine
US11138702B2 (en) Information processing apparatus, information processing method and non-transitory computer readable storage medium
CN110595353A (en) Calibration positioning method and device based on calibration plate and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 518100 1808, Jinxiu Hongdu building, Meilong Avenue, Xinniu community, Minzhi street, Longhua District, Shenzhen City, Guangdong Province (office address)

Applicant after: Shenzhen chuangxiang 3D Technology Co.,Ltd.

Address before: 518100 12th floor, building 3, Jincheng Industrial Park, 19 Huafan Road, Dalang street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen Chuangxiang 3D Technology Co.,Ltd.

Country or region before: China

GR01 Patent grant
GR01 Patent grant