CN111105365A - Color correction method, medium, terminal and device for texture image - Google Patents

Color correction method, medium, terminal and device for texture image Download PDF

Info

Publication number
CN111105365A
CN111105365A CN201911232222.5A CN201911232222A CN111105365A CN 111105365 A CN111105365 A CN 111105365A CN 201911232222 A CN201911232222 A CN 201911232222A CN 111105365 A CN111105365 A CN 111105365A
Authority
CN
China
Prior art keywords
brightness
target
light source
texture image
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911232222.5A
Other languages
Chinese (zh)
Other versions
CN111105365B (en
Inventor
李云强
陈颖
武云钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jimu Yida Science And Technology Co ltd
Original Assignee
Shenzhen Jimu Yida Science And Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jimu Yida Science And Technology Co ltd filed Critical Shenzhen Jimu Yida Science And Technology Co ltd
Priority to CN201911232222.5A priority Critical patent/CN111105365B/en
Publication of CN111105365A publication Critical patent/CN111105365A/en
Application granted granted Critical
Publication of CN111105365B publication Critical patent/CN111105365B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a color correction method, a medium, a terminal and a device of texture images, wherein the method comprises the following steps: irradiating the reference plane by adopting a target light source, collecting a test texture image of the reference plane, and calibrating the brightness of the target light source; irradiating the object to be detected by adopting a target light source with calibrated brightness, collecting a target texture image of the object to be detected, and generating a reflection coefficient of the object to be detected; and correcting the brightness of the object to be detected in the target texture image according to the reflection coefficient of the object to be detected. According to the texture mapping method, the light intensity distribution of the light source can be calibrated without adopting complex equipment, then the color correction is carried out on the acquired texture images by combining the depth, the normal line and other information of the object, and the brightness of the same material area on the object is ensured to be consistent in a plurality of texture images with different acquisition distances and different acquisition angles, so that the overall brightness of the texture mapping is ensured to be uniform, the color is not distorted, and the final texture mapping effect is more natural and vivid.

Description

Color correction method, medium, terminal and device for texture image
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of image acquisition, in particular to a color correction method, a medium, a terminal and a device for texture images.
[ background of the invention ]
In recent years, the demand for three-dimensional visualization has become higher and higher, and the development of virtual scene technology has been very rapid. In the process of three-dimensional visualization technology, the application of texture mapping by using texture images and three-dimensional models is also very common. In general, in order to obtain texture information with more realistic sensation, a scanner scans a model repeatedly for many times during three-dimensional scanning measurement, and texture images of multiple different viewpoints are generated. The phenomenon that illumination, colour effect are inconsistent and discontinuous can appear in the shooting of different angles, simultaneously, because the shape of object itself, external environment reason such as all can cause the different images that utilize the camera to obtain not to be under same illumination model, but it is more difficult again to construct an even illumination environment, and the problem that the luminance of illumination that the object normal direction brought is inconsistent can not be solved under the even illumination environment equally, this just leads to the texture mapping result discontinuous, luminance transition is unnatural, the sense of reality on this kind of fragment texture surface is not strong. In order to overcome the above problems, the texture images of different viewpoints are often corrected in the early stage of texture mapping to ensure that the overall brightness distribution of the texture images is uniform, and the texture images are usually based on Wallis or MASK dodging algorithm. However, the existing dodging algorithm has the following problems: (1) the image with poor shooting quality (over-bright and over-dark) is not corrected well; (2) for multiple images of the same object shot in multiple visual angles, the brightness of the same position among different corrected images cannot be guaranteed to be consistent.
[ summary of the invention ]
The invention provides a color correction method, a medium, a terminal and a device for texture images, which solve the technical problems.
The technical scheme for solving the technical problems is as follows: a color correction method for texture images comprises the following steps:
step 1, irradiating a reference plane by using a target light source, collecting a test texture image of the reference plane, and calibrating the brightness of the target light source according to the test texture image;
step 2, irradiating an object to be detected by adopting a target light source with calibrated brightness, collecting a target texture image of the object to be detected, and generating a reflection coefficient of the object to be detected;
and 3, correcting the brightness of the object to be detected in the target texture image according to the reflection coefficient of the object to be detected so as to keep the brightness of the same material area of the object to be detected in all the target texture images consistent.
In a preferred embodiment, a target light source is adopted to irradiate a reference plane and a test texture image of the reference plane is acquired, and the brightness of the target light source is calibrated according to the test texture image, which specifically includes the following steps:
s101, irradiating a reference plane by using a target light source, and acquiring a plurality of test texture images of the reference plane at different shooting distances and the same shooting angle and/or at the same shooting distance and different shooting angles and three-dimensional point cloud information corresponding to each test texture image by using three-dimensional scanning equipment;
s102, establishing a three-dimensional irradiation model for each pixel point of a reference plane, marking the reflection brightness, three-dimensional point cloud information and normal line information of each pixel point in the three-dimensional irradiation model, and calculating the distance from the three-dimensional point corresponding to each pixel point to the optical center of a target light source and the included angle between the light direction corresponding to each pixel point and the normal line direction according to a marking result;
s103, calculating the light source brightness corresponding to each pixel point in each test texture image by adopting a first preset formula, wherein the first preset formula is as follows:
Figure BDA0002303868080000031
wherein, IijRepresenting the light source brightness, I, corresponding to a pixel point (I, j) in the test texture images'expressing the reflection brightness of the pixel point (i, j) in the test texture image, λ' is the preset reflection coefficient corresponding to the reference plane, d0Corresponding the distance from a three-dimensional point to the optical center of the target light source for a pixel point in the test texture image, and theta' is the included angle between the light direction corresponding to the pixel point in the test texture image and the normal direction;
s104, testing a plurality of sheetsFiltering or convex optimization processing is carried out on the light source brightness corresponding to the same pixel point in the texture image to generate target light source brightness I in smooth distributiono
In a preferred embodiment, the method for generating the reflection coefficient of the object to be measured by using the target light source with calibrated brightness to irradiate the object to be measured and collecting the target texture image of the object to be measured specifically comprises the following steps:
s201, irradiating an object to be detected by adopting a target light source with calibrated brightness, and acquiring a three-dimensional model of the object to be detected, target texture images of the object to be detected at different shooting distances and target texture images at the same shooting distance and different shooting angles by utilizing the three-dimensional scanning equipment;
s202, back projecting the three-dimensional model of the object to be detected into each target texture image by combining the parameters of a camera in the three-dimensional scanning equipment to obtain the reflection brightness information I corresponding to each pixel point in each target texture imagesThree-dimensional point cloud information (x, y, z) and normal line information (nx, ny, nz);
s203, establishing a three-dimensional irradiation model for each pixel point in the target texture image, and acquiring the reflection brightness information IsMarking the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source in the three-dimensional irradiation model, and generating the distance d from the three-dimensional point (x, y, z) corresponding to the pixel point to the optical center of the target light source and the direction of the light ray corresponding to the pixel point according to the marking result
Figure BDA0002303868080000032
And normal direction
Figure BDA0002303868080000033
The cosine of angle theta, wherein
Figure BDA0002303868080000034
S204, calculating the reflection coefficient lambda corresponding to each pixel point by utilizing a second preset formula and combining the calibrated target light source brightness, wherein the second preset formula is
Io=Is/λF(θ,d2),
Figure BDA0002303868080000041
k1,k2And k3In order to set the coefficients to a predetermined value,
wherein, IoRepresenting the brightness of the target light source, IsAnd expressing the reflection brightness of the pixel points in the target texture image, wherein d is the distance from the pixel points in the target texture image to the three-dimensional points to the optical center of the target light source, and theta is the included angle between the light direction and the normal direction corresponding to the pixel points in the target texture image.
In a preferred embodiment, the correcting the brightness of the object in the target texture image according to the reflection coefficient of the object specifically includes:
s301, acquiring L corresponding to the target texture imageconstantValue as L per pixelconstantA value;
s302, calculating a corrected brightness corresponding to the target texture image by using a third preset formula, and generating a target corrected image corresponding to the target texture image, where the third preset formula is:
Ic=λ*Lconstant,
wherein, IcAnd expressing the corrected brightness of each pixel point in the target texture image, and expressing the reflection coefficient corresponding to each pixel point in the target texture image by lambda.
A second aspect of the embodiments of the present invention provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the color correction method for texture images described above.
A third aspect of the embodiments of the present invention provides a color correction terminal for a texture image, including the computer-readable storage medium and a processor, where the processor implements the steps of the color correction method for the texture image when executing a computer program on the computer-readable storage medium.
A fourth aspect of the present invention provides a color correction device for texture images, comprising a calibration module, a calculation module and a correction module,
the calibration module is used for adopting a target light source to irradiate a reference plane, acquiring a test texture image of the reference plane, and calibrating the brightness of the target light source according to the test texture image;
the calculation module is used for irradiating an object to be detected by adopting a target light source with calibrated brightness, acquiring a target texture image of the object to be detected and generating a reflection coefficient of the object to be detected;
the correction module is used for correcting the brightness of the object to be detected in the target texture images according to the reflection coefficient of the object to be detected so as to keep the brightness of the same material area of the object to be detected in all the target texture images consistent.
In a preferred embodiment, the calibration module specifically includes:
the first acquisition unit is used for irradiating a reference plane by adopting a target light source and acquiring a plurality of test texture images of the reference plane at different shooting distances and the same shooting angle and/or at the same shooting distance and different shooting angles and three-dimensional point cloud information corresponding to each test texture image by utilizing three-dimensional scanning equipment;
the first model establishing unit is used for establishing a three-dimensional illumination model aiming at each pixel point of a reference plane, marking the reflection brightness, three-dimensional point cloud information and normal line information of each pixel point in the three-dimensional illumination model, and calculating the distance from the three-dimensional point corresponding to each pixel point to the optical center of a target light source and the included angle between the light direction corresponding to each pixel point and the normal line direction according to the marking result;
the first calculating unit is used for calculating the light source brightness corresponding to each pixel point in each test texture image by adopting a first preset formula, wherein the first preset formula is as follows:
Figure BDA0002303868080000051
wherein, IijRepresenting the light source brightness corresponding to the pixel point (i, j) in the test texture image,Is'expressing the reflection brightness of the pixel point (i, j) in the test texture image, λ' is the preset reflection coefficient corresponding to the reference plane, d0Corresponding the distance from a three-dimensional point to the optical center of the target light source for a pixel point in the test texture image, and theta' is the included angle between the light direction corresponding to the pixel point in the test texture image and the normal direction;
an optimization unit for performing filtering or convex optimization processing on the light source brightness corresponding to the same pixel point in the multiple test texture images to generate a smoothly distributed target light source brightness Io
In a preferred embodiment, the calculation module specifically includes:
the second acquisition unit is used for irradiating the object to be detected by adopting the target light source with the calibrated brightness, and acquiring a three-dimensional model of the object to be detected, target texture images of the object to be detected at different shooting distances and target texture images at the same shooting distance and different shooting angles by utilizing the three-dimensional scanning equipment;
an information acquisition unit for back-projecting the three-dimensional model of the object to be measured to each target texture image by combining the parameters of the camera in the three-dimensional scanning equipment to obtain the reflection brightness information I corresponding to each pixel point in each target texture imagesThree-dimensional point cloud information (x, y, z) and normal line information (nx, ny, nz);
a second model establishing unit for establishing a three-dimensional illumination model for each pixel point in the target texture image and obtaining the reflection brightness information IsMarking the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source in the three-dimensional irradiation model, and generating the distance d from the three-dimensional point (x, y, z) corresponding to the pixel point to the optical center of the target light source and the direction of the light ray corresponding to the pixel point according to the marking result
Figure BDA0002303868080000061
And normal direction
Figure BDA0002303868080000062
The cosine of angle theta, wherein
Figure BDA0002303868080000063
A second calculating unit, configured to calculate, by using a second preset formula and combining the calibrated target light source brightness, a reflection coefficient λ corresponding to each pixel point, where the second preset formula is
Io=Is/λF(θ,d2),
Figure BDA0002303868080000064
k1,k2And k3In order to set the coefficients to a predetermined value,
wherein, IoRepresenting the brightness of the target light source, IsAnd expressing the reflection brightness of the pixel points in the target texture image, wherein d is the distance from the pixel points in the target texture image to the three-dimensional points to the optical center of the target light source, and theta is the included angle between the light direction and the normal direction corresponding to the pixel points in the target texture image.
In a preferred embodiment, the correction module specifically includes:
a query unit for obtaining L corresponding to the target texture imageconstantValue as L per pixelconstantA value;
the correction unit is configured to calculate a correction brightness corresponding to the target texture image by using a third preset formula, and generate a target correction image corresponding to the target texture image, where the third preset formula is:
Ic=λ*Lconstant,
wherein, IcAnd expressing the corrected brightness of each pixel point in the target texture image, and expressing the reflection coefficient corresponding to each pixel point in the target texture image by lambda.
The invention provides a color correction method of a texture image, which can calibrate the light intensity distribution of a light source without adopting complex equipment, and then carry out color correction on the acquired texture image by combining information such as the depth, the normal line and the like of an object, thereby ensuring that the brightness of the same material area on the object is kept consistent in a plurality of texture images with different acquisition distances and different acquisition angles, further ensuring that the overall brightness of a texture mapping is uniform, the color is not distorted, and the final texture mapping effect is more natural and vivid.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a flowchart illustrating a color correction method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating an illumination model according to another embodiment of the present invention;
FIG. 3 is a block diagram of a white wall image according to another embodiment of the present invention;
FIG. 4 is a diagram illustrating an illumination calibration image according to another embodiment of the present invention;
FIG. 5 is a block diagram illustrating a white wall corrected image according to another embodiment of the present invention;
FIG. 6 is a detailed comparison diagram before and after image correction according to another embodiment of the present invention;
FIG. 7 is a bedside texture image according to another embodiment of the present invention;
FIG. 8 is a close-up view of a bedside texture image in accordance with another embodiment of the present invention;
FIG. 9 is a three-dimensional model of a head of a bed as described in another embodiment of the invention;
FIG. 10 is a block diagram of a corrected bedside texture image according to another embodiment of the present invention;
FIG. 11 is a close-up image of a corrected bed side texture according to another embodiment of the present invention;
FIG. 12 is a block diagram of a bedside original texture image after being mapped according to another embodiment of the present invention;
FIG. 13 is a block diagram illustrating the bedside corrected texture image mapping according to another embodiment of the present invention;
FIG. 14 is a schematic structural diagram of a color correction apparatus according to another embodiment of the present invention;
fig. 15 is a schematic structural diagram of a color correction terminal according to another embodiment of the present invention.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantageous effects of the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and the detailed description. It should be understood that the detailed description and specific examples, while indicating the preferred embodiment of the invention, are intended for purposes of illustration only and are not intended to limit the scope of the invention.
Fig. 1 is a flowchart illustrating a color correction method for a texture image according to an embodiment 1 of the present invention, as shown in fig. 1, including the following steps:
step 1, a target light source is adopted to irradiate a reference plane, a test texture image of the reference plane is collected, and the brightness of the target light source is calibrated according to the test texture image. Specifically, the target light source when shooting the object is shown as a circle in fig. 2, and the optical center of the camera and the center of the target light source are approximately coincident, where the brightness of any point in space can be expressed as:
If≈Io*F(θ,d2) (1)
Is=λ*If(2)
by combining the above formulas, we can obtain:
Is=λ*(Io*F(θ,d2)) (3)
wherein, IoThe brightness of the object surface is obtained by the distance and angle modulation, namely the mapping of a function FfThen passes through the surface of the objectThe reflection of the surface finally obtains the reflection brightness I of the pointsI.e. the brightness of the point collected by the camera, where λ is the reflection coefficient, and is related to the material of the object itself. Diffuse reflected light intensity (I) according to the Lambertian hypothesisf) Inversely proportional to the square of the distance (d) and proportional to the cosine of the angle of the object cos (theta), so in this embodiment,
Figure BDA0002303868080000091
k1,k2and k3Is a preset coefficient.
In this embodiment, the illumination calibration method is to calibrate the intensity I of the target light sourceoThe method specifically comprises the following steps:
s101, irradiating a reference plane by using a target light source, and acquiring a plurality of test texture images of the reference plane at different shooting distances and the same shooting angle and/or at the same shooting distance and different shooting angles and three-dimensional point cloud information corresponding to each test texture image by using a three-dimensional scanning device. Specifically, firstly, proper scanning equipment parameters are adjusted to ensure that texture images within a range from d1 to d2 away from a reference plane are not too bright or too dark, then three-dimensional scanning hardware equipment is aligned to the reference plane, within a range from d1 to d2, multiple texture images are shot by near-far continuous moving equipment and corresponding three-dimensional point cloud information is obtained, then, within a range from d1 to d2, the angle of continuous rotating equipment is theta1~θ2And shooting a plurality of texture images and obtaining corresponding three-dimensional point cloud information.
And then S102 is executed, a three-dimensional irradiation model is established for each pixel point of the reference plane, the reflection brightness, the three-dimensional point cloud information and the normal line information of each pixel point are marked in the three-dimensional irradiation model, and the distance from the three-dimensional point corresponding to each pixel point to the optical center of the target light source and the included angle between the light direction corresponding to each pixel point and the normal line direction are calculated according to the marking result. Then, calculating the light source brightness corresponding to each pixel point in each test texture image by adopting a first preset formula, wherein the first preset formula is as follows:
Figure BDA0002303868080000101
wherein, IijRepresenting the light source brightness, I, corresponding to a pixel point (I, j) in the test texture images'expressing the reflection brightness of the pixel point (i, j) in the test texture image, λ' is the preset reflection coefficient corresponding to the reference plane, d0And theta' is the included angle between the light direction corresponding to the pixel points in the texture testing image and the normal direction.
Then, S104 is executed, the light source brightness corresponding to the same pixel point in the multiple test texture images is filtered or convexly optimized, for example, weighted average processing is performed, and the target light source brightness I with smooth distribution is generatedo
In the above preferred embodiment, the brightness of the target light source is calibrated by testing the three-dimensional point cloud information, the normal line information and the reflection brightness of each pixel point in the texture image, and in other embodiments, the brightness of the target light source may also be calibrated by testing the RGB value of the texture image, which is not described in detail herein, but is within the protection scope of the present invention.
Then step 2 is executed, the target light source with the calibrated brightness is adopted to irradiate the object to be measured, the target texture image of the object to be measured is collected, and the reflection coefficient of the object to be measured is generated according to the three-dimensional point cloud information, the normal line information and the reflection brightness of each pixel point in the target texture image, and the method specifically comprises the following steps:
s201, irradiating an object to be detected by adopting a target light source with calibrated brightness, and acquiring a three-dimensional model of the object to be detected, target texture images of the object to be detected at different shooting distances and target texture images at the same shooting distance and different shooting angles by utilizing the three-dimensional scanning equipment;
s202, back projecting the three-dimensional model of the object to be detected into each target texture image by combining the parameters of a camera in the three-dimensional scanning equipment to obtain the reflection brightness information I corresponding to each pixel point in each target texture images(i.e. brightness I acquired by the camera)s) And calculating the reflection coefficient corresponding to each pixel point by combining the calibrated target light source brightness. Specifically, a three-dimensional irradiation model is established for each pixel point in a target texture image, and the reflection brightness information I is obtainedsMarking the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source in the three-dimensional irradiation model, and calculating the distance d from the three-dimensional point (x, y, z) corresponding to each pixel point in each image to the optical center, wherein the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source are marked in the three-dimensional irradiation model, and the
Figure BDA0002303868080000111
Then, according to the normal information of each image, the direction of the light corresponding to each pixel point of each image can be obtained
Figure BDA0002303868080000112
With corresponding three-dimensional point normal
Figure BDA0002303868080000113
Cosine of angle of direction
Figure BDA0002303868080000114
Then according to the calibrated brightness I of the target light sourceoAnd formula (1), the brightness I of the surface of each pixel point can be calculatedfWhile the reflected brightness I of the known pixel pointsAccording to the formula (2), the reflection coefficient of each pixel point can be calculated. The reflection coefficient lambda of the pixel point is related to the material of the object, and the lambda of the same material is close to that of the object.
And then step 3 is executed, the brightness of the object to be detected in the target texture image is corrected according to the reflection coefficient of the object to be detected, so that the brightness of the same material area of the object to be detected in all the target texture images is kept consistent, and the method specifically comprises the following steps:
s301, acquiring L corresponding to the target texture imageconstantValue as L per pixelconstantValue, where the user can select L based on the brightness of the shot at that timeconstantValue, Lconstant∈[0,255]And subjecting it toL as per pixel point input into softwareconstantThe value is obtained.
S302, calculating a corrected brightness corresponding to the target texture image by using a third preset formula, and generating a target corrected image corresponding to the target texture image, where the third preset formula is:
Ic=λ*Lconstant,
wherein, IcAnd expressing the corrected brightness of each pixel point in the target texture image, and expressing the reflection coefficient corresponding to each pixel point in the target texture image by lambda.
The purpose of the correction is that the brightness of the same material area is not affected by the distance, the angle and the light source and is only related to the reflection coefficient lambda, so that the brightness of the same material area can be guaranteed to be consistent in the same texture and different texture images.
The color correction method of the preferred embodiment further comprises a first image mode conversion step and a second image mode conversion step,
the first image mode conversion step specifically comprises: converting the test texture image into an HSV color space, acquiring the reflection brightness of each pixel point V channel in the converted HSV color space, and generating the brightness of a target light source;
the second image mode conversion step specifically comprises: and converting the target texture image into an HSV color space, calculating correction brightness and a target correction image corresponding to a V channel in the HSV color space, and converting the target correction image into an RGB space, so that the correction color can be further ensured not to be distorted.
The color correction method of another preferred embodiment further includes a verification step, specifically: and after the brightness of the target light source is calibrated by adopting the reference plane, irradiating the reference plane by adopting the target light source again, correcting the brightness of the reference plane in the target texture image according to the color correction method, and judging whether to calibrate the brightness of the target light source again according to the correction result, thereby further ensuring the color correction effect of the texture image.
The above process is explained below by a specific embodiment.
In the experimental process, a handheld three-dimensional scanner which is independently researched and developed is adopted to collect data, the scanner adopts annular light, and the collected original texture image has the phenomena of dark periphery and bright middle. In this embodiment, in order to simplify the influence of λ during the calibration process and determine the calibration effect of the light source brightness, a white board or a white wall is used as a reference plane, and at this time, the reflection coefficient λ of the white board or the white wall may be assumed to be 1.
Firstly, the hardware equipment is aligned to a white wall or a white board to collect a plurality of groups of images with different visual angles and the same distance: adjusting appropriate camera parameters to ensure that images collected 400mm to 1000mm away from a white wall or a white board are not too dark or too bright; the handheld scanner is just right white wall or blank, gather about 50 images from near-to-far continuous mobile device (distance is 400mm to 1000mm), continue handheld scanner between 400mm to 1000mm distance, the angle (0 to 70 degrees) of continuous rotating equipment is taken about 250 images, as shown in fig. 3, the image that is apart from closely is lighter than the image whole of distance far away, the image of different angle collections exists certain luminance skew, in addition because the image of annular light, the image exists the dark middle bright phenomenon in the periphery.
The handheld three-dimensional scanner can simultaneously output corresponding three-dimensional point cloud and normal information, convert each image into HSV space in color space, and calibrate the brightness value of the V channel by the method in the step 1 to obtain the brightness I of the target light sourceoAs shown in fig. 4. In order to ensure the brightness I of the target light sourceoThe accuracy of the value is corrected according to the color correction method in the step 2 and the step 3, and the brightness correction reference value L is obtained at the momentconstantAs shown in fig. 5 and 6, the corrected image is very uniform at 200 f.
And then irradiating the object to be measured, namely the bed head in the embodiment, by the target light source with the calibrated brightness. In actual operation, an operator may collect texture images and three-dimensional data of a bed head from different viewing angles, as shown in fig. 7 and 8, the characteristics of the ring-shaped light cause the image to be dark at the periphery and bright in the middle, and especially the brightness of the visible image in the close-up image of fig. 8 is very uneven.
Then, using a three-dimensional bedside model obtained by using camera parameters, as shown in fig. 9, the model is back-projected onto each texture image to obtain three-dimensional point cloud and normal line information corresponding to each image, further obtain the distance from the three-dimensional point cloud corresponding to each pixel point of each image to the optical center and the included angle information between the corresponding light and the normal line, and then combine with the light calibration IoColor correction is carried out on the image of the V channel to obtain IcAt this time, L is obtained by looking up the tableconstant125. Fig. 10 and 11 are the corrected texture images, and comparing with fig. 8, it can be seen that the corrected image solves the problems of middle brightness and dark periphery, and the brightness of the whole image is more uniform and natural. Compared with fig. 7, the images at different angles have no brightness deviation, and the brightness of the same material area is more consistent. Compared with the model directly mapped by the original image in fig. 12, the texture mapping is performed on the corrected texture image obtained by the color correction method provided by the invention, and the model transition is more natural, as shown in fig. 13.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
An embodiment of the present invention further provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the method for color correction of a texture image is implemented.
Fig. 14 is a schematic structural diagram of a color correction device for texture images according to embodiment 2 of the present invention, as shown in fig. 14, including a calibration module 100, a calculation module 200 and a correction module 300,
the calibration module 100 is configured to irradiate a reference plane with a target light source, collect a test texture image of the reference plane, and calibrate the brightness of the target light source according to the test texture image;
the calculation module 200 is configured to irradiate an object to be measured with a target light source with a calibrated brightness, collect a target texture image of the object to be measured, and generate a reflection coefficient of the object to be measured;
the correction module 300 is configured to correct the brightness of the object to be measured in the target texture image according to the reflection coefficient of the object to be measured, so that the brightness of the same material area of the object to be measured in all the target texture images is kept consistent.
In a preferred embodiment, the calibration module 100 specifically includes:
the first acquisition unit 101 is used for irradiating a reference plane by using a target light source, and acquiring a plurality of test texture images of the reference plane at different shooting distances and at the same shooting angle and/or at the same shooting distance and at different shooting angles and three-dimensional point cloud information corresponding to each test texture image by using a three-dimensional scanning device;
the first model establishing unit 102 is configured to establish a three-dimensional illumination model for each pixel point of the reference plane, label the reflection brightness, the three-dimensional point cloud information, and the normal information of each pixel point in the three-dimensional illumination model, and calculate, according to a labeling result, a distance from the three-dimensional point corresponding to each pixel point to the optical center of the target light source and an included angle between the light direction corresponding to each pixel point and the normal direction;
the first calculating unit 103 is configured to calculate light source brightness corresponding to each pixel point in each test texture image by using a first preset formula, where the first preset formula is:
Figure BDA0002303868080000151
wherein, IijRepresenting the light source brightness, I, corresponding to a pixel point (I, j) in the test texture images'expressing the reflection brightness of the pixel point (i, j) in the test texture image, λ' is the preset reflection coefficient corresponding to the reference plane, d0Corresponding the distance from a three-dimensional point to the optical center of the target light source for a pixel point in the test texture image, and theta' is the included angle between the light direction corresponding to the pixel point in the test texture image and the normal direction;
an optimizing unit 104 for optimizing the same pixel points in the multiple test texture imagesFiltering or convex optimization processing is carried out on the corresponding light source brightness to generate target light source brightness I which is smoothly distributedo
In a preferred embodiment, the computing module 200 specifically includes:
the second acquisition unit 201 is configured to irradiate the object to be detected with the target light source with the calibrated brightness, and acquire a three-dimensional model of the object to be detected, target texture images of the object to be detected at different shooting distances, and target texture images of the object to be detected at the same shooting distance and different shooting angles by using the three-dimensional scanning device;
an information obtaining unit 202, configured to back-project the three-dimensional model of the object to be detected to each target texture image in combination with parameters of a camera in the three-dimensional scanning device, so as to obtain reflection brightness information I corresponding to each pixel point in each target texture imagesThree-dimensional point cloud information (x, y, z) and normal line information (nx, ny, nz);
a second model establishing unit 203, configured to establish a three-dimensional illumination model for each pixel point in the target texture image, and to obtain the reflection brightness information IsMarking the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source in the three-dimensional irradiation model, and generating the distance d from the three-dimensional point (x, y, z) corresponding to the pixel point to the optical center of the target light source and the direction of the light ray corresponding to the pixel point according to the marking result
Figure BDA0002303868080000161
And normal direction
Figure BDA0002303868080000162
The cosine of angle theta, wherein
Figure BDA0002303868080000163
A second calculating unit 204, configured to calculate, by using a second preset formula and combining the calibrated target light source brightness, a reflection coefficient λ corresponding to each pixel point, where the second preset formula is
Io=Is/λF(θ,d2),
Figure BDA0002303868080000164
k1,k2And k3In order to set the coefficients to a predetermined value,
wherein, IoRepresenting the brightness of the target light source, IsAnd expressing the reflection brightness of the pixel points in the target texture image, wherein d is the distance from the pixel points in the target texture image to the three-dimensional points to the optical center of the target light source, and theta is the included angle between the light direction and the normal direction corresponding to the pixel points in the target texture image.
In a preferred embodiment, the correction module 300 specifically includes:
a query unit 301, configured to obtain an L corresponding to the target texture imageconstantValue as L per pixelconstantA value;
the correcting unit 302 is configured to calculate a corrected brightness corresponding to the target texture image by using a third preset formula, and generate a target corrected image corresponding to the target texture image, where the third preset formula is:
Ic=λ*Lconstant,
wherein, IcAnd expressing the corrected brightness of each pixel point in the target texture image, and expressing the reflection coefficient corresponding to each pixel point in the target texture image by lambda.
In a preferred embodiment, the color correction device for texture images further comprises a first image mode conversion module 400 and a second image mode conversion module 500,
the first image mode conversion module 400 is configured to convert the test texture image into an HSV color space, obtain a reflection brightness of a V channel of each pixel point in the converted HSV color space, and generate a target light source brightness;
the second image mode conversion module 500 is configured to convert the target texture image into an HSV color space, calculate a corrected luminance and a target corrected image corresponding to a V channel in the HSV color space, and convert the target corrected image into an RGB space.
In a preferred embodiment, the color correction apparatus for texture images further includes a verification module 600, and the verification module 600 is configured to calibrate the brightness of the target light source by using the reference plane, irradiate the reference plane by using the target light source again, correct the brightness of the reference plane in the target texture image according to the color correction method, and determine whether to calibrate the brightness of the target light source again according to the correction result.
The embodiment of the invention also provides a color correction terminal of the texture image, which comprises the computer readable storage medium and a processor, wherein the processor realizes the steps of the color correction method of the texture image when executing the computer program on the computer readable storage medium. Fig. 15 is a schematic structural diagram of a color correction terminal for a texture image according to embodiment 3 of the present invention, and as shown in fig. 15, the color correction terminal 8 for a texture image according to this embodiment includes: a processor 80, a readable storage medium 81 and a computer program 82 stored in said readable storage medium 81 and executable on said processor 80. The processor 80, when executing the computer program 82, implements the steps in the various method embodiments described above, such as steps 1 through 3 shown in fig. 1. Alternatively, the processor 80, when executing the computer program 82, implements the functions of the modules in the above-described device embodiments, such as the functions of the modules 100 to 300 shown in fig. 14.
Illustratively, the computer program 82 may be partitioned into one or more modules that are stored in the readable storage medium 81 and executed by the processor 80 to implement the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions for describing the execution of the computer program 82 in the color correction terminal 8 for texture images.
The color correction terminal 8 for the texture image may include, but is not limited to, a processor 80 and a readable storage medium 81. Those skilled in the art will appreciate that fig. 15 is only an example of the color correction terminal 8 of the texture image, and does not constitute a limitation to the color correction terminal 8 of the texture image, and may include more or less components than those shown, or combine some components, or different components, for example, the color correction terminal of the texture image may further include a power management module, an arithmetic processing module, an input-output device, a network access device, a bus, and the like.
The Processor 80 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The readable storage medium 81 may be an internal storage unit of the color correction terminal 8 for the texture image, such as a hard disk or a memory of the color correction terminal 8 for the texture image. The readable storage medium 81 may also be an external storage device of the color correction terminal 8 for the texture image, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are equipped on the color correction terminal 8 for the texture image. Further, the readable storage medium 81 may also include both an internal storage unit of the color correction terminal 8 of the texture image and an external storage device. The readable storage medium 81 is used to store the computer program and other programs and data required for the color correction terminal of the texture image. The readable storage medium 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.

Claims (10)

1. A color correction method for texture images, comprising the steps of:
step 1, irradiating a reference plane by using a target light source, collecting a test texture image of the reference plane, and calibrating the brightness of the target light source according to the test texture image;
step 2, irradiating an object to be detected by adopting a target light source with calibrated brightness, collecting a target texture image of the object to be detected, and generating a reflection coefficient of the object to be detected;
and 3, correcting the brightness of the object to be detected in the target texture image according to the reflection coefficient of the object to be detected so as to keep the brightness of the same material area of the object to be detected in all the target texture images consistent.
2. The color correction method of the texture image according to claim 1, wherein a target light source is adopted to illuminate a reference plane and a test texture image of the reference plane is acquired, and the brightness of the target light source is calibrated according to the test texture image, specifically comprising the following steps:
s101, irradiating a reference plane by using a target light source, and acquiring a plurality of test texture images of the reference plane at different shooting distances and the same shooting angle and/or at the same shooting distance and different shooting angles and three-dimensional point cloud information corresponding to each test texture image by using three-dimensional scanning equipment;
s102, establishing a three-dimensional irradiation model for each pixel point of a reference plane, marking the reflection brightness, three-dimensional point cloud information and normal line information of each pixel point in the three-dimensional irradiation model, and calculating the distance from the three-dimensional point corresponding to each pixel point to the optical center of a target light source and the included angle between the light direction corresponding to each pixel point and the normal line direction according to a marking result;
s103, calculating the light source brightness corresponding to each pixel point in each test texture image by adopting a first preset formula, wherein the first preset formula is as follows:
Figure FDA0002303868070000011
wherein, IijRepresenting the light source brightness, I 'corresponding to the pixel point (I, j) in the test texture image'sExpressing the reflection brightness of the pixel point (i, j) in the test texture image, lambda' is the preset reflection coefficient corresponding to the reference plane, d0Corresponding the distance from a three-dimensional point to the optical center of the target light source for a pixel point in the test texture image, and theta' is the included angle between the light direction corresponding to the pixel point in the test texture image and the normal direction;
s104, filtering or convex optimization processing is carried out on the light source brightness corresponding to the same pixel point in the multiple testing texture images to generate target light source brightness I in smooth distributiono
3. The color correction method of the texture image according to claim 1 or 2, wherein a target light source with a calibrated brightness is adopted to irradiate an object to be measured, and a target texture image of the object to be measured is collected to generate a reflection coefficient of the object to be measured, and the method specifically comprises the following steps:
s201, irradiating an object to be detected by adopting a target light source with calibrated brightness, and acquiring a three-dimensional model of the object to be detected, target texture images of the object to be detected at different shooting distances and target texture images at the same shooting distance and different shooting angles by utilizing the three-dimensional scanning equipment;
s202, back projecting the three-dimensional model of the object to be detected into each target texture image by combining the parameters of a camera in the three-dimensional scanning equipment to obtain the reflection brightness information I corresponding to each pixel point in each target texture imagesThree-dimensional point cloud information (x, y, z) and normal line information (nx, ny, nz);
s203, establishing a three-dimensional irradiation model for each pixel point in the target texture image, and acquiring the reflection brightness information IsMarking the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source in the three-dimensional irradiation model, and generating the distance d from the three-dimensional point (x, y, z) corresponding to the pixel point to the optical center of the target light source and the direction of the light ray corresponding to the pixel point according to the marking result
Figure FDA0002303868070000021
And normal direction
Figure FDA0002303868070000022
The cosine of angle theta, wherein
Figure FDA0002303868070000023
S204, calculating the reflection coefficient lambda corresponding to each pixel point by utilizing a second preset formula and combining the calibrated target light source brightness, wherein the second preset formula is
Io=Is/λF(θ,d2),
Figure FDA0002303868070000024
k1,k2And k3In order to set the coefficients to a predetermined value,
wherein, IoRepresenting the brightness of the target light source, IsAnd expressing the reflection brightness of the pixel points in the target texture image, wherein d is the distance from the pixel points in the target texture image to the three-dimensional points to the optical center of the target light source, and theta is the included angle between the light direction and the normal direction corresponding to the pixel points in the target texture image.
4. The color correction method of the texture image according to claim 3, wherein the brightness of the object in the target texture image is corrected according to the reflection coefficient of the object, specifically:
s301, acquiring L corresponding to the target texture imageconstantValue as L per pixelconstantA value;
s302, calculating a corrected brightness corresponding to the target texture image by using a third preset formula, and generating a target corrected image corresponding to the target texture image, where the third preset formula is:
Ic=λ*Lconstant,
wherein, IcAnd expressing the corrected brightness of each pixel point in the target texture image, and expressing the reflection coefficient corresponding to each pixel point in the target texture image by lambda.
5. The color correction method for texture images according to any one of claims 1 to 4, further comprising a verification step, specifically: and after the brightness of the target light source is calibrated by adopting the reference plane, irradiating the reference plane by adopting the target light source again, correcting the brightness of the reference plane in the target texture image according to the color correction method, and judging whether to calibrate the brightness of the target light source again according to the correction result.
6. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the color correction method for texture images according to any one of claims 1 to 5.
7. A color correction terminal for a texture image, comprising the computer-readable storage medium of claim 6 and a processor, wherein the processor implements the steps of the color correction method for the texture image according to any one of claims 1 to 5 when executing the computer program on the computer-readable storage medium.
8. A color correction device for texture images is characterized by comprising a calibration module, a calculation module and a correction module,
the calibration module is used for adopting a target light source to irradiate a reference plane, acquiring a test texture image of the reference plane, and calibrating the brightness of the target light source according to the test texture image;
the calculation module is used for irradiating an object to be detected by adopting a target light source with calibrated brightness, acquiring a target texture image of the object to be detected and generating a reflection coefficient of the object to be detected;
the correction module is used for correcting the brightness of the object to be detected in the target texture images according to the reflection coefficient of the object to be detected so as to keep the brightness of the same material area of the object to be detected in all the target texture images consistent.
9. The color correction device for texture images according to claim 8, wherein the calibration module comprises:
the first acquisition unit is used for irradiating a reference plane by adopting a target light source and acquiring a plurality of test texture images of the reference plane at different shooting distances and the same shooting angle and/or at the same shooting distance and different shooting angles and three-dimensional point cloud information corresponding to each test texture image by utilizing three-dimensional scanning equipment;
the first model establishing unit is used for establishing a three-dimensional illumination model aiming at each pixel point of a reference plane, marking the reflection brightness, three-dimensional point cloud information and normal line information of each pixel point in the three-dimensional illumination model, and calculating the distance from the three-dimensional point corresponding to each pixel point to the optical center of a target light source and the included angle between the light direction corresponding to each pixel point and the normal line direction according to the marking result;
the first calculating unit is used for calculating the light source brightness corresponding to each pixel point in each test texture image by adopting a first preset formula, wherein the first preset formula is as follows:
Figure FDA0002303868070000041
wherein, IijRepresenting the light source brightness, I 'corresponding to the pixel point (I, j) in the test texture image'sExpressing the reflection brightness of the pixel point (i, j) in the test texture image, lambda' is the preset reflection coefficient corresponding to the reference plane, d0Corresponding the distance from a three-dimensional point to the optical center of the target light source for a pixel point in the test texture image, and theta' is the included angle between the light direction corresponding to the pixel point in the test texture image and the normal direction;
an optimization unit for performing filtering or convex optimization processing on the light source brightness corresponding to the same pixel point in the multiple test texture images to generate a smoothly distributed target light source brightness Io
10. The color correction device for texture images according to claim 8 or 9, wherein the computing module comprises:
the second acquisition unit is used for irradiating the object to be detected by adopting the target light source with the calibrated brightness, and acquiring a three-dimensional model of the object to be detected, target texture images of the object to be detected at different shooting distances and target texture images at the same shooting distance and different shooting angles by utilizing the three-dimensional scanning equipment;
an information acquisition unit for back-projecting the three-dimensional model of the object to be measured to each target texture image by combining the parameters of the camera in the three-dimensional scanning equipment to obtain each image in each target texture imageReflection brightness information I corresponding to pixel pointsThree-dimensional point cloud information (x, y, z) and normal line information (nx, ny, nz);
a second model establishing unit for establishing a three-dimensional illumination model for each pixel point in the target texture image and obtaining the reflection brightness information IsMarking the three-dimensional point cloud information (x, y, z), the normal information (nx, ny, nz) and the target light source in the three-dimensional irradiation model, and generating the distance d from the three-dimensional point (x, y, z) corresponding to the pixel point to the optical center of the target light source and the direction of the light ray corresponding to the pixel point according to the marking result
Figure FDA0002303868070000051
And normal direction
Figure FDA0002303868070000052
The cosine of angle theta, wherein
Figure FDA0002303868070000053
A second calculating unit, configured to calculate, by using a second preset formula and combining the calibrated target light source brightness, a reflection coefficient λ corresponding to each pixel point, where the second preset formula is
Io=Is/λF(θ,d2),
Figure FDA0002303868070000054
k1,k2And k3In order to set the coefficients to a predetermined value,
wherein, IoRepresenting the brightness of the target light source, IsAnd expressing the reflection brightness of the pixel points in the target texture image, wherein d is the distance from the pixel points in the target texture image to the three-dimensional points to the optical center of the target light source, and theta is the included angle between the light direction and the normal direction corresponding to the pixel points in the target texture image.
CN201911232222.5A 2019-12-05 2019-12-05 Color correction method, medium, terminal and device for texture image Active CN111105365B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911232222.5A CN111105365B (en) 2019-12-05 2019-12-05 Color correction method, medium, terminal and device for texture image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911232222.5A CN111105365B (en) 2019-12-05 2019-12-05 Color correction method, medium, terminal and device for texture image

Publications (2)

Publication Number Publication Date
CN111105365A true CN111105365A (en) 2020-05-05
CN111105365B CN111105365B (en) 2023-10-24

Family

ID=70421578

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911232222.5A Active CN111105365B (en) 2019-12-05 2019-12-05 Color correction method, medium, terminal and device for texture image

Country Status (1)

Country Link
CN (1) CN111105365B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112097679A (en) * 2020-09-10 2020-12-18 厦门海铂特生物科技有限公司 Three-dimensional space measuring method based on optical information
WO2023108992A1 (en) * 2021-12-13 2023-06-22 小米科技(武汉)有限公司 Image processing method and apparatus, storage medium, electronic device, and program product

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867295A (en) * 2012-08-06 2013-01-09 电子科技大学 Color correction method for color image
CN107228625A (en) * 2017-06-01 2017-10-03 深度创新科技(深圳)有限公司 Three-dimensional rebuilding method, device and equipment
CN109300091A (en) * 2018-09-12 2019-02-01 首都师范大学 Radiance bearing calibration and device
CN109643444A (en) * 2017-06-26 2019-04-16 深圳配天智能技术研究院有限公司 Polishing bearing calibration and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867295A (en) * 2012-08-06 2013-01-09 电子科技大学 Color correction method for color image
CN107228625A (en) * 2017-06-01 2017-10-03 深度创新科技(深圳)有限公司 Three-dimensional rebuilding method, device and equipment
CN109643444A (en) * 2017-06-26 2019-04-16 深圳配天智能技术研究院有限公司 Polishing bearing calibration and device
CN109300091A (en) * 2018-09-12 2019-02-01 首都师范大学 Radiance bearing calibration and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
上海科学技术文献出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112097679A (en) * 2020-09-10 2020-12-18 厦门海铂特生物科技有限公司 Three-dimensional space measuring method based on optical information
CN112097679B (en) * 2020-09-10 2022-04-19 厦门海铂特生物科技有限公司 Three-dimensional space measuring method based on optical information
WO2023108992A1 (en) * 2021-12-13 2023-06-22 小米科技(武汉)有限公司 Image processing method and apparatus, storage medium, electronic device, and program product

Also Published As

Publication number Publication date
CN111105365B (en) 2023-10-24

Similar Documents

Publication Publication Date Title
US10924729B2 (en) Method and device for calibration
CN109754426B (en) Method, system and device for verifying camera calibration parameters
US10726580B2 (en) Method and device for calibration
WO2022100242A1 (en) Image processing method and apparatus, electronic device, and computer-readable storage medium
CN110458932B (en) Image processing method, device, system, storage medium and image scanning apparatus
US10447999B2 (en) Alignment of images of a three-dimensional object
US8310499B2 (en) Balancing luminance disparity in a display by multiple projectors
CN110033510B (en) Method and device for establishing color mapping relation for correcting rendered image color
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
JP2013127774A (en) Image processing device, image processing method, and program
Menk et al. Visualisation techniques for using spatial augmented reality in the design process of a car
CN113409379B (en) Method, device and equipment for determining spectral reflectivity
CN111105365A (en) Color correction method, medium, terminal and device for texture image
CN113533256A (en) Method, device and equipment for determining spectral reflectivity
CN112734824A (en) Three-dimensional reconstruction method based on generalized luminosity stereo model
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN108010071B (en) System and method for measuring brightness distribution by using 3D depth measurement
WO2005109312A2 (en) Color characterization using color value clipping
US20140184851A1 (en) Automatic image combining apparatus
CN113808246B (en) Method and device for generating map, computer equipment and computer readable storage medium
CN114581506A (en) Parallel algorithm for accurately calculating height by combining two-dimensional information and three-dimensional information
CN112308933A (en) Method and device for calibrating camera internal reference and computer storage medium
CN109754365B (en) Image processing method and device
US9389122B1 (en) Determining color at an object point from multple images providing conflicting color information
CN115816833B (en) Method and device for determining image correction data, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant