CN109510948B - Exposure adjusting method, exposure adjusting device, computer equipment and storage medium - Google Patents

Exposure adjusting method, exposure adjusting device, computer equipment and storage medium Download PDF

Info

Publication number
CN109510948B
CN109510948B CN201811160167.9A CN201811160167A CN109510948B CN 109510948 B CN109510948 B CN 109510948B CN 201811160167 A CN201811160167 A CN 201811160167A CN 109510948 B CN109510948 B CN 109510948B
Authority
CN
China
Prior art keywords
pixel point
image
gray value
region
interest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811160167.9A
Other languages
Chinese (zh)
Other versions
CN109510948A (en
Inventor
黄磊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining 3D Technology Co Ltd
Original Assignee
Shining 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shining 3D Technology Co Ltd filed Critical Shining 3D Technology Co Ltd
Priority to CN201811160167.9A priority Critical patent/CN109510948B/en
Publication of CN109510948A publication Critical patent/CN109510948A/en
Application granted granted Critical
Publication of CN109510948B publication Critical patent/CN109510948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

The application relates to an exposure adjustment method, an exposure adjustment device, a computer device and a storage medium. The method comprises the following steps: acquiring a planar scanning image; calculating three-dimensional information of each pixel point according to the plane scanning image; obtaining exposure compensation parameters according to the plane scanning image, the three-dimensional information of each pixel point, the transformation relation between a camera coordinate system and a projector coordinate system and a standard light field; and carrying out exposure adjustment according to the exposure compensation parameters. The exposure parameters can be adjusted in real time according to the changes of the surface color and the material of the currently scanned object, and the signal to noise ratio of the acquired picture is improved. The quality of the reconstructed three-dimensional image is further improved.

Description

Exposure adjusting method, exposure adjusting device, computer equipment and storage medium
Technical Field
The present application relates to the field of three-dimensional scanning technologies, and in particular, to an exposure adjustment method and apparatus, a computer device, and a storage medium.
Background
The three-dimensional scanning is a high and new technology integrating light, mechanical, electrical and computer technologies, and is mainly used for collecting and analyzing geometric construction and appearance data of an object or an environment, and performing three-dimensional reconstruction on the collected data to obtain a three-dimensional digital model of the scanned object.
Three-dimensional scanning schemes based on active vision generally require some coding patterns to be projected on the surface of an object, and the coding patterns modulated on the surface of the object under one or more different viewing angles are collected by a camera. And (4) finding corresponding points by utilizing the coded information in the patterns, and further resolving three-dimensional data of the surface of the object. In the process of the step, the color and the material of the surface of the object are changed continuously, if the exposure parameters of the camera or the projector are set unreasonably, the signal-to-noise ratio of the picture acquired by the camera is reduced, the coding information cannot be solved correctly, and the reconstruction fails.
Disclosure of Invention
In view of the above, it is necessary to provide an exposure adjustment method, an apparatus, a computer device and a storage medium capable of accurately calculating the exposure parameters of a projector.
An exposure adjustment method, the method comprising: acquiring a planar scanning image; calculating three-dimensional information of each pixel point according to the plane scanning image; obtaining exposure compensation parameters according to the plane scanning image, the three-dimensional information of each pixel point, the transformation relation between a camera coordinate system and a projector coordinate system and a standard light field; and carrying out exposure adjustment according to the exposure compensation parameters.
In one embodiment, before acquiring the planar scan image, the method further includes: obtaining a calibration image of the projection coding light modulated by the calibration piece; acquiring three-dimensional information of each pixel point of the modulated calibration image and a gray value of each pixel point of the modulated calibration image; and obtaining a standard light field according to the three-dimensional information of each pixel point of the modulated calibration image, the gray value of each pixel point of the modulated calibration image and the transformation relation between the camera coordinate system and the projector coordinate system.
In one embodiment, the calculating three-dimensional information of each pixel point according to the plane scanning image includes: extracting a region-of-interest image in the planar scanning image; and calculating the three-dimensional information of each pixel point of the interested area image according to the interested area image.
In one embodiment, the extracting the region-of-interest image in the planar scan image includes: taking each pixel in the planar scanning image and a preset number of pixels adjacent to the pixel in the periphery of the planar scanning image as a judgment area of the current pixel; calculating the fragrance concentration entropies of all the judgment areas; comparing the fragrance concentration entropies of all the judgment areas with preset fragrance concentration entropies; and if the aroma entropy of the judgment area is greater than the preset aroma entropy, the pixel corresponding to the corresponding judgment area is the image pixel of the region of interest.
In one embodiment, the calculating three-dimensional information of each pixel point of the region-of-interest image according to the region-of-interest image includes: calculating pixel point three-dimensional information obtained by measurement according to the interested region image; and obtaining the three-dimensional information of each pixel point of the interested region image by interpolation of the measured three-dimensional information of the pixel points.
In one embodiment, the obtaining of the exposure compensation parameter according to the planar scanning image, the three-dimensional information of each pixel point, the transformation relationship between the camera coordinate system and the projector coordinate system, and the standard light field includes: acquiring the gray value of each pixel point of the image of the region of interest; calculating the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector according to the gray value of each pixel point of the image of the region of interest, the three-dimensional information of each pixel point of the image of the region of interest and the transformation relation between the camera coordinate system and the projector coordinate system; and comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image of the region of interest and the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector to obtain an exposure compensation parameter.
In one embodiment, the comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field to obtain the exposure compensation parameter includes: if the gray value of each pixel point of the image in the region of interest is smaller than the gray value of the corresponding pixel point of the standard light field, the exposure compensation parameter is to increase the projection brightness of the corresponding pixel point of the projector; and if the gray value of each pixel point of the image of the region of interest is greater than the gray value of the corresponding pixel point of the standard light field, reducing the projection brightness of the corresponding pixel point of the projector by using the exposure compensation parameter.
An exposure adjustment apparatus, the apparatus comprising: the acquisition module is used for acquiring a plane scanning image; the three-dimensional information calculation module is used for calculating the three-dimensional information of each pixel point according to the plane scanning image; the exposure compensation parameter calculation module is used for obtaining exposure compensation parameters according to the plane scanning image, the three-dimensional information of each pixel point, the transformation relation between a camera coordinate system and a projector coordinate system and a standard light field; and the adjusting module is used for carrying out exposure adjustment according to the exposure compensation parameters.
A computer device comprising a memory storing a computer program and a processor implementing the steps of any of the methods described above when the computer program is executed.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any of the above.
According to the exposure adjustment method, the exposure adjustment device, the computer equipment and the storage medium, firstly, a plane scanning image is obtained, the three-dimensional information of each pixel point in the plane scanning image is calculated, exposure compensation parameters are obtained according to the plane scanning image, the three-dimensional information of each pixel point in the plane image, the transformation relation between a camera coordinate system and a projector coordinate system and a standard light field, and finally exposure adjustment is carried out according to the calculated exposure compensation parameters. The exposure compensation parameters are obtained by obtaining the current scanning image and comparing the current scanning image with the light field of the current environment, so that the exposure conditions are adjusted. The exposure parameters can be adjusted in real time according to the changes of the surface color and the material of the currently scanned object, and the signal to noise ratio of the acquired picture is improved. The quality of the reconstructed three-dimensional image is further improved.
Drawings
FIG. 1 is a flow chart illustrating an exposure adjustment method according to an embodiment;
FIG. 2 is a schematic flow chart of a method for obtaining a standard light field in one embodiment;
FIG. 3 is a flow diagram illustrating a method for extracting an image of a region of interest in one embodiment;
FIG. 4 is a schematic flow chart diagram illustrating a method for obtaining exposure parameters in one embodiment;
FIG. 5 is a block diagram showing the structure of an exposure adjustment apparatus according to an embodiment;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Reference numerals: the system comprises an acquisition module 100, a three-dimensional information calculation module 200, an exposure compensation parameter calculation module 300 and an adjustment module 400.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The three-dimensional scanning technology is an advanced full-automatic high-precision stereo scanning technology, and point cloud information of the surface of an object is obtained by measuring three-dimensional coordinate values of surface points of the object in space and is converted into a three-dimensional model which can be directly processed by a computer, and the three-dimensional model is also called as a real scene replication technology. The method comprises the steps of obtaining an image of an object to be scanned in a three-dimensional scanner, and enabling a set of exposure parameters not to simultaneously satisfy all scanning areas when the color contrast of the object in the obtained image of the object to be scanned is too high or the reverse curvature discrimination of a material is too high, so that only partial data of the obtained image of the object to be scanned is reconstructed, and the rest areas cannot be reconstructed. For example: if the white area is in the majority in the image of the object to be scanned, the system will always judge that the tendency of overexposure is increased, and thus the exposure of the camera and the projection is turned down, which will result in that the black area, which is smaller in the area, cannot be reconstructed all the time. In the prior art, camera exposure is usually adjusted, when a handheld three-dimensional scanner scans, the handheld three-dimensional scanner moves relative to an object to be scanned, and if the camera exposure time is too long, a captured image is smeared in the process, so that the scanned object is distorted geometrically. Therefore, the embodiment of the invention provides a method for adjusting the exposure degree of the corresponding pixel point by adjusting the projection brightness of the corresponding pixel point of the projector, and the exposure parameters are adjusted in real time according to the image of the object to be scanned, so that the quality of the reconstructed three-dimensional image is further improved.
In one embodiment, as shown in fig. 1, there is provided an exposure adjustment method including the steps of:
step S102, acquiring a planar scanning image.
Specifically, before a handheld three-dimensional scanner scans an object to be scanned to obtain a planar scanning image, a coordinate transformation relation between camera coordinates and a projector and a standard light field are obtained. Where the light field refers to the amount of light that passes through each point in each direction. The amount of light refers to the sum of the light emitted by the light source over a certain period of time. After acquiring the coordinate transformation relation between the camera coordinate and the projector and the standard light field, a user holds the three-dimensional scanner by hand and acquires the plane images of the object to be scanned from different angles respectively. Wherein the handheld three-dimensional scanner has a binocular measurement system of two cameras. The binocular measurement system calculates the depth information of an object to be scanned by using a binocular vision principle, specifically, two identical cameras are used for imaging the object to be scanned at the same angle from different positions to obtain an image pair obtained by the two cameras, pixel points in the two images are matched, and finally, the depth information from the object to be scanned to a handheld scanner is calculated by using a triangulation method.
And step S104, calculating the three-dimensional information of each pixel point according to the plane scanning image.
Specifically, three-dimensional information of each pixel point is calculated according to a planar scanning image acquired by a handheld three-dimensional scanner. The three-dimensional information is three-dimensional coordinate information of each point and comprises an X-axis coordinate, a Y-axis coordinate and a Z-axis coordinate. And the Z-axis coordinate is the distance from the corresponding pixel point to the handheld scanner. More specifically, in order to reduce the data amount and increase the operation speed, the image of the region of interest in the planar scanning image is first extracted, the image of interest is the image of the object to be scanned, and the rest of the images are background images, that is, the background in the planar scanning image is first eliminated. And then calculating the three-dimensional information of each pixel point of the interested region image according to the interested region image. The data size can be greatly reduced by neglecting the three-dimensional information of the background area. More specifically, the three-dimensional information of each pixel point of the image of the region of interest is calculated, firstly, the three-dimensional information of the pixel points obtained by measurement is calculated according to the image of the region of interest, and the three-dimensional information of each pixel point of the image of the region of interest is obtained through interpolation on the three-dimensional information of the pixel points obtained by measurement. Since there is a certain loss of data under improper exposure, and the three-dimensional information and the image of the region of interest do not correspond to each other, it is necessary to interpolate three-dimensional information corresponding to the image of the region of interest by using the existing data. Preferably, the three-dimensional information of each pixel point of the image of the region of interest can be obtained by using a fitting mode. Common fitting schemes include cubic surface fitting or gridFit interpolation.
And step S106, obtaining exposure compensation parameters according to the plane scanning image, the three-dimensional information of each pixel point, the transformation relation between the camera coordinate system and the projector coordinate system and the standard light field.
Specifically, gray values of all pixel points of the plane scanning image are obtained, and according to the obtained gray values of all pixel points of the plane scanning image, three-dimensional information of all pixel points of the plane scanning image and a transformation relation between a camera coordinate system and a projector coordinate system, a corresponding relation between the gray values of all pixel points of the plane scanning image and all pixel points of the projector is obtained through calculation; and comparing the gray value of each pixel point of the plane scanning image with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the plane scanning image and the corresponding relation between the gray value of each pixel point of the plane scanning image and each pixel point of the projector to obtain the exposure compensation parameter of each pixel point of the plane scanning image. And finally, carrying out exposure adjustment according to the exposure compensation parameters of each pixel point of the plane scanning image. In order to reduce the data amount in the above steps, the background in the plane scanning image is eliminated, and the region-of-interest image is obtained. The method for acquiring the exposure compensation parameters of each pixel point of the image of the region of interest according to the image of interest specifically comprises the following steps: acquiring the gray value of each pixel point of the image of the region of interest; calculating the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector according to the gray value of each pixel point of the image of the region of interest, the three-dimensional information of each pixel point of the image of the region of interest and the transformation relation between the camera coordinate system and the projector coordinate system; and comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image of the region of interest and the corresponding relation between the gray value of each pixel point of the region of interest and each pixel point of the projector to obtain the exposure compensation parameter of each pixel point of the image of the region of interest.
And step S108, carrying out exposure adjustment according to the exposure compensation parameters.
Specifically, the projection brightness of the corresponding pixel point of the projector is adjusted according to the exposure compensation parameter of each pixel point of the obtained plane scanning image, and then the exposure of the projection is accurately adjusted. In the above steps, in order to reduce the data amount, the background in the planar scanning image is eliminated to obtain the exposure compensation parameters of each pixel point of the image of the region of interest, and the projection brightness of the corresponding pixel point of the projector is adjusted according to the exposure compensation parameters of each pixel point of the image of the region of interest, so as to accurately adjust the projection exposure.
The exposure adjustment method comprises the steps of firstly obtaining a plane scanning image, calculating three-dimensional information of each pixel point in the plane scanning image, obtaining exposure compensation parameters according to the plane scanning image, the three-dimensional information of each pixel point in the plane image, the transformation relation between a camera coordinate system and a projector coordinate system and a standard light field, and finally carrying out exposure adjustment according to the exposure compensation parameters obtained through calculation. The exposure compensation parameters are obtained by obtaining the current scanning image and comparing the current scanning image with the light field of the current environment, so that the exposure conditions are adjusted. The exposure of each pixel on each frame of the projection image can be accurately adjusted. The gray scale of each area on the original picture acquired by the camera is within a reasonable range, single-chip data with guaranteed quality is guaranteed to be reconstructed under a complex environment, and scanning is carried out completely and smoothly.
In one embodiment, as shown in fig. 2, there is provided a method of acquiring a standard light field, comprising the steps of:
step S202, a calibration image of the projection coded light modulated by the calibration piece is obtained.
Specifically, the calibration piece is a flat plate made of a reference material and having a reference color, wherein the reference material can be any material, and only the flat plate needs to be made. The reference color may be any color, that is, the flat plate may be a flat plate made of any color material. And projecting the coded light to the calibration piece through a projector, and modulating an image formed by the coded light by a flat plate made of a reference material and having a reference color to obtain a modulated calibration image.
Step S204, three-dimensional information of each pixel point of the modulated calibration image and the gray value of each pixel point of the modulated calibration image are obtained.
Specifically, a handheld three-dimensional scanner has a binocular measurement system of two cameras. The binocular measurement system calculates three-dimensional information of an object to be scanned by using a binocular vision principle. And the three-dimensional information is obtained by the image data acquired by the left camera and the binocular system. And obtaining the gray value of each pixel point according to the obtained modulated calibration image.
And step S206, obtaining a standard light field according to the three-dimensional information of each pixel point of the modulated calibration image, the gray value of each pixel point of the modulated calibration image and the transformation relation between the camera coordinate system and the projector coordinate system.
Specifically, the calibration parts are placed on a plurality of depth planes of the handheld three-dimensional scanner for a plurality of times, calibration images after corresponding modulation of different depths are obtained, and three-dimensional information of each pixel point of the corresponding depth and a gray value of each pixel point of the corresponding depth are calculated. And obtaining the direction of each pixel ray in the space and the change rule of energy in the space under the current environment according to the three-dimensional information of each pixel point in each depth and the gray value of each pixel point in each depth. And finally, obtaining a standard light field according to the three-dimensional information of each pixel point at each depth, the gray value of each pixel point at each depth, the direction of each pixel ray in the space, the change rule of energy in the space and the transformation relation between a camera coordinate system and a projector coordinate system.
According to the method for acquiring the standard light field, the calibration image modulated by the calibration piece is acquired at different depths in real time, so that the three-dimensional information of each pixel point of the modulated calibration image and the gray value of each pixel point of the modulated calibration image are obtained, and finally the standard light field is obtained. According to the standard light field, exposure compensation parameters of all pixel points can be accurately determined, a better compensation effect is achieved, and the quality of the reconstructed three-dimensional image is further improved.
In one embodiment, as shown in fig. 3, there is provided a method of extracting an image of a region of interest, comprising the steps of:
step S302, using a preset number of pixels adjacent to each pixel in the planar scanning image as a judgment area of the current pixel.
Specifically, all pixel points in the planar scanning image are selected, and each pixel point and a pixel in the neighborhood of the peripheral preset range of the pixel point are used as a judgment area. Wherein the preset range is modified according to the actual situation. More specifically, each pixel point of the planar scanning image is adjacent to a preset number of pixels around the planar scanning image as an interested area judgment area of the current pixel.
And step S304, calculating the fragrance entropy of all the judgment areas.
In particular, the aroma entropy, which may also be referred to as information entropy, is the amount by which information is quantified. And calculating the fragrance concentration entropy of the judgment area corresponding to each pixel point in the planar scanning image.
Step S306, comparing the fragrance concentration entropies of all the judgment areas with preset fragrance concentration entropies.
Specifically, according to the calculated fragrance concentration entropy of the judgment region corresponding to each pixel point, the calculated fragrance concentration entropy is compared with the preset fragrance concentration entropy. Therefore, whether each pixel point in the planar scanning image belongs to the region of interest is judged.
Step S308, if the aroma entropy of the judgment area is larger than the preset aroma entropy, the pixel corresponding to the corresponding judgment area is the image pixel of the region of interest.
Specifically, if the aroma entropy of the judgment region is less than or equal to the preset aroma entropy, the pixel corresponding to the corresponding judgment region is the pixel of the background region. The fragrance concentration entropy of the judgment area corresponding to each pixel point in the planar scanning image is compared with the preset fragrance concentration entropy, so that the background area is removed, and the image of the region of interest is obtained.
When scanning with a hand-held three-dimensional scanner, background areas may be present during the scanning process. Therefore, the background area needs to be identified in the scanning process, and the background area is prevented from participating in calculation. According to the method for extracting the image of the region of interest, the judgment region is set for each pixel point, the fragrance concentration entropy of the judgment region is calculated, and then the comparison is carried out with the preset fragrance concentration entropy, so that the background region is removed. The fragrance intensity entropy is used for judging the value of information content of one region, and the region projected with the coded light has higher fragrance intensity entropy in the image due to the existence of the code, so that the pixel corresponding to the judgment region fragrance intensity entropy larger than the preset fragrance intensity entropy is used as the pixel of the region of interest. For example, with the number 5 pixel point as the center, the number 1 pixel point, the number 2 pixel point, the number 3 pixel point, the number 4 pixel point, the number 6 pixel point, the number 7 pixel point, the number 8 pixel point and the number 9 pixel point are selected as the judgment area of the number 5 pixel point, the aroma entropy of the current judgment area is calculated, the aroma entropy of the current judgment area is compared with the preset aroma entropy, and when the aroma entropy of the judgment area is greater than the preset aroma entropy, the number 5 pixel is used as the image pixel of the region of interest; and when the regional fragrance concentration entropy is judged to be less than or equal to the preset fragrance concentration entropy, taking the No. 5 pixel as the background regional image pixel. By calculating the fragrance intensity entropy, the region of interest can be determined more accurately, and the operation efficiency of exposure adjustment is improved.
In one embodiment, as shown in fig. 4, there is provided a method of acquiring exposure parameters, comprising the steps of:
step S402, obtaining the gray value of each pixel point of the image of the region of interest.
Specifically, the region of interest image is mapped to a camera plane and a projector plane, respectively. And obtaining gray values of pixel points of the image of the region of interest collected by the camera.
Step S404, calculating the corresponding relation between the gray value of each pixel point of the interested area image and each pixel point of the projector according to the gray value of each pixel point of the interested area image, the three-dimensional information of each pixel point of the interested area image and the transformation relation between the camera coordinate system and the projector coordinate system.
Specifically, the gray value of each pixel point of the image of the region of interest is converted through the camera coordinate system and the projector coordinate system to obtain the corresponding relationship between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector.
Step S406, comparing the gray value of each pixel point of the image in the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image in the region of interest and the corresponding relationship between the gray value of each pixel point of the image in the region of interest and each pixel point of the projector to obtain an exposure compensation parameter.
Specifically, if the gray value of each pixel point of the image in the region of interest is smaller than the gray value of the corresponding pixel point of the standard light field, the exposure compensation parameter is to increase the projection brightness of the corresponding pixel point of the projector; and if the gray value of each pixel point of the image in the region of interest is greater than the gray value of the corresponding pixel point of the standard light field, reducing the projection brightness of the corresponding pixel point of the projector by using the exposure compensation parameter.
According to the method for acquiring the exposure parameters, the gray values of all the pixel points of the image in the region of interest are compared with the gray values of the corresponding pixel points of the standard light field, the projection brightness of the corresponding pixel points of the projector is adjusted, the exposure conditions of all the pixel points are further adjusted, and the quality of the reconstructed three-dimensional image is improved.
It should be understood that although the various steps in the flow charts of fig. 1-4 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-4 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 5, there is provided an exposure adjustment apparatus including: the system comprises an acquisition module 100, a three-dimensional information calculation module 200, an exposure compensation parameter calculation module 300 and an adjustment module 400, wherein:
the acquiring module 100 is used for acquiring a planar scanning image.
And a three-dimensional information calculation module 200, configured to calculate three-dimensional information of each pixel point according to the plane scan image.
And an exposure compensation parameter calculation module 300, configured to obtain an exposure compensation parameter according to the planar scanning image, the three-dimensional information of each pixel, the transformation relationship between the camera coordinate system and the projector coordinate system, and the standard light field.
And an adjusting module 400, configured to perform exposure adjustment according to the exposure compensation parameter.
An exposure adjustment apparatus further includes: and the standard light field calculation module is used for acquiring a standard light field.
The standard light field calculation module comprises: the device comprises a calibration image acquisition unit, a first gray value calculation unit and a standard light field calculation unit.
And the calibration image acquisition unit is used for acquiring a calibration image of the projection coding light modulated by the calibration piece.
And the first gray value calculating unit is used for acquiring the three-dimensional information of each pixel point of the modulated calibration image and the gray value of each pixel point of the modulated calibration image.
And the standard light field calculation unit is used for obtaining a standard light field according to the three-dimensional information of each pixel point of the modulated calibration image, the gray value of each pixel point of the modulated calibration image and the transformation relation between the camera coordinate system and the projector coordinate system.
The three-dimensional information calculation module further includes: a region-of-interest image extracting unit and a three-dimensional information calculating unit.
And the interested region image extraction unit is used for extracting the interested region image in the planar scanning image.
And the three-dimensional information calculation unit is used for calculating the three-dimensional information of each pixel point of the interested area image according to the interested area image.
The region-of-interest image extraction unit further includes: the device comprises a judgment region acquisition subunit, a fragrance concentration entropy calculation subunit, a fragrance concentration entropy comparison subunit and an interested region image pixel extraction subunit.
And the judgment area acquisition subunit is used for taking each pixel in the planar scanning image and the pixels adjacent to the periphery of the pixel in the preset number as the judgment area of the current pixel.
And the aroma entropy calculating subunit is used for calculating the aroma entropy of all the judgment areas.
And the aroma concentration entropy comparison subunit is used for comparing the aroma concentration entropies of all the judgment areas with the preset aroma concentration entropies.
And the interesting region image pixel extraction subunit is used for judging whether the aroma entropy of the region is larger than the preset aroma entropy or not, and if so, determining that the pixel corresponding to the corresponding judgment region is the interesting region image pixel.
The three-dimensional information calculation unit further includes: a three-dimensional information calculation subunit and an interpolation subunit.
And the three-dimensional information calculating subunit is used for calculating the measured pixel point three-dimensional information according to the interested region image.
And the interpolation subunit is used for obtaining the three-dimensional information of each pixel point of the image of the region of interest by performing interpolation on the measured three-dimensional information of the pixel points.
The exposure compensation parameter calculation module further includes: a second gray value calculating unit, a corresponding relation calculating unit and an exposure compensation parameter calculating unit.
And the second gray value calculating unit is used for acquiring the gray value of each pixel point of the image of the region of interest.
And the corresponding relation calculating unit is used for calculating the corresponding relation between the gray value of each pixel point of the interested area image and each pixel point of the projector according to the gray value of each pixel point of the interested area image, the three-dimensional information of each pixel point of the interested area image and the transformation relation between the camera coordinate system and the projector coordinate system.
And the exposure compensation parameter calculation unit is used for comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image of the region of interest and the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector to obtain the exposure compensation parameter.
The exposure compensation parameter calculation unit is also used for increasing the projection brightness of the corresponding pixel point of the projector by using the exposure compensation parameter if the gray value of each pixel point of the image of the region of interest is smaller than the gray value of the corresponding pixel point of the standard light field; and if the gray value of each pixel point of the image in the region of interest is greater than the gray value of the corresponding pixel point of the standard light field, reducing the projection brightness of the corresponding pixel point of the projector by using the exposure compensation parameter.
For the specific definition of the exposure adjustment device, reference may be made to the above definition of the exposure adjustment method, which is not described herein again. The respective modules in the exposure adjusting apparatus described above may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an exposure adjustment method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 6 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
planar scan images are acquired. And calculating the three-dimensional information of each pixel point according to the plane scanning image. And obtaining exposure compensation parameters according to the planar scanning image, the three-dimensional information of each pixel point, the transformation relation between the camera coordinate system and the projector coordinate system and the standard light field. And carrying out exposure adjustment according to the exposure compensation parameters.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and acquiring a calibration image of the projection coded light modulated by the calibration piece. And acquiring the three-dimensional information of each pixel point of the modulated calibration image and the gray value of each pixel point of the modulated calibration image. And obtaining a standard light field according to the three-dimensional information of each pixel point of the modulated calibration image, the gray value of each pixel point of the modulated calibration image and the transformation relation between the camera coordinate system and the projector coordinate system.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and taking each pixel in the planar scanning image and a preset number of pixels adjacent to the pixel in the periphery of the planar scanning image as a judgment area of the current pixel. And (4) calculating the fragrance concentration entropies of all the judgment areas. And comparing the fragrance concentration entropies of all the judgment areas with the preset fragrance concentration entropies. And if the aroma entropy of the judgment area is greater than the preset aroma entropy, the pixel corresponding to the corresponding judgment area is the image pixel of the region of interest.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
and acquiring the gray value of each pixel point of the image of the region of interest. And calculating the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector according to the gray value of each pixel point of the image of the region of interest, the three-dimensional information of each pixel point of the image of the region of interest and the transformation relation between the camera coordinate system and the projector coordinate system. And comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image of the region of interest and the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector to obtain an exposure compensation parameter.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
planar scan images are acquired. And calculating the three-dimensional information of each pixel point according to the plane scanning image. And obtaining exposure compensation parameters according to the planar scanning image, the three-dimensional information of each pixel point, the transformation relation between the camera coordinate system and the projector coordinate system and the standard light field. And carrying out exposure adjustment according to the exposure compensation parameters.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and acquiring a calibration image of the projection coded light modulated by the calibration piece. And acquiring the three-dimensional information of each pixel point of the modulated calibration image and the gray value of each pixel point of the modulated calibration image. And obtaining a standard light field according to the three-dimensional information of each pixel point of the modulated calibration image, the gray value of each pixel point of the modulated calibration image and the transformation relation between the camera coordinate system and the projector coordinate system.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and taking each pixel in the planar scanning image and a preset number of pixels adjacent to the pixel in the periphery of the planar scanning image as a judgment area of the current pixel. And (4) calculating the fragrance concentration entropies of all the judgment areas. And comparing the fragrance concentration entropies of all the judgment areas with the preset fragrance concentration entropies. And if the aroma entropy of the judgment area is greater than the preset aroma entropy, the pixel corresponding to the corresponding judgment area is the image pixel of the region of interest.
In one embodiment, the computer program when executed by the processor further performs the steps of:
and acquiring the gray value of each pixel point of the image of the region of interest. And calculating the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector according to the gray value of each pixel point of the image of the region of interest, the three-dimensional information of each pixel point of the image of the region of interest and the transformation relation between the camera coordinate system and the projector coordinate system. And comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image of the region of interest and the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector to obtain an exposure compensation parameter.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An exposure adjustment method, characterized in that the method comprises:
acquiring a planar scanning image;
calculating three-dimensional information of each pixel point according to the plane scanning image;
acquiring the gray value of each pixel point in the plane scanning image; calculating the corresponding relation between the gray value of each pixel point of the plane scanning image and each pixel point of the projector according to the transformation relation between a camera coordinate system and a projector coordinate system, the gray value of each pixel point of the plane scanning image and the three-dimensional information, and obtaining an exposure compensation parameter according to the gray value of each pixel point of the plane scanning image, the corresponding relation and a standard light field;
and carrying out exposure adjustment according to the exposure compensation parameters.
2. The method of claim 1, wherein prior to acquiring the plan scan image, further comprising:
obtaining a modulated calibration image; the modulated calibration image is obtained by modulating the projection coded light by the calibration piece;
acquiring three-dimensional information of each pixel point of the modulated calibration image and a gray value of each pixel point of the modulated calibration image;
and obtaining a standard light field according to the three-dimensional information of each pixel point of the modulated calibration image, the gray value of each pixel point of the modulated calibration image and the transformation relation between the camera coordinate system and the projector coordinate system.
3. The method of claim 2, wherein the calculating the three-dimensional information of each pixel point according to the plane scanning image comprises:
extracting a region-of-interest image in the planar scanning image;
and calculating the three-dimensional information of each pixel point of the interested area image according to the interested area image.
4. The method of claim 3, wherein the extracting the region of interest image in the plan scan image comprises:
taking each pixel in the planar scanning image and a preset number of pixels adjacent to the pixel in the periphery of the planar scanning image as a judgment area of the current pixel;
calculating the fragrance concentration entropies of all the judgment areas;
comparing the fragrance concentration entropies of all the judgment areas with preset fragrance concentration entropies;
and if the aroma entropy of the judgment area is greater than the preset aroma entropy, the pixel corresponding to the corresponding judgment area is the image pixel of the region of interest.
5. The method according to claim 3, wherein the calculating three-dimensional information of each pixel point of the region-of-interest image according to the region-of-interest image comprises:
calculating pixel point three-dimensional information obtained by measurement according to the interested region image;
and obtaining the three-dimensional information of each pixel point of the interested region image by interpolation of the measured three-dimensional information of the pixel points.
6. The method according to claim 3, wherein the gray value of each pixel point in the plane scanning image is obtained; calculating the corresponding relation between the gray value of each pixel point of the plane scanning image and each pixel point of the projector according to the transformation relation between the camera coordinate system and the projector coordinate system, the gray value of each pixel point of the plane scanning image and the three-dimensional information, and obtaining the exposure compensation parameter according to the gray value of each pixel point of the plane scanning image, the corresponding relation and the standard light field, wherein the exposure compensation parameter comprises the following steps:
acquiring the gray value of each pixel point of the image of the region of interest;
calculating the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector according to the gray value of each pixel point of the image of the region of interest, the three-dimensional information of each pixel point of the image of the region of interest and the transformation relation between the camera coordinate system and the projector coordinate system;
and comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field according to the gray value of each pixel point of the image of the region of interest and the corresponding relation between the gray value of each pixel point of the image of the region of interest and each pixel point of the projector to obtain an exposure compensation parameter.
7. The method of claim 6, wherein the comparing the gray value of each pixel point of the image of the region of interest with the gray value of the corresponding pixel point of the standard light field to obtain the exposure compensation parameter comprises:
if the gray value of each pixel point of the image in the region of interest is smaller than the gray value of the corresponding pixel point of the standard light field, the exposure compensation parameter is to increase the projection brightness of the corresponding pixel point of the projector;
and if the gray value of each pixel point of the image of the region of interest is greater than the gray value of the corresponding pixel point of the standard light field, reducing the projection brightness of the corresponding pixel point of the projector by using the exposure compensation parameter.
8. An exposure adjustment apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring a plane scanning image;
the three-dimensional information calculation module is used for calculating the three-dimensional information of each pixel point according to the plane scanning image;
the exposure compensation parameter calculation module is used for acquiring the gray value of each pixel point in the plane scanning image; calculating the corresponding relation between the gray value of each pixel point of the plane scanning image and each pixel point of the projector according to the transformation relation between a camera coordinate system and a projector coordinate system, the gray value of each pixel point of the plane scanning image and the three-dimensional information, and obtaining an exposure compensation parameter according to the gray value of each pixel point of the plane scanning image, the corresponding relation and a standard light field;
and the adjusting module is used for carrying out exposure adjustment according to the exposure compensation parameters.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN201811160167.9A 2018-09-30 2018-09-30 Exposure adjusting method, exposure adjusting device, computer equipment and storage medium Active CN109510948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811160167.9A CN109510948B (en) 2018-09-30 2018-09-30 Exposure adjusting method, exposure adjusting device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811160167.9A CN109510948B (en) 2018-09-30 2018-09-30 Exposure adjusting method, exposure adjusting device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109510948A CN109510948A (en) 2019-03-22
CN109510948B true CN109510948B (en) 2020-11-17

Family

ID=65746376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811160167.9A Active CN109510948B (en) 2018-09-30 2018-09-30 Exposure adjusting method, exposure adjusting device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109510948B (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109764827B (en) * 2019-02-13 2021-06-29 盎锐(上海)信息科技有限公司 Synchronization method and device for projection grating modeling
WO2020237492A1 (en) * 2019-05-28 2020-12-03 深圳市汇顶科技股份有限公司 Three-dimensional reconstruction method, device, apparatus, and storage medium
TWI735953B (en) * 2019-09-18 2021-08-11 財團法人工業技術研究院 Three-dimension measurement device and operation method thereof
CN112584055A (en) * 2019-09-29 2021-03-30 深圳市光鉴科技有限公司 Brightness self-adaptive adjusting method, system, equipment and medium based on image coding
CN112584054A (en) * 2019-09-29 2021-03-30 深圳市光鉴科技有限公司 Brightness self-adaptive adjusting method, system, equipment and medium based on image coding
CN110913150B (en) * 2019-11-18 2021-04-02 西北工业大学 Self-adaptive exposure imaging method based on space platform
CN111540042B (en) * 2020-04-28 2023-08-11 上海盛晃光学技术有限公司 Method, device and related equipment for three-dimensional reconstruction
CN112036201B (en) * 2020-08-06 2022-06-17 浙江大华技术股份有限公司 Image processing method, device, equipment and medium
CN114253079B (en) * 2020-09-21 2024-04-09 浙江水晶光电科技股份有限公司 Gray scale photoetching light intensity correction method, device, equipment and storage medium
CN113340235B (en) * 2021-04-27 2022-08-12 成都飞机工业(集团)有限责任公司 Projection system based on dynamic projection and phase shift pattern generation method
CN113473034B (en) * 2021-07-02 2023-05-05 杭州思锐迪科技有限公司 Hole site light supplementing method, hole site light supplementing device, hole site scanning method and system
CN113870233B (en) * 2021-09-30 2022-07-15 常州市宏发纵横新材料科技股份有限公司 Binding yarn detection method, computer equipment and storage medium
CN116878402A (en) * 2023-07-11 2023-10-13 北京博科测试系统股份有限公司 Non-contact wheel arch measuring sensor and method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634544A (en) * 2012-08-20 2014-03-12 联想(北京)有限公司 A projection method and an electronic device
CN104601899A (en) * 2013-10-30 2015-05-06 佳能株式会社 Image processing apparatus and image processing method
CN105049664A (en) * 2015-08-12 2015-11-11 杭州思看科技有限公司 Method for light filling control of handheld three-dimensional laser scanner
CN105068384A (en) * 2015-08-12 2015-11-18 杭州思看科技有限公司 Method for controlling exposure time of laser projectors of handheld three-dimensional laser scanner
CN105222725A (en) * 2015-09-24 2016-01-06 大连理工大学 A kind of high-definition image dynamic collecting method based on spectral analysis
CN106303206A (en) * 2015-06-12 2017-01-04 西安蒜泥电子科技有限责任公司 The camera system localization method of a kind of body-scanner and device
CN107071277A (en) * 2017-03-31 2017-08-18 努比亚技术有限公司 A kind of light paints filming apparatus, method and mobile terminal
CN107134005A (en) * 2017-05-04 2017-09-05 网易(杭州)网络有限公司 Illumination adaptation method, device, storage medium, processor and terminal
CN107784672A (en) * 2016-08-26 2018-03-09 百度在线网络技术(北京)有限公司 For the method and apparatus for the external parameter for obtaining in-vehicle camera
CN107869968A (en) * 2017-12-01 2018-04-03 杭州测度科技有限公司 A kind of quick three-dimensional scan method and system suitable for complex object surface

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130051516A1 (en) * 2011-08-31 2013-02-28 Carestream Health, Inc. Noise suppression for low x-ray dose cone-beam image reconstruction
US10101154B2 (en) * 2015-12-21 2018-10-16 Intel Corporation System and method for enhanced signal to noise ratio performance of a depth camera system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103634544A (en) * 2012-08-20 2014-03-12 联想(北京)有限公司 A projection method and an electronic device
CN104601899A (en) * 2013-10-30 2015-05-06 佳能株式会社 Image processing apparatus and image processing method
CN106303206A (en) * 2015-06-12 2017-01-04 西安蒜泥电子科技有限责任公司 The camera system localization method of a kind of body-scanner and device
CN105049664A (en) * 2015-08-12 2015-11-11 杭州思看科技有限公司 Method for light filling control of handheld three-dimensional laser scanner
CN105068384A (en) * 2015-08-12 2015-11-18 杭州思看科技有限公司 Method for controlling exposure time of laser projectors of handheld three-dimensional laser scanner
CN105222725A (en) * 2015-09-24 2016-01-06 大连理工大学 A kind of high-definition image dynamic collecting method based on spectral analysis
CN107784672A (en) * 2016-08-26 2018-03-09 百度在线网络技术(北京)有限公司 For the method and apparatus for the external parameter for obtaining in-vehicle camera
CN107071277A (en) * 2017-03-31 2017-08-18 努比亚技术有限公司 A kind of light paints filming apparatus, method and mobile terminal
CN107134005A (en) * 2017-05-04 2017-09-05 网易(杭州)网络有限公司 Illumination adaptation method, device, storage medium, processor and terminal
CN107869968A (en) * 2017-12-01 2018-04-03 杭州测度科技有限公司 A kind of quick three-dimensional scan method and system suitable for complex object surface

Also Published As

Publication number Publication date
CN109510948A (en) 2019-03-22

Similar Documents

Publication Publication Date Title
CN109510948B (en) Exposure adjusting method, exposure adjusting device, computer equipment and storage medium
CN109829930B (en) Face image processing method and device, computer equipment and readable storage medium
CN109118569B (en) Rendering method and device based on three-dimensional model
US10997696B2 (en) Image processing method, apparatus and device
CN112258579B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
US9886759B2 (en) Method and system for three-dimensional data acquisition
KR20200044093A (en) Image processing methods and devices, electronic devices and computer-readable storage media
US20170103510A1 (en) Three-dimensional object model tagging
CN107592449B (en) Three-dimensional model establishing method and device and mobile terminal
CN107517346B (en) Photographing method and device based on structured light and mobile device
CN107610171B (en) Image processing method and device
CN108682050B (en) Three-dimensional model-based beautifying method and device
TW201525415A (en) Method and system for calibrating laser measuring apparatus
WO2018040480A1 (en) Method and device for adjusting scanning state
CN107392874B (en) Beauty treatment method and device and mobile equipment
CN111256628A (en) Wall surface flatness detection method and device, computer equipment and storage medium
CN107493452B (en) Video picture processing method and device and terminal
US11523056B2 (en) Panoramic photographing method and device, camera and mobile terminal
US20220392027A1 (en) Method for calibrating image distortion, apparatus, electronic device and storage medium
CN113496542A (en) Multi-exposure image modeling method and device, computer equipment and storage medium
CN107370952B (en) Image shooting method and device
US11748908B1 (en) Systems and methods for generating point-accurate three-dimensional models with point-accurate color information from a non-cosited capture
KR102195762B1 (en) Acquisition method for high quality 3-dimension spatial information using photogrammetry
CN114359401A (en) Calibration method, system and equipment
CN107515844B (en) Font setting method and device and mobile device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant