CN115484408A - Snow surface reflection coefficient generation method and device based on high-precision camera shooting - Google Patents
Snow surface reflection coefficient generation method and device based on high-precision camera shooting Download PDFInfo
- Publication number
- CN115484408A CN115484408A CN202211087163.9A CN202211087163A CN115484408A CN 115484408 A CN115484408 A CN 115484408A CN 202211087163 A CN202211087163 A CN 202211087163A CN 115484408 A CN115484408 A CN 115484408A
- Authority
- CN
- China
- Prior art keywords
- snow surface
- image data
- snow
- surface image
- gray
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000012545 processing Methods 0.000 claims abstract description 88
- 238000005457 optimization Methods 0.000 claims abstract description 19
- 238000004364 calculation method Methods 0.000 claims description 21
- 239000011159 matrix material Substances 0.000 claims description 14
- 238000000605 extraction Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 abstract description 16
- 230000007613 environmental effect Effects 0.000 abstract description 8
- 238000004891 communication Methods 0.000 description 15
- 239000003086 colorant Substances 0.000 description 6
- 230000008878 coupling Effects 0.000 description 6
- 238000010168 coupling process Methods 0.000 description 6
- 238000005859 coupling reaction Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 241001270131 Agaricus moelleri Species 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/14—Picture signal circuitry for video frequency region
- H04N5/20—Circuitry for controlling amplitude response
- H04N5/202—Gamma control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/68—Circuits for processing colour signals for controlling the amplitude of colour signals, e.g. automatic chroma control circuits
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a snow surface reflection coefficient generation method and device based on high-precision camera shooting. Wherein, the method comprises the following steps: acquiring first snow surface image data; carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data; extracting image units larger than a preset gray threshold value in the second snow surface image data, and combining a Hoss factor with the image units larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter; and generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameters and the weak light parameters. The invention solves the technical problems that in the snow surface image optimization process of the snow field in the prior art, the operation of collecting original image data is only carried out by fusing the environmental factors of the snow field and adjusting the running parameters of the equipment, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and the image optimization is further carried out by utilizing the parameters suitable for the snow surface reflection of the snow field.
Description
Technical Field
The invention relates to the field of image parameter processing, in particular to a snow surface reflection coefficient generation method and device based on high-precision shooting.
Background
Along with the continuous development of intelligent science and technology, people use intelligent equipment more and more among life, work, the study, use intelligent science and technology means, improved the quality of people's life, increased the efficiency of people's study and work.
At present, aiming at equipment for high-precision monitoring in a snow field, the aperture and the luminosity processing strategy of high-precision camera equipment are adjusted only by the environmental factors of the snow field so as to overcome the technical problems of unclear images, singular points of the images and the like caused by the reflection of a snow surface. However, in the snow field snow surface image optimization process in the prior art, the operation of collecting original image data is performed only by fusing the environmental factors of the snow field and adjusting the running parameters of the device, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and the image optimization is further performed by using the parameters suitable for the snow surface reflection of the snow field.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a snow surface reflection coefficient generation method and device based on high-precision camera shooting, and the method and device at least solve the technical problems that in the prior art, in the snow surface image optimization process of a snow field, the operation of acquiring original image data is only carried out by fusing environmental factors of the snow field and adjusting the running parameters of the device, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and the parameters suitable for snow surface reflection of the snow field are utilized to carry out further image optimization.
According to one aspect of the embodiment of the invention, a snow surface reflection coefficient generation method based on high-precision camera shooting is provided, and comprises the following steps: acquiring first snow surface image data; carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data; extracting image units larger than a preset gray threshold value in the second snow surface image data, and combining a Hoss factor with the image units larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter; and generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameters and the weak light parameters.
Optionally, the performing binarization grayscale processing on the first snow surface image data to obtain second snow surface image data includes: extracting a target gray factor from the binary gray factor matrix according to the precision requirement of the snow field image; generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter.
Optionally, the extracting of the image unit larger than the preset grayscale threshold from the second snow surface image data, and combining the houss factor with the image unit larger than the preset grayscale threshold to obtain the hard light parameter and the weak light parameter includes: generating the preset gray threshold according to the gray processing parameters; comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value; calculating the image unit larger than the preset gray threshold value and the Hoss factor to obtain the strong light parameter and the weak light parameter, wherein the calculation formula is as follows:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, H is a Hoss factor for calculating a reflection coefficient, P is w And the image unit data sets are the image unit data sets which are larger than the preset gray threshold.
Optionally, the generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter includes: inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
According to another aspect of the embodiments of the present invention, there is also provided a snow surface reflection coefficient generation apparatus based on high-precision image pickup, including: the acquisition module is used for acquiring first snow surface image data; the processing module is used for carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data; the extraction module is used for extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter; and the generating module is used for generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter.
Optionally, the processing module includes: the extraction unit is used for extracting a target gray factor from the binary gray factor matrix according to the precision requirement of the snow field image; a processing unit configured to generate a grayscale processing result according to the target grayscale factor and the first snow surface image data, wherein the grayscale processing result includes: the second snow surface image data and the gray processing parameter.
Optionally, the extracting module includes: the generating unit is used for generating the preset gray threshold according to the gray processing parameters; the comparison unit is used for comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value; the calculation unit is configured to calculate, through the image unit larger than the preset grayscale threshold and the houss factor, the highlight parameter and the dim parameter, where the calculation formula is:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, H is a Hoss factor for calculating a reflection coefficient, P is w And the image unit data sets are the image unit data sets which are larger than the preset gray threshold.
Optionally, the generating module includes: the input unit is used for inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and the optimization unit is used for optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium including a stored program, wherein the program controls, when running, an apparatus in which the non-volatile storage medium is located to execute a snow surface reflectance generation method based on high-precision image capturing.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the storage is stored with computer readable instructions, and the processor is used for executing the computer readable instructions, wherein the computer readable instructions execute a snow surface reflection coefficient generation method based on high-precision shooting when running.
In the embodiment of the invention, the first snow surface image data is acquired; carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data; extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter; the method for generating the snow surface reflection coefficient and the high-precision snow surface image according to the strong light parameters and the weak light parameters solves the technical problems that in the prior art, the snow surface image optimization process of the snow field is performed, the operation of acquiring original image data is performed only by fusing environmental factors of the snow field and adjusting the running parameters of the equipment, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and the image optimization is further performed by using the parameters suitable for the snow surface reflection of the snow field.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention and do not constitute a limitation of the invention. In the drawings:
fig. 1 is a flowchart of a snow surface reflection coefficient generation method based on high-precision image pickup according to an embodiment of the present invention;
fig. 2 is a block diagram showing the structure of a snow surface reflection coefficient generating apparatus based on high-precision image pickup according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing a method according to the present invention, according to an embodiment of the present invention;
fig. 4 is a memory unit for holding or carrying program code implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a snow surface reflectance generation method based on high-precision image capturing, it is noted that the steps illustrated in the flowchart of the drawings may be executed in a computer system such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be executed in an order different from that herein.
Example one
Fig. 1 is a flowchart of a snow surface reflection coefficient generation method based on high-precision image pickup according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
step S102, first snow surface image data is acquired.
Specifically, in order to solve the technical problems that in the snow field snow surface image optimization process in the prior art, the operation of acquiring original image data is performed only by fusing snow field environment factors and adjusting running parameters of the device, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and further image optimization is performed by using parameters suitable for snow surface reflection of the snow field, first snow surface image data needs to be acquired according to high-precision multi-lens camera equipment configured on the snow surface of the snow field, wherein the first snow surface image data can be an original snow surface imaging image, pixels and an imaging mode of the first snow surface image data adopt the most direct imaging effect, so that the reflection coefficient can be calculated and generated according to the acquired image data.
And step S104, performing binarization grayscale processing on the first snow surface image data to obtain second snow surface image data.
Optionally, the performing binarization grayscale processing on the first snow surface image data to obtain second snow surface image data includes: extracting a target gray-scale factor from a binary gray-scale factor matrix according to the precision requirement of the snow field image; generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter.
Specifically, in order to process the first snow surface image to generate second snow surface image data with a gray scale feature and further perform subsequent reflection coefficient calculation work, it is necessary to perform binary grayscale processing on the first snow surface image, where the advanced processing of the binary black-and-white processing is to determine a linkage coefficient between black and white according to a grayscale factor, so as to perform input of a grayscale parameter and grayscale processing to a certain extent, and thus find out a prerequisite parameter quantity for reflection coefficient calculation, for example, performing binary grayscale processing on the first snow surface image data to obtain the second snow surface image data includes: extracting a target gray factor from the binary gray factor matrix according to the precision requirement of the snow field image; generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter, wherein the binary gray factor matrix is used for corresponding to a quadratic matrix according to binary, such as [ H, W ], H is a binary gray factor constant, and W is a corresponding precision requirement, that is, different precision requirements and requirements of reflection coefficients are different for extracting the gray factor in the binary process.
In the field of computers, a binary Gray scale (Gray scale) digital image is an image having only one sampling color per pixel. Such images are typically displayed in gray scale from darkest black to brightest white, although in theory this sampling could be different shades of any color and even different colors at different brightnesses. The gray image is different from the black and white image, the black and white image only has two colors of black and white in the computer image field, and the gray image has a plurality of levels of color depth between black and white. However, outside the field of digital images, a "black-and-white image" also means a "grayscale image", and for example, a photograph of grayscale is generally called a "black-and-white photograph". Monochrome images are equivalent to grayscale images in some articles on digital images and to black and white images in other articles. Grayscale images are often obtained by measuring the brightness of each pixel within a single electromagnetic spectrum, such as visible light. The gray scale image for display is typically stored with a non-linear scale of 8bits per sample pixel, so that there are 256 gray scales (8 bits is 2 to the power of 8 = 256). This accuracy just avoids visible banding distortion and is very easy to program. More stages are often employed in the application of these techniques to medical and remote sensing images to take full advantage of the 10 or 12bits per sample sensor accuracy and to avoid approximation errors in the calculations. 16bits, namely 65536 combinations (or 65536 colors) are popular in such applications.
And step S106, extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter.
Optionally, the extracting of the image unit larger than the preset grayscale threshold from the second snow surface image data, and combining the houss factor with the image unit larger than the preset grayscale threshold to obtain the hard light parameter and the weak light parameter includes: generating the preset gray threshold according to the gray processing parameters; comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value; calculating the image unit larger than the preset gray threshold value and the Hoss factor to obtain the strong light parameter and the weak light parameter, wherein the calculation formula is as follows:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, H is a Hoss factor for calculating a reflection coefficient, P is w And the image unit data set is the image unit data set which is larger than the preset gray threshold.
And S108, generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameters and the weak light parameters.
Specifically, after the strong light parameter and the weak light parameter are obtained, the snow surface reflection coefficient needs to be calculated according to the two variable parameters in the snow field image data, so that the high-precision snow surface image is optimized and processed, and the possibility that the snow surface image is unclear and has flaw points due to reflection is eliminated.
Optionally, the generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter includes: inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
Specifically, in order to increase the calculation efficiency of the hard light parameter and the weak light parameter, a reflection coefficient calculation model may be trained in advance through a plurality of historical data, and the model may utilize a GAN neural network to train the resistive hidden layer, so as to speed up and perfect the control of the input vector and the accuracy assurance of the output vector, for example, the generating of the snow surface reflection coefficient and the high-accuracy snow surface image according to the hard light parameter and the weak light parameter includes: inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
Through the embodiment, the technical problems that in the snow field snow surface image optimization process in the prior art, the operation of acquiring original image data is performed only by fusing environmental factors of the snow field and adjusting the running parameters of the equipment, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and further image optimization is performed by using the parameters suitable for snow surface reflection of the snow field are solved.
Example two
Fig. 2 is a block diagram of () according to an embodiment of the present invention, and as shown in fig. 2, the apparatus includes:
an obtaining module 20 is configured to obtain first snow surface image data.
Specifically, in order to solve the technical problems that in the snow field snow surface image optimization process in the prior art, the operation of acquiring original image data is performed only by fusing environmental factors of the snow field and adjusting the running parameters of the device, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and further image optimization is performed by using the parameters suitable for snow surface reflection of the snow field, first snow surface image data needs to be acquired according to high-precision multi-lens camera equipment configured on the snow surface of the snow field, wherein the first snow surface image data can be an acquired original snow surface imaging image, and the pixels and the imaging mode of the first snow surface image data adopt the most direct imaging effect, so that the reflection coefficient can be calculated and generated according to the acquired image data in the subsequent process.
And the processing module 22 is configured to perform binarization grayscale processing on the first snow surface image data to obtain second snow surface image data.
Optionally, the processing module includes: the extraction unit is used for extracting a target gray-scale factor from the binary gray-scale factor matrix according to the precision requirement of the snow field image; a processing unit configured to generate a grayscale processing result according to the target grayscale factor and the first snow surface image data, wherein the grayscale processing result includes: the second snow surface image data and the gray processing parameter.
Specifically, in order to process a first snow surface image so as to generate second snow surface image data with a gray scale feature and further perform subsequent reflection coefficient calculation work, binary gray scale processing needs to be performed on the first snow surface image, wherein advanced processing of the binary black-and-white processing is to determine a linkage coefficient between black and white according to a gray scale factor, so as to perform input of a gray scale parameter and gray scale processing to a certain degree, and thereby find out a prerequisite parameter quantity of reflection coefficient calculation, for example, performing binary gray scale processing on the first snow surface image data to obtain the second snow surface image data includes: extracting a target gray-scale factor from a binary gray-scale factor matrix according to the precision requirement of the snow field image; generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter, wherein the binary gray factor matrix is used for corresponding to a quadratic matrix according to binarization, such as [ H, W ], H is a binary gray factor constant, and W is a corresponding precision requirement, that is, different precision requirements and requirements of a reflection coefficient are different for extraction of the gray factor in the binarization process.
It should be noted that, in the field of computers, a binary Gray scale (Gray scale) digital image is an image in which each pixel has only one sampling color. Such images are typically displayed in gray scale from darkest black to brightest white, although in theory this sampling could be different shades of any color and even different colors at different brightnesses. The gray image is different from the black and white image, the black and white image only has two colors of black and white in the computer image field, and the gray image has a plurality of levels of color depth between black and white. However, outside the field of digital images, a "black-and-white image" also means a "gray-scale image", and for example, a photograph of gray scale is generally called a "black-and-white photograph". In some articles relating to digital images, monochrome images are equivalent to grayscale images, and in other articles to black and white images. Grayscale images are often obtained by measuring the brightness of each pixel within a single electromagnetic spectrum, such as visible light. The gray scale image for display is typically stored with a non-linear scale of 8bits per sample pixel, so that there are 256 gray scales (8 bits is 2 to the power of 8 = 256). This accuracy just avoids visible banding distortion and is very easy to program. More stages are often employed in the application of these techniques to medical and remote sensing images to take full advantage of the 10 or 12bits per sample sensor accuracy and avoid approximation errors in the calculations. 16bits, namely 65536 combinations (or 65536 colors) are popular in such applications.
And the extracting module 24 is configured to extract image units greater than a preset grayscale threshold in the second snow surface image data, and combine the houss factor with the image units greater than the preset grayscale threshold to obtain a strong light parameter and a weak light parameter.
Optionally, the extracting module includes: the generating unit is used for generating the preset gray threshold according to the gray processing parameters; the comparison unit is used for comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value; the calculation unit is configured to calculate, through the image unit larger than the preset grayscale threshold and the houss factor, the highlight parameter and the dim parameter, where the calculation formula is:
[L1,L2]=H -1 P w
wherein L1 and L2 are each independentlyThe strong light parameter and the weak light parameter are described, H is a Hoss factor used for calculating a reflection coefficient, P w And the image unit data sets are the image unit data sets which are larger than the preset gray threshold.
And the generating module 26 is used for generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter.
Specifically, after the strong light parameter and the weak light parameter are obtained, the snow surface reflection coefficient needs to be calculated according to the two variable parameters in the snow field image data, so that the high-precision snow surface image is optimized and processed, and the possibility that the snow surface image is unclear and has flaw points due to reflection is eliminated.
Optionally, the generating module includes: the input unit is used for inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and the optimization unit is used for optimizing the first snow surface image data by utilizing the snow surface reflection coefficient to obtain the high-precision snow surface image.
Specifically, in order to increase the calculation efficiency of the hard light parameter and the weak light parameter, a reflection coefficient calculation model may be trained in advance through a plurality of historical data, and the model may utilize a GAN neural network to train the resistive hidden layer, so as to speed up and perfect the control of the input vector and the accuracy assurance of the output vector, for example, the generating of the snow surface reflection coefficient and the high-accuracy snow surface image according to the hard light parameter and the weak light parameter includes: inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
Through the embodiment, the technical problems that in the snow field snow surface image optimization process in the prior art, the operation of acquiring original image data is performed only by fusing environmental factors of the snow field and adjusting the running parameters of the equipment, so that the reflection coefficient cannot be calculated according to the snow surface reflection condition of the snow field, and further image optimization is performed by using the parameters suitable for snow surface reflection of the snow field are solved.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium including a stored program, wherein the program controls, when running, an apparatus in which the non-volatile storage medium is located to execute a snow surface reflectance generation method based on high-precision image capturing.
Specifically, the method comprises the following steps: acquiring first snow surface image data; carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data; extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter; and generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter. Optionally, the performing binarization grayscale processing on the first snow surface image data to obtain second snow surface image data includes: extracting a target gray-scale factor from a binary gray-scale factor matrix according to the precision requirement of the snow field image; generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter. Optionally, the extracting of the image unit larger than the preset grayscale threshold from the second snow surface image data, and combining the houss factor with the image unit larger than the preset grayscale threshold to obtain the hard light parameter and the weak light parameter includes: generating the preset gray threshold according to the gray processing parameters; comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value; calculating the image unit larger than the preset gray threshold value and the Hoss factor to obtain the strong light parameter and the weak light parameter, wherein the calculation formula is as follows:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, and H is a Hoss factor for calculating a reflection coefficient,P w And the image unit data set is the image unit data set which is larger than the preset gray threshold. Optionally, the generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter includes: inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the storage is stored with computer readable instructions, and the processor is used for executing the computer readable instructions, wherein the computer readable instructions execute a snow surface reflection coefficient generation method based on high-precision shooting when running.
Specifically, the method comprises the following steps: acquiring first snow surface image data; carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data; extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter; and generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameters and the weak light parameters. Optionally, the performing binarization grayscale processing on the first snow surface image data to obtain second snow surface image data includes: extracting a target gray-scale factor from a binary gray-scale factor matrix according to the precision requirement of the snow field image; generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter. Optionally, the extracting image units larger than a preset gray threshold from the second snow surface image data, and combining the hounsfield factor with the image units larger than the preset gray threshold to obtain the hard light parameter and the weak light parameter includes: generating the preset gray threshold according to the gray processing parameters; comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value; and calculating the image unit which is larger than the preset gray threshold value and the Hoss factor to obtain the strong light parameter and the weak light parameter, wherein the calculation formula is as follows:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, H is a Hoss factor for calculating a reflection coefficient, P is w And the image unit data sets are the image unit data sets which are larger than the preset gray threshold. Optionally, the generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter includes: inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient; and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be an indirect coupling or communication connection through some interfaces, units or modules, and may be electrical or in other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, fig. 3 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to implement communication connections between the elements. The memory 33 may comprise a high speed RAM memory, and may also include a non-volatile memory NVM, such as at least one disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented by, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through a wired or wireless connection.
Optionally, the input device 30 may include a variety of input devices, for example, at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; optionally, the transceiver may be a radio frequency transceiver chip with a communication function, a baseband processing chip, a transceiver antenna, and the like. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, a sound, or other output device.
In this embodiment, the processor of the terminal device includes a module for executing the functions of the modules of the data processing apparatus in each device, and specific functions and technical effects may refer to the foregoing embodiments, which are not described herein again.
Fig. 4 is a schematic diagram of a hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of fig. 3 in an implementation process. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the method in the above-described embodiment.
The memory 42 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The memory 42 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, the processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. Processing components 40 may include one or more processors 41 to execute instructions to perform all or a portion of the steps of the above-described method. Further, processing component 40 may include one or more modules that facilitate interaction between processing component 40 and other components. For example, the processing component 40 may include a multimedia module to facilitate interaction between the multimedia component 45 and the processing component 40.
The power supply component 44 provides power to the various components of the terminal device. The power components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia component 45 includes a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a voice recognition mode. The received audio signal may further be stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 also includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing component 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor assembly 48 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor assembly 48 may detect the open/closed status of the terminal device, the relative positioning of the components, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot for inserting a SIM card therein, so that the terminal device can log on to a GPRS network and establish communication with the server via the internet.
From the above, the communication component 43, the audio component 46, the input/output interface 47 and the sensor component 48 referred to in the embodiment of fig. 4 can be implemented as the input device in the embodiment of fig. 3.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention, which is substantially or partly contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A snow surface reflection coefficient generation method based on high-precision camera shooting is characterized by comprising the following steps:
acquiring first snow surface image data;
carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data;
extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter;
and generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameters and the weak light parameters.
2. The method according to claim 1, wherein the subjecting the first snow surface image data to binarization grayscale processing to obtain second snow surface image data comprises:
extracting a target gray factor from the binary gray factor matrix according to the precision requirement of the snow field image;
generating a gray processing result according to the target gray factor and the first snow surface image data, wherein the gray processing result comprises: the second snow surface image data and the gray processing parameter.
3. The method of claim 2, wherein the extracting image units in the second snow surface image data that are larger than a preset gray threshold and combining a houss factor with the image units that are larger than the preset gray threshold to obtain a high light parameter and a low light parameter comprises:
generating the preset gray threshold according to the gray processing parameters;
comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value;
calculating the image unit larger than the preset gray threshold value and the Hoss factor to obtain the strong light parameter and the weak light parameter, wherein the calculation formula is as follows:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, H is a Hoss factor for calculating a reflection coefficient, P is w And the image unit data sets are the image unit data sets which are larger than the preset gray threshold.
4. The method of claim 1, wherein generating a snow surface reflectance and a high precision snow surface image from the hard light parameter and the soft light parameter comprises:
inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient;
and optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
5. The utility model provides a snow face reflection coefficient generates device based on high accuracy is made a video recording which characterized in that includes:
the acquisition module is used for acquiring first snow surface image data;
the processing module is used for carrying out binarization gray level processing on the first snow surface image data to obtain second snow surface image data;
the extraction module is used for extracting image units which are larger than a preset gray threshold value in the second snow surface image data, and combining the Hoss factor with the image units which are larger than the preset gray threshold value to obtain a strong light parameter and a weak light parameter;
and the generating module is used for generating a snow surface reflection coefficient and a high-precision snow surface image according to the strong light parameter and the weak light parameter.
6. The apparatus of claim 5, wherein the processing module comprises:
the extraction unit is used for extracting a target gray-scale factor from the binary gray-scale factor matrix according to the precision requirement of the snow field image;
a processing unit configured to generate a grayscale processing result according to the target grayscale factor and the first snow surface image data, wherein the grayscale processing result includes: the second snow surface image data and the gray processing parameter.
7. The method of claim 6, wherein the extraction module comprises:
the generating unit is used for generating the preset gray threshold according to the gray processing parameters;
the comparison unit is used for comparing the gray parameter of the second snow surface image data with the preset gray threshold value to obtain the image unit which is larger than the preset gray threshold value;
the calculation unit is configured to calculate, through the image unit larger than the preset grayscale threshold and the houss factor, the highlight parameter and the dim parameter, where the calculation formula is:
[L1,L2]=H -1 P w
wherein L1 and L2 are the strong light parameter and the weak light parameter, respectively, H is a Hoss factor for calculating a reflection coefficient, P is w And the image unit data sets are the image unit data sets which are larger than the preset gray threshold.
8. The apparatus of claim 5, wherein the generating module comprises:
the input unit is used for inputting the strong light parameters and the weak light parameters into a reflection coefficient model to obtain the snow surface reflection coefficient;
and the optimization unit is used for optimizing the first snow surface image data by using the snow surface reflection coefficient to obtain the high-precision snow surface image.
9. A non-volatile storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211087163.9A CN115484408A (en) | 2022-09-07 | 2022-09-07 | Snow surface reflection coefficient generation method and device based on high-precision camera shooting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211087163.9A CN115484408A (en) | 2022-09-07 | 2022-09-07 | Snow surface reflection coefficient generation method and device based on high-precision camera shooting |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115484408A true CN115484408A (en) | 2022-12-16 |
Family
ID=84423652
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211087163.9A Pending CN115484408A (en) | 2022-09-07 | 2022-09-07 | Snow surface reflection coefficient generation method and device based on high-precision camera shooting |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115484408A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116402935A (en) * | 2023-03-28 | 2023-07-07 | 北京拙河科技有限公司 | Image synthesis method and device based on ray tracing algorithm |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101719989A (en) * | 2009-11-30 | 2010-06-02 | 北京中星微电子有限公司 | Method and system for backlight compensation |
CN105678240A (en) * | 2015-12-30 | 2016-06-15 | 哈尔滨工业大学 | Image processing method for removing the reflect light of roads |
CN106650743A (en) * | 2016-09-12 | 2017-05-10 | 北京旷视科技有限公司 | Strong light reflection detection method and device of image |
CN107194881A (en) * | 2017-03-23 | 2017-09-22 | 南京汇川图像视觉技术有限公司 | A kind of removal image reflex reflector and method based on photometric stereo |
WO2021143281A1 (en) * | 2020-01-13 | 2021-07-22 | 华为技术有限公司 | Color shading correction method, terminal device, and computer-readable storage medium |
CN114399441A (en) * | 2022-01-13 | 2022-04-26 | 成都希格玛光电科技有限公司 | Image strong reflection inhibition method and system |
-
2022
- 2022-09-07 CN CN202211087163.9A patent/CN115484408A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101719989A (en) * | 2009-11-30 | 2010-06-02 | 北京中星微电子有限公司 | Method and system for backlight compensation |
CN105678240A (en) * | 2015-12-30 | 2016-06-15 | 哈尔滨工业大学 | Image processing method for removing the reflect light of roads |
CN106650743A (en) * | 2016-09-12 | 2017-05-10 | 北京旷视科技有限公司 | Strong light reflection detection method and device of image |
CN107194881A (en) * | 2017-03-23 | 2017-09-22 | 南京汇川图像视觉技术有限公司 | A kind of removal image reflex reflector and method based on photometric stereo |
WO2021143281A1 (en) * | 2020-01-13 | 2021-07-22 | 华为技术有限公司 | Color shading correction method, terminal device, and computer-readable storage medium |
CN114399441A (en) * | 2022-01-13 | 2022-04-26 | 成都希格玛光电科技有限公司 | Image strong reflection inhibition method and system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116402935A (en) * | 2023-03-28 | 2023-07-07 | 北京拙河科技有限公司 | Image synthesis method and device based on ray tracing algorithm |
CN116402935B (en) * | 2023-03-28 | 2024-01-19 | 北京拙河科技有限公司 | Image synthesis method and device based on ray tracing algorithm |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3614333B1 (en) | Image processing method, storage medium, and electronic apparatus | |
CN108701439B (en) | Image display optimization method and device | |
CN105306788B (en) | A kind of noise-reduction method and device of image of taking pictures | |
CN109086742A (en) | scene recognition method, scene recognition device and mobile terminal | |
CN115631122A (en) | Image optimization method and device for edge image algorithm | |
CN115330626A (en) | Picture transformation method and device based on mesh grid network decomposition | |
WO2023005818A1 (en) | Noise image generation method and apparatus, electronic device, and storage medium | |
CN115484408A (en) | Snow surface reflection coefficient generation method and device based on high-precision camera shooting | |
CN115115526A (en) | Image processing method and apparatus, storage medium, and graphic calculation processor | |
CN115984126A (en) | Optical image correction method and device based on input instruction | |
CN104954627B (en) | A kind of information processing method and electronic equipment | |
CN116614453B (en) | Image transmission bandwidth selection method and device based on cloud interconnection | |
CN115474091A (en) | Motion capture method and device based on decomposition metagraph | |
CN210605753U (en) | System for recognizing cigarette brand display condition of retail merchant | |
CN104504653A (en) | Image enhancing method and device | |
CN116363006A (en) | Image calibration method and device based on normal algorithm | |
CN113706438A (en) | Image processing method, related device, equipment, system and storage medium | |
CN116468883B (en) | High-precision image data volume fog recognition method and device | |
CN115511735B (en) | Snow field gray scale picture optimization method and device | |
CN116664413B (en) | Image volume fog eliminating method and device based on Abbe convergence operator | |
CN115205313B (en) | Picture optimization method and device based on least square algorithm | |
CN116758165B (en) | Image calibration method and device based on array camera | |
CN116402935B (en) | Image synthesis method and device based on ray tracing algorithm | |
CN116723419B (en) | Acquisition speed optimization method and device for billion-level high-precision camera | |
CN116723298B (en) | Method and device for improving transmission efficiency of camera end |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221216 |