CN111161198A - Control method and device of imaging equipment, storage medium and processor - Google Patents
Control method and device of imaging equipment, storage medium and processor Download PDFInfo
- Publication number
- CN111161198A CN111161198A CN201911268903.7A CN201911268903A CN111161198A CN 111161198 A CN111161198 A CN 111161198A CN 201911268903 A CN201911268903 A CN 201911268903A CN 111161198 A CN111161198 A CN 111161198A
- Authority
- CN
- China
- Prior art keywords
- image
- light
- low
- fusion
- level
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 78
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000004927 fusion Effects 0.000 claims abstract description 56
- 238000011156 evaluation Methods 0.000 claims abstract description 54
- 238000013441 quality evaluation Methods 0.000 claims abstract description 29
- 230000000007 visual effect Effects 0.000 claims abstract description 12
- 238000001931 thermography Methods 0.000 claims abstract description 11
- 230000007613 environmental effect Effects 0.000 claims description 27
- 238000012163 sequencing technique Methods 0.000 claims description 12
- 238000007781 pre-processing Methods 0.000 claims description 9
- 238000010801 machine learning Methods 0.000 claims description 6
- 238000012549 training Methods 0.000 claims description 6
- 238000003709 image segmentation Methods 0.000 claims description 5
- 230000009466 transformation Effects 0.000 claims description 5
- 238000007499 fusion processing Methods 0.000 claims description 4
- 230000000694 effects Effects 0.000 abstract description 12
- 238000005516 engineering process Methods 0.000 abstract description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004297 night vision Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 238000009412 basement excavation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000005728 strengthening Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a control method and device of an imaging device, a storage medium and a processor. Wherein, the method comprises the following steps: acquiring a low-light-level image and an infrared image of a target object under the same visual angle by adopting imaging equipment; fusing the low-light-level image and the infrared image based on a predetermined pixel point to obtain a fused image; identifying to obtain evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to the image quality evaluation model; determining an operation mode of the imaging device according to the evaluation result, wherein the operation mode at least comprises one of the following modes: low light level, thermal imaging, fusion of black and white, and fusion of color. The invention solves the technical problem of poor imaging effect caused by adopting a single imaging mode in the related technology.
Description
Technical Field
The invention relates to the technical field of imaging equipment control, in particular to a method and a device for controlling imaging equipment, a storage medium and a processor.
Background
With the construction and development of cities, the underground arrangement of power transmission and distribution lines is very important in improving urban environment and strengthening urban disaster prevention safety. At present, the power line is mostly laid by adopting underground power transmission and distribution lines, slotting, jacking pipes, shallow-buried underground excavation, shield and other methods. The working range of the electric power tunnel engineering at the present stage is mainly an underground area. In the case of performing some heavy construction work, night work is indispensable. Because night work is obviously different from ordinary work, the working environment mainly embodied in night work is mostly night, and the requirement for light is beyond the ordinary work.
To night work, adopt shimmer night vision to equip or infrared detection equipment mostly among the prior art, make like this that the formation of image mode is more single, and receive the influence of environmental factor easily, the formation of image effect is poor, can not realize the high quality formation of image to the target object.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a control method and device of imaging equipment, a storage medium and a processor, which are used for at least solving the technical problem of poor imaging effect caused by a single imaging mode in the related art.
According to an aspect of an embodiment of the present invention, there is provided a control method of an image forming apparatus, including: acquiring a low-light-level image and an infrared image of a target object under the same visual angle by adopting imaging equipment; fusing the low-light-level image and the infrared image based on a preset pixel point to obtain a fused image; identifying and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to an image quality evaluation model; determining an operation mode of the imaging device according to the evaluation result, wherein the operation mode at least comprises one of the following modes: low light level, thermal imaging, fusion of black and white, and fusion of color.
Optionally, the acquiring, by the imaging device, the low-light-level image and the infrared image of the target object at the same viewing angle includes: acquiring environmental parameters of the target object, wherein the environmental parameters at least comprise one of the following parameters: temperature, humidity, brightness; and obtaining the low-light-level image and the infrared image based on a shooting mode corresponding to the environmental parameter, wherein the shooting mode is obtained by matching the environmental parameter with a preset environmental parameter in a preset shooting template.
Optionally, after obtaining the low-light-level image and the infrared image based on the shooting mode corresponding to the environmental parameter, acquiring the low-light-level image and the infrared image of the target object at the same viewing angle by using an imaging device further includes: pre-processing the low-light image and the infrared image, wherein the pre-processing comprises at least one of: image transformation, image enhancement and restoration, image segmentation and image denoising.
Optionally, the predetermined pixel point is a feature point of the target object in the image, and the obtaining of the fused image by fusing the low-light-level image and the infrared image based on the predetermined pixel point includes: associating the preset pixel points corresponding to the low-light-level images and the infrared images by utilizing a normalized correlation algorithm to obtain an association result; and determining the fused image according to the correlation result.
Optionally, before identifying, according to an image quality evaluation model, an evaluation result corresponding to the low-light-level image, the infrared image, and the fused image, the method further includes: constructing the image quality evaluation model, wherein the image quality evaluation model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: different types of the low-light images, the infrared images, the fusion images and corresponding evaluation results thereof.
Optionally, determining the operating mode of the imaging device according to the evaluation result comprises: sequencing the evaluation results according to a preset rule to obtain a sequencing result; and determining the working mode according to the sequencing result.
Optionally, the ranking the evaluation results according to a predetermined rule, and obtaining the ranking results includes: acquiring weights corresponding to the low-light-level image, the infrared image and the fusion image; and determining the sorting result according to the evaluation result and the weight.
According to another aspect of the embodiments of the present invention, there is also provided a control device of an image forming apparatus, including: the acquisition module is used for acquiring a low-light-level image and an infrared image of a target object under the same visual angle by adopting imaging equipment; the fusion module is used for carrying out fusion processing on the low-light-level image and the infrared image based on a preset pixel point to obtain a fusion image; the identification module is used for identifying and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to an image quality evaluation model; a determining module, configured to determine an operating mode of the imaging device according to the evaluation result, where the operating mode includes at least one of: low light level, thermal imaging, fusion of black and white, and fusion of color.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein when the program runs, an apparatus in which the storage medium is located is controlled to execute the control method of the imaging apparatus according to any one of the above.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program executes the control method of the imaging apparatus described in any one of the above.
In the embodiment of the invention, an imaging device is adopted to collect low-light-level images and infrared images of a target object under the same visual angle; fusing the low-light-level image and the infrared image based on a preset pixel point to obtain a fused image; identifying and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to an image quality evaluation model; determining an operation mode of the imaging device according to the evaluation result, wherein the operation mode at least comprises one of the following modes: the method comprises the following steps of glimmer imaging, thermal imaging, black and white fusion and color fusion, and by utilizing the evaluation results corresponding to glimmer images, infrared images and fusion images of an image quality evaluation model, the working mode of the imaging equipment is determined, the purpose of imaging according to the working mode of the imaging equipment selected according to the image quality is achieved, the technical effect of high-quality imaging is achieved, and the technical problem that the imaging effect is poor due to the fact that a single imaging mode is adopted in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of a control method of an image forming apparatus according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a control device of an image forming apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
According to an embodiment of the present invention, there is provided an embodiment of a control method of an image forming apparatus, it is noted that the steps illustrated in the flowchart of the drawings may be executed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be executed in an order different from that herein.
Fig. 1 is a flowchart of a control method of an image forming apparatus according to an embodiment of the present invention, as shown in fig. 1, the control method of the image forming apparatus including the steps of:
step S102, acquiring a low-light-level image and an infrared image of a target object under the same visual angle by adopting imaging equipment;
the imaging equipment is a fusion telescope, low-light night vision equipment with a thermal infrared imager and the like; the target object can be a work space with dark light, such as an electric power tunnel, a mine and other night work occasions; the visual angle is an included angle formed by a central point of the lens and two ends of a diagonal line of the imaging plane, wherein the low-light-level image and the infrared image of the collected target object have strong relevance under the same visual angle, and different imaging effects of the target object can be obtained.
Step S104, fusing the low-light-level image and the infrared image based on a preset pixel point to obtain a fused image;
because the low-light-level image and the infrared image are images at the same visual angle, the pixel point of the low-light-level image and the pixel point of the infrared image are easier to fuse, the preset pixel point is the characteristic point of the target object in the image, and then the characteristic point of the target object in the image is fused, so that a high-efficiency and high-quality fused image can be obtained, and the defect of high complexity of the existing image fusion is avoided.
Step S106, identifying and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to the image quality evaluation model;
the image quality evaluation model is obtained by training based on machine learning, wherein a large amount of image information in a scene and corresponding evaluation results are adopted for training. In the specific implementation process, any one of the low-light-level image, the infrared image and the fusion image is input into the image quality evaluation model, and the corresponding evaluation result can be output.
The evaluation result is used to feed back an image quality index, wherein the image quality index is a score of image resolution, color depth, and image distortion.
The fused image at least comprises fused black and white and fused color.
Step S108, determining the working mode of the imaging device according to the evaluation result, wherein the working mode at least comprises one of the following modes: low light level, thermal imaging, fusion of black and white, and fusion of color.
In implementation, the operation mode of the imaging device may be selected according to the evaluation result, and specifically, may be any one of low light level, thermal imaging, black-and-white fusion, and color fusion. Therefore, by the method, the high-quality imaging working mode can be selected according to the image quality, so that the imaging equipment performs imaging in the working mode, and the problem of poor imaging effect caused by single imaging mode is avoided.
Through the steps, the low-light-level image and the infrared image of the target object under the same visual angle can be acquired by adopting the imaging equipment; fusing the low-light-level image and the infrared image based on a predetermined pixel point to obtain a fused image; identifying to obtain evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to the image quality evaluation model; determining an operation mode of the imaging device according to the evaluation result, wherein the operation mode at least comprises one of the following modes: the method comprises the following steps of glimmer imaging, thermal imaging, black and white fusion and color fusion, and by utilizing the evaluation results corresponding to glimmer images, infrared images and fusion images of an image quality evaluation model, the working mode of the imaging equipment is determined, the purpose of imaging according to the working mode of the imaging equipment selected according to the image quality is achieved, the technical effect of high-quality imaging is achieved, and the technical problem that the imaging effect is poor due to the fact that a single imaging mode is adopted in the related technology is solved.
Optionally, the acquiring, by the imaging device, the low-light-level image and the infrared image of the target object at the same viewing angle includes: acquiring an environmental parameter of a target object, wherein the environmental parameter at least comprises one of the following parameters: temperature, humidity, brightness; and obtaining the low-light-level image and the infrared image based on a shooting mode corresponding to the environmental parameters, wherein the shooting mode is obtained by matching the environmental parameters with preset environmental parameters in a preset shooting template.
The environment parameters have influence on the imaging effect of the target object, and the optimal shooting mode can be automatically selected according to different environment parameters, so that the adverse influence of the environment parameters on imaging is reduced, and the imaging effect is improved.
The shooting modes can be of various types, and different shooting modes can be obtained according to matching of the environmental parameters and the preset environmental parameters in the preset shooting template.
Optionally, after obtaining the low-light-level image and the infrared image based on the shooting mode corresponding to the environmental parameter, acquiring the low-light-level image and the infrared image of the target object at the same viewing angle by using the imaging device further includes: preprocessing the low-light image and the infrared image, wherein the preprocessing at least comprises one of the following steps: image transformation, image enhancement and restoration, image segmentation and image denoising.
In order to further improve the image quality, the acquired low-light-level image and the infrared image can be preprocessed, and in the implementation process, the preprocessing comprises but is not limited to image transformation, image enhancement and restoration, image segmentation and image denoising.
Optionally, the predetermined pixel point is a feature point of the target object in the image, and the fusion processing is performed on the low-light-level image and the infrared image based on the predetermined pixel point to obtain a fused image includes: associating preset pixel points corresponding to the low-light-level image and the infrared image by utilizing a normalized correlation algorithm to obtain an association result; and determining a fused image according to the correlation result.
It should be noted that the predetermined pixel point may be a feature point of the target object in the image, or a feature point selected according to needs, and is not limited in the implementation process.
In the process of fusing the images, the predetermined pixel points corresponding to the low-light images and the predetermined pixel points corresponding to the infrared images can be associated by utilizing a normalized correlation algorithm, so that the fused images can be quickly and accurately obtained.
Optionally, before identifying, according to the image quality evaluation model, an evaluation result corresponding to the low-light-level image, the infrared image, and the fused image, the method further includes: constructing an image quality evaluation model, wherein the image quality evaluation model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: different types of low-light images, infrared images and fused images and corresponding evaluation results thereof.
In order to feed back the image quality more efficiently, a large amount of data is trained in a machine learning mode to obtain an image quality evaluation model. The image quality evaluation model obtained in this way can more accurately obtain an evaluation result corresponding to the input image.
It should be noted that each of the data sets includes: the low-light-level images, the infrared images and the fusion images of different types and corresponding evaluation results are obtained from different application scenes, so that the image quality evaluation model can be applied to various scenes, and the identification range is enlarged.
Optionally, determining the operating mode of the imaging device according to the evaluation result comprises: sequencing the evaluation results according to a preset rule to obtain a sequencing result; and determining the working mode according to the sequencing result.
The predetermined rule is a priority rule for determining an operation mode. According to a predetermined rule, the ranking of the evaluation results can be obtained, and further the corresponding working mode can be determined. It should be noted that different predetermined rules may obtain the sorting result corresponding to the predetermined rule. In a specific implementation process, an optimal working mode can be selected according to the sequencing result, and the imaging device can obtain a high-quality imaging effect in the optimal working mode.
Optionally, the ranking the evaluation results according to a predetermined rule, and obtaining the ranking results includes: acquiring weights corresponding to the low-light-level image, the infrared image and the fusion image; and determining a sequencing result according to the evaluation result and the weight.
In the implementation process, the sorting can be performed according to the weight and the evaluation result corresponding to the low-light-level image, the infrared image and the fusion image. For example, the evaluation result may reflect image qualities corresponding to the low-light-level image, the infrared image, and the fusion image, but may also determine weights corresponding to the low-light-level image, the infrared image, and the fusion image in a specific application scene according to requirements of different application scenes, so as to obtain a ranking result satisfying the requirements.
Optionally, determining the operating mode of the imaging device according to the evaluation result further includes: receiving an input signal by using an input device, wherein the input signal is used for triggering a corresponding evaluation result; the operating mode of the imaging device is selected in accordance with the input signal. The input device may be a device included in the imaging apparatus itself, or may be a device externally connected to the imaging apparatus. By the mode, the working mode of the imaging equipment can be flexibly selected.
Optionally, after determining the operating mode of the imaging device according to the evaluation result, the method further includes: imaging is performed with the display device in the above-described operating mode. The imaging device can perform imaging such as low-light-level imaging, thermal imaging, black-white fusion and color fusion according to different working modes.
Example 2
According to another aspect of an embodiment of the present invention, there is also provided an apparatus embodiment for executing the control method of the image forming apparatus in embodiment 1 described above, and fig. 2 is a schematic diagram of a control apparatus of the image forming apparatus according to an embodiment of the present invention, as shown in fig. 2, the control apparatus of the image forming apparatus including: an acquisition module 22, a fusion module 24, an identification module 26, and a determination module 28. The control device of the image forming apparatus is explained in detail below.
The acquisition module 22 is used for acquiring a low-light-level image and an infrared image of the target object under the same visual angle by adopting the imaging equipment;
a fusion module 24, connected to the acquisition module 22, for performing fusion processing on the low-light-level image and the infrared image based on a predetermined pixel point to obtain a fused image;
the recognition module 26 is connected to the fusion module 24 and is used for recognizing and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to the image quality evaluation model;
a determining module 28, connected to the identifying module 26, for determining an operation mode of the imaging device according to the evaluation result, the operation mode including at least one of: low light level, thermal imaging, fusion of black and white, and fusion of color.
It should be noted here that the above-mentioned acquisition module 22, fusion module 24, recognition module 26 and determination module 28 correspond to steps S102 to S108 in embodiment 1, and the above-mentioned modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of embodiment 1. It should be noted that the modules described above as part of an apparatus may be implemented in a computer system such as a set of computer-executable instructions.
Optionally, the acquisition module includes: an obtaining unit, configured to obtain an environmental parameter of a target object, where the environmental parameter at least includes one of: temperature, humidity, brightness; and the shooting unit is used for obtaining the low-light-level image and the infrared image based on a shooting mode corresponding to the environmental parameters, wherein the shooting mode is obtained by matching the environmental parameters with preset environmental parameters in a preset shooting template.
Optionally, after obtaining the low-light-level image and the infrared image based on the shooting mode corresponding to the environmental parameter, the acquisition module further includes: the processing unit is used for preprocessing the low-light-level image and the infrared image, wherein the preprocessing at least comprises one of the following steps: image transformation, image enhancement and restoration, image segmentation and image denoising.
Optionally, the predetermined pixel point is a feature point of the target object in the image, and the fusion module includes: the correlation unit is used for correlating preset pixel points corresponding to the low-light-level images and the infrared images by utilizing a normalized correlation algorithm to obtain a correlation result; and the first determining unit is used for determining the fused image according to the correlation result.
Optionally, before identifying, according to the image quality evaluation model, an evaluation result corresponding to the low-light-level image, the infrared image, and the fusion image, the apparatus further includes: the image quality evaluation module is used for constructing an image quality evaluation model, wherein the image quality evaluation model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: different types of low-light images, infrared images and fused images and corresponding evaluation results thereof.
Optionally, the determining module includes: the ordering unit is used for ordering the evaluation results according to a preset rule to obtain an ordering result; and the second determining unit is used for determining the working mode according to the sequencing result.
Optionally, the sorting unit includes: the acquisition subunit is used for acquiring the weights corresponding to the low-light-level image, the infrared image and the fusion image; and the determining subunit is used for determining the sorting result according to the evaluation result and the weight.
Example 3
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein the apparatus in which the storage medium is located is controlled to execute the control method of the image forming apparatus of any one of the above when the program is executed.
Example 4
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, wherein the program executes the control method of the imaging apparatus of any one of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A control method of an image forming apparatus, characterized by comprising:
acquiring a low-light-level image and an infrared image of a target object under the same visual angle by adopting imaging equipment;
fusing the low-light-level image and the infrared image based on a preset pixel point to obtain a fused image;
identifying and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to an image quality evaluation model;
determining an operation mode of the imaging device according to the evaluation result, wherein the operation mode at least comprises one of the following modes: low light level, thermal imaging, fusion of black and white, and fusion of color.
2. The method of claim 1, wherein acquiring the low-light image and the infrared image of the target object at the same viewing angle by using the imaging device comprises:
acquiring environmental parameters of the target object, wherein the environmental parameters at least comprise one of the following parameters: temperature, humidity, brightness;
and obtaining the low-light-level image and the infrared image based on a shooting mode corresponding to the environmental parameter, wherein the shooting mode is obtained by matching the environmental parameter with a preset environmental parameter in a preset shooting template.
3. The method of claim 2, wherein after obtaining the low-light-level image and the infrared image based on the shooting mode corresponding to the environmental parameter, acquiring the low-light-level image and the infrared image of the target object at the same viewing angle by using an imaging device further comprises:
pre-processing the low-light image and the infrared image, wherein the pre-processing comprises at least one of: image transformation, image enhancement and restoration, image segmentation and image denoising.
4. The method of claim 1, wherein the predetermined pixel points are feature points of the target object in an image, and fusing the low-light-level image and the infrared image based on the predetermined pixel points to obtain a fused image comprises:
associating the preset pixel points corresponding to the low-light-level images and the infrared images by utilizing a normalized correlation algorithm to obtain an association result;
and determining the fused image according to the correlation result.
5. The method of claim 1, wherein before identifying the evaluation results corresponding to the low-light image, the infrared image, and the fused image according to an image quality evaluation model, the method further comprises:
constructing the image quality evaluation model, wherein the image quality evaluation model is obtained by using multiple groups of data through machine learning training, and each group of data in the multiple groups of data comprises: different types of the low-light images, the infrared images, the fusion images and corresponding evaluation results thereof.
6. The method of claim 1, wherein determining an operating mode of an imaging device based on the evaluation comprises:
sequencing the evaluation results according to a preset rule to obtain a sequencing result;
and determining the working mode according to the sequencing result.
7. The method of claim 6, wherein ranking the evaluation results according to a predetermined rule, the obtaining of the ranking results comprises:
acquiring weights corresponding to the low-light-level image, the infrared image and the fusion image;
and determining the sorting result according to the evaluation result and the weight.
8. A control device of an image forming apparatus, characterized by comprising:
the acquisition module is used for acquiring a low-light-level image and an infrared image of a target object under the same visual angle by adopting imaging equipment;
the fusion module is used for carrying out fusion processing on the low-light-level image and the infrared image based on a preset pixel point to obtain a fusion image;
the identification module is used for identifying and obtaining evaluation results corresponding to the low-light-level image, the infrared image and the fusion image according to an image quality evaluation model;
a determining module, configured to determine an operating mode of the imaging device according to the evaluation result, where the operating mode includes at least one of: low light level, thermal imaging, fusion of black and white, and fusion of color.
9. A storage medium characterized by comprising a stored program, wherein an apparatus in which the storage medium is located is controlled to execute the control method of an image forming apparatus according to any one of claims 1 to 7 when the program is executed.
10. A processor characterized in that the processor is configured to execute a program, wherein the program executes to execute the method of controlling the image forming apparatus according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911268903.7A CN111161198A (en) | 2019-12-11 | 2019-12-11 | Control method and device of imaging equipment, storage medium and processor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911268903.7A CN111161198A (en) | 2019-12-11 | 2019-12-11 | Control method and device of imaging equipment, storage medium and processor |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111161198A true CN111161198A (en) | 2020-05-15 |
Family
ID=70557033
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911268903.7A Pending CN111161198A (en) | 2019-12-11 | 2019-12-11 | Control method and device of imaging equipment, storage medium and processor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111161198A (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101349591A (en) * | 2008-08-29 | 2009-01-21 | 北京理工大学 | Reconfigurable distributed multi-optical spectrum imaging system |
CN104601953A (en) * | 2015-01-08 | 2015-05-06 | 中国航空无线电电子研究所 | Video image fusion-processing system |
CN105447838A (en) * | 2014-08-27 | 2016-03-30 | 北京计算机技术及应用研究所 | Method and system for infrared and low-level-light/visible-light fusion imaging |
CN105654470A (en) * | 2015-12-24 | 2016-06-08 | 小米科技有限责任公司 | Image selection method, device and system |
US20170176139A1 (en) * | 2015-12-22 | 2017-06-22 | Huntercraft Limited | Infrared-light and low-light two-phase fusion night-vision sighting device |
CN109829417A (en) * | 2019-01-28 | 2019-05-31 | 上海市建筑科学研究院 | Optimization method and monitoring device based on infrared thermal imaging technique monitoring computer room hot spot |
CN110428412A (en) * | 2019-07-31 | 2019-11-08 | 北京奇艺世纪科技有限公司 | The evaluation of picture quality and model generating method, device, equipment and storage medium |
-
2019
- 2019-12-11 CN CN201911268903.7A patent/CN111161198A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101349591A (en) * | 2008-08-29 | 2009-01-21 | 北京理工大学 | Reconfigurable distributed multi-optical spectrum imaging system |
CN105447838A (en) * | 2014-08-27 | 2016-03-30 | 北京计算机技术及应用研究所 | Method and system for infrared and low-level-light/visible-light fusion imaging |
CN104601953A (en) * | 2015-01-08 | 2015-05-06 | 中国航空无线电电子研究所 | Video image fusion-processing system |
US20170176139A1 (en) * | 2015-12-22 | 2017-06-22 | Huntercraft Limited | Infrared-light and low-light two-phase fusion night-vision sighting device |
CN105654470A (en) * | 2015-12-24 | 2016-06-08 | 小米科技有限责任公司 | Image selection method, device and system |
CN109829417A (en) * | 2019-01-28 | 2019-05-31 | 上海市建筑科学研究院 | Optimization method and monitoring device based on infrared thermal imaging technique monitoring computer room hot spot |
CN110428412A (en) * | 2019-07-31 | 2019-11-08 | 北京奇艺世纪科技有限公司 | The evaluation of picture quality and model generating method, device, equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110705405B (en) | Target labeling method and device | |
EP3496383A1 (en) | Image processing method, apparatus and device | |
CN108389224B (en) | Image processing method and device, electronic equipment and storage medium | |
CN102538980A (en) | Thermal image device and shooting method for thermal image | |
CN113052066B (en) | Multi-mode fusion method based on multi-view and image segmentation in three-dimensional target detection | |
CN109116129B (en) | Terminal detection method, detection device, system and storage medium | |
CN103063314A (en) | Thermal imaging device and thermal imaging shooting method | |
CN101394573A (en) | Panoramagram generation method and system based on characteristic matching | |
CN107705254B (en) | City environment assessment method based on street view | |
CN111383204A (en) | Video image fusion method, fusion device, panoramic monitoring system and storage medium | |
CN113395440A (en) | Image processing method and electronic equipment | |
CN110866889A (en) | Multi-camera data fusion method in monitoring system | |
CN113052754B (en) | Method and device for blurring picture background | |
CN110188640B (en) | Face recognition method, face recognition device, server and computer readable medium | |
CN105791793A (en) | Image processing method and electronic device | |
CN102088539A (en) | Method and system for evaluating pre-shot picture quality | |
CN110334652B (en) | Image processing method, electronic device, and storage medium | |
CN115035147A (en) | Matting method, device and system based on virtual shooting and image fusion method | |
CN113298177B (en) | Night image coloring method, device, medium and equipment | |
CN111898463B (en) | Smoke and fire detection and identification method and device, storage medium and electronic device | |
CN112329649A (en) | Urban vegetation type identification method, system, equipment and medium | |
CN109919164B (en) | User interface object identification method and device | |
CN111161198A (en) | Control method and device of imaging equipment, storage medium and processor | |
CN110751668A (en) | Image processing method, device, terminal, electronic equipment and readable storage medium | |
CN108062403B (en) | Old scene detection method and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200515 |