EP2831812A1 - Analyse et interprétation de la facilitation de la lumière visible associée et des informations d'image infrarouge (ir) - Google Patents

Analyse et interprétation de la facilitation de la lumière visible associée et des informations d'image infrarouge (ir)

Info

Publication number
EP2831812A1
EP2831812A1 EP13715175.9A EP13715175A EP2831812A1 EP 2831812 A1 EP2831812 A1 EP 2831812A1 EP 13715175 A EP13715175 A EP 13715175A EP 2831812 A1 EP2831812 A1 EP 2831812A1
Authority
EP
European Patent Office
Prior art keywords
image
images
captured
thermography arrangement
imaging system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13715175.9A
Other languages
German (de)
English (en)
Inventor
Katrin Strandemar
Björn Roth
Mats AHLSTRÖM
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Flir Systems AB
Original Assignee
Flir Systems AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flir Systems AB filed Critical Flir Systems AB
Publication of EP2831812A1 publication Critical patent/EP2831812A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • thermographic imaging More specifically, different embodiments of the application relates to facilitating or enabling improved analysis and interpretation of associated visible light or visual light (VL) image and infrared (IR) image information.
  • VL visible light or visual light
  • IR infrared
  • thermography arrangements today comprise a combination of a visible light imaging system and an infrared (IR) imaging system. Since it is often hard to interpret an IR image and relate it to what is shown in a corresponding visible light (VL) image, there also exist suggestions on how analysis and interpretation of captured visible light and IR image data may be enabled.
  • Some prior art references disclose blending or fusion of IR image data and visible light image data with the purpose of enabling easier analysis and interpretation of a depicted scene.
  • a visible light image may be scaled to match an IR image, and further aligned with the IR image, with the purpose of fusing the images into a combined image, comprising both visible light image data and IR image data.
  • a combined image may be stored along with the captured IR and visible light images, the separate images being stored in their original captured format, thereby enabling later retrieval of the original images comprising all captured data.
  • thermography systems do not use all advanced software features available in the thermography system, such as for example the possibility to combine VL image and IR image data to fuse visible light and IR image data. Instead, a user may simply use the thermography arrangement to capture image data during an extended period of time, for example an hour or a day, perhaps store the captured images and thereafter, at site or possibly at a later time, display or retrieve the images from memory for analysis. If the user sees something of interest in an IR image, for example indicating a temperature anomaly or an interesting pattern, the user may turn to a corresponding visible light image for more information on what part of the scene the interesting IR feature relates to.
  • an IR image for example indicating a temperature anomaly or an interesting pattern
  • VL image might depict or represent a different captured view of the real world scene than the captured view of the real world scene of the IR image it is difficult for the user to relate the visible light image information to the IR image information.
  • the interpretation of a combined IR/VL image is difficult when the VL image might depict or represent a different captured view of the real world scene than the captured view of the real world scene of the IR image.
  • Systems and methods are disclosed, in accordance with one or more embodiments, which are directed to providing improved analysis and interpretation of associated visible light and IR image information when the visible light image information and the IR image information are not combined into one image.
  • One or more embodiments may facilitate or enable improved analysis and interpretation of associated visible light and IR image information when the visible light image information and the IR image information are or are not combined into one image, for example using picture in picture, blending or fusing methods. Furthermore, one or more embodiments may facilitate or enable improved analysis and interpretation of an image pair comprising an infrared (IR) image and a visible light image depicting a real world scene, said images being captured using a thermography arrangement comprising an IR imaging system and a visible light imaging system.
  • IR infrared
  • the field of view of the visible light imaging system of the thermography arrangement is typically substantially larger than the field of view of the IR imaging system, the same coordinates in the visible light image and the IR image will not represent the same part of the scene, which makes it difficult for a user to relate the visible light image information to the IR image information.
  • techniques disclosed herein for one or more embodiments may ensure that images or image sequences captured using an infrared (IR) imaging system and a visible light (VL) imaging system, respectively, are displayed and/or stored in a format wherein the images have the same the field of view (FOV). As discussed further herein, this is also referred to as FOV follow functionality, or FOV follow mode.
  • a comparison between images is facilitated. For instance, if an image pair comprising an IR image and a VL image depicting the same scene further depict the scene according to the same FOV, it will be easier for a user to compare what is seen in one of the images to what is seen in the other and thereby come to conclusions in an analysis of the observed scene.
  • a user friendly way of presenting IR and VL image data is provided, thereby rendering an improved usability and facilitating or enabling improved analysis and interpretation of what is represented in the captured images.
  • a method for facilitating or enabling improved analysis and interpretation of associated infrared (IR) and visible light (VL) image data in an IR image and a VL image depicting a real world scene said images being captured using a thermography arrangement comprising an IR imaging system and a visible light imaging system, the method comprising: associating an IR image and a VL image depicting the real world scene; processing at least one of the VL image and the IR image such that the field of view represented in the VL image substantially corresponds to the field of view represented in the IR image, thereby generating a resulting IR image and a resulting VL image with corresponding fields of view; enabling a user to access the associated images for display of a representation of said associated images.
  • IR infrared
  • VL visible light
  • thermography arrangement for facilitating or enabling improved analysis and interpretation of associated infrared (IR) and visible light (VL) image data in an IR image and a VL image depicting a real world scene
  • said arrangement comprising: an IR imaging system configured to capture an IR image of the real world scene according to a first field of view; a visible light imaging system configured to capture a visible light image according to a second field of view; a processor arranged to process at least one of the visible light image and the IR image such that the field of view represented in the visible light image substantially corresponds to the field of view represented in the IR image and associate the resulting IR and visible light images; and a memory configured to store the associated images.
  • Fig. l shows a schematic view of a thermography arrangement according to an embodiment.
  • Fig. 2 is a block diagram of a method according to one or more embodiments.
  • Fig. 3a shows an example of an image pair according to an embodiment, without use of the inventive method.
  • Fig. 3b shows an example of an image pair according to an embodiment.
  • Fig 4 shows a method for obtaining a combined image comprising the steps of aligning, determining that the VL image resolution value and the IR image resolution value are substantially the same and combining the IR image and the VL image.
  • Fig 5 shows an exemplary embodiment of an input device 4 is shown.
  • the input device comprises an interactive display 570, such as a touch screen, an image display section and controls 510-550 enabling the user to enter input
  • Fig 6a shows examples of how the captured view without FOV functionality is for the VL imaging system 620 and how the captured view without FOV functionality is for the IR imaging system 630.
  • Fig 6b shows an example how, when FOV functionality is activated, the processed visible light image and the processed IR image depicts or represents substantially the same subset of the captured view
  • Fig 7 shows displaying of an image, such as the associated IR image, the associated VL image or a combined image based on the associated IR image and the associated VL image.
  • Fig 8a-8c shows exemplary ways of determining a subset of the captured view of the real world scene based by determining an indicative location in a captured VL image or IR image.
  • methods described herein may ensure that images or image sequences captured using an infrared (IR) imaging system and a visible light (VL) imaging system, respectively, are displayed and/or stored in a format wherein the images have the same the field of view (FOV).
  • FOV field of view
  • the IR imaging system and the VL imaging system are both comprised in a thermography arrangement, as further described below.
  • ensuring that the FOVs are the same comprises zooming and/or shifting the images captured by the imaging system having the wider FOV, in such a way that they match the images captured using the imaging system having the narrower FOV.
  • the FOV of images, or image frame sequences represent the same FOV, or in other words depict the same part of the observed scene
  • a comparison between images is facilitated. For instance, if an image pair comprising an IR image and a VL image depicting the same scene further depict the scene according to the same FOV, it will be easier for a user to compare what is seen in one of the images to what is seen in the other and thereby come to conclusions in an analysis of the observed scene.
  • the VL imaging system has a wider FOV than the IR imaging system. Therefore, in many cases it will be the VL FOV that needs to be adapted, e.g. zoomed and/or shifted, to match the FOV of the IR imaging system.
  • VL imaging system may have, for example, a FOV of 50-60 degrees, while an IR imaging system may have, for example, a FOV of 25 degrees.
  • the thermography arrangement has additional IR optics providing a FOV of e.g., 90 degrees for the IR imaging system.
  • the FOV of the IR image is adapted, e.g., zoomed and/or shifted, to match the FOV of the VL image according to any of the embodiments described below.
  • the FOV follow functionality it is possible, for example, to turn on and turn off the FOV follow functionality of the method embodiments presented herein.
  • the user wants to view or store a VL image with a wider FOV than the FOV of the IR imaging system (if this is the narrower FOV of the two). If the FOV follow functionality is turned off, the IR FOV and the VL FOV are per default set to their respective maximum value.
  • a VL imaging system commonly has a visible (or visual) field of view (FOV) of approximately 40-70 0
  • an IR imaging system typically has a narrower visible (or visual) FOV, e.g., of approximately 20-30 0
  • FOV visible field of view
  • replaceable optical elements or lenses including optical elements may for instance render a FOV of 15-45 0 , or even viewing angles of up to approximately 90 0 .
  • thermography arrangement 1 for facilitating or enabling improved analysis and interpretation of associated infrared (IR) and visible light (VL) image data in an IR image and a VL image depicting a real world scene.
  • the thermography arrangement l comprises an IR imaging system 12 having an IR sensor 20, the IR imaging system 12 being configured to capture an IR image of the real world scene according to a first field of view.
  • thermography arrangement 1 further comprises a VL imaging system 11 having a visible (or visual) sensor 16, the VL imaging system 11 being configured to capture a visible light image according to a second field of view.
  • thermography arrangement 1 further comprises an optional sensor device aimed at the observed real world scene and communicatively coupled to the thermography arrangement 1 .
  • the sensor device is one of a laser projector configured to project a laser dot onto observed real world scene, a visual light source configured to project a laser dot onto observed real world scene, a rangefinder configured to project a laser dot onto observed real world scene, an ambient temperature sensor or a humidity sensor.
  • the IR imaging system 12 comprised in the thermography arrangement 1 is configured to capture IR images and the visible light (VL) imaging system 11 is configured to capture visible light (VL) images, in manners known per se (e.g., in one or more conventional ways as would be understood by one skilled in the art).
  • the IR image and the visible light image are captured simultaneously.
  • the IR image and the visible light image are captured in close succession.
  • the IR image and the visible light image are captured at time instances further apart.
  • the captured one or more images are transmitted to a processor 2 configured to perform image processing operations.
  • the processor 2 is according to embodiments integrated in the thermography arrangement 1, coupled to the thermography arrangement 1 or configured to receive data transferred from the thermography arrangement 1.
  • the captured images may also be transmitted with possible intermediate storing to a processing unit separate or external from the imaging device.
  • the processing in the imaging device or the separate processing unit are provided with specifically designed programming or program code portions adapted to control the processing unit or processor to perform the steps and functions of embodiments of the inventive method, as further described herein.
  • the processor 2 is arranged to process at least one of the visible light image and the IR image such that the field of view represented in the visible light image substantially corresponds to the field of view represented in the IR image.
  • the processor 2 is arranged to process at least one of the visible light image and the IR image such that the field of view represented in the visible light image substantially corresponds to the field of view represented in the IR image comprises a selection of the following operations: cropping; windowing; zooming; shifting; and rotation of at least one of the images or parts of at least one of the images.
  • the processor 2 is arranged to process at least one of the captured visible light image and the captured IR image such that a processed visible light image and a processed IR image depicts or represents substantially the same subset of the captured view of the real world scene, wherein the subset of the captured view of the real world scene is entirely enclosed by the IR imaging system FOV and the VL imaging system FOV.
  • Fig 6a shows examples of how the captured view without FOV functionality is for the VL imaging system 620 and how the captured view without FOV functionality is for the IR imaging system 630.
  • Fig 6a also shows an exemplary subset 640 of the captured view of the real world scene entirely enclosed by the IR imaging system FOV and the VL imaging system FOV.
  • Fig 6a shows an observed real world scene 610.
  • Fig 6b shows an example how, when FOV functionality is activated, the processed visible light image and the processed IR image depicts or represents substantially the same subset of the captured view
  • the subset of the captured view of the real world scene is determined by: -determining the latest image, as which one of the captured visible light image and the captured IR image that is most recently captured
  • the subset of the captured view of the real world scene is determined by:
  • Fig 8a-8c shows exemplary ways of determining a subset of the captured view of the real world scene based by determining an indicative location in a captured VL image or IR image.
  • Fig 8a shows an example where the real world scene 810, the captured view of the real world scene by the VL imaging system 820, the captured view of the real world scene by the IR imaging system 830 and the subset of the captured view of the real world scene 840 is illustrated.
  • the operational mode of FOV follow functionality the captured view of the real world scene by the IR imaging system 830 is entirely enclosed in the captured view of the real world scene by the VL imaging system 820 which is selected as the subset of the captured view of the real world scene840. subset of the captured view of the real world scene
  • the subset of the captured view of the real world scene is determined by:
  • determining the indicative location in a captured VL image or IR image is performed by:
  • thermography arrangement -projecting a laser dot onto the real world scene by a laser projector attached to or integrated in the thermography arrangement.
  • Fig 8b shows an example where the real world scene 810, the captured view of the real world scene by the VL imaging system 820, the captured view of the real world scene by the IR imaging system 830 and the subset of the captured view of the real world scene 840 is illustrated.
  • an indicative location is indicated by the user by aiming a laser dot 850 by a projector 17 onto the real world scene 810, the location of the laser dot in a captured VL image or IR image is determined as an indicative location, a predetermined feature, such as a rectangle, is associated, e.g. centered on, with the indicative location and the subset of the captured view of the real world scene is determined as the outline of the predetermined feature.
  • thermography arrangement is projecting a laser dot onto the real world scene, e.g. onto a first object.
  • the location of the laser dot in a captured VL image or IR image representing or depicting the real world scene is determined, e.g. by detecting the laser dot in the VL image or by predetermined relations between the IR imaging system, the VL imaging system and the laser projector, e.g. FOV.
  • a predetermined feature is associated to the determined laser dot location in the VL image and the IR image, e.g. a rectangle centered on the determined location.
  • determining the indicative location in a captured VL image or IR image is performed by:
  • Fig 8c shows an example where the real world scene 810, the captured view of the real world scene by the VL imaging system 820, the captured view of the real world scene by the IR imaging system 830 and the subset of the captured view of the real world scene 840 is illustrated.
  • an indicative location is indicated by the user by centering the view of real world scene 810 on to an area of interest, the center of the captured IR image or the center of the captured VL image is determined as an indicative location, a predetermined feature, such as a rectangle, is associated, e.g. centered on, with the indicative location and the subset of the captured view of the real world scene is determined as the outline of the predetermined feature.
  • the center of a captured VL image or the center of a captured IR image representing or depicting the real world scene is determined as the indicative location.
  • a predetermined feature is associated to the indicative location in the VL image and the IR image, e.g. a rectangle centered on the determined location. Determining the subset of the captured view of the real world scene as the outline of the predetermined feature, such as a rectangle, and obtaining a processed visible light image and a processed IR image depicting or representing the subset of the captured view of the real world scene by cropping the part of the VL image and the part of the IR image outside the outline of the predetermined feature. According to embodiments, determining the indicative location in a captured VL image or IR image is performed by:
  • the center of an object is indicated by a user through a user interface. According to embodiments, the center of an object is indicated by detecting an object in the captured VL image or captured IR image and determining the center of the object.
  • an object is detected in a captured VL image or in a captured IR image, e.g. by user indicating the object or through prior art object detection methods.
  • the center of an object in a captured VL image or the center of an object in a captured IR image representing or depicting the real world scene is determined as the indicative location.
  • a predetermined feature is associated to the indicative location in the VL image and the IR image, e.g. a rectangle centered on the determined location.
  • the processor 2 is arranged to perform scaling of the at least one processed image such that the size of the images becomes the same. According to an embodiment, the processor 2 is arranged to perform resampling of the at least one processed image such that the resolution of the images becomes the same.
  • the processor 2 is further configured to associate the resulting IR and visible light images. According to an embodiment, the processor 2 is configured to generate a data structure comprising the associated images, for example in the form of an image pair.
  • the processor 2 may be a processor such as a general or special purpose processing engine for example a microprocessor, microcontroller or other control logic that comprises sections of code or code portions, stored on a computer readable storage medium, that are fixed to perform certain tasks but also other sections of code, stored on a computer readable storage medium, that can be altered during use.
  • Such alterable sections can comprise parameters that are to be used as input for the various tasks, such as the calibration of the thermography arrangement l, the sample rate, or the filter for the spatial filtering of the images, among others.
  • the processor 2 is configurable using a hardware description language (HDL).
  • HDL hardware description language
  • the processor 2 is a field-programmable gate array (FPGA), i.e., an integrated circuit designed to be configured by the customer or designer after
  • HDL hardware description language
  • the thermography arrangement l further comprises a selection of one or more input devices 4 for inputting commands and/or control signals, e.g., an interactive display, joystick and/or record/push-buttons.
  • the processor 2 controls functions of the different parts of the thermography arrangement 1.
  • Fig 5 an exemplary embodiment of an input device is shown.
  • the input device comprises an interactive display 570, such as a touch screen, an image display section and controls 510- 550 enabling the user to enter input.
  • the input device 4 comprises controls enabling the user to perform the functions of:
  • Activate 510 or deactivate the "FOV follow" functionality i.e. matching of the FOV represented by the associated IR images with the FOV of the associated VL image.
  • Selecting to 520 access/ or display an image, such as the associated IR image, the associated VL image or a combined image based on the associated IR image and the associated VL image. This is further detailed in Fig 7.
  • thermography arrangement 1 further comprises at least one memory 15 for storing the data registered or processed by the thermography
  • the memory 15 is configured to store the image data, for example the associated images obtained from the processing described above.
  • the memory 15 may be integrated into the thermography arrangement 1 or coupled to the thermography arrangement 1 via wired or wireless communication.
  • the memory is an external memory 8 integrated in an external unit 10, the memory 8 being configured to receive and store data from the thermography arrangement 1.
  • the memory may be a memory 15 that is integrated in or coupled to the thermography arrangement and/or a memory 8 that is integrated in an external unit configured to receive data from, and/or transfer data to, the thermography arrangement 1.
  • each memory 15, 8 may either be a volatile memory or a nonvolatile memory.
  • the thermography arrangement 1 comprises or is coupled to a data communication interface 5 configured to communicate data to an external unit 10 and thereby enable a user to access and/or display the associated images using an external unit 10.
  • the external unit 10 comprises a processing unit 9 configured to perform any, all or a selection of the method steps of functions described herein.
  • data is transferred or communicated between the thermography arrangement l and the external unit 10 via a data communication interface 5 of the thermography arrangement 1 and a corresponding data communication interface 6 of the external unit, the interfaces e.g., comprising wired or wireless connections, IRDA,
  • the one-way or two-way communication enabled by said interfaces 5, 6 is illustrated by a dashed arrow in Fig. 1.
  • thermography arrangement 1 may comprise a display configured to display at least one of the associated IR and VL images or a combined image based on the associated IR and VL images.
  • the display is a display 3 integrated in or couplable to the thermography arrangement 1.
  • the display is an external display 7 integrated in or coupled to an external unit 10 and configured to receive data transferred from the thermography arrangement via the interfaces 5, 6 described above.
  • the display 3, 7 is configured to display the associated images to the user for further analysis and interpretation.
  • VL image and the IR image are representations of the same real world scene according to the same field of view, meaning that a user viewing both images, or switching between the images, can easily and quickly relate the information presented in one image to the information presented in the other.
  • computer program product and “computer-readable storage medium” may be used generally to refer to media such as a memory 15 or the storage medium of processor 2 or an external storage medium. These and other forms of computer- readable storage media may be used to provide instructions to processor 2 for execution.
  • thermography arrangement e.g., IR camera
  • logic may include hardware, software, firmware, or a combination of thereof.
  • the processor 2 communicates with a memory 15 where parameters are kept ready for use by the processor 2, and where the images being processed by the processor 2 can be stored if the user desires.
  • the one or more memories 15 may comprise a selection of a hard disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a CD or DVD drive (R or RW), or other removable or fixed media drive.
  • Fig. 2 shows a block diagram of a method according to one or more embodiments.
  • a method for facilitating or enabling improved analysis and interpretation of associated infrared (IR) and visible light (VL) image data in an IR image and a VL image depicting a real world scene is provided.
  • the associated VL and IR image information are represented as an image pair comprising an infrared (IR) image and a visible light (VL) image depicting the same real world scene.
  • said images are captured using a thermography arrangement comprising an IR imaging system and a VL imaging system.
  • the method comprises:
  • Step S202 Capturing an IR image depicting the real world scene using the IR imaging system, having a first field of view.
  • Step S204 Capturing a visible light image depicting the real world scene using the visible light imaging system, having a second field of view.
  • steps S202 and S204 may be performed
  • thermography arrangement e.g. fixedly mounted or placed on a stand for monitoring of a fairly static scene over a longer period of time
  • images captured at time instances further apart may comprise
  • Step S206 Processing at least one of the VL image and the IR image such that the field of view represented in the visible light image substantially corresponds to the field of view represented in the IR image, thereby generating a resulting IR image and a resulting VL image with corresponding fields of view.
  • the IR image and the VL image are herein referred to as resulting images after step S206 has been performed, even for the embodiments where one of the IR image and VL image has not been processed in step S206.
  • the processing may comprise a selection of cropping; windowing; zooming; shifting; and rotation, according to methods known in the art. For instance, a portion of the VL image may be shifted and/or cropped or windowed to match the IR FOV.
  • the VL image and/or IR image may further be scaled to match the resolution of the display.
  • the captured IR image represents a wider FOV than that of the VL image, i.e. the IR imaging system FOV entirely comprising the VL imaging system FOV.
  • the VL image is processed such that the FOV of the VL image matches the FOV of the IR image.
  • the captured VL image represents a wider FOV than that of the IR image, i.e. the IR imaging system FOV entirely comprising the VL imaging system FOV...
  • the IR image is processed such that the FOV of the IR image matches the FOV of the VL image.
  • both the FOV of the VL imaging system and the FOV of the IR imaging system and the images representing the respective FOV may have any internal relationship and both images are processed such that the represented FOVs of the images match each other and a third, narrower, FOV.
  • This may for example be relevant if the FOVs of the IR and VL imaging systems do not completely overlap, meaning that parts of images captured by both imaging systems must be removed in order for the FOVs represented in the images to match.
  • the processing of step S206 comprises identifying an area or a region in the captured image, representing the wider FOV, that corresponds to the narrower FOV represented by the other captured image.
  • the processing comprises identifying the area or region in the image representing the wider FOV that depicts the same part of the observed scene as the image representing the narrower FOV.
  • the area or region may according to an embodiment be identified using a known relationship between the imaging systems 11, 12, such as known parallax, pointing error and/or the relationship between the FOV of the IR imaging system 12 and the VL imaging system 11. Said relationships may have been determined during design, production or calibration of the thermography arrangement, in manners known per se (e.g., conventional methods as would be understood by one skilled in the art).
  • the area or region may be identified for each image pair using any known identification method, e.g.
  • steps S206 may further comprise alignment and/or stabilization of the captured images, in manners known per se, in order to provide a more accurate identification of the area or region as described above.
  • step S206 a user can easily identify points-of- interest in the IR image with areas or objects in the VL image simply by noting where in the IR image the feature of interest is located and look at the same coordinates in the VL image.
  • the method may further comprise, according to embodiments, resampling the processed visible light image such that the resolution of the processed visible light image matches the resolution of the IR image and/or scaling the processed visible light image such that the size of the processed visible light image matches the size of the IR image. Resampling and/or scaling may be performed if the images after the processing in step S206 do not already have the same resolution or size.
  • Step S208 After the processing, associating the resulting IR and visible light images.
  • associating the images comprises creating a relation or connection between the images, thereby creating an image pair.
  • the associated images may then be handled or processed simultaneously. For instance, both images may be displayed next to each other, or a user viewing the associated images may be enabled to switch between display modes of the images.
  • associating the images comprises creating an instance of a data structure comprising the two images.
  • the associated images may further be stored, either stored temporarily on a transitory memory, e.g. for live or real-time viewing of the captured images at site, or stored more permanently on a non-transitory memory for later viewing and analysis, using a display 3 of the thermography arrangement 1 or a display 7 of an external unit 10.
  • the associated images are stored on a volatile memory that is either integrated in the thermography arrangement or coupled to the thermography arrangement.
  • the volatile memory may e.g. be a RAM or a cache storage.
  • the associated images are stored on a non-volatile memory that is integrated in the thermography arrangement; coupled to the thermography arrangement; or integrated in an external unit configured to receive data from, and/or transfer data to, the thermography arrangement.
  • enabling a user to access the associated images for display comprises communicating data comprising the associated images to an external unit 10 via a data communication interface 5, 6.
  • the external unit 10 may comprise a display 7 configured to display the received image data.
  • enabling the user to access the associated images for display comprises displaying the associated images on a display integrated in or coupled to the thermography arrangement.
  • the associated images are displayed in real-time or near real-time, in connection to being captured.
  • enabling the user to access the associated images for display further comprises enabling switching between displaying the VL image and the IR image.
  • enabling the user to access the associated images for display further comprises enabling display of a combined image dependent on the associated images.
  • the combined image is a contrast enhanced version of the IR image with addition of VL image data.
  • a method for obtaining a combined image comprises the steps of aligning, determining that the VL image resolution value and the IR image resolution value are substantially the same and combining the IR image and the VL image.
  • a schematic view of the method is shown in Fig 4.
  • the optical axes between the imaging systems may be at a distance from each other and an optical phenomenon known as parallax distance error will arise.
  • the optical axes between the imaging systems may be oriented at an angle in relation to each other and an optical phenomenon known as parallax pointing error will arise.
  • the rotation of the imaging systems around their corresponding optical axes and an optical phenomenon known as parallax rotation error will arise.
  • the captured view of the real world scene might differ between the IR imaging system and the VL imaging system. Since the capturing of the infrared (IR) image and capturing of the visual light (VL) image is generally performed by different imaging systems of the imaging device with different optical systems with different properties, such as magnification, the captured view of the real world scene, called field of view (FOV) might differ between the imaging systems.
  • the IR image and the VL image might be obtained with different optical systems with different optical properties, such as magnification, resulting in different sizes of the FOV captured by the IR sensor and the VL sensor.
  • the images In order to combine the captured IR and captured VL image the images must be adapted so that an adapted IR image and adapted VL image representing the same part of the observed real world scene is obtained, in other words compensating for the different parallax errors and FOV size.
  • This processing step is referred to as registration of or alignment of the IR image and the VL image.
  • Registration or alignment can be performed according to any method known to a skilled person in the art. Determining that the VL image resolution value and the IR image resolution value are substantially the same
  • the IR image and the VL image might be obtained with different resolution, i.e. different number of sensor elements of the imaging systems.
  • different resolution i.e. different number of sensor elements of the imaging systems.
  • Re-sampling can be performed according to any method known to a skilled person in the art.
  • the IR image is resampled to a first resolution and the VL image are resampled to a second resolution, wherein the first resolution is a multiple of 2 times the second resolution or the second resolution is a multiple of 2 times the first resolution, thereby enabling instant resampling by considering every 2 pixels of the IR image or the VL image.
  • an IR image and a VL image is combined by combining an aligned IR image with high spatial frequency content of an aligned VL image to yield a contrast enhanced combined image.
  • the combination is performed through superimposition of the high spatial frequency content of the VL image and the IR image, or alternatively superimposing the IR image on the high spatial frequency content of the VL image.
  • a method for obtaining a contrast enhanced combined image comprises the following steps:
  • Step low capturing VL image.
  • capturing a VL image comprises capturing a VL image depicting the observed real world scene using the VL imaging system with an optical system and sensor elements, wherein the captured VL image comprises VL pixels of a visual representation of captured visual light image. Capturing a VL image can be performed according to any method known to a skilled person in the art.
  • Step 1020 capturing an IR image
  • capturing an IR image comprises capturing an IR image depicting an observed real world scene using the IR imaging system with an optical system and sensor elements, wherein the captured IR image comprises captured infrared data values of IR radiation emitted from the observed real world scene and associated IR pixels of a visual representation representing temperature values of the captured infrared data values.
  • Capturing an IR image can be performed according to any method known to a skilled person in the art.
  • steps loio and 1020 are performed simultaneously or one after the other.
  • the images may be captured at the same time or with as little time difference as possible, since this will decrease the risk for alignment differences due to movements of an imaging device unit capturing the visual and IR images.
  • images captured at time instances further apart may also be used.
  • the sensor elements of the IR imaging system and the sensor elements of the VL image system are substantially the same, e.g. have substantially the same resolution.
  • the IR image may be captured with a very low resolution IR imaging device, the resolution for instance being as low as 64x64 or 32x32 pixels, but many other resolutions are equally applicable, as is readably understood by a person skilled in the art.
  • edge and contour (high spatial frequency) information is added to the combined image from the VL image, the use of a very low resolution IR image will still render a combined image where the user can clearly distinguish the depicted objects and the temperature or other IR information related to them
  • Capturing an IR image can be performed according to any method known to a skilled person in the art.
  • Step 1030 Aligning the IR image and the VL image.
  • parallax error comprises parallax distance error between the optical axes that generally arises due to differences in placement of the sensors of the imaging systems for capturing said IR image and said VL image, the parallax pointing error angle created between these axes due to mechanical tolerances that generally prevents them being mounted exactly parallel and the parallax rotation error due to mechanical tolerances that generally prevents them being mounted exactly with the same rotation around the optical axis of the IR and VL image systems.
  • the capturing of the infrared (IR) image and capturing of the visual light (VL) image is performed by different imaging systems of the imaging device with different optical systems with different properties, such as magnification, the extent of the captured view of the real world scene, called size of field of view (FOV) might differ.
  • different imaging systems of the imaging device with different optical systems with different properties, such as magnification, the extent of the captured view of the real world scene, called size of field of view (FOV) might differ.
  • Aligning the IR image by compensating for parallax error and size of FOV to obtain an aligned IR image and an aligned VL image with substantially the same FOV can be performed according to any method known to a skilled person in the art.
  • Step 1090 determining a resolution value of the IR imaging system and a resolution value ofVL imaging system, wherein the resolution value of the IR imaging system corresponds to the resolution of the captured IR image and the resolution value ofVL imaging system corresponds to the resolution of the captured VL image.
  • the resolution value represents the number of pixels in a row and the number of pixels in a column of a captured image. In one exemplary embodiment, the resolutions of the imaging systems are predetermined.
  • corresponds to the resolution of the captured VL image can be performed according to any method known to a skilled person in the art.
  • Step iioo determining that the VL image resolution value and the IR image resolution value are substantially the same If in Step iioo it is determined that the VL image resolution value and the IR image resolution value are not substantially the same the method may further involves the optional step 1040 of re-sampling at least one of the received images so that the resulting VL image resolution value and the resulting IR image resolution value, obtained after re-sampling, are substantially the same.
  • re-sampling comprises up-sampling of the resolution of the IR image to the resolution of the VL image, determined in step 1090.
  • re-sampling comprises up-sampling of the resolution of the VL image to the resolution of the IR image, determined in step 1090.
  • re-sampling comprises down-sampling of the resolution of the IR image to the resolution of the VL image, determined in step 1090.
  • re-sampling comprises down-sampling of the resolution of the VL image to the resolution of the IR image, determined in step 1090.
  • re-sampling comprises re-sampling of the resolution of the IR image and the resolution of the VL image to an intermediate resolution different from the captured IR image resolution and the captured VL image resolution determined in step 1090.
  • the intermediate resolution is determined based on the resolution of a display unit of the thermography arrangement or imaging device.
  • the method steps are performed for a portion of the IR image and a corresponding portion of the VL image.
  • the corresponding portion of the VL image is the portion that depicts the same part of the observed real world scene as the portion of the IR image.
  • high spatial frequency content is extracted from the portion of the VL image, and the portion of the IR image is combined with the extracted high spatial frequency content of the portion of the VL image, to generate a combined image, wherein the contrast and/or resolution in the portion of the IR image is increased compared to the contrast of the originally captured IR image.
  • said portion of the IR image may be the entire IR image or a sub portion of the entire IR image and said corresponding portion of the VL image may be the entire VL image or a sub portion of the entire VL image.
  • the portions are the entire IR image and a corresponding portion of the VL image that may be the entire VL image or a subpart of the VL image if the respective IR and visual imaging systems have different fields of view.
  • VL image resolution value and the IR image resolution value are substantially the same can be performed according to any method known to a skilled person in the art.
  • Step 1050 process the VL image by extracting the high spatial frequency content of the VL image
  • extracting the high spatial frequency content of the VL image is performed by high pass filtering the VL image using a spatial filter.
  • extracting the high spatial frequency content of the VL image is performed by extracting the difference (commonly referred to as a difference image) between two images depicting the same scene, where a first image is captured at one time instance and a second image is captured at a second time instance, preferably close in time to the first time instance.
  • the two images may typically be two consecutive image frames in an image frame sequence.
  • High spatial frequency content, representing edges and contours of the objects in the scene, will appear in the difference image unless the imaged scene is perfectly unchanged from the first time instance to the second, and the imaging sensor has been kept perfectly still.
  • the scene may for example have changed from one frame to the next due to changes in light in the imaged scene or movements of depicted objects. Also, in almost every case the imaging device or thermography system will not have been kept perfectly still.
  • a high pass filtering is performed for the purpose of extracting high spatial frequency content in the image, in other words locating contrast areas, i.e. areas where values of adjacent pixels display large differences, such as sharp edges.
  • a resulting high pass filtered image can be achieved by subtracting a low pass filtered image from the original image, calculated pixel by pixel.
  • Processing the VL image by extracting the high spatial frequency content of the VL image can be performed according to any method known to a skilled person in the art (optional)
  • Step 1060 process the IR image to reduce noise in and/or blur the IR image
  • processing the IR image comprises reducing noise and/or blur the IR image is performed through the use of a spatial low pass filter.
  • Low pass filtering may be performed by placing a spatial core over each pixel of the image and calculating a new value for said pixel by using values in adjacent pixels and coefficients of said spatial core.
  • the purpose of the low pass filtering performed at optional step 1060 is to smooth out unevenness in the IR image from noise present in the original IR image captured at step 1020. Since sharp edges and noise visible in the original IR image are removed or at least diminished in the filtering process, the visibility in the resulting image is further improved through the filtering of the IR image and the risk of double edges showing up in a combined image where the IR image and the VL image are not aligned is reduced.
  • Processing the IR image to reduce noise in and/or blur the IR image can be performed according to any method known to a skilled person in the art.
  • Step 1070 combining the extracted high spatial frequency content of the captured VL image and the optionally processed IR image to a combined image
  • combining the extracted high spatial frequency content of the captured VL image and the optionally processed IR image to a combined image comprises using only the luminance component Y from the processed VL image.
  • combining the extracted high spatial frequency content of the captured VL image and the optionally processed IR image to a combined image comprises combining the luminance component of the extracted high spatial frequency content of the captured VL image with the luminance component of the optionally processed IR image.
  • the colors or greyscale of the IR image are not altered and the properties of the original IR palette maintained, while at the same time adding the desired contrasts to the combined image.
  • To maintain the IR palette through all stages of processing and display is beneficial, since the radiometry or other relevant IR information may be kept throughout the process and the interpretation of the combined image may thereby be facilitated for the user.
  • combining the extracted high spatial frequency content of the captured VL image and the optionally processed IR image to a combined image comprises combining the luminance component of the VL image with the luminance component of the IR image using a factor alpha to determine the balance between the luminance components of the two images when adding the luminance components.
  • This factor alpha can be determined by the imaging device or imaging system itself, using suitable parameters for determining the level of contour needed from the VL image to create a satisfactory image, but can also be decided by a user by giving an input to the imaging device or imaging system. The factor can also be altered at a later stage, such as when images are stored in the system or in a PC or the like and can be adjusted to suit any demands from the user.
  • combining the extracted high spatial frequency content of the captured VL image and the optionally processed IR image to a combined image comprises using a palette to map colors to the temperature values of the IR image, for instance according to the YCbCr family of color spaces, where the Y component (i.e. the palette luminance component) may be chosen as a constant over the entire palette.
  • the Y component may be selected to be 0.5 times the maximum luminance of the combined image, the VL image or the IR image.
  • the high resolution noise is high resolution temporal noise.
  • High resolution noise may be added to the combined image in order to render the resulting image more clearly to the viewer and to decrease the impression of smudges or the like that may be present due to noise in the original IR image that has been preserved during the optional low pass filtering of said IR image.
  • the processor 2 is arranged to perform the method steps 1010- 1080
  • a user interface enabling the user to interact with the displayed data, e.g. on one of the displays 3, 7.
  • Such a user interface may comprise selectable options or input possibilities allowing a user to switch between different views, zoom in on areas of interest etc.
  • the user may provide input using one or more of the input devices 4.
  • a user may interact with the thermography arrangement 1 to perform zooming or scaling of one of the images, in manners known in the art, before storing or display of the images. If a user performs a zooming or scaling action on either the IR or the VL image, the FOV of the associated image will be adjusted according to any of the method embodiments described herein (e.g., for step S208 and S210). Thereby, the FOV of the associated images will always be matched, either in real-time or near real-time to a user viewing the images on site, or in image data stored for later retrieval.
  • Fig. 3a shows an example of an image pair according to an embodiment, without use of the inventive method.
  • the image pair shown in Fig. 3A comprises a VL image 300, according to a first FOV, and an IR image, according to a second FOV.
  • the FOV of the VL image 300 corresponds to the FOV of the VL imaging system 11
  • the FOV of the IR image 310 corresponds to the FOV of the IR imaging system 12.
  • the "FOV follow" functionality or in other words matching of the FOV of IR images with the FOV of corresponding VL images is a mode that may be turned on and off. Turning the FOV follow mode on and off may be performed automatically, based on settings of the thermography arrangement, or manually by a user interacting with one or more of the input devices 4.
  • a user may be operating a thermography arrangement to investigate a scene looking for abnormalities, for example in the form of moist, poor isolation or overheated electronics at a construction site.
  • the user may view the captured IR and VL image frame sequences in real-time, or near real-time, on a display of the thermography arrangement.
  • the IR image frame sequence is displayed to the user on the display.
  • the user may switch from the IR image view to the VL image view, using an input device provided on the camera.
  • the user will now be presented with a VL image according to the same FOV as the IR image that was presented a moment ago. Because the two images are matched, the user can easily identify points-of-interest in the IR image with areas or objects in the VL image simply by noting where in the IR image the feature of interest is located and look at the same coordinates in the VL image. In other words, the relevant part of the scene will be right where the user a moment ago noticed the area of interest in the IR image.
  • the IR image displayed may instead of a pure IR image be a fused, blended or contrast enhanced version of the IR image, with addition of VL image data, generated in manners known in the art.
  • VL image data generated in manners known in the art.
  • an IR image is often hard to interpret in itself and also in blended or fused images it may be hard to relate the image content to the observed scene. Therefore, the possibility to switch to a VL representation wherein the FOV of the VL image coincides with the FOV of the previously displayed IR or combined image is very advantageous for interpretation.
  • the user is using the thermography arrangement to observe a scene distant from the point where the user is standing. Therefore, the user chooses to zoom in, while being shown an IR image on the display of the thermography arrangement.
  • the user switches to VL in order to relate the IR related data to the scene. If the FOV follow mode is turned on, the user will now see a VL image representing the zoomed in FOV of the IR image that was shown a moment ago.
  • the IR imaging system and the VL imaging system work independently, during capturing of a live image sequence or still pictures, and the user may for example zoom in in the images captured by one of the IR or VL imaging system without affecting the image or image sequence captured by the other imaging system.
  • the user will be presented with either a zoomed in version of the VL image according to the VL imaging systems FOV, which is zoomed in but not matched with the IR image, or with the original zoomed-out VL image according to the entire FOV of the VL imaging system.
  • the user has the FOV follow mode on, switches to VL to see in what part of the scene the interesting feature is found in the zoomed- in and matched VL image, and then turns the FOV follow mode off whereby a VL image according to the full FOV of the VL imaging system is displayed, thereby allowing the user to put the interesting part of the scene into its context.
  • the user may use an input device 4 to capture and/or store an associated IR image and VL image pair. In the eyes of the user, a single input action may thereby be performed. In response to the single user action, method steps according to any of the embodiments described herein are performed, leading to capturing, association, storing and/or displaying of images dependent on the preset or user selected settings of the thermography arrangement currently in operation.
  • the user may want to zoom in and take a picture or capture an image sequence.
  • the user may have detected the object or detail of interest in the IR image, but finds that it is easier to zoom in in the VL view. Therefore the user switches to VL before zooming.
  • the FOV of the VL image correspond to the FOV of the IR image, making it easy for the user to relate the detected temperature pattern or abnormality of interest seen in the IR image to the correct part of the VL scene and zoom in on it.
  • the user may want to zoom out in order to put an observed detail or object into its context. If the user switches between IR and VL in connection with the zooming actions, the FOVs will always be matched. In case the user manually selects a FOV for VL that is wider than what is obtainable for the IR imaging system, the FOV of the IR imaging system will be set to its maximum.
  • what is shown on the display is what is saved when the user saves the image.
  • an image pair comprising an IR and a VL image is stored, wherein the images have a matched FOV.
  • the user may at a later time retrieve the image pair and view them on a display of the thermography arrangement or an external unit. According to the settings of the display and the user interface used, the user may view the image pair either simultaneously, the images being presented next to each other, or one by one, where the user is enabled to switch between the images.
  • a user has captured images and chooses to generate a report, using report generating logic integrated in the thermography arrangement or in an external unit to which the captured image data has been transferred.
  • a user often wishes to include an image pair comprising an IR image and a VL image depicting the same scene.
  • the user would either have to include an IR image and a VL image representing different FOVs, or perform an additional step of e.g. manually cropping the image having a wider FOV to match the FOV of the other image, in order to facilitate interpretation for a person reading the report.
  • this is not necessary, as the associated images will always represent the same FOV, thereby enabling improved analysis and interpretation of associated visible light and IR image information.
  • the captured and associated images may be further processed according to methods known in the art, e.g. to obtain images that are fused, blended or presented as picture in picture.
  • the processor is configurable using a hardware description language (HDL).
  • the processor 2 is a Field-programmable gate array (FPGA) or other type of logic device.
  • the processor 2 is further adapted to perform any of the steps or functions of the method embodiments described herein.
  • a computer system having a processor being adapted to perform any of the steps or functions of the method embodiments described herein.
  • a computer-readable medium on which is stored non-transitory information adapted to control a processor to perform any of the steps or functions of the method embodiments described herein.
  • a computer program product comprising code portions adapted to control a processor to perform any of the steps or functions of the method embodiments described herein.
  • a computer program product comprising configuration data adapted to configure a Field-programmable gate array (FPGA) to perform any of the steps or functions of the method embodiments described herein.
  • FPGA Field-programmable gate array
  • the user By activating the FOV follow functionality the user is enabled to instantly switch between image information, such as the VL image, the IR image or a combined image, thereby instantly accessing the most relevant information for the current scenario.
  • image information such as the VL image, the IR image or a combined image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Radiation Pyrometers (AREA)
  • Investigating Or Analyzing Materials Using Thermal Means (AREA)
  • Studio Devices (AREA)

Abstract

Un mode de réalisation un procédé permettant d'améliorer l'analyse et l'interprétation de données d'images associées dans une image infrarouge (IR) et une lumière visible (VL) image illustrant une scène du monde réel, capturée à l'aide d'un agencement de thermographie comprenant un système d'imagerie Infrarouge et un système d'imagerie VL sont prévus, le procédé comprenant les étapes consistant à : capturer une image IR représentant la scène, ayant un premier champ de vision (FOV) ; à capturer une image VL représentant la scène ayant un deuxième champ de vision ; le traitement d'au moins l'une de l'image VL et l'image IR de telle sorte que le champ de vision (FOV) représenté dans l'image VL corresponde sensiblement au champ de vision FOV représenté dans l'image IR ; à associer les images LV et IR résultant pour fournir des images associées ; et permettre à un utilisateur d'accéder aux images associées pour un affichage, dans lequel l'analyse et l'interprétation des images associées, ayant contenu de la paire d'images, est aussi intuitive que le produit résultant image VL et l' image infrarouge obtenue après une association représentent le même champ de vision.
EP13715175.9A 2012-03-30 2013-03-28 Analyse et interprétation de la facilitation de la lumière visible associée et des informations d'image infrarouge (ir) Withdrawn EP2831812A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261618001P 2012-03-30 2012-03-30
PCT/EP2013/056708 WO2013144298A1 (fr) 2012-03-30 2013-03-28 Analyse et interprétation de la facilitation de la lumière visible associée et des informations d'image infrarouge (ir)

Publications (1)

Publication Number Publication Date
EP2831812A1 true EP2831812A1 (fr) 2015-02-04

Family

ID=48083135

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13715175.9A Withdrawn EP2831812A1 (fr) 2012-03-30 2013-03-28 Analyse et interprétation de la facilitation de la lumière visible associée et des informations d'image infrarouge (ir)

Country Status (3)

Country Link
EP (1) EP2831812A1 (fr)
CN (1) CN104364800A (fr)
WO (1) WO2013144298A1 (fr)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015091821A1 (fr) * 2013-12-18 2015-06-25 Flir Systems Ab Traitement d'images infrarouges sur la base de gestes de glissement
CN105069768B (zh) * 2015-08-05 2017-12-29 武汉高德红外股份有限公司 一种可见光图像与红外图像融合处理系统及融合方法
CN105338262B (zh) 2015-10-09 2018-09-21 浙江大华技术股份有限公司 一种热成像图像处理方法及装置
CN105676884B (zh) * 2016-01-27 2018-09-25 武汉天木林科技有限公司 一种红外热成像搜索跟踪瞄准的装置及方法
WO2019084791A1 (fr) * 2017-10-31 2019-05-09 深圳市大疆创新科技有限公司 Procédé d'affichage d'image, procédé de commande et dispositif associé
CN110361092B (zh) * 2018-04-11 2020-12-18 杭州海康微影传感科技有限公司 一种图像配准方法、装置及热成像摄像机
CN110460783B (zh) * 2018-05-08 2021-01-26 宁波舜宇光电信息有限公司 阵列摄像模组及其图像处理系统、图像处理方法和电子设备
CN110677597B (zh) * 2018-07-03 2021-10-29 杭州海康威视数字技术股份有限公司 图像处理方法及装置
CN111345026B (zh) * 2018-08-27 2021-05-14 深圳市大疆创新科技有限公司 图像呈现方法、图像获取设备及终端装置
CN109151402B (zh) * 2018-10-26 2022-10-11 深圳市道通智能航空技术股份有限公司 航拍相机的图像处理方法、图像处理系统及无人机
CN114584702A (zh) * 2019-09-20 2022-06-03 广州速率信息技术有限公司 一种拍摄可见光和热成像重叠图的方法及系统
CN112945388A (zh) * 2020-01-28 2021-06-11 杭州美盛红外光电技术有限公司 热像和可见光匹配装置和显示匹配方法
CN111220277A (zh) * 2020-02-27 2020-06-02 北京遥感设备研究所 一种智能红外体温筛查系统
CN111307300B (zh) * 2020-04-17 2021-08-17 镇江明润信息科技有限公司 一种红外测温传感器的温度校准装置及方法
CN111522260A (zh) * 2020-04-24 2020-08-11 徐建红 基于信息抓取识别及大数据的智能辅助服务方法及其系统
CN113409218B (zh) * 2021-06-25 2023-02-28 杭州海康消防科技有限公司 可穿戴护具以及用于可穿戴护具的场景呈现方法

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110262053A1 (en) * 2010-04-23 2011-10-27 Flir Systems Ab Infrared resolution and contrast enhancement with fusion

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020015536A1 (en) * 2000-04-24 2002-02-07 Warren Penny G. Apparatus and method for color image fusion
US8531562B2 (en) * 2004-12-03 2013-09-10 Fluke Corporation Visible light and IR combined image camera with a laser pointer
CN101111748B (zh) * 2004-12-03 2014-12-17 弗卢克公司 具有激光指示器的可见光和ir组合的图像照相机
US7613360B2 (en) * 2006-02-01 2009-11-03 Honeywell International Inc Multi-spectral fusion for video surveillance
US8749635B2 (en) * 2009-06-03 2014-06-10 Flir Systems, Inc. Infrared camera systems and methods for dual sensor applications
CN102334141B (zh) * 2010-04-23 2015-05-20 前视红外系统股份公司 利用融合的红外线分辨率与对比度增强
US9723229B2 (en) * 2010-08-27 2017-08-01 Milwaukee Electric Tool Corporation Thermal detection systems, methods, and devices

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110262053A1 (en) * 2010-04-23 2011-10-27 Flir Systems Ab Infrared resolution and contrast enhancement with fusion

Also Published As

Publication number Publication date
CN104364800A (zh) 2015-02-18
WO2013144298A1 (fr) 2013-10-03

Similar Documents

Publication Publication Date Title
US10044946B2 (en) Facilitating analysis and interpretation of associated visible light and infrared (IR) image information
WO2013144298A1 (fr) Analyse et interprétation de la facilitation de la lumière visible associée et des informations d'image infrarouge (ir)
US20230162340A1 (en) Infrared resolution and contrast enhancement with fusion
US20130300875A1 (en) Correction of image distortion in ir imaging
US9891817B2 (en) Processing an infrared (IR) image based on swipe gestures
US10360711B2 (en) Image enhancement with fusion
EP2570989B1 (fr) Amélioration de résolution et de contraste avec fusion dans des images IR à faible résolution
US10033945B2 (en) Orientation-adapted image remote inspection systems and methods
US10148895B2 (en) Generating a combined infrared/visible light image having an enhanced transition between different types of image information
CA2797054C (fr) Amelioration de la resolution et du contraste infrarouge par fusion
JP5281972B2 (ja) 撮像装置
JP4770197B2 (ja) プレゼンテーション制御装置およびプログラム
CN113132640B (zh) 图像呈现方法、图像获取设备及终端装置
WO2014012946A1 (fr) Correction de distorsion d'image en imagerie ir
WO2014044221A1 (fr) Dispositif de diagnostic d'image thermique et procédé de diagnostic d'image thermique
JP5966584B2 (ja) 表示制御装置、表示制御方法およびプログラム
US20210190594A1 (en) Personal electronic device with built-in visible camera and thermal sensor with thermal information available in the visible camera user interface and display
JP5152317B2 (ja) プレゼンテーション制御装置及びプログラム
JP5887297B2 (ja) 画像処理装置および画像処理プログラム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20141030

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20180417

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20200522