WO2018073900A1 - Computer system, object diagnosis method, and program - Google Patents

Computer system, object diagnosis method, and program Download PDF

Info

Publication number
WO2018073900A1
WO2018073900A1 PCT/JP2016/080872 JP2016080872W WO2018073900A1 WO 2018073900 A1 WO2018073900 A1 WO 2018073900A1 JP 2016080872 W JP2016080872 W JP 2016080872W WO 2018073900 A1 WO2018073900 A1 WO 2018073900A1
Authority
WO
WIPO (PCT)
Prior art keywords
visible light
light image
region
temperature
infrared image
Prior art date
Application number
PCT/JP2016/080872
Other languages
French (fr)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2016/080872 priority Critical patent/WO2018073900A1/en
Publication of WO2018073900A1 publication Critical patent/WO2018073900A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N25/00Investigating or analyzing materials by the use of thermal means
    • G01N25/72Investigating presence of flaws

Definitions

  • the present invention relates to a computer system that performs imaging by imaging an object, an object diagnostic method, and a program.
  • Non-Patent Document 1 a configuration for diagnosing an object by measuring the temperature of the object using a spectrometer or infrared thermography is disclosed (see Non-Patent Document 1).
  • Non-Patent Document 1 there is a temperature difference for each part of the object, and in order to accurately grasp the state of the object, the temperature of each part to be measured should be accurately measured. Is required.
  • it is possible to identify the outline and approximate part of an object from a thermographic image it is difficult to specify the exact position of each part depending on the distance from the camera to the subject and other factors. As a result, there was a limit to improving the accuracy of the temperature measured for each part.
  • the present invention provides the following solutions.
  • the present invention includes first acquisition means for acquiring a visible light image and an infrared image captured by a camera; First image processing means for specifying a region corresponding to a predetermined part of an object imaged by the camera in the visible light image; Second image processing means for identifying a region in the infrared image corresponding to the identified region in the visible light image; Diagnostic means for diagnosing the object based on the temperature of the region in the identified infrared image; A computer system is provided.
  • the computer system acquires a visible light image and an infrared image captured by a camera, and in the visible light image, an area corresponding to a predetermined part of an object captured by the camera is obtained.
  • the region in the infrared image corresponding to the identified region in the visible light image is identified, and the object is diagnosed based on the temperature of the region in the identified infrared image.
  • the present invention is a category of a computer system, but also in other categories such as an object diagnosis method and program, the same actions and effects corresponding to the category are exhibited.
  • FIG. 1 is a diagram showing an outline of the object diagnostic system 1.
  • FIG. 2 is an overall configuration diagram of the object diagnostic system 1.
  • FIG. 3 is a functional block diagram of the computer 10.
  • FIG. 4 is a flowchart showing object diagnosis processing executed by the computer 10.
  • FIG. 5 is a flowchart illustrating object diagnosis processing executed by the computer 10.
  • FIG. 6 is a diagram schematically showing an example of visible light image data acquired by the computer 10.
  • FIG. 7 is a diagram schematically showing an example of infrared image data acquired by the computer 10.
  • FIG. 8 is a diagram illustrating an example schematically showing a state in which the computer 10 specifies a predetermined part in a visible light image.
  • FIG. 9 is a diagram schematically illustrating an example in which the computer 10 specifies a region in an infrared image.
  • FIG. 10 is a diagram illustrating an example of a reference temperature database stored in the computer 10.
  • FIG. 1 is a diagram for explaining an outline of an object diagnosis system 1 which is a preferred embodiment of the present invention.
  • the object diagnosis system 1 includes a computer 10 and includes IoT devices such as computers, terminals, sensors, and robots, factory equipment such as factory pipes and piping, moving objects such as cars, airplanes, and buses, buildings, and houses. An image obtained by imaging an object such as a building such as a factory itself is acquired, and this object is diagnosed. In the following description, the object is described as being a computer.
  • the computer 10 is a computing device that is communicably connected to a visible light camera, an infrared camera, a sensor, an environment adjustment device, etc. (not shown).
  • the object diagnosis system 1 obtains a visible light image from a visible light camera, obtains an infrared image from an infrared camera, and provides an environment for an object installation location such as illuminance, wind direction, wind speed, temperature, temperature, humidity, and atmospheric pressure from a sensor.
  • Information is acquired and the instruction
  • the computer 10 acquires a visible light image and an infrared image captured by a camera (not shown) (step S01).
  • the computer 10 acquires a visible light image such as a moving image or a still image of an object captured by the visible light camera.
  • the computer 10 acquires an infrared image such as a moving image or a still image of an object captured by the infrared camera.
  • the visible light camera and the infrared camera are installed side by side or in the vicinity, and the visible light camera and the infrared camera image the same object. That is, the visible light camera and the infrared camera image the same object from substantially the same imaging point.
  • the computer 10 specifies a visible light image region that is a region corresponding to a predetermined part of the object imaged by the camera in the visible light image (step S02).
  • the computer 10 specifies, for example, a part of an object such as a housing, a display, and a keyboard, a preset part, and the like as the predetermined part of the object.
  • the computer 10 specifies an area corresponding to the predetermined part in the visible light image by image analysis.
  • the computer 10 extracts a feature amount existing in the visible light image, and specifies a predetermined part based on the feature amount.
  • the computer 10 extracts the color of the visible light image and identifies a predetermined part based on this color.
  • the computer 10 specifies an infrared image region that is a region in the infrared image corresponding to the region in the visible light image described above in the infrared image (step S03).
  • the computer 10 compares the visible light image and the infrared image, and identifies the region that matches the visible light image region in the infrared image as the infrared image region.
  • the computer 10 identifies an infrared image area at the same position as the visible light image area as the infrared image area.
  • the computer 10 diagnoses an object based on the temperature of the infrared image area (step S04).
  • the computer 10 diagnoses whether or not a failure has occurred in the object, for example, by comparing the temperature of the infrared image region with the temperature at the time of the failure of the object.
  • FIG. 2 is a diagram showing a system configuration of the object diagnostic system 1 which is a preferred embodiment of the present invention.
  • the object diagnosis system 1 includes a computer 10 and a public network (Internet network, third generation, fourth generation communication network, etc.) 5, acquires an image obtained by imaging an object, and diagnoses the object.
  • a public network Internet network, third generation, fourth generation communication network, etc.
  • the object diagnosis system 1 includes a camera such as a visible light camera that captures a visible light image of an object and an infrared camera that captures an infrared image of the object, and an installation environment of the object such as illuminance, wind light, wind speed, temperature, temperature, humidity, and atmospheric pressure.
  • a camera such as a visible light camera that captures a visible light image of an object and an infrared camera that captures an infrared image of the object
  • an installation environment of the object such as illuminance, wind light, wind speed, temperature, temperature, humidity, and atmospheric pressure.
  • an environment adjustment device that adjusts an installation environment of an object such as an illumination device such as various lights, an illumination device such as various lights, an air conditioner such as a blower or an air conditioner, or a watering device.
  • the computer 10 obtains various information from these and transmits various instructions.
  • the computer 10 is the above-described computing device having the functions described later.
  • FIG. 3 is a functional block diagram of the computer 10.
  • the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like as the control unit 11, and other devices (cameras, various sensors, environmental adjustment devices) as the communication unit 12. Etc.), for example, a WiFi (Wireless Fidelity) compatible device compliant with IEEE 802.11.
  • the computer 10 also includes a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card as the storage unit 13. Further, the computer 10 includes, as the processing unit 14, a device for executing various processes such as image processing and failure diagnosis.
  • the control unit 11 when the control unit 11 reads a predetermined program, the image data acquisition module 20, the environment information acquisition module 21, and the adjustment instruction transmission module 22 are realized in cooperation with the communication unit 12. Further, in the computer 10, the control unit 11 reads a predetermined program, thereby realizing the storage module 30 in cooperation with the storage unit 13. Further, in the computer 10, the control unit 11 reads a predetermined program, so that the visible light image analysis module 40, the infrared image analysis module 41, the temperature analysis module 42, the diagnosis module 43, and the environment cooperate with the processing unit 14. The adjustment module 44 is realized.
  • FIG.4 and FIG.5 are flowcharts showing object diagnosis processing executed by the computer 10. Processing executed by each module described above will be described together with this processing.
  • the image data acquisition module 20 acquires image data of a visible light image and an infrared image of an object (step S10).
  • the image data acquisition module 20 acquires visible light image data that is a visible light image captured by the visible light camera and infrared image data that is an infrared image captured by the infrared camera.
  • the image data acquisition module 20 acquires visible light image data and infrared image data at a predetermined time interval or at a plurality of time points such as a preset time.
  • the visible light image data and the infrared image data acquired by the image data acquisition module 20 are captured from the same imaging viewpoint and are the same target data.
  • the computer 10 executes diagnosis of an object based on visible light image data and infrared image data at a predetermined time.
  • FIG. 6 is a diagram schematically illustrating an example of visible light image data acquired by the image data acquisition module 20.
  • the image data acquisition module 20 acquires the visible light image 100 indicated by the visible light image data.
  • an object 110 is shown.
  • the visible light image 100 may include scenery other than the object 110, natural objects, artificial objects, and the like, which are omitted for the sake of simplicity.
  • the visible light image 100 may include a plurality of objects 110 and objects different from the objects 110.
  • FIG. 7 is a diagram schematically showing an example of infrared image data acquired by the image data acquisition module 20.
  • the image data acquisition module 20 acquires an infrared image 200 indicated by the infrared image data.
  • An object 210 is shown in the infrared image 200.
  • the infrared image 200 may include scenery, natural objects, artifacts, and the like other than the object 210, but are omitted for the sake of simplicity.
  • each temperature is indicated by hatching for convenience.
  • the infrared image 200 may include a plurality of objects 210 or an object different from the object 210.
  • the environment information acquisition module 21 acquires environment information indicating the installation environment of the object (step S11).
  • the environment information acquisition module 21 acquires illuminance, wind light, wind speed, temperature, temperature, humidity, atmospheric pressure, and the like as environment information.
  • the environment information acquisition module 21 acquires environment information from various sensors (not shown) such as an illuminance sensor, a wind direction / wind speed sensor, a temperature sensor, a humidity sensor, and a pressure sensor.
  • the environment information acquisition module 21 acquires the visible light image data and the infrared image data at the same time as the acquisition timing. These various sensors are installed in the vicinity of the object or in the vicinity of the place where the object is installed.
  • the various sensors may be sensors that detect environmental information other than the examples described above.
  • the installation position of various sensors is not limited to the above-described example, and can be appropriately changed to a position where the installation environment of the object can be detected.
  • the process of step S11 may be omitted. In this case, what is necessary is just to perform the process of step S12 mentioned later after performing the process of step S10 mentioned above.
  • the visible light image analysis module 40 performs image analysis on the acquired visible light image data (step S12).
  • step S ⁇ b> 12 the visible light image analysis module 40 extracts the feature amount and color of the visible light image data, and identifies an object existing in the visible light image data.
  • step S12 for example, the visible light image analysis module 40 compares the feature amount extracted from the visible light image data with the feature amount of the object stored in advance in the storage module 30, and extracts an object having a matching feature amount. The extracted object is identified as existing in the visible light image data. Further, for example, the visible light image analysis module 40 compares the RGB value extracted from the visible light image data with the RGB value of the object stored in advance in the storage module 30, and extracts an object having a matching or similar RGB value. The extracted object is identified as existing in the visible light image data.
  • the visible light image analysis module 40 determines whether there are a plurality of individuals in the visible light image data as a result of the image analysis (step S13). In step S13, the visible light image analysis module 40 determines whether there are a plurality of individuals by determining whether there are a plurality of objects in the visible light image data. The visible light image analysis module 40 determines whether there are a plurality of individuals of one object, whether there are individuals of a plurality of types of objects, and the like.
  • step S13 when the visible light image analysis module 40 determines that there are not a plurality of individuals (step S13: NO), that is, when it is determined that only one individual is present in the visible light image data, visible light image analysis is performed.
  • the module 40 identifies areas corresponding to a plurality of predetermined parts of one individual (step S14).
  • the predetermined part is, for example, a part of an object such as a housing, a display, or a keyboard, a part that is set in advance, or the like.
  • the visible light image analysis module 40 specifies areas corresponding to, for example, a housing, a display, and a keyboard.
  • the visible light image analysis module 40 extracts the casing, display, and keyboard present in the visible light image data from the feature amount, and specifies the extracted location as an area corresponding to a predetermined part. At this time, the visible light image analysis module 40 specifies a plurality of regions corresponding to each of the plurality of predetermined parts.
  • the visible light image analysis module 40 extracts a housing, a display, and a keyboard that are present in the visible light image data from the RGB values, and specifies the extracted location as an area corresponding to a predetermined part.
  • FIG. 8 is a diagram illustrating an example schematically showing a state in which the visible light image analysis module 40 specifies a predetermined part.
  • the visible light image analysis module 40 specifies a region in the visible light image 100 where a predetermined part such as a housing, a display, or a keyboard is located based on the feature amount and the color.
  • the visible light image analysis module 40 identifies the region of the object 110 corresponding to the parts of the casings 300 to 302, the displays 310 to 312 and the keyboards 320 to 322.
  • the specified area is indicated by hatching for convenience. This area indicates a part of each part, but may indicate the entire corresponding part. Note that the number, type, and position of the parts to be specified can be changed as appropriate.
  • step S13 the visible light image analysis module 40 determines that there are a plurality of individuals (YES in step S13), that is, a plurality of individuals such as a first individual, a second individual, and a third individual.
  • the visible light image analysis module 40 identifies each of the plurality of individuals (step S15). In the following description, it will be assumed that the first individual and the second individual exist in the visible light image data.
  • the visible light image analysis module 40 identifies the positional relationship of each of the plurality of individuals (step S16).
  • step S ⁇ b> 16 the visible light image analysis module 40 identifies the positional relationship between the first individual and the second individual based on the position in the visible light image.
  • the visible light image analysis module 40 specifies a relative position or an absolute position between the first individual and the second individual.
  • the positional relationship is, for example, which is closer to the shooting point, coordinates in the visible light image, or the like. Note that the processing in step S16 is not limited to the positional relationship between the first individual and the second individual, but may be the positional relationship with other individuals.
  • the visible light image analysis module 40 specifies a region corresponding to a predetermined part for each of the plurality of individuals (step S17).
  • the process of step S17 performs the process of step S14 mentioned above with respect to each object which exists in visible light image data.
  • the infrared image analysis module 41 identifies a region in the infrared image corresponding to the region in the identified visible light image (step S18).
  • step S ⁇ b> 18 the infrared image analysis module 41 compares the visible light image data with the infrared image data to identify the region of the infrared image data corresponding to the identified region of the object.
  • the infrared image analysis module 41 acquires the position of the region in the visible light image as a coordinate, and specifies the position that matches the acquired coordinate as the region in the infrared image corresponding to the region in the identified visible light image.
  • FIG. 9 is a diagram illustrating an example schematically showing a state in which the infrared image analysis module 41 specifies a region in the infrared image.
  • regions in the infrared image 200 corresponding to the parts of the casings 300 to 302 are identified. This is specified by comparing the position in the visible light image 100 and the position in the infrared image 200.
  • the infrared image analysis module 41 acquires the position coordinates of each part in the visible light image 100, and specifies the position in the infrared image corresponding to the acquired position coordinates as the region in the infrared image corresponding to the region in the visible light image. .
  • the infrared image analysis module 41 specifies the parts of the casings 400 to 402, the displays 410 to 412, and the keyboards 420 to 422 of the object 210.
  • the specified area is indicated by hatching for convenience. This area refers to a part or the whole part depending on the part specified in the above-described visible light image. Note that the number of positions to be specified and their positions can be changed as appropriate in accordance with the visible light image.
  • the temperature analysis module 42 analyzes the temperature of the area in the specified infrared image data (step S19). In step S19, the temperature analysis module 42 acquires the temperature of each region based on the infrared image data.
  • the temperature analysis module 42 acquires a plurality of reference temperatures corresponding to each of a plurality of parts of the object from the reference temperature database stored in the storage module 30 (step S20).
  • the storage module 30 stores a plurality of reference temperatures corresponding to the respective parts in advance, and the temperature analysis module 42 acquires the stored reference temperatures.
  • the reference temperature of the part corresponding to the region in the specified infrared image data is acquired.
  • FIG. 10 is a diagram illustrating an example of a reference temperature database stored in the storage module 30.
  • the storage module 30 stores the name of the part and the reference temperature of the part in association with each other. That is, the storage module 30 stores “housing” and “40” in association with each other, “display” and “30” in association with each other, and “keyboard” and “20” in association with each other.
  • the storage module 30 stores this reference temperature database for each type of object.
  • the storage module 30 may store a reference temperature database for each object, not for each object type. In this case, the storage module 30 may acquire a reference temperature for each part of each individual in advance and store the part and the reference temperature in association with each other.
  • the diagnosis module 43 diagnoses an object based on the temperature of the region in the specified infrared image (step S21).
  • the diagnosis module 43 performs object diagnosis on the acquired temperature, the reference temperature, the temperature of another individual different from the one individual, the positional relationship between the first individual and the second individual, the environmental information It is executed by any one or a plurality of combinations.
  • the diagnosis module 43 executes object diagnosis based on the acquired temperature.
  • the diagnosis module 43 determines whether or not the temperature of the region in the identified infrared image is an abnormal value, and determines that no failure has occurred if the temperature is not an abnormal value. On the other hand, if the diagnosis module 43 determines that the value is an abnormal value, it determines that a failure has occurred.
  • the diagnosis module 43 executes object diagnosis based on the reference temperature.
  • the diagnosis module 43 compares the temperature of the region in the identified infrared image with the reference temperature stored in the acquired storage module 30, and calculates the temperature difference between the temperature of the region and the reference temperature.
  • the diagnosis module 43 determines whether or not the calculated temperature difference is within a predetermined range (for example, within 0.5 ° C, within 1 ° C, within 2 ° C, etc.). Is determined not to occur. On the other hand, the diagnostic module 43 determines that a failure has occurred when the calculated temperature difference is not within the predetermined range.
  • the diagnosis module 43 executes object diagnosis based on one individual and the temperature of another individual different from the one individual.
  • the diagnostic module 43 compares the temperature of the region in the infrared image of the acquired one individual with the temperature of the corresponding region of the same part in the infrared image of another individual different from the one individual. Calculate the temperature difference.
  • the diagnosis module 43 determines whether or not the calculated temperature difference is within a predetermined range (for example, within 0.5 ° C, within 1 ° C, within 2 ° C, etc.). Is determined not to occur. On the other hand, the diagnostic module 43 determines that a failure has occurred when the calculated temperature difference is not within the predetermined range.
  • the diagnostic module 43 compares the reference temperature described above with one individual or another individual, and calculates a temperature difference between the one individual and the other individual, thereby It may be determined whether or not the above has occurred. In other words, the diagnosis module 43 may determine whether a failure has occurred in one or both of the one individual and the other individual based on the reference temperature and the temperature difference.
  • the diagnosis module 43 executes diagnosis of an object based on the positional relationship between the first individual and the second individual will be described.
  • the diagnosis module 43 compares the positional relationship between the position of the first individual and the second individual different from the first individual among the plurality of acquired individuals, and any one of them is an air conditioner, illumination, etc. Determine if you are more affected by This specifies whether either the first individual or the second individual is lower or higher in temperature due to the influence of an air conditioner, illumination, or the like.
  • the diagnosis module 43 corrects the temperatures of the first and second individuals by acquiring environmental information such as air temperature and illuminance on the influence of the air conditioner and lighting.
  • the diagnosis module 43 compares the corrected temperatures of the first individual and the second individual with the reference temperature described above, and calculates the temperature difference between them.
  • the diagnosis module 43 determines whether or not the temperature difference is within a predetermined range (for example, within 0.5 ° C., within 1 ° C., within 2 ° C., etc.). Judge that it does not occur. On the other hand, if the diagnostic module 43 determines that the calculated temperature difference is not within the predetermined range, it determines that a failure has occurred. The diagnosis module 43 may determine whether or not a failure has occurred without using the reference temperature. For example, it may be determined whether or not a failure has occurred based on whether or not the corrected temperature of the first individual and the second individual is a predetermined temperature.
  • the diagnosis module 43 executes diagnosis of an object based on environmental information acquired from a sensor.
  • the diagnosis module 43 corrects the acquired individual temperature based on the environmental information. For example, the diagnosis module 43 acquires humidity, air temperature, atmospheric pressure, and the like as environment information, and corrects the acquired individual temperature based on the acquired environment information.
  • the diagnosis module 43 diagnoses an object based on the corrected individual temperature.
  • the diagnosis module 43 may determine whether or not a failure has occurred in the object based on the corrected individual temperature and the reference temperature.
  • the diagnosis module 43 outputs a diagnosis result (step S22).
  • the diagnosis module 43 outputs the content of the failure (for example, the name of the failure, a countermeasure method, etc.) as the diagnosis result.
  • the diagnosis module 43 outputs the diagnosis result of the one individual.
  • the diagnosis module 43 outputs a diagnosis result for each individual.
  • the diagnosis module 43 outputs a diagnosis result together with various information capable of uniquely specifying the individual such as the name, identifier, and position information of each individual.
  • the diagnosis module 43 performs diagnosis of an object using one visible light image data and one infrared image data, but a plurality of visible light image data acquired within a predetermined period.
  • the diagnosis of the object may be executed based on the infrared image data.
  • the diagnosis of the object may be executed based on the change amount, change width, and change itself of the individual temperature acquired from each infrared image data.
  • the diagnosis module 43 may perform diagnosis of this object based on the average value of the individual temperatures acquired from the plurality of infrared image data. For example, the diagnosis module 43 calculates a temperature difference between the average value of the individual and the reference temperature by comparing the average value of the temperature of the individual with the reference temperature, and the temperature difference is a predetermined value.
  • the diagnosis of the object may be executed based on whether or not it is within the range.
  • the diagnosis module 43 determines whether a failure has occurred in the individual based on the output diagnosis result (step S23).
  • step S23 when the diagnosis module 43 determines that no failure has occurred in the individual (NO in step S23), the process ends. At this time, the diagnosis module 43 may transmit a notification to the effect that no failure has occurred in the individual to an external terminal device (not shown).
  • step S23 when the diagnosis module 43 determines that a failure has occurred in the individual (YES in step S23), the environment adjustment module 44 adjusts the installation environment based on information representing the result of the diagnosis.
  • An instruction is created (step S24).
  • step S ⁇ b> 24 the environment adjustment module 44 acquires necessary processing based on the diagnosed failure content based on an adjustment database that stores the failure content and the processing in association with each other.
  • the environment adjustment module 44 creates an adjustment instruction for executing the acquired processing.
  • This adjustment instruction includes necessary processing and information that can uniquely identify the environmental adjustment device such as an identifier or device ID of the environmental adjustment device that executes this processing.
  • the adjustment instruction transmission module 22 transmits the adjustment instruction created by the environment adjustment module 44 in step S24 described above to the environment adjustment apparatus (step S25).
  • the adjustment instruction transmission module 22 transmits the environmental adjustment device included in the adjustment instruction to the target environmental adjustment device based on information that can uniquely identify the environmental adjustment device.
  • the environment adjustment device receives this adjustment instruction and executes necessary processing included in this adjustment instruction. For example, the environment adjustment device executes lighting on / off, humidity and temperature control, watering, medicine spraying, and the like.
  • the above is the object diagnosis process.
  • the object is described as being a computer.
  • the object is not limited to this example, and may be another article.
  • electronic products such as netbook terminals, slate terminals, electronic book terminals, portable music players, wearable terminals such as smart glasses and head mounted displays
  • various sensors IoT (Internet of Things) equipment such as a robot
  • factory equipment such as factory pipes, piping equipment, drainage equipment, power receiving / transforming equipment, power transmission equipment, pump equipment, fire fighting equipment, boiler equipment, high pressure gas equipment, high pressure air equipment may be used.
  • it may be a moving body such as a vehicle, an airplane, a ship, a bus, a house, a hospital, a clinic, a station, an airport, a building, a government office, a police station, a fire station, a police box, a stadium, a stadium, a hotel. It may be a building such as a warehouse, a school, a public toilet, a store or a restaurant.
  • the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
  • the program is provided, for example, in a form (SaaS: Software as a Service) provided from a computer via a network.
  • the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.

Abstract

[Problem] To provide: a computer system which improves the measurement accuracy of the temperature of each region of an object; an object diagnosis method; and a program. [Solution] This computer system acquires a visible light image and an infrared image which are captured by a camera, identifies an area, in the visible light image, which corresponds to a prescribed region of an object imaged by the camera, identifies an area, in the infrared image, which corresponds to the identified area in the visible light image, and diagnoses the object on the basis of the temperature of the identified area in the infrared image.

Description

コンピュータシステム、物体の診断方法及びプログラムComputer system, object diagnostic method and program
 本発明は、物体を撮像して診断を行うコンピュータシステム、物体の診断方法及びプログラムに関する。 The present invention relates to a computer system that performs imaging by imaging an object, an object diagnostic method, and a program.
 近年、物体の画像を撮像することにより、物体の障害診断を行うことが可能なコンピュータシステムが提案されている。 In recent years, computer systems have been proposed that are capable of diagnosing faults in an object by taking an image of the object.
 このようなコンピュータシステムにおいて、スペクトロメータや赤外線サーモグラフィを利用して、物体の温度を測定することにより、物体の診断を行う構成が開示されている(非特許文献1参照)。 In such a computer system, a configuration for diagnosing an object by measuring the temperature of the object using a spectrometer or infrared thermography is disclosed (see Non-Patent Document 1).
 しかしながら、非特許文献1の構成では、物体の部位毎に温度差が存在し、物体の状態を正確に把握するためには、温度を測定すべき対象の各部位の温度を正確に測定することが求められている。しかしながら、サーモグラフィ画像から物体の輪郭やおおよその部位を特定することが可能であるが、カメラから被写体までの距離やその他の要因によって、各部位の正確な位置を特定することは困難であった。結果的に、各部位について測定した温度の精度を向上させることに限界があった。 However, in the configuration of Non-Patent Document 1, there is a temperature difference for each part of the object, and in order to accurately grasp the state of the object, the temperature of each part to be measured should be accurately measured. Is required. However, although it is possible to identify the outline and approximate part of an object from a thermographic image, it is difficult to specify the exact position of each part depending on the distance from the camera to the subject and other factors. As a result, there was a limit to improving the accuracy of the temperature measured for each part.
 本発明は、対象の各部位の温度の測定精度を向上させたコンピュータシステム、物体の診断方法及びプログラムを提供することを目的とする。 It is an object of the present invention to provide a computer system, an object diagnosis method, and a program that improve the temperature measurement accuracy of each target part.
 本発明では、以下のような解決手段を提供する。 The present invention provides the following solutions.
 本発明は、カメラにて撮像された、可視光画像及び赤外線画像を取得する第1取得手段と、
 前記可視光画像において、前記カメラにて撮像された物体の所定の部位に対応する領域を特定する第1画像処理手段と、
 特定された前記可視光画像における領域に対応する、前記赤外線画像における領域を特定する第2画像処理手段と、
 特定された前記赤外線画像における領域の温度に基づいて、前記物体を診断する診断手段と、
 を備えることを特徴とするコンピュータシステムを提供する。
The present invention includes first acquisition means for acquiring a visible light image and an infrared image captured by a camera;
First image processing means for specifying a region corresponding to a predetermined part of an object imaged by the camera in the visible light image;
Second image processing means for identifying a region in the infrared image corresponding to the identified region in the visible light image;
Diagnostic means for diagnosing the object based on the temperature of the region in the identified infrared image;
A computer system is provided.
 本発明によれば、コンピュータシステムは、カメラにて撮像された、可視光画像及び赤外線画像を取得し、前記可視光画像において、前記カメラにて撮像された物体の所定の部位に対応する領域を特定し、特定された前記可視光画像における領域に対応する、前記赤外線画像における領域を特定し、特定された前記赤外線画像における領域の温度に基づいて、前記物体を診断する。 According to the present invention, the computer system acquires a visible light image and an infrared image captured by a camera, and in the visible light image, an area corresponding to a predetermined part of an object captured by the camera is obtained. The region in the infrared image corresponding to the identified region in the visible light image is identified, and the object is diagnosed based on the temperature of the region in the identified infrared image.
 本発明は、コンピュータシステムのカテゴリであるが、物体の診断方法及びプログラム等の他のカテゴリにおいても、そのカテゴリに応じた同様の作用・効果を発揮する。 The present invention is a category of a computer system, but also in other categories such as an object diagnosis method and program, the same actions and effects corresponding to the category are exhibited.
 本発明によれば、対象の各部位の温度の測定精度を向上させたコンピュータシステム、物体の診断方法及びプログラムを提供することが可能となる。 According to the present invention, it is possible to provide a computer system, an object diagnosis method, and a program which improve the temperature measurement accuracy of each target part.
図1は、物体診断システム1の概要を示す図である。FIG. 1 is a diagram showing an outline of the object diagnostic system 1. 図2は、物体診断システム1の全体構成図である。FIG. 2 is an overall configuration diagram of the object diagnostic system 1. 図3は、コンピュータ10の機能ブロック図である。FIG. 3 is a functional block diagram of the computer 10. 図4は、コンピュータ10が実行する物体診断処理を示すフローチャートである。FIG. 4 is a flowchart showing object diagnosis processing executed by the computer 10. 図5は、コンピュータ10が実行する物体診断処理を示すフローチャートである。FIG. 5 is a flowchart illustrating object diagnosis processing executed by the computer 10. 図6は、コンピュータ10が取得した可視光画像データを模式的に示した一例を示す図である。FIG. 6 is a diagram schematically showing an example of visible light image data acquired by the computer 10. 図7は、コンピュータ10が取得した赤外線画像データを模式的に示した一例を示す図である。FIG. 7 is a diagram schematically showing an example of infrared image data acquired by the computer 10. 図8は、コンピュータ10が、可視光画像における所定の部位を特定した状態を模式的に示した一例を示す図である。FIG. 8 is a diagram illustrating an example schematically showing a state in which the computer 10 specifies a predetermined part in a visible light image. 図9は、コンピュータ10が、赤外線画像における領域を特定した状態を模式的に示した一例を示す図である。FIG. 9 is a diagram schematically illustrating an example in which the computer 10 specifies a region in an infrared image. 図10は、コンピュータ10が記憶する基準温度データベースの一例を示す図である。FIG. 10 is a diagram illustrating an example of a reference temperature database stored in the computer 10.
 以下、本発明を実施するための最良の形態について図を参照しながら説明する。なお、これはあくまでも一例であって、本発明の技術的範囲はこれに限られるものではない。 Hereinafter, the best mode for carrying out the present invention will be described with reference to the drawings. This is merely an example, and the technical scope of the present invention is not limited to this.
 [物体診断システム1の概要]
 本発明の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態である物体診断システム1の概要を説明するための図である。物体診断システム1は、コンピュータ10から構成され、コンピュータ、端末、センサ、ロボット等のIoT機器や、工場のパイプや配管等の工場設備や、車、飛行機、バス等の移動体や、ビル、家、工場そのもの等の建築物等の物体を撮像した画像を取得し、この物体の診断を行う。以下の説明において、物体は、コンピュータであるものとして説明する。
[Outline of Object Diagnosis System 1]
The outline of the present invention will be described with reference to FIG. FIG. 1 is a diagram for explaining an outline of an object diagnosis system 1 which is a preferred embodiment of the present invention. The object diagnosis system 1 includes a computer 10 and includes IoT devices such as computers, terminals, sensors, and robots, factory equipment such as factory pipes and piping, moving objects such as cars, airplanes, and buses, buildings, and houses. An image obtained by imaging an object such as a building such as a factory itself is acquired, and this object is diagnosed. In the following description, the object is described as being a computer.
 コンピュータ10は、図示していない可視光カメラ、赤外線カメラ、センサ、環境調整装置等に通信可能に接続された計算装置である。物体診断システム1は、可視光カメラから可視光画像を取得し、赤外線カメラから赤外線画像を取得し、センサから照度・風向・風速・温度・気温・湿度・気圧等の物体の設置場所についての環境情報を取得し、各種ライト等の照明装置・送風機等の空調装置・散水装置等の環境調整装置に、上述した環境情報を調整する指示を送信する。 The computer 10 is a computing device that is communicably connected to a visible light camera, an infrared camera, a sensor, an environment adjustment device, etc. (not shown). The object diagnosis system 1 obtains a visible light image from a visible light camera, obtains an infrared image from an infrared camera, and provides an environment for an object installation location such as illuminance, wind direction, wind speed, temperature, temperature, humidity, and atmospheric pressure from a sensor. Information is acquired and the instruction | indication which adjusts environmental information mentioned above is transmitted to illuminating devices, such as various lights, air conditioners, such as an air blower, and environmental control apparatuses, such as a watering apparatus.
 はじめに、コンピュータ10は、図示していないカメラにて撮像された可視光画像及び赤外線画像を取得する(ステップS01)。コンピュータ10は、可視光カメラが撮像する物体の動画や静止画等の可視光画像を取得する。また、コンピュータ10は、赤外線カメラが撮像する物体の動画や静止画等の赤外線画像を取得する。可視光カメラ及び赤外線カメラは、並置又は近傍に設置されており、可視光カメラと赤外線カメラとは同一の対象を撮像する。すなわち、可視光カメラと赤外線カメラとは、同一の対象を、略同一の撮像地点から撮像する。 First, the computer 10 acquires a visible light image and an infrared image captured by a camera (not shown) (step S01). The computer 10 acquires a visible light image such as a moving image or a still image of an object captured by the visible light camera. In addition, the computer 10 acquires an infrared image such as a moving image or a still image of an object captured by the infrared camera. The visible light camera and the infrared camera are installed side by side or in the vicinity, and the visible light camera and the infrared camera image the same object. That is, the visible light camera and the infrared camera image the same object from substantially the same imaging point.
 コンピュータ10は、可視光画像において、カメラにて撮像された物体の所定の部位に対応する領域である可視光画像領域を特定する(ステップS02)。コンピュータ10は、物体の所定の部位として、例えば、筐体、ディスプレイ、キーボード等の物体の一部や、予め設定された部位等を特定する。例えば、コンピュータ10は、可視光画像において、この所定の部位に対応する領域を、画像解析により特定する。コンピュータ10は、可視光画像に存在する特徴量を抽出し、この特徴量に基づいて、所定の部位を特定する。また、コンピュータ10は、可視光画像の色を抽出し、この色に基づいて、所定の部位を特定する。 The computer 10 specifies a visible light image region that is a region corresponding to a predetermined part of the object imaged by the camera in the visible light image (step S02). The computer 10 specifies, for example, a part of an object such as a housing, a display, and a keyboard, a preset part, and the like as the predetermined part of the object. For example, the computer 10 specifies an area corresponding to the predetermined part in the visible light image by image analysis. The computer 10 extracts a feature amount existing in the visible light image, and specifies a predetermined part based on the feature amount. In addition, the computer 10 extracts the color of the visible light image and identifies a predetermined part based on this color.
 コンピュータ10は、赤外線画像において、上述した可視光画像における領域に対応する赤外線画像における領域である赤外線画像領域を特定する(ステップS03)。コンピュータ10は、可視光画像と赤外線画像とを比較することにより、赤外線画像における可視光画像領域と一致する領域を赤外線画像領域として特定する。コンピュータ10は、可視光画像領域の位置と同じ位置にある赤外線画像の領域を赤外線画像領域として特定する。 The computer 10 specifies an infrared image region that is a region in the infrared image corresponding to the region in the visible light image described above in the infrared image (step S03). The computer 10 compares the visible light image and the infrared image, and identifies the region that matches the visible light image region in the infrared image as the infrared image region. The computer 10 identifies an infrared image area at the same position as the visible light image area as the infrared image area.
 コンピュータ10は、赤外線画像領域の温度に基づいて、物体を診断する(ステップS04)。コンピュータ10は、例えば、赤外線画像領域の温度と、物体の障害発生時の温度とを比較することにより、物体に障害が発生しているか否かを診断する。 The computer 10 diagnoses an object based on the temperature of the infrared image area (step S04). The computer 10 diagnoses whether or not a failure has occurred in the object, for example, by comparing the temperature of the infrared image region with the temperature at the time of the failure of the object.
 以上が、物体診断システム1の概要である。 The above is the outline of the object diagnosis system 1.
 [物体診断システム1のシステム構成]
 図2に基づいて、物体診断システム1のシステム構成について説明する。図2は、本発明の好適な実施形態である物体診断システム1のシステム構成を示す図である。物体診断システム1は、コンピュータ10、公衆回線網(インターネット網や、第3、第4世代通信網等)5から構成され、物体を撮像した画像を取得し、この物体の診断を行う。
[System configuration of object diagnosis system 1]
The system configuration of the object diagnostic system 1 will be described with reference to FIG. FIG. 2 is a diagram showing a system configuration of the object diagnostic system 1 which is a preferred embodiment of the present invention. The object diagnosis system 1 includes a computer 10 and a public network (Internet network, third generation, fourth generation communication network, etc.) 5, acquires an image obtained by imaging an object, and diagnoses the object.
 物体診断システム1は、物体の可視光画像を撮像する可視光カメラ及び物体の赤外線画像を撮像する赤外線カメラ等のカメラ、照度・風光・風速・温度・気温・湿度・気圧等の物体の設置環境を示す環境情報を検知する各種センサ、各種ライト等の照明装置・送風機・冷暖房機等の空調装置・散水装置等の物体の設置環境を調整する環境調整装置とデータ通信可能に接続される。コンピュータ10は、これらから各種情報を取得するとともに、様々な指示を送信する。 The object diagnosis system 1 includes a camera such as a visible light camera that captures a visible light image of an object and an infrared camera that captures an infrared image of the object, and an installation environment of the object such as illuminance, wind light, wind speed, temperature, temperature, humidity, and atmospheric pressure. Are connected to an environment adjustment device that adjusts an installation environment of an object such as an illumination device such as various lights, an illumination device such as various lights, an air conditioner such as a blower or an air conditioner, or a watering device. The computer 10 obtains various information from these and transmits various instructions.
 コンピュータ10は、後述の機能を備えた上述した計算装置である。 The computer 10 is the above-described computing device having the functions described later.
 [各機能の説明]
 図3に基づいて、物体診断システム1の機能について説明する。図3は、コンピュータ10の機能ブロック図を示す図である。
[Description of each function]
Based on FIG. 3, the function of the object diagnostic system 1 is demonstrated. FIG. 3 is a functional block diagram of the computer 10.
 コンピュータ10は、制御部11として、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)等を備え、通信部12として、他の機器(カメラ・各種センサ・環境調整装置等)と通信可能にするためのデバイス、例えば、IEEE802.11に準拠したWiFi(Wireless Fidelity)対応デバイスを備える。また、コンピュータ10は、記憶部13として、ハードディスクや半導体メモリ、記録媒体、メモリカード等によるデータのストレージ部を備える。また、コンピュータ10は、処理部14として、画像処理、障害診断等の各種処理を実行するためのデバイス等を備える。 The computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like as the control unit 11, and other devices (cameras, various sensors, environmental adjustment devices) as the communication unit 12. Etc.), for example, a WiFi (Wireless Fidelity) compatible device compliant with IEEE 802.11. The computer 10 also includes a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card as the storage unit 13. Further, the computer 10 includes, as the processing unit 14, a device for executing various processes such as image processing and failure diagnosis.
 コンピュータ10において、制御部11が所定のプログラムを読み込むことにより、通信部12と協働して、画像データ取得モジュール20、環境情報取得モジュール21、調整指示送信モジュール22を実現する。また、コンピュータ10において、制御部11が所定のプログラムを読み込むことにより、記憶部13と協働して、記憶モジュール30を実現する。また、コンピュータ10において、制御部11が所定のプログラムを読み込むことにより、処理部14と協働して、可視光画像解析モジュール40、赤外線画像解析モジュール41、温度解析モジュール42、診断モジュール43、環境調整モジュール44を実現する。 In the computer 10, when the control unit 11 reads a predetermined program, the image data acquisition module 20, the environment information acquisition module 21, and the adjustment instruction transmission module 22 are realized in cooperation with the communication unit 12. Further, in the computer 10, the control unit 11 reads a predetermined program, thereby realizing the storage module 30 in cooperation with the storage unit 13. Further, in the computer 10, the control unit 11 reads a predetermined program, so that the visible light image analysis module 40, the infrared image analysis module 41, the temperature analysis module 42, the diagnosis module 43, and the environment cooperate with the processing unit 14. The adjustment module 44 is realized.
 [物体診断処理]
 図4及び図5に基づいて、物体診断システム1が実行する物体診断処理について説明する。図4及び図5は、コンピュータ10が実行する物体診断処理のフローチャートを示す図である。上述した各モジュールが実行する処理について、本処理に併せて説明する。
[Object diagnosis processing]
Based on FIG.4 and FIG.5, the object diagnostic process which the object diagnostic system 1 performs is demonstrated. 4 and 5 are flowcharts showing object diagnosis processing executed by the computer 10. Processing executed by each module described above will be described together with this processing.
 はじめに、画像データ取得モジュール20は、物体の可視光画像及び赤外線画像の画像データを取得する(ステップS10)。ステップS10において、画像データ取得モジュール20は、可視光カメラが撮像した可視光画像である可視光画像データ及び赤外線カメラが撮像した赤外線画像である赤外線画像データを取得する。画像データ取得モジュール20は、所定の時間間隔毎や、予め設定された時刻等の複数の時点における可視光画像データ及び赤外線画像データを取得する。画像データ取得モジュール20が取得する可視光画像データ及び赤外線画像データは、同一の撮像視点により撮像されており、同一の対象のデータである。なお、後述の説明において、所定の時点における可視光画像データ及び赤外線画像データに基づいて、コンピュータ10が物体の診断を実行するものとして説明する。 First, the image data acquisition module 20 acquires image data of a visible light image and an infrared image of an object (step S10). In step S10, the image data acquisition module 20 acquires visible light image data that is a visible light image captured by the visible light camera and infrared image data that is an infrared image captured by the infrared camera. The image data acquisition module 20 acquires visible light image data and infrared image data at a predetermined time interval or at a plurality of time points such as a preset time. The visible light image data and the infrared image data acquired by the image data acquisition module 20 are captured from the same imaging viewpoint and are the same target data. In the following description, it is assumed that the computer 10 executes diagnosis of an object based on visible light image data and infrared image data at a predetermined time.
 図6に基づいて、画像データ取得モジュール20が取得する物体の可視光画像データについて説明する。図6は、画像データ取得モジュール20が取得した可視光画像データを模式的に示した一例を示す図である。画像データ取得モジュール20は、可視光画像データが示す可視光画像100を取得する。この可視光画像100には、物体110が写っている。また、可視光画像100には、物体110以外の風景、自然物、人工物等が写っていてもよいが、説明の簡略化のために、省略している。また、可視光画像100に複数の物体110や、物体110とは異なる物体が存在してもよい。 The visible light image data of the object acquired by the image data acquisition module 20 will be described with reference to FIG. FIG. 6 is a diagram schematically illustrating an example of visible light image data acquired by the image data acquisition module 20. The image data acquisition module 20 acquires the visible light image 100 indicated by the visible light image data. In this visible light image 100, an object 110 is shown. The visible light image 100 may include scenery other than the object 110, natural objects, artificial objects, and the like, which are omitted for the sake of simplicity. The visible light image 100 may include a plurality of objects 110 and objects different from the objects 110.
 図7に基づいて、画像データ取得モジュール20が取得する物体の赤外線画像データについて説明する。図7は、画像データ取得モジュール20が取得した赤外線画像データを模式的に示した一例を示す図である。画像データ取得モジュール20は、赤外線画像データが示す赤外線画像200を取得する。赤外線画像200には、物体210が写っている。また、赤外線画像200には、物体210以外の風景、自然物、人工物等が写っていてもよいが、説明の簡略化のために、省略している。赤外線画像200において、各温度を、便宜的にハッチングにより示している。また、赤外線画像200に複数の物体210や、物体210とは異なる物体が存在してもよい。 The infrared image data of the object acquired by the image data acquisition module 20 will be described with reference to FIG. FIG. 7 is a diagram schematically showing an example of infrared image data acquired by the image data acquisition module 20. The image data acquisition module 20 acquires an infrared image 200 indicated by the infrared image data. An object 210 is shown in the infrared image 200. In addition, the infrared image 200 may include scenery, natural objects, artifacts, and the like other than the object 210, but are omitted for the sake of simplicity. In the infrared image 200, each temperature is indicated by hatching for convenience. The infrared image 200 may include a plurality of objects 210 or an object different from the object 210.
 環境情報取得モジュール21は、物体の設置環境を示す環境情報を取得する(ステップS11)。ステップS11において、環境情報取得モジュール21は、環境情報として、照度・風光・風速・温度・気温・湿度・気圧等を取得する。環境情報取得モジュール21は、図示していない照度センサ、風向風速センサ、温度センサ、湿度センサ、圧力センサ等の各種センサから、環境情報を取得する。環境情報取得モジュール21は、可視光画像データ及び赤外線画像データを取得したタイミングと同時に取得する。これらの各種センサは、物体の近傍又はこの物体が設置されている場所の近傍に設置されている。 The environment information acquisition module 21 acquires environment information indicating the installation environment of the object (step S11). In step S11, the environment information acquisition module 21 acquires illuminance, wind light, wind speed, temperature, temperature, humidity, atmospheric pressure, and the like as environment information. The environment information acquisition module 21 acquires environment information from various sensors (not shown) such as an illuminance sensor, a wind direction / wind speed sensor, a temperature sensor, a humidity sensor, and a pressure sensor. The environment information acquisition module 21 acquires the visible light image data and the infrared image data at the same time as the acquisition timing. These various sensors are installed in the vicinity of the object or in the vicinity of the place where the object is installed.
 なお、各種センサは、上述した例以外の環境情報を検知するセンサであってもよい。また、各種センサの設置位置は、上述した例に限らず、この物体の設置環境を検知することが可能な位置に適宜変更可能である。また、ステップS11の処理は、省略してもよい。この場合、上述したステップS10の処理を実行後に、後述するステップS12の処理を実行すればよい。 The various sensors may be sensors that detect environmental information other than the examples described above. Moreover, the installation position of various sensors is not limited to the above-described example, and can be appropriately changed to a position where the installation environment of the object can be detected. Further, the process of step S11 may be omitted. In this case, what is necessary is just to perform the process of step S12 mentioned later after performing the process of step S10 mentioned above.
 可視光画像解析モジュール40は、取得した可視光画像データを画像解析する(ステップS12)。ステップS12において、可視光画像解析モジュール40は、可視光画像データの特徴量や色を抽出し、可視光画像データに存在する物体を識別する。ステップS12において、例えば、可視光画像解析モジュール40は、可視光画像データから抽出した特徴量と、予め記憶モジュール30が記憶する物体の特徴量とを比較し、一致する特徴量を有する物体を抽出し、この抽出した物体が可視光画像データに存在すると識別する。また、例えば、可視光画像解析モジュール40は、可視光画像データから抽出したRGB値と、予め記憶モジュール30が記憶する物体のRGB値とを比較し、一致又は類似するRGB値を有する物体を抽出し、この抽出した物体が可視光画像データに存在すると識別する。 The visible light image analysis module 40 performs image analysis on the acquired visible light image data (step S12). In step S <b> 12, the visible light image analysis module 40 extracts the feature amount and color of the visible light image data, and identifies an object existing in the visible light image data. In step S12, for example, the visible light image analysis module 40 compares the feature amount extracted from the visible light image data with the feature amount of the object stored in advance in the storage module 30, and extracts an object having a matching feature amount. The extracted object is identified as existing in the visible light image data. Further, for example, the visible light image analysis module 40 compares the RGB value extracted from the visible light image data with the RGB value of the object stored in advance in the storage module 30, and extracts an object having a matching or similar RGB value. The extracted object is identified as existing in the visible light image data.
 可視光画像解析モジュール40は、画像解析の結果、可視光画像データに複数の個体が存在するか否かを判断する(ステップS13)。ステップS13において、可視光画像解析モジュール40は、可視光画像データに、物体が複数存在するか否かを判断することにより、複数の個体が存在するか否かを判断する。可視光画像解析モジュール40は、一の物体の個体が複数存在するか否か、複数の種類の物体の個体が存在するか否か等を判断する。 The visible light image analysis module 40 determines whether there are a plurality of individuals in the visible light image data as a result of the image analysis (step S13). In step S13, the visible light image analysis module 40 determines whether there are a plurality of individuals by determining whether there are a plurality of objects in the visible light image data. The visible light image analysis module 40 determines whether there are a plurality of individuals of one object, whether there are individuals of a plurality of types of objects, and the like.
 ステップS13において、可視光画像解析モジュール40は、複数の個体が存在しないと判断した場合(ステップS13 NO)、すなわち、一の個体のみが可視光画像データに存在すると判断した場合、可視光画像解析モジュール40は、一の個体の複数の所定の部位に対応する領域を特定する(ステップS14)。ステップS14において、所定の部位とは、例えば、筐体、ディスプレイ、キーボード等の物体の一部や、予め設定された部位等である。可視光画像解析モジュール40は、例えば、筐体、ディスプレイ、キーボードに対応する領域を特定する。可視光画像解析モジュール40は、可視光画像データに存在する筐体、ディスプレイ、キーボードを特徴量から抽出し、この抽出した場所を、所定の部位に対応する領域として特定する。このとき、可視光画像解析モジュール40は、複数の所定の部位の各々に対応する複数の領域を特定する。また、可視光画像解析モジュール40は、可視光画像データに存在する筐体、ディスプレイ、キーボードをRGB値から抽出し、この抽出した場所を、所定の部位に対応する領域として特定する。 In step S13, when the visible light image analysis module 40 determines that there are not a plurality of individuals (step S13: NO), that is, when it is determined that only one individual is present in the visible light image data, visible light image analysis is performed. The module 40 identifies areas corresponding to a plurality of predetermined parts of one individual (step S14). In step S14, the predetermined part is, for example, a part of an object such as a housing, a display, or a keyboard, a part that is set in advance, or the like. The visible light image analysis module 40 specifies areas corresponding to, for example, a housing, a display, and a keyboard. The visible light image analysis module 40 extracts the casing, display, and keyboard present in the visible light image data from the feature amount, and specifies the extracted location as an area corresponding to a predetermined part. At this time, the visible light image analysis module 40 specifies a plurality of regions corresponding to each of the plurality of predetermined parts. The visible light image analysis module 40 extracts a housing, a display, and a keyboard that are present in the visible light image data from the RGB values, and specifies the extracted location as an area corresponding to a predetermined part.
 図8に基づいて、可視光画像解析モジュール40が特定する所定の部位に対応する領域について説明する。図8は、可視光画像解析モジュール40が所定の部位を特定した状態を模式的に示した一例を示す図である。図8において、可視光画像解析モジュール40は、特徴量や色に基づいて、筐体、ディスプレイ、キーボード等の所定の部位が位置する可視光画像100における領域を特定する。すなわち、可視光画像解析モジュール40は、物体110の、筐体300~302、ディスプレイ310~312、キーボード320~322の部位に対応する領域を特定する。図8において、特定した領域を、便宜的にハッチングにより示している。この領域は、各部位の一部を指しているが、該当する部位全体を指してもよい。なお、特定する部位の数、種類及びその位置は適宜変更可能である。 Based on FIG. 8, the region corresponding to the predetermined part specified by the visible light image analysis module 40 will be described. FIG. 8 is a diagram illustrating an example schematically showing a state in which the visible light image analysis module 40 specifies a predetermined part. In FIG. 8, the visible light image analysis module 40 specifies a region in the visible light image 100 where a predetermined part such as a housing, a display, or a keyboard is located based on the feature amount and the color. In other words, the visible light image analysis module 40 identifies the region of the object 110 corresponding to the parts of the casings 300 to 302, the displays 310 to 312 and the keyboards 320 to 322. In FIG. 8, the specified area is indicated by hatching for convenience. This area indicates a part of each part, but may indicate the entire corresponding part. Note that the number, type, and position of the parts to be specified can be changed as appropriate.
 一方、ステップS13において、可視光画像解析モジュール40は、複数の個体が存在すると判断した場合(ステップS13 YES)、すなわち、第1の個体、第2の個体、第3の個体等の複数の個体が可視光画像データに存在すると判断した場合、可視光画像解析モジュール40は、複数の個体をそれぞれ識別する(ステップS15)。なお、以下の説明において、可視光画像データに第1の個体と第2の個体とが存在するものとして説明する。 On the other hand, in step S13, the visible light image analysis module 40 determines that there are a plurality of individuals (YES in step S13), that is, a plurality of individuals such as a first individual, a second individual, and a third individual. When the visible light image data is determined to be present in the visible light image data, the visible light image analysis module 40 identifies each of the plurality of individuals (step S15). In the following description, it will be assumed that the first individual and the second individual exist in the visible light image data.
 可視光画像解析モジュール40は、複数の個体のそれぞれの位置関係を特定する(ステップS16)。ステップS16において、可視光画像解析モジュール40は、可視光画像内の位置に基づいて、第1の個体と第2の個体との位置関係を特定する。可視光画像解析モジュール40は、第1の個体と第2の個体との相対的な位置又は絶対的な位置を特定する。位置関係とは、例えば、どちらが撮影地点に近いか、可視光画像における座標等である。なお、ステップS16の処理は、第1の個体と第2の個体との位置関係に限らず、その他の個体との位置関係であってもよい。 The visible light image analysis module 40 identifies the positional relationship of each of the plurality of individuals (step S16). In step S <b> 16, the visible light image analysis module 40 identifies the positional relationship between the first individual and the second individual based on the position in the visible light image. The visible light image analysis module 40 specifies a relative position or an absolute position between the first individual and the second individual. The positional relationship is, for example, which is closer to the shooting point, coordinates in the visible light image, or the like. Note that the processing in step S16 is not limited to the positional relationship between the first individual and the second individual, but may be the positional relationship with other individuals.
 可視光画像解析モジュール40は、複数の個体のそれぞれに対して、所定の部位に対応する領域を特定する(ステップS17)。ステップS17の処理は、上述したステップS14の処理を、可視光画像データに存在する各物体に対して実行する。 The visible light image analysis module 40 specifies a region corresponding to a predetermined part for each of the plurality of individuals (step S17). The process of step S17 performs the process of step S14 mentioned above with respect to each object which exists in visible light image data.
 赤外線画像解析モジュール41は、特定された可視光画像における領域に対応する赤外線画像における領域を特定する(ステップS18)。ステップS18において、赤外線画像解析モジュール41は、可視光画像データと、赤外線画像データとを比較することにより、特定した物体の部位の領域に対応する赤外線画像データの領域を特定する。赤外線画像解析モジュール41は、可視光画像における領域の位置を座標として取得し、この取得した座標に一致する位置を、特定された可視光画像における領域に対応する赤外線画像における領域として特定する。 The infrared image analysis module 41 identifies a region in the infrared image corresponding to the region in the identified visible light image (step S18). In step S <b> 18, the infrared image analysis module 41 compares the visible light image data with the infrared image data to identify the region of the infrared image data corresponding to the identified region of the object. The infrared image analysis module 41 acquires the position of the region in the visible light image as a coordinate, and specifies the position that matches the acquired coordinate as the region in the infrared image corresponding to the region in the identified visible light image.
 図9に基づいて、赤外線画像解析モジュール41が特定する可視光画像における領域に対応する赤外線画像における領域について説明する。図9は、赤外線画像解析モジュール41が赤外線画像における領域を特定した状態を模式的に示した一例を示す図である。図9において、上述した可視光画像100において、特定した物体110の筐体300~302、ディスプレイ310~312、キーボード320~322の部位に対応する赤外線画像200における領域を特定する。これは、可視光画像100における位置と、赤外線画像200における位置とを比較することにより特定する。赤外線画像解析モジュール41は、可視光画像100における各部位の位置座標を取得し、この取得した位置座標に対応する赤外線画像における位置を、可視光画像における領域に対応する赤外線画像における領域として特定する。赤外線画像解析モジュール41は、物体210の筐体400~402、ディスプレイ410~412、キーボード420~422の部位を特定する。図9において、特定した領域を、便宜的にハッチングにより示している。この領域は、上述した可視光画像において特定した部位により、一部又は部位全体を指す。なお、特定する部位の数及びその位置は、可視光画像に併せて適宜変更可能である。 Based on FIG. 9, the region in the infrared image corresponding to the region in the visible light image specified by the infrared image analysis module 41 will be described. FIG. 9 is a diagram illustrating an example schematically showing a state in which the infrared image analysis module 41 specifies a region in the infrared image. 9, in the visible light image 100 described above, regions in the infrared image 200 corresponding to the parts of the casings 300 to 302, the displays 310 to 312 and the keyboards 320 to 322 of the identified object 110 are identified. This is specified by comparing the position in the visible light image 100 and the position in the infrared image 200. The infrared image analysis module 41 acquires the position coordinates of each part in the visible light image 100, and specifies the position in the infrared image corresponding to the acquired position coordinates as the region in the infrared image corresponding to the region in the visible light image. . The infrared image analysis module 41 specifies the parts of the casings 400 to 402, the displays 410 to 412, and the keyboards 420 to 422 of the object 210. In FIG. 9, the specified area is indicated by hatching for convenience. This area refers to a part or the whole part depending on the part specified in the above-described visible light image. Note that the number of positions to be specified and their positions can be changed as appropriate in accordance with the visible light image.
 温度解析モジュール42は、特定した赤外線画像データにおける領域の温度を解析する(ステップS19)。ステップS19において、温度解析モジュール42は、赤外線画像データに基づいて、各領域の温度を取得する。 The temperature analysis module 42 analyzes the temperature of the area in the specified infrared image data (step S19). In step S19, the temperature analysis module 42 acquires the temperature of each region based on the infrared image data.
 温度解析モジュール42は、物体の複数の部位の各々に対応する複数の基準温度を、記憶モジュール30が記憶する基準温度データベースから取得する(ステップS20)。ステップS20において、記憶モジュール30は、予め各部位に対応する基準温度を複数記憶しておき、温度解析モジュール42は、この記憶された基準温度を取得する。このとき、特定された赤外線画像データにおける領域に対応する部位の基準温度を取得する。 The temperature analysis module 42 acquires a plurality of reference temperatures corresponding to each of a plurality of parts of the object from the reference temperature database stored in the storage module 30 (step S20). In step S20, the storage module 30 stores a plurality of reference temperatures corresponding to the respective parts in advance, and the temperature analysis module 42 acquires the stored reference temperatures. At this time, the reference temperature of the part corresponding to the region in the specified infrared image data is acquired.
 [基準温度データベース]
 図10に基づいて、記憶モジュール30が記憶する基準温度データベースについて説明する。図10は、記憶モジュール30が記憶する基準温度データベースの一例を示す図である。図10において、記憶モジュール30は、部位の名称と、この部位の基準温度とを対応付けて記憶する。すなわち、記憶モジュール30は、「筐体」と「40」とを対応付け、「ディスプレイ」と「30」とを対応付け、「キーボード」と「20」とを対応付けて記憶する。記憶モジュール30は、この基準温度データベースを、物体の種類毎に記憶する。なお、記憶モジュール30は、物体の種類毎ではなく、物体の個体毎に、基準温度データベースを記憶してもよい。この場合、記憶モジュール30は、予め各個体の部位毎に基準温度を取得しておき、部位と基準温度とを対応付けて記憶しておけばよい。
[Reference temperature database]
A reference temperature database stored in the storage module 30 will be described with reference to FIG. FIG. 10 is a diagram illustrating an example of a reference temperature database stored in the storage module 30. In FIG. 10, the storage module 30 stores the name of the part and the reference temperature of the part in association with each other. That is, the storage module 30 stores “housing” and “40” in association with each other, “display” and “30” in association with each other, and “keyboard” and “20” in association with each other. The storage module 30 stores this reference temperature database for each type of object. Note that the storage module 30 may store a reference temperature database for each object, not for each object type. In this case, the storage module 30 may acquire a reference temperature for each part of each individual in advance and store the part and the reference temperature in association with each other.
 診断モジュール43は、特定された赤外線画像における領域の温度に基づいて、物体を診断する(ステップS21)。ステップS21において、診断モジュール43は、物体の診断を、取得した温度、基準温度、一の個体とは異なる他の個体の温度、第1の個体と第2の個体との位置関係、環境情報のいずれか又は複数の組み合わせにより実行する。 The diagnosis module 43 diagnoses an object based on the temperature of the region in the specified infrared image (step S21). In step S <b> 21, the diagnosis module 43 performs object diagnosis on the acquired temperature, the reference temperature, the temperature of another individual different from the one individual, the positional relationship between the first individual and the second individual, the environmental information It is executed by any one or a plurality of combinations.
 診断モジュール43が、物体の診断を、取得した温度により実行する場合について説明する。診断モジュール43は、特定された赤外線画像における領域の温度が、異常な値であるか否かを判断し、異常な値でなかった場合、障害が発生していないと判断する。一方、診断モジュール43は、異常な値であると判断した場合、障害が発生していると判断する。 The case where the diagnosis module 43 executes object diagnosis based on the acquired temperature will be described. The diagnosis module 43 determines whether or not the temperature of the region in the identified infrared image is an abnormal value, and determines that no failure has occurred if the temperature is not an abnormal value. On the other hand, if the diagnosis module 43 determines that the value is an abnormal value, it determines that a failure has occurred.
 診断モジュール43が、物体の診断を、基準温度により実行する場合について説明する。診断モジュール43は、特定された赤外線画像における領域の温度と、取得した記憶モジュール30が記憶する基準温度とを比較し、領域の温度と基準温度との間の温度差を算出する。診断モジュール43は、算出した温度差が所定の範囲内(例えば、0.5℃以内、1℃以内、2℃以内等)であるか否かを判断し、所定の範囲内である場合、障害が発生していないと判断する。一方、診断モジュール43は、算出した温度差が所定の範囲内でない場合、障害が発生していると判断する。 The case where the diagnosis module 43 executes object diagnosis based on the reference temperature will be described. The diagnosis module 43 compares the temperature of the region in the identified infrared image with the reference temperature stored in the acquired storage module 30, and calculates the temperature difference between the temperature of the region and the reference temperature. The diagnosis module 43 determines whether or not the calculated temperature difference is within a predetermined range (for example, within 0.5 ° C, within 1 ° C, within 2 ° C, etc.). Is determined not to occur. On the other hand, the diagnostic module 43 determines that a failure has occurred when the calculated temperature difference is not within the predetermined range.
 診断モジュール43が、物体の診断を、一の個体と、この一の個体とは異なる他の個体の温度とに基づいて実行する場合について説明する。診断モジュール43は、取得した一の個体における赤外線画像における領域の温度と、この一の個体とは異なる他の個体における赤外線画像における対応する同一部位の領域の温度とを比較し、これらの温度の温度差を算出する。診断モジュール43は、算出した温度差が所定の範囲内(例えば、0.5℃以内、1℃以内、2℃以内等)であるか否かを判断し、所定の範囲内である場合、障害が発生していないと判断する。一方、診断モジュール43は、算出した温度差が所定の範囲内ではない場合、障害が発生していると判断する。なお、この場合において、診断モジュール43は、一の個体又は他の個体と、上述した基準温度とを比較するとともに、一の個体と他の個体との間の温度差を算出することにより、障害が発生しているか否かを判断してもよい。すなわち、診断モジュール43は、一の個体又は他の個体のいずれか又は双方に障害が発生しているか否かを、基準温度と、温度差とに基づいて判断してもよい。 A case will be described in which the diagnosis module 43 executes object diagnosis based on one individual and the temperature of another individual different from the one individual. The diagnostic module 43 compares the temperature of the region in the infrared image of the acquired one individual with the temperature of the corresponding region of the same part in the infrared image of another individual different from the one individual. Calculate the temperature difference. The diagnosis module 43 determines whether or not the calculated temperature difference is within a predetermined range (for example, within 0.5 ° C, within 1 ° C, within 2 ° C, etc.). Is determined not to occur. On the other hand, the diagnostic module 43 determines that a failure has occurred when the calculated temperature difference is not within the predetermined range. In this case, the diagnostic module 43 compares the reference temperature described above with one individual or another individual, and calculates a temperature difference between the one individual and the other individual, thereby It may be determined whether or not the above has occurred. In other words, the diagnosis module 43 may determine whether a failure has occurred in one or both of the one individual and the other individual based on the reference temperature and the temperature difference.
 診断モジュール43が、物体の診断を、第1の個体と第2の個体との位置関係に基づいて実行する場合について説明する。診断モジュール43は、取得した複数の個体のうち、第1の個体の位置と、第1の個体とは異なる第2の個体との間の位置関係を比較し、いずれかが空調装置や照明等の影響をよりうけているかを判断する。これは、第1の個体と第2の個体とのいずれかが空調装置や照明等の影響により、より温度が低くなっているか又は高くなっているかを特定するものである。診断モジュール43は、この空調装置や照明等の影響を、気温や照度等の環境情報を取得することにより、第1の個体と第2の個体との温度を補正する。診断モジュール43は、補正後の第1の個体と第2の個体との温度と、上述した基準温度とを比較し、これらの温度差を算出する。診断モジュール43は、この温度差が所定の範囲内(例えば、0.5℃以内、1℃以内、2℃以内等)であるか否かを判断し、所定の範囲内である場合、障害が発生していないと判断する。一方、診断モジュール43は、算出した温度差が所定の範囲内ではないと判断した場合、障害が発生していると判断する。なお、診断モジュール43は、基準温度を用いずに、障害が発生しているか否かを判断してもよい。例えば、補正した第1の個体と第2の個体との温度が所定の温度であるか否かに基づいて、障害が発生しているか否かを判断すればよい。 The case where the diagnosis module 43 executes diagnosis of an object based on the positional relationship between the first individual and the second individual will be described. The diagnosis module 43 compares the positional relationship between the position of the first individual and the second individual different from the first individual among the plurality of acquired individuals, and any one of them is an air conditioner, illumination, etc. Determine if you are more affected by This specifies whether either the first individual or the second individual is lower or higher in temperature due to the influence of an air conditioner, illumination, or the like. The diagnosis module 43 corrects the temperatures of the first and second individuals by acquiring environmental information such as air temperature and illuminance on the influence of the air conditioner and lighting. The diagnosis module 43 compares the corrected temperatures of the first individual and the second individual with the reference temperature described above, and calculates the temperature difference between them. The diagnosis module 43 determines whether or not the temperature difference is within a predetermined range (for example, within 0.5 ° C., within 1 ° C., within 2 ° C., etc.). Judge that it does not occur. On the other hand, if the diagnostic module 43 determines that the calculated temperature difference is not within the predetermined range, it determines that a failure has occurred. The diagnosis module 43 may determine whether or not a failure has occurred without using the reference temperature. For example, it may be determined whether or not a failure has occurred based on whether or not the corrected temperature of the first individual and the second individual is a predetermined temperature.
 診断モジュール43が、物体の診断を、センサから取得した環境情報に基づいて実行する場合について説明する。診断モジュール43は、取得した個体の温度を、環境情報に基づいて補正する。例えば、診断モジュール43は、環境情報として、湿度、気温、気圧等を取得し、取得した環境情報により、取得した個体の温度を補正する。診断モジュール43は、補正後の個体の温度に基づいて、物体を診断する。なお、診断モジュール43は、補正後の個体の温度と、基準温度とに基づいて、物体に障害が発生しているか否かを判断してもよい。 The case where the diagnosis module 43 executes diagnosis of an object based on environmental information acquired from a sensor will be described. The diagnosis module 43 corrects the acquired individual temperature based on the environmental information. For example, the diagnosis module 43 acquires humidity, air temperature, atmospheric pressure, and the like as environment information, and corrects the acquired individual temperature based on the acquired environment information. The diagnosis module 43 diagnoses an object based on the corrected individual temperature. The diagnosis module 43 may determine whether or not a failure has occurred in the object based on the corrected individual temperature and the reference temperature.
 診断モジュール43は、診断結果を出力する(ステップS22)。ステップS22において、診断モジュール43は、診断結果として、障害の内容(例えば、障害の名称、対策方法等)を出力する。診断モジュール43は、可視光画像データ及び赤外線画像データに、一の個体のみが存在する場合、この一の個体の診断結果を出力する。また、診断モジュール43は、可視光画像データ及び赤外線画像データに、複数の個体が存在する場合、各個体毎に、診断結果を出力する。このとき、診断モジュール43は、各個体の名称、識別子、位置情報等の個体を一意に特定することが可能な各種情報とともに、診断結果を出力する。 The diagnosis module 43 outputs a diagnosis result (step S22). In step S22, the diagnosis module 43 outputs the content of the failure (for example, the name of the failure, a countermeasure method, etc.) as the diagnosis result. When only one individual exists in the visible light image data and the infrared image data, the diagnosis module 43 outputs the diagnosis result of the one individual. Further, when there are a plurality of individuals in the visible light image data and the infrared image data, the diagnosis module 43 outputs a diagnosis result for each individual. At this time, the diagnosis module 43 outputs a diagnosis result together with various information capable of uniquely specifying the individual such as the name, identifier, and position information of each individual.
 なお、上述した説明において、診断モジュール43は、一の可視光画像データ及び一の赤外線画像データにより、物体の診断を実施しているが、所定の期間内に取得された複数の可視光画像データ及び赤外線画像データに基づいて、物体の診断を実行してもよい。この場合、各赤外線画像データから取得した個体の温度の、変化量や変化幅や変化そのものに基づいて、物体の診断を実行すればよい。また、診断モジュール43は、複数の赤外線画像データから取得した個体の温度の平均値により、この物体の診断を実行すればよい。例えば、診断モジュール43は、この個体の温度の平均値と、基準温度とを比較することにより、この温度の平均値と、基準温度との間の温度差を算出し、この温度差が所定の範囲内であるか否かに基づいて、物体の診断を実行すればよい。 In the above description, the diagnosis module 43 performs diagnosis of an object using one visible light image data and one infrared image data, but a plurality of visible light image data acquired within a predetermined period. The diagnosis of the object may be executed based on the infrared image data. In this case, the diagnosis of the object may be executed based on the change amount, change width, and change itself of the individual temperature acquired from each infrared image data. The diagnosis module 43 may perform diagnosis of this object based on the average value of the individual temperatures acquired from the plurality of infrared image data. For example, the diagnosis module 43 calculates a temperature difference between the average value of the individual and the reference temperature by comparing the average value of the temperature of the individual with the reference temperature, and the temperature difference is a predetermined value. The diagnosis of the object may be executed based on whether or not it is within the range.
 診断モジュール43は、出力した診断結果に基づいて、個体に障害が発生しているか否かを判断する(ステップS23)。 The diagnosis module 43 determines whether a failure has occurred in the individual based on the output diagnosis result (step S23).
 ステップS23において、診断モジュール43は、個体に障害が発生していないと判断した場合(ステップS23 NO)、本処理を終了する。なお、このとき、診断モジュール43は、図示していない外部端末装置に対して、個体に障害が発生していない旨の通知を送信してもよい。 In step S23, when the diagnosis module 43 determines that no failure has occurred in the individual (NO in step S23), the process ends. At this time, the diagnosis module 43 may transmit a notification to the effect that no failure has occurred in the individual to an external terminal device (not shown).
 ステップS23において、診断モジュール43は、個体に障害が発生していると判断した場合(ステップS23 YES)、環境調整モジュール44は、この診断の結果を表す情報に基づいて、設置環境を調整する調整指示を作成する(ステップS24)。ステップS24において、環境調整モジュール44は、診断した障害の内容に基づいて、必要な処理を、障害の内容と処理とを関連づけて記憶する調整データベース等に基づいて取得する。環境調整モジュール44は、この取得した処理を実行する調整指示を作成する。この調整指示には、必要な処理及びこの処理を実行する環境調整装置の識別子や装置ID等の環境調整装置を一意に特定することが可能な情報等が含まれる。 In step S23, when the diagnosis module 43 determines that a failure has occurred in the individual (YES in step S23), the environment adjustment module 44 adjusts the installation environment based on information representing the result of the diagnosis. An instruction is created (step S24). In step S <b> 24, the environment adjustment module 44 acquires necessary processing based on the diagnosed failure content based on an adjustment database that stores the failure content and the processing in association with each other. The environment adjustment module 44 creates an adjustment instruction for executing the acquired processing. This adjustment instruction includes necessary processing and information that can uniquely identify the environmental adjustment device such as an identifier or device ID of the environmental adjustment device that executes this processing.
 調整指示送信モジュール22は、上述したステップS24において環境調整モジュール44が作成した調整指示を、環境調整装置に送信する(ステップS25)。調整指示送信モジュール22は、調整指示に含まれる環境調整装置を一意に特定することが可能な情報に基づいて、対象となる環境調整装置に送信する。 The adjustment instruction transmission module 22 transmits the adjustment instruction created by the environment adjustment module 44 in step S24 described above to the environment adjustment apparatus (step S25). The adjustment instruction transmission module 22 transmits the environmental adjustment device included in the adjustment instruction to the target environmental adjustment device based on information that can uniquely identify the environmental adjustment device.
 環境調整装置は、この調整指示を受信し、この調整指示に含まれる必要な処理を実行する。例えば、環境調整装置は、照明のオンオフの実行、湿度や温度制御の実行、散水の実行、薬剤の散布等を実行する。 The environment adjustment device receives this adjustment instruction and executes necessary processing included in this adjustment instruction. For example, the environment adjustment device executes lighting on / off, humidity and temperature control, watering, medicine spraying, and the like.
 以上が、物体診断処理である。 The above is the object diagnosis process.
 なお、上述の実施形態では、物体がコンピュータであるものとして説明しているが、物体は、この例に限らずその他の物品であってもよい。例えば、携帯電話、携帯情報端末、タブレット端末に加え、ネットブック端末、スレート端末、電子書籍端末、携帯型音楽プレーヤ等の電化製品や、スマートグラス、ヘッドマウントディスプレイ等のウェアラブル端末や、各種センサや、ロボット等のIoT(Internet of Things)機器であってもよい。また、工場のパイプ、配管設備、排水設備、受変電設備、送電設備、ポンプ設備、消防設備、ボイラー設備、高圧ガス設備、高圧空気設備等の工場設備であってもよい。さらには、車両、飛行機、船、バス等の移動体であってもよいし、住宅、病院、診療所、駅、空港、ビル、官公庁舎、警察署、消防署、交番、競技場、球場、ホテル、倉庫、学校、公衆便所、店舗、飲食店等の建築物であってもよい。 In the above-described embodiment, the object is described as being a computer. However, the object is not limited to this example, and may be another article. For example, in addition to mobile phones, personal digital assistants, tablet terminals, electronic products such as netbook terminals, slate terminals, electronic book terminals, portable music players, wearable terminals such as smart glasses and head mounted displays, various sensors, IoT (Internet of Things) equipment such as a robot may be used. Also, factory equipment such as factory pipes, piping equipment, drainage equipment, power receiving / transforming equipment, power transmission equipment, pump equipment, fire fighting equipment, boiler equipment, high pressure gas equipment, high pressure air equipment may be used. Furthermore, it may be a moving body such as a vehicle, an airplane, a ship, a bus, a house, a hospital, a clinic, a station, an airport, a building, a government office, a police station, a fire station, a police box, a stadium, a stadium, a hotel. It may be a building such as a warehouse, a school, a public toilet, a store or a restaurant.
 上述した手段、機能は、コンピュータ(CPU、情報処理装置、各種端末を含む)が、所定のプログラムを読み込んで、実行することによって実現される。プログラムは、例えば、コンピュータからネットワーク経由で提供される(SaaS:ソフトウェア・アズ・ア・サービス)形態で提供される。また、プログラムは、例えば、フレキシブルディスク、CD(CD-ROMなど)、DVD(DVD-ROM、DVD-RAMなど)等のコンピュータ読取可能な記録媒体に記録された形態で提供される。この場合、コンピュータはその記録媒体からプログラムを読み取って内部記憶装置又は外部記憶装置に転送し記憶して実行する。また、そのプログラムを、例えば、磁気ディスク、光ディスク、光磁気ディスク等の記憶装置(記録媒体)に予め記録しておき、その記憶装置から通信回線を介してコンピュータに提供するようにしてもよい。 The means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program. The program is provided, for example, in a form (SaaS: Software as a Service) provided from a computer via a network. The program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.). In this case, the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it. The program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
 以上、本発明の実施形態について説明したが、本発明は上述したこれらの実施形態に限るものではない。また、本発明の実施形態に記載された効果は、本発明から生じる最も好適な効果を列挙したに過ぎず、本発明による効果は、本発明の実施形態に記載されたものに限定されるものではない。 As mentioned above, although embodiment of this invention was described, this invention is not limited to these embodiment mentioned above. The effects described in the embodiments of the present invention are only the most preferable effects resulting from the present invention, and the effects of the present invention are limited to those described in the embodiments of the present invention. is not.
 1 物体診断システム、10 コンピュータ 1. Object diagnosis system, 10 computers

Claims (11)

  1.  カメラにて撮像された、可視光画像及び赤外線画像を取得する第1取得手段と、
     前記可視光画像において、前記カメラにて撮像された物体の所定の部位に対応する領域を特定する第1画像処理手段と、
     特定された前記可視光画像における領域に対応する、前記赤外線画像における領域を特定する第2画像処理手段と、
     特定された前記赤外線画像における領域の温度に基づいて、前記物体を診断する診断手段と、
     を備えることを特徴とするコンピュータシステム。
    First acquisition means for acquiring a visible light image and an infrared image captured by a camera;
    First image processing means for specifying a region corresponding to a predetermined part of an object imaged by the camera in the visible light image;
    Second image processing means for identifying a region in the infrared image corresponding to the identified region in the visible light image;
    Diagnostic means for diagnosing the object based on the temperature of the region in the identified infrared image;
    A computer system comprising:
  2.  前記物体の複数の箇所の各々に対応する複数の基準温度を記憶した記憶手段と、
     をさらに備え、
     前記第1画像処理手段は、複数の所定の部位の各々に対応する複数の領域を特定し、
     前記診断手段は、前記第2画像処理手段にて特定された複数の領域における温度と各基準温度とを比較する、
     ことを特徴とする請求項1に記載のコンピュータシステム。
    Storage means for storing a plurality of reference temperatures corresponding to each of a plurality of locations of the object;
    Further comprising
    The first image processing means identifies a plurality of regions corresponding to each of a plurality of predetermined parts,
    The diagnostic means compares temperatures in a plurality of regions specified by the second image processing means with each reference temperature,
    The computer system according to claim 1.
  3.  前記第1画像処理手段は、前記カメラにて撮像された複数の個体をそれぞれ識別し、
     前記診断手段は、前記個体毎に診断結果を出力する、
     ことを特徴とする請求項1に記載のコンピュータシステム。
    The first image processing means identifies each of a plurality of individuals imaged by the camera,
    The diagnosis means outputs a diagnosis result for each individual;
    The computer system according to claim 1.
  4.  前記診断手段は、前記複数の個体のうち一の個体について診断を行うに際し、他の個体の温度を用いる、
     ことを特徴とする請求項3に記載のコンピュータシステム。
    The diagnostic means uses the temperature of another individual when performing diagnosis on one of the plurality of individuals.
    The computer system according to claim 3.
  5.  前記第1画像処理手段は、第1の個体と第2の個体との位置関係を特定し、
     前記診断手段は、前記位置関係を用いる、
     ことを特徴とする請求項3に記載のコンピュータシステム。
    The first image processing means identifies a positional relationship between the first individual and the second individual,
    The diagnostic means uses the positional relationship.
    The computer system according to claim 3.
  6.  前記物体の設置環境を示す環境情報を取得する第2取得手段と、
     をさらに備え、
     前記診断手段は、前記環境情報を用いる、
     ことを特徴とする請求項1に記載のコンピュータシステム。
    Second acquisition means for acquiring environment information indicating an installation environment of the object;
    Further comprising
    The diagnostic means uses the environmental information.
    The computer system according to claim 1.
  7.  前記第1取得手段は、前記可視光画像及び前記赤外線画像を、それぞれ複数の時点において取得し、
     前記診断手段は、所定の期間内に取得された複数の前記可視光画像及び複数の前記赤外線画像を用いる、
     ことを特徴とする請求項1に記載のコンピュータシステム。
    The first acquisition means acquires the visible light image and the infrared image at a plurality of time points,
    The diagnostic means uses a plurality of the visible light images and a plurality of the infrared images acquired within a predetermined period.
    The computer system according to claim 1.
  8.  前記診断手段にて出力された診断の結果を表す情報に基づいて前記設置環境を調整する調整手段と、
     をさらに備えることを特徴とする請求項1に記載のコンピュータシステム。
    Adjusting means for adjusting the installation environment based on information representing the result of diagnosis output by the diagnostic means;
    The computer system according to claim 1, further comprising:
  9.  前記物体は、IoT機器又は工場設備である、
     ことを特徴とする請求項1に記載のコンピュータシステム。
    The object is an IoT device or a factory facility.
    The computer system according to claim 1.
  10.  カメラにて撮像された、可視光画像及び赤外線画像を取得するステップと、
     前記可視光画像において、前記カメラにて撮像された物体の所定の部位に対応する領域を特定するステップと、
     特定された前記可視光画像における領域に対応する、前記赤外線画像における領域を特定するステップと、
     特定された前記赤外線画像における領域の温度に基づいて、前記物体を診断するステップと、
     を備えることを特徴とする物体の診断方法。
    Acquiring a visible light image and an infrared image captured by a camera;
    In the visible light image, specifying a region corresponding to a predetermined part of the object imaged by the camera;
    Identifying a region in the infrared image corresponding to the identified region in the visible light image;
    Diagnosing the object based on the temperature of the region in the identified infrared image;
    An object diagnostic method comprising:
  11.  コンピュータシステムに、
     カメラにて撮像された、可視光画像及び赤外線画像を取得するステップ、
     前記可視光画像において、前記カメラにて撮像された物体の所定の部位に対応する領域を特定するステップ、
     特定された前記可視光画像における領域に対応する、前記赤外線画像における領域を特定するステップ、
     特定された前記赤外線画像における領域の温度に基づいて、前記物体を診断するステップ、
     を実行させるためのプログラム。
    Computer system,
    Acquiring a visible light image and an infrared image captured by a camera;
    Identifying a region corresponding to a predetermined part of an object imaged by the camera in the visible light image;
    Identifying a region in the infrared image corresponding to the identified region in the visible light image;
    Diagnosing the object based on the temperature of the region in the identified infrared image;
    A program for running
PCT/JP2016/080872 2016-10-18 2016-10-18 Computer system, object diagnosis method, and program WO2018073900A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/080872 WO2018073900A1 (en) 2016-10-18 2016-10-18 Computer system, object diagnosis method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/080872 WO2018073900A1 (en) 2016-10-18 2016-10-18 Computer system, object diagnosis method, and program

Publications (1)

Publication Number Publication Date
WO2018073900A1 true WO2018073900A1 (en) 2018-04-26

Family

ID=62019294

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/080872 WO2018073900A1 (en) 2016-10-18 2016-10-18 Computer system, object diagnosis method, and program

Country Status (1)

Country Link
WO (1) WO2018073900A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110288578A (en) * 2019-06-24 2019-09-27 国网上海市电力公司 A kind of power equipments defect infrared image recognizing system of high discrimination
CN115306718A (en) * 2022-07-15 2022-11-08 嘉洋智慧安全生产科技发展(北京)有限公司 Method, apparatus, device, medium and program product for detecting screw compressor failure

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5754048A (en) * 1980-09-09 1982-03-31 Agency Of Ind Science & Technol Abnormality monitor employing infrared picture image
JPH11120458A (en) * 1997-10-14 1999-04-30 Hitachi Eng & Service Co Ltd Fire detector
JP2002132341A (en) * 2000-10-26 2002-05-10 Toshiba Corp Field inspection device
JP2008045888A (en) * 2006-08-11 2008-02-28 Chugoku Electric Power Co Inc:The Device for diagnosing overheating

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5754048A (en) * 1980-09-09 1982-03-31 Agency Of Ind Science & Technol Abnormality monitor employing infrared picture image
JPH11120458A (en) * 1997-10-14 1999-04-30 Hitachi Eng & Service Co Ltd Fire detector
JP2002132341A (en) * 2000-10-26 2002-05-10 Toshiba Corp Field inspection device
JP2008045888A (en) * 2006-08-11 2008-02-28 Chugoku Electric Power Co Inc:The Device for diagnosing overheating

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110288578A (en) * 2019-06-24 2019-09-27 国网上海市电力公司 A kind of power equipments defect infrared image recognizing system of high discrimination
CN115306718A (en) * 2022-07-15 2022-11-08 嘉洋智慧安全生产科技发展(北京)有限公司 Method, apparatus, device, medium and program product for detecting screw compressor failure
CN115306718B (en) * 2022-07-15 2023-08-18 嘉洋智慧安全科技(北京)股份有限公司 Screw compressor fault detection method, apparatus, device, medium and program product

Similar Documents

Publication Publication Date Title
JP6441546B2 (en) Computer system, object diagnostic method and program
CN111157124B (en) Human body temperature measurement method, device and system based on face recognition
JP6255085B2 (en) Locating system and locating method
JP6560833B2 (en) Computer system, plant diagnosis method and program
JP6383439B2 (en) Method and system for calibrating a sensor using a recognized object
US9295141B2 (en) Identification device, method and computer program product
WO2019078660A1 (en) Wireless power transmission apparatus and control method thereof
US11162888B2 (en) Cloud-based machine learning system and data fusion for the prediction and detection of corrosion under insulation
WO2018073900A1 (en) Computer system, object diagnosis method, and program
CN113299035A (en) Fire identification method and system based on artificial intelligence and binocular vision
US20190215461A1 (en) System, method, and program for adjusting angle of camera
KR102655014B1 (en) Electronic device and method for identifying location in the electronic device
US20170280131A1 (en) Method and system for recalibrating sensing devices without familiar targets
KR101711156B1 (en) Image security system and method using mobile identification
CN111998956A (en) Temperature measurement method, temperature measurement device, electronic equipment and computer readable storage medium
CN111521279A (en) Pipeline leakage inspection method
WO2020136969A1 (en) Measurement system, measurement device, measurement method, and program
CN116403359A (en) Production safety early warning system of multi-mode image recognition algorithm
US11842452B2 (en) Portable display device with overlaid virtual information
WO2017213373A1 (en) Electronic device, external server, and method for controlling same
Thapliyal et al. Development of data acquisition console and web server using Raspberry Pi for marine platforms
KR20220011902A (en) Smart farm sensors fault detection system and method
US20220417479A1 (en) Information processing device and information processing method
WO2023032507A1 (en) Position estimation system and position estimation method
WO2023032769A1 (en) Position estimation system and position estimation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16919505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP