WO2018073900A1 - Système informatique, méthode et programme de diagnostic d'un sujet - Google Patents

Système informatique, méthode et programme de diagnostic d'un sujet Download PDF

Info

Publication number
WO2018073900A1
WO2018073900A1 PCT/JP2016/080872 JP2016080872W WO2018073900A1 WO 2018073900 A1 WO2018073900 A1 WO 2018073900A1 JP 2016080872 W JP2016080872 W JP 2016080872W WO 2018073900 A1 WO2018073900 A1 WO 2018073900A1
Authority
WO
WIPO (PCT)
Prior art keywords
visible light
light image
region
temperature
infrared image
Prior art date
Application number
PCT/JP2016/080872
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2016/080872 priority Critical patent/WO2018073900A1/fr
Publication of WO2018073900A1 publication Critical patent/WO2018073900A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N25/00Investigating or analyzing materials by the use of thermal means
    • G01N25/72Investigating presence of flaws

Definitions

  • the present invention relates to a computer system that performs imaging by imaging an object, an object diagnostic method, and a program.
  • Non-Patent Document 1 a configuration for diagnosing an object by measuring the temperature of the object using a spectrometer or infrared thermography is disclosed (see Non-Patent Document 1).
  • Non-Patent Document 1 there is a temperature difference for each part of the object, and in order to accurately grasp the state of the object, the temperature of each part to be measured should be accurately measured. Is required.
  • it is possible to identify the outline and approximate part of an object from a thermographic image it is difficult to specify the exact position of each part depending on the distance from the camera to the subject and other factors. As a result, there was a limit to improving the accuracy of the temperature measured for each part.
  • the present invention provides the following solutions.
  • the present invention includes first acquisition means for acquiring a visible light image and an infrared image captured by a camera; First image processing means for specifying a region corresponding to a predetermined part of an object imaged by the camera in the visible light image; Second image processing means for identifying a region in the infrared image corresponding to the identified region in the visible light image; Diagnostic means for diagnosing the object based on the temperature of the region in the identified infrared image; A computer system is provided.
  • the computer system acquires a visible light image and an infrared image captured by a camera, and in the visible light image, an area corresponding to a predetermined part of an object captured by the camera is obtained.
  • the region in the infrared image corresponding to the identified region in the visible light image is identified, and the object is diagnosed based on the temperature of the region in the identified infrared image.
  • the present invention is a category of a computer system, but also in other categories such as an object diagnosis method and program, the same actions and effects corresponding to the category are exhibited.
  • FIG. 1 is a diagram showing an outline of the object diagnostic system 1.
  • FIG. 2 is an overall configuration diagram of the object diagnostic system 1.
  • FIG. 3 is a functional block diagram of the computer 10.
  • FIG. 4 is a flowchart showing object diagnosis processing executed by the computer 10.
  • FIG. 5 is a flowchart illustrating object diagnosis processing executed by the computer 10.
  • FIG. 6 is a diagram schematically showing an example of visible light image data acquired by the computer 10.
  • FIG. 7 is a diagram schematically showing an example of infrared image data acquired by the computer 10.
  • FIG. 8 is a diagram illustrating an example schematically showing a state in which the computer 10 specifies a predetermined part in a visible light image.
  • FIG. 9 is a diagram schematically illustrating an example in which the computer 10 specifies a region in an infrared image.
  • FIG. 10 is a diagram illustrating an example of a reference temperature database stored in the computer 10.
  • FIG. 1 is a diagram for explaining an outline of an object diagnosis system 1 which is a preferred embodiment of the present invention.
  • the object diagnosis system 1 includes a computer 10 and includes IoT devices such as computers, terminals, sensors, and robots, factory equipment such as factory pipes and piping, moving objects such as cars, airplanes, and buses, buildings, and houses. An image obtained by imaging an object such as a building such as a factory itself is acquired, and this object is diagnosed. In the following description, the object is described as being a computer.
  • the computer 10 is a computing device that is communicably connected to a visible light camera, an infrared camera, a sensor, an environment adjustment device, etc. (not shown).
  • the object diagnosis system 1 obtains a visible light image from a visible light camera, obtains an infrared image from an infrared camera, and provides an environment for an object installation location such as illuminance, wind direction, wind speed, temperature, temperature, humidity, and atmospheric pressure from a sensor.
  • Information is acquired and the instruction
  • the computer 10 acquires a visible light image and an infrared image captured by a camera (not shown) (step S01).
  • the computer 10 acquires a visible light image such as a moving image or a still image of an object captured by the visible light camera.
  • the computer 10 acquires an infrared image such as a moving image or a still image of an object captured by the infrared camera.
  • the visible light camera and the infrared camera are installed side by side or in the vicinity, and the visible light camera and the infrared camera image the same object. That is, the visible light camera and the infrared camera image the same object from substantially the same imaging point.
  • the computer 10 specifies a visible light image region that is a region corresponding to a predetermined part of the object imaged by the camera in the visible light image (step S02).
  • the computer 10 specifies, for example, a part of an object such as a housing, a display, and a keyboard, a preset part, and the like as the predetermined part of the object.
  • the computer 10 specifies an area corresponding to the predetermined part in the visible light image by image analysis.
  • the computer 10 extracts a feature amount existing in the visible light image, and specifies a predetermined part based on the feature amount.
  • the computer 10 extracts the color of the visible light image and identifies a predetermined part based on this color.
  • the computer 10 specifies an infrared image region that is a region in the infrared image corresponding to the region in the visible light image described above in the infrared image (step S03).
  • the computer 10 compares the visible light image and the infrared image, and identifies the region that matches the visible light image region in the infrared image as the infrared image region.
  • the computer 10 identifies an infrared image area at the same position as the visible light image area as the infrared image area.
  • the computer 10 diagnoses an object based on the temperature of the infrared image area (step S04).
  • the computer 10 diagnoses whether or not a failure has occurred in the object, for example, by comparing the temperature of the infrared image region with the temperature at the time of the failure of the object.
  • FIG. 2 is a diagram showing a system configuration of the object diagnostic system 1 which is a preferred embodiment of the present invention.
  • the object diagnosis system 1 includes a computer 10 and a public network (Internet network, third generation, fourth generation communication network, etc.) 5, acquires an image obtained by imaging an object, and diagnoses the object.
  • a public network Internet network, third generation, fourth generation communication network, etc.
  • the object diagnosis system 1 includes a camera such as a visible light camera that captures a visible light image of an object and an infrared camera that captures an infrared image of the object, and an installation environment of the object such as illuminance, wind light, wind speed, temperature, temperature, humidity, and atmospheric pressure.
  • a camera such as a visible light camera that captures a visible light image of an object and an infrared camera that captures an infrared image of the object
  • an installation environment of the object such as illuminance, wind light, wind speed, temperature, temperature, humidity, and atmospheric pressure.
  • an environment adjustment device that adjusts an installation environment of an object such as an illumination device such as various lights, an illumination device such as various lights, an air conditioner such as a blower or an air conditioner, or a watering device.
  • the computer 10 obtains various information from these and transmits various instructions.
  • the computer 10 is the above-described computing device having the functions described later.
  • FIG. 3 is a functional block diagram of the computer 10.
  • the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like as the control unit 11, and other devices (cameras, various sensors, environmental adjustment devices) as the communication unit 12. Etc.), for example, a WiFi (Wireless Fidelity) compatible device compliant with IEEE 802.11.
  • the computer 10 also includes a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card as the storage unit 13. Further, the computer 10 includes, as the processing unit 14, a device for executing various processes such as image processing and failure diagnosis.
  • the control unit 11 when the control unit 11 reads a predetermined program, the image data acquisition module 20, the environment information acquisition module 21, and the adjustment instruction transmission module 22 are realized in cooperation with the communication unit 12. Further, in the computer 10, the control unit 11 reads a predetermined program, thereby realizing the storage module 30 in cooperation with the storage unit 13. Further, in the computer 10, the control unit 11 reads a predetermined program, so that the visible light image analysis module 40, the infrared image analysis module 41, the temperature analysis module 42, the diagnosis module 43, and the environment cooperate with the processing unit 14. The adjustment module 44 is realized.
  • FIG.4 and FIG.5 are flowcharts showing object diagnosis processing executed by the computer 10. Processing executed by each module described above will be described together with this processing.
  • the image data acquisition module 20 acquires image data of a visible light image and an infrared image of an object (step S10).
  • the image data acquisition module 20 acquires visible light image data that is a visible light image captured by the visible light camera and infrared image data that is an infrared image captured by the infrared camera.
  • the image data acquisition module 20 acquires visible light image data and infrared image data at a predetermined time interval or at a plurality of time points such as a preset time.
  • the visible light image data and the infrared image data acquired by the image data acquisition module 20 are captured from the same imaging viewpoint and are the same target data.
  • the computer 10 executes diagnosis of an object based on visible light image data and infrared image data at a predetermined time.
  • FIG. 6 is a diagram schematically illustrating an example of visible light image data acquired by the image data acquisition module 20.
  • the image data acquisition module 20 acquires the visible light image 100 indicated by the visible light image data.
  • an object 110 is shown.
  • the visible light image 100 may include scenery other than the object 110, natural objects, artificial objects, and the like, which are omitted for the sake of simplicity.
  • the visible light image 100 may include a plurality of objects 110 and objects different from the objects 110.
  • FIG. 7 is a diagram schematically showing an example of infrared image data acquired by the image data acquisition module 20.
  • the image data acquisition module 20 acquires an infrared image 200 indicated by the infrared image data.
  • An object 210 is shown in the infrared image 200.
  • the infrared image 200 may include scenery, natural objects, artifacts, and the like other than the object 210, but are omitted for the sake of simplicity.
  • each temperature is indicated by hatching for convenience.
  • the infrared image 200 may include a plurality of objects 210 or an object different from the object 210.
  • the environment information acquisition module 21 acquires environment information indicating the installation environment of the object (step S11).
  • the environment information acquisition module 21 acquires illuminance, wind light, wind speed, temperature, temperature, humidity, atmospheric pressure, and the like as environment information.
  • the environment information acquisition module 21 acquires environment information from various sensors (not shown) such as an illuminance sensor, a wind direction / wind speed sensor, a temperature sensor, a humidity sensor, and a pressure sensor.
  • the environment information acquisition module 21 acquires the visible light image data and the infrared image data at the same time as the acquisition timing. These various sensors are installed in the vicinity of the object or in the vicinity of the place where the object is installed.
  • the various sensors may be sensors that detect environmental information other than the examples described above.
  • the installation position of various sensors is not limited to the above-described example, and can be appropriately changed to a position where the installation environment of the object can be detected.
  • the process of step S11 may be omitted. In this case, what is necessary is just to perform the process of step S12 mentioned later after performing the process of step S10 mentioned above.
  • the visible light image analysis module 40 performs image analysis on the acquired visible light image data (step S12).
  • step S ⁇ b> 12 the visible light image analysis module 40 extracts the feature amount and color of the visible light image data, and identifies an object existing in the visible light image data.
  • step S12 for example, the visible light image analysis module 40 compares the feature amount extracted from the visible light image data with the feature amount of the object stored in advance in the storage module 30, and extracts an object having a matching feature amount. The extracted object is identified as existing in the visible light image data. Further, for example, the visible light image analysis module 40 compares the RGB value extracted from the visible light image data with the RGB value of the object stored in advance in the storage module 30, and extracts an object having a matching or similar RGB value. The extracted object is identified as existing in the visible light image data.
  • the visible light image analysis module 40 determines whether there are a plurality of individuals in the visible light image data as a result of the image analysis (step S13). In step S13, the visible light image analysis module 40 determines whether there are a plurality of individuals by determining whether there are a plurality of objects in the visible light image data. The visible light image analysis module 40 determines whether there are a plurality of individuals of one object, whether there are individuals of a plurality of types of objects, and the like.
  • step S13 when the visible light image analysis module 40 determines that there are not a plurality of individuals (step S13: NO), that is, when it is determined that only one individual is present in the visible light image data, visible light image analysis is performed.
  • the module 40 identifies areas corresponding to a plurality of predetermined parts of one individual (step S14).
  • the predetermined part is, for example, a part of an object such as a housing, a display, or a keyboard, a part that is set in advance, or the like.
  • the visible light image analysis module 40 specifies areas corresponding to, for example, a housing, a display, and a keyboard.
  • the visible light image analysis module 40 extracts the casing, display, and keyboard present in the visible light image data from the feature amount, and specifies the extracted location as an area corresponding to a predetermined part. At this time, the visible light image analysis module 40 specifies a plurality of regions corresponding to each of the plurality of predetermined parts.
  • the visible light image analysis module 40 extracts a housing, a display, and a keyboard that are present in the visible light image data from the RGB values, and specifies the extracted location as an area corresponding to a predetermined part.
  • FIG. 8 is a diagram illustrating an example schematically showing a state in which the visible light image analysis module 40 specifies a predetermined part.
  • the visible light image analysis module 40 specifies a region in the visible light image 100 where a predetermined part such as a housing, a display, or a keyboard is located based on the feature amount and the color.
  • the visible light image analysis module 40 identifies the region of the object 110 corresponding to the parts of the casings 300 to 302, the displays 310 to 312 and the keyboards 320 to 322.
  • the specified area is indicated by hatching for convenience. This area indicates a part of each part, but may indicate the entire corresponding part. Note that the number, type, and position of the parts to be specified can be changed as appropriate.
  • step S13 the visible light image analysis module 40 determines that there are a plurality of individuals (YES in step S13), that is, a plurality of individuals such as a first individual, a second individual, and a third individual.
  • the visible light image analysis module 40 identifies each of the plurality of individuals (step S15). In the following description, it will be assumed that the first individual and the second individual exist in the visible light image data.
  • the visible light image analysis module 40 identifies the positional relationship of each of the plurality of individuals (step S16).
  • step S ⁇ b> 16 the visible light image analysis module 40 identifies the positional relationship between the first individual and the second individual based on the position in the visible light image.
  • the visible light image analysis module 40 specifies a relative position or an absolute position between the first individual and the second individual.
  • the positional relationship is, for example, which is closer to the shooting point, coordinates in the visible light image, or the like. Note that the processing in step S16 is not limited to the positional relationship between the first individual and the second individual, but may be the positional relationship with other individuals.
  • the visible light image analysis module 40 specifies a region corresponding to a predetermined part for each of the plurality of individuals (step S17).
  • the process of step S17 performs the process of step S14 mentioned above with respect to each object which exists in visible light image data.
  • the infrared image analysis module 41 identifies a region in the infrared image corresponding to the region in the identified visible light image (step S18).
  • step S ⁇ b> 18 the infrared image analysis module 41 compares the visible light image data with the infrared image data to identify the region of the infrared image data corresponding to the identified region of the object.
  • the infrared image analysis module 41 acquires the position of the region in the visible light image as a coordinate, and specifies the position that matches the acquired coordinate as the region in the infrared image corresponding to the region in the identified visible light image.
  • FIG. 9 is a diagram illustrating an example schematically showing a state in which the infrared image analysis module 41 specifies a region in the infrared image.
  • regions in the infrared image 200 corresponding to the parts of the casings 300 to 302 are identified. This is specified by comparing the position in the visible light image 100 and the position in the infrared image 200.
  • the infrared image analysis module 41 acquires the position coordinates of each part in the visible light image 100, and specifies the position in the infrared image corresponding to the acquired position coordinates as the region in the infrared image corresponding to the region in the visible light image. .
  • the infrared image analysis module 41 specifies the parts of the casings 400 to 402, the displays 410 to 412, and the keyboards 420 to 422 of the object 210.
  • the specified area is indicated by hatching for convenience. This area refers to a part or the whole part depending on the part specified in the above-described visible light image. Note that the number of positions to be specified and their positions can be changed as appropriate in accordance with the visible light image.
  • the temperature analysis module 42 analyzes the temperature of the area in the specified infrared image data (step S19). In step S19, the temperature analysis module 42 acquires the temperature of each region based on the infrared image data.
  • the temperature analysis module 42 acquires a plurality of reference temperatures corresponding to each of a plurality of parts of the object from the reference temperature database stored in the storage module 30 (step S20).
  • the storage module 30 stores a plurality of reference temperatures corresponding to the respective parts in advance, and the temperature analysis module 42 acquires the stored reference temperatures.
  • the reference temperature of the part corresponding to the region in the specified infrared image data is acquired.
  • FIG. 10 is a diagram illustrating an example of a reference temperature database stored in the storage module 30.
  • the storage module 30 stores the name of the part and the reference temperature of the part in association with each other. That is, the storage module 30 stores “housing” and “40” in association with each other, “display” and “30” in association with each other, and “keyboard” and “20” in association with each other.
  • the storage module 30 stores this reference temperature database for each type of object.
  • the storage module 30 may store a reference temperature database for each object, not for each object type. In this case, the storage module 30 may acquire a reference temperature for each part of each individual in advance and store the part and the reference temperature in association with each other.
  • the diagnosis module 43 diagnoses an object based on the temperature of the region in the specified infrared image (step S21).
  • the diagnosis module 43 performs object diagnosis on the acquired temperature, the reference temperature, the temperature of another individual different from the one individual, the positional relationship between the first individual and the second individual, the environmental information It is executed by any one or a plurality of combinations.
  • the diagnosis module 43 executes object diagnosis based on the acquired temperature.
  • the diagnosis module 43 determines whether or not the temperature of the region in the identified infrared image is an abnormal value, and determines that no failure has occurred if the temperature is not an abnormal value. On the other hand, if the diagnosis module 43 determines that the value is an abnormal value, it determines that a failure has occurred.
  • the diagnosis module 43 executes object diagnosis based on the reference temperature.
  • the diagnosis module 43 compares the temperature of the region in the identified infrared image with the reference temperature stored in the acquired storage module 30, and calculates the temperature difference between the temperature of the region and the reference temperature.
  • the diagnosis module 43 determines whether or not the calculated temperature difference is within a predetermined range (for example, within 0.5 ° C, within 1 ° C, within 2 ° C, etc.). Is determined not to occur. On the other hand, the diagnostic module 43 determines that a failure has occurred when the calculated temperature difference is not within the predetermined range.
  • the diagnosis module 43 executes object diagnosis based on one individual and the temperature of another individual different from the one individual.
  • the diagnostic module 43 compares the temperature of the region in the infrared image of the acquired one individual with the temperature of the corresponding region of the same part in the infrared image of another individual different from the one individual. Calculate the temperature difference.
  • the diagnosis module 43 determines whether or not the calculated temperature difference is within a predetermined range (for example, within 0.5 ° C, within 1 ° C, within 2 ° C, etc.). Is determined not to occur. On the other hand, the diagnostic module 43 determines that a failure has occurred when the calculated temperature difference is not within the predetermined range.
  • the diagnostic module 43 compares the reference temperature described above with one individual or another individual, and calculates a temperature difference between the one individual and the other individual, thereby It may be determined whether or not the above has occurred. In other words, the diagnosis module 43 may determine whether a failure has occurred in one or both of the one individual and the other individual based on the reference temperature and the temperature difference.
  • the diagnosis module 43 executes diagnosis of an object based on the positional relationship between the first individual and the second individual will be described.
  • the diagnosis module 43 compares the positional relationship between the position of the first individual and the second individual different from the first individual among the plurality of acquired individuals, and any one of them is an air conditioner, illumination, etc. Determine if you are more affected by This specifies whether either the first individual or the second individual is lower or higher in temperature due to the influence of an air conditioner, illumination, or the like.
  • the diagnosis module 43 corrects the temperatures of the first and second individuals by acquiring environmental information such as air temperature and illuminance on the influence of the air conditioner and lighting.
  • the diagnosis module 43 compares the corrected temperatures of the first individual and the second individual with the reference temperature described above, and calculates the temperature difference between them.
  • the diagnosis module 43 determines whether or not the temperature difference is within a predetermined range (for example, within 0.5 ° C., within 1 ° C., within 2 ° C., etc.). Judge that it does not occur. On the other hand, if the diagnostic module 43 determines that the calculated temperature difference is not within the predetermined range, it determines that a failure has occurred. The diagnosis module 43 may determine whether or not a failure has occurred without using the reference temperature. For example, it may be determined whether or not a failure has occurred based on whether or not the corrected temperature of the first individual and the second individual is a predetermined temperature.
  • the diagnosis module 43 executes diagnosis of an object based on environmental information acquired from a sensor.
  • the diagnosis module 43 corrects the acquired individual temperature based on the environmental information. For example, the diagnosis module 43 acquires humidity, air temperature, atmospheric pressure, and the like as environment information, and corrects the acquired individual temperature based on the acquired environment information.
  • the diagnosis module 43 diagnoses an object based on the corrected individual temperature.
  • the diagnosis module 43 may determine whether or not a failure has occurred in the object based on the corrected individual temperature and the reference temperature.
  • the diagnosis module 43 outputs a diagnosis result (step S22).
  • the diagnosis module 43 outputs the content of the failure (for example, the name of the failure, a countermeasure method, etc.) as the diagnosis result.
  • the diagnosis module 43 outputs the diagnosis result of the one individual.
  • the diagnosis module 43 outputs a diagnosis result for each individual.
  • the diagnosis module 43 outputs a diagnosis result together with various information capable of uniquely specifying the individual such as the name, identifier, and position information of each individual.
  • the diagnosis module 43 performs diagnosis of an object using one visible light image data and one infrared image data, but a plurality of visible light image data acquired within a predetermined period.
  • the diagnosis of the object may be executed based on the infrared image data.
  • the diagnosis of the object may be executed based on the change amount, change width, and change itself of the individual temperature acquired from each infrared image data.
  • the diagnosis module 43 may perform diagnosis of this object based on the average value of the individual temperatures acquired from the plurality of infrared image data. For example, the diagnosis module 43 calculates a temperature difference between the average value of the individual and the reference temperature by comparing the average value of the temperature of the individual with the reference temperature, and the temperature difference is a predetermined value.
  • the diagnosis of the object may be executed based on whether or not it is within the range.
  • the diagnosis module 43 determines whether a failure has occurred in the individual based on the output diagnosis result (step S23).
  • step S23 when the diagnosis module 43 determines that no failure has occurred in the individual (NO in step S23), the process ends. At this time, the diagnosis module 43 may transmit a notification to the effect that no failure has occurred in the individual to an external terminal device (not shown).
  • step S23 when the diagnosis module 43 determines that a failure has occurred in the individual (YES in step S23), the environment adjustment module 44 adjusts the installation environment based on information representing the result of the diagnosis.
  • An instruction is created (step S24).
  • step S ⁇ b> 24 the environment adjustment module 44 acquires necessary processing based on the diagnosed failure content based on an adjustment database that stores the failure content and the processing in association with each other.
  • the environment adjustment module 44 creates an adjustment instruction for executing the acquired processing.
  • This adjustment instruction includes necessary processing and information that can uniquely identify the environmental adjustment device such as an identifier or device ID of the environmental adjustment device that executes this processing.
  • the adjustment instruction transmission module 22 transmits the adjustment instruction created by the environment adjustment module 44 in step S24 described above to the environment adjustment apparatus (step S25).
  • the adjustment instruction transmission module 22 transmits the environmental adjustment device included in the adjustment instruction to the target environmental adjustment device based on information that can uniquely identify the environmental adjustment device.
  • the environment adjustment device receives this adjustment instruction and executes necessary processing included in this adjustment instruction. For example, the environment adjustment device executes lighting on / off, humidity and temperature control, watering, medicine spraying, and the like.
  • the above is the object diagnosis process.
  • the object is described as being a computer.
  • the object is not limited to this example, and may be another article.
  • electronic products such as netbook terminals, slate terminals, electronic book terminals, portable music players, wearable terminals such as smart glasses and head mounted displays
  • various sensors IoT (Internet of Things) equipment such as a robot
  • factory equipment such as factory pipes, piping equipment, drainage equipment, power receiving / transforming equipment, power transmission equipment, pump equipment, fire fighting equipment, boiler equipment, high pressure gas equipment, high pressure air equipment may be used.
  • it may be a moving body such as a vehicle, an airplane, a ship, a bus, a house, a hospital, a clinic, a station, an airport, a building, a government office, a police station, a fire station, a police box, a stadium, a stadium, a hotel. It may be a building such as a warehouse, a school, a public toilet, a store or a restaurant.
  • the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
  • the program is provided, for example, in a form (SaaS: Software as a Service) provided from a computer via a network.
  • the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.

Abstract

Le problème abordé par la présente invention est de pourvoir à : un système informatique qui améliore la précision de mesure de la température de chaque région d'un sujet ; une méthode et un programme de diagnostic d'un sujet. La solution selon l'invention porte sur un système informatique qui acquiert une image en lumière visible et une image infrarouge qui sont capturées par une caméra, identifie une zone, dans l'image en lumière visible, qui correspond à une région prescrite du sujet imagé par la caméra, identifie une zone, dans l'image infrarouge, qui correspond à la zone identifiée dans l'image en lumière visible, et diagnostique le sujet en fonction de la température de la zone identifiée dans l'image infrarouge.
PCT/JP2016/080872 2016-10-18 2016-10-18 Système informatique, méthode et programme de diagnostic d'un sujet WO2018073900A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/080872 WO2018073900A1 (fr) 2016-10-18 2016-10-18 Système informatique, méthode et programme de diagnostic d'un sujet

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/080872 WO2018073900A1 (fr) 2016-10-18 2016-10-18 Système informatique, méthode et programme de diagnostic d'un sujet

Publications (1)

Publication Number Publication Date
WO2018073900A1 true WO2018073900A1 (fr) 2018-04-26

Family

ID=62019294

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/080872 WO2018073900A1 (fr) 2016-10-18 2016-10-18 Système informatique, méthode et programme de diagnostic d'un sujet

Country Status (1)

Country Link
WO (1) WO2018073900A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110288578A (zh) * 2019-06-24 2019-09-27 国网上海市电力公司 一种高识别率的电力设备缺陷红外图像识别系统
CN115306718A (zh) * 2022-07-15 2022-11-08 嘉洋智慧安全生产科技发展(北京)有限公司 螺杆压缩机故障检测方法、装置、设备、介质和程序产品

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5754048A (ja) * 1980-09-09 1982-03-31 Agency Of Ind Science & Technol Sekigaisengazonyoruijokanshisochi
JPH11120458A (ja) * 1997-10-14 1999-04-30 Hitachi Eng & Service Co Ltd 火災検知装置
JP2002132341A (ja) * 2000-10-26 2002-05-10 Toshiba Corp 現場点検装置
JP2008045888A (ja) * 2006-08-11 2008-02-28 Chugoku Electric Power Co Inc:The 過熱診断装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5754048A (ja) * 1980-09-09 1982-03-31 Agency Of Ind Science & Technol Sekigaisengazonyoruijokanshisochi
JPH11120458A (ja) * 1997-10-14 1999-04-30 Hitachi Eng & Service Co Ltd 火災検知装置
JP2002132341A (ja) * 2000-10-26 2002-05-10 Toshiba Corp 現場点検装置
JP2008045888A (ja) * 2006-08-11 2008-02-28 Chugoku Electric Power Co Inc:The 過熱診断装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110288578A (zh) * 2019-06-24 2019-09-27 国网上海市电力公司 一种高识别率的电力设备缺陷红外图像识别系统
CN115306718A (zh) * 2022-07-15 2022-11-08 嘉洋智慧安全生产科技发展(北京)有限公司 螺杆压缩机故障检测方法、装置、设备、介质和程序产品
CN115306718B (zh) * 2022-07-15 2023-08-18 嘉洋智慧安全科技(北京)股份有限公司 螺杆压缩机故障检测方法、装置、设备、介质和程序产品

Similar Documents

Publication Publication Date Title
JP6441546B2 (ja) コンピュータシステム、物体の診断方法及びプログラム
CN111157124B (zh) 一种基于人脸识别的人体测温方法、装置、系统
JP6560833B2 (ja) コンピュータシステム、植物の診断方法及びプログラム
CN102410832B (zh) 位置姿势测量设备和位置姿势测量方法
US9295141B2 (en) Identification device, method and computer program product
KR101874926B1 (ko) 인식된 오브젝트들을 사용하여 센서들을 교정하기 위한 방법들 및 시스템들
WO2018073900A1 (fr) Système informatique, méthode et programme de diagnostic d'un sujet
CN113299035A (zh) 一种基于人工智能和双目视觉的火灾识别方法及系统
US20170280131A1 (en) Method and system for recalibrating sensing devices without familiar targets
US10645297B2 (en) System, method, and program for adjusting angle of camera
KR101711156B1 (ko) 단말 식별자를 이용한 영상 보안 시스템 및 방법
CN111521279A (zh) 一种管线渗漏巡查方法
WO2020136969A1 (fr) Système de mesure, dispositif de mesure, procédé de mesure et programme
CN116403359A (zh) 一种多模态图像识别算法的生产安全预警系统
US11842452B2 (en) Portable display device with overlaid virtual information
WO2017213373A1 (fr) Dispositif électronique, serveur externe et procédé de commande de ceux-ci
Thapliyal et al. Development of data acquisition console and web server using Raspberry Pi for marine platforms
KR20220011902A (ko) 스마트팜 센서류 고장검출 시스템 및 방법
WO2024005425A1 (fr) Système de positionnement basé sur un indicateur d'intensité de signal de réception sans fil (rssi)
WO2023032507A1 (fr) Système d'estimation de position et procédé d'estimation de position
WO2023032769A1 (fr) Système d'estimation de position et procédé d'estimation de position
US10573160B1 (en) Measuring method for high temperature thermal bridge effect and low temperature thermal bridge effect and measuring system thereof
KR101858989B1 (ko) 표적특성 분석 시스템
WO2024084601A1 (fr) Procédé de détection de changement, système de détection de changement et appareil de détection de changement
KR20230048880A (ko) 전자 장치의 지도 병합 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16919505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16919505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP