WO2023032771A1 - Control method, program, and control system - Google Patents

Control method, program, and control system Download PDF

Info

Publication number
WO2023032771A1
WO2023032771A1 PCT/JP2022/031812 JP2022031812W WO2023032771A1 WO 2023032771 A1 WO2023032771 A1 WO 2023032771A1 JP 2022031812 W JP2022031812 W JP 2022031812W WO 2023032771 A1 WO2023032771 A1 WO 2023032771A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
thermal image
human body
temperature
air
Prior art date
Application number
PCT/JP2022/031812
Other languages
French (fr)
Japanese (ja)
Inventor
千人 浦
仁 吉澤
達雄 古賀
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2023545485A priority Critical patent/JPWO2023032771A1/ja
Publication of WO2023032771A1 publication Critical patent/WO2023032771A1/en

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/30Control or safety arrangements for purposes related to the operation of the system, e.g. for safety or monitoring
    • F24F11/46Improving electric energy efficiency or saving
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/62Control or safety arrangements characterised by the type of control or by internal processing, e.g. using fuzzy logic, adaptive control or estimation of values
    • F24F11/63Electronic processing
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24FAIR-CONDITIONING; AIR-HUMIDIFICATION; VENTILATION; USE OF AIR CURRENTS FOR SCREENING
    • F24F11/00Control or safety arrangements
    • F24F11/89Arrangement or mounting of control or safety devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an air conditioner control method, program, and control system.
  • thermal sensation a technique for controlling an air conditioner mounted on a vehicle so that a passenger feels comfortable (see, for example, Patent Literature 1).
  • the present invention provides an air conditioner control method, program, and control system that estimate a person's thermal sensation and achieve a comfortable environmental temperature.
  • a control method is an air conditioner control method executed by a computer, wherein the computer controls the temperature of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space.
  • a thermal image showing the distribution is acquired, the outline of a person present in the air-conditioned space is detected based on the acquired thermal image, and a human body region, which is an area surrounded by the detected outline of the person, specifying in the thermal image, calculating the human body temperature including the temperature of the clothes of the person based on the specified temperature distribution in the human body region, and calculating the ambient temperature of the region other than the human body region in the thermal image; calculating the heat dissipation amount of the person based on the difference value between the human body temperature and the ambient temperature; estimating the thermal sensation of the person based on the calculated heat dissipation amount;
  • the air conditioner is controlled based on the feeling of coolness.
  • a program according to one aspect of the present invention is a program for causing a computer to execute the control method.
  • a control system includes an acquisition unit that acquires a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space, and the acquired thermal image: a detection unit that detects the contour of a person existing in the air-conditioned space based on the above; calculating a human body temperature including the temperature of clothes of the person based on the temperature distribution in the human body region, calculating an ambient temperature of a region other than the human body region in the thermal image, and calculating the human body temperature and the ambient temperature; a calculation unit that calculates the heat dissipation amount of the person based on the difference value between the heat dissipation amount of the person, an estimation unit that estimates the thermal sensation of the person based on the calculated heat dissipation amount, and based on the estimated thermal sensation and a control unit that controls the air conditioner installed in the air-conditioned space.
  • an air conditioner control method and program that can estimate a person's thermal sensation and achieve a comfortable environmental temperature are realized.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a control system according to an embodiment.
  • FIG. 2 is a diagram showing an air-conditioned space to which the operating system according to the embodiment is applied.
  • FIG. 3 is a diagram for explaining a thermal image.
  • FIG. 4 is a flowchart of an operation example of the control system according to the embodiment.
  • each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code
  • FIG. 1 is a block diagram showing an example of a functional configuration of a control system according to an embodiment.
  • FIG. 2 is a diagram showing an air-conditioned space to which the control system according to the embodiment is applied.
  • the control system 10 extracts the contour of the person 1 existing in the air-conditioned space based on a thermal image showing the temperature distribution of the air-conditioned space, identifies a human body region surrounded by the extracted contour of the person 1, and identifies the human body region. A temperature difference value between the human body region and the other regions is calculated, and the heat release amount of the person 1 is calculated based on the calculated difference value. Then, the control system 10 estimates the thermal sensation of the person 1 based on the calculated heat release amount of the person 1, and controls the air conditioner 30 based on the estimated thermal sensation.
  • the air-conditioned space is, for example, an indoor space in a building.
  • a building may be, for example, an office, a commercial facility, a public facility, an educational facility, or a residence.
  • control system 10 includes sensor 20 , air conditioner 30 , and server device 40 . Note that the control system 10 may include a plurality of sensors 20 .
  • the sensor 20 is installed, for example, on the ceiling of the air-conditioned space, and generates a thermal image showing the temperature distribution when the air-conditioned space is viewed from the ceiling side (in other words, from above).
  • the senor 20 may be installed directly on the ceiling of the air-conditioned space, or may be powered by a lighting device (not shown) or a fire alarm (not shown) installed on the ceiling. It may be detachably connected to the terminal. In the latter case, sensor 20 is powered by a lighting device or fire alarm.
  • the power supply terminal is, for example, a USB (Universal Serial Bus) terminal.
  • the sensor 20 is, for example, an infrared sensor.
  • the infrared sensor is, for example, an infrared array sensor (in other words, a thermal image sensor) configured by an array of 8 ⁇ 8 infrared detection elements.
  • the thermal image produced by the infrared sensor has 8x8 pixels.
  • the thermal image shows the temperature distribution in the sensing range of the infrared sensor with 8 ⁇ 8 resolution.
  • FIG. 3 is a diagram for explaining a thermal image.
  • FIG. 3A is a diagram schematically showing an image of the air-conditioned space detected by the sensor 20 (more specifically, the sensing range of the sensor 20 in the air-conditioned space) captured by a camera.
  • (b) of FIG. 3 is a diagram schematically showing a thermal image detected by the sensor 20.
  • each of the 8 ⁇ 8 small regions in FIG. 3(a) represents a pixel included in the image
  • each of the 8 ⁇ 8 small regions in FIG. 3(b) represents a pixel included in the thermal image 100.
  • Numerical values in the pixels shown in FIG. 3B are pixel values, and specifically indicate temperatures.
  • the area surrounded by the thick line in FIG. 3B is the human body area 100a, and the area other than the human body area 100a in the thermal image 100 is the surrounding area 100b.
  • the senor 20 may be installed on a wall or the like to generate a thermal image showing the temperature distribution when the air-conditioned space is viewed from the wall side (in other words, the side).
  • the control system 10 may acquire a plurality of thermal images showing the temperature distribution when the air-conditioned space is viewed from above and from the sides.
  • the air conditioner 30 controls the temperature around the person 1 in the air-conditioned space (also called environmental temperature) based on the thermal sensation of the person 1 present in the air-conditioned space.
  • the air conditioner 30 includes, for example, a communication section 31, a control section 32, a storage section 33, a louver 36, a compressor 37, and a fan .
  • the communication unit 31 is a communication circuit (communication module) for the air conditioner 30 to communicate with the sensor 20 and the server device 40 .
  • the communication unit 31 includes a communication circuit (communication module) for communication via the wide area communication network 5 and a communication circuit (communication module) for communication via a local communication network (not shown). good too.
  • the communication unit 31 is, for example, a wireless communication circuit that performs wireless communication, but may be a wired communication circuit that performs wired communication. Note that the communication standard for communication performed by the communication unit 31 is not particularly limited.
  • the control unit 32 performs various types of information processing based on the thermal image acquired from the sensor 20 and controls the operation of the air conditioner 30 . Specifically, the control unit 32 controls the louver 36, the compressor 37, and the fan 38 based on the thermal sensation of the person 1 estimated by the estimation unit 32e. For example, when it is estimated that the person 1 feels hot, the control unit 32 orients the louver 36 toward the person 1 and controls the compressor 37 and the fan 38 to generate cool air.
  • control unit 32 includes an acquisition unit 32a that acquires the thermal image received by the communication unit 31, and a detection unit that detects the contour of the person 1 including the skin 1a and the clothes 1b of the person 1 in the thermal image.
  • an identification unit 32c that identifies a human body region 100a (see FIG. 3B) that is an area surrounded by the outline of the person 1;
  • a calculation unit 32d that calculates the heat release amount, an estimation unit 32e that estimates the thermal sensation of the person 1 based on the heat release amount, and an estimation result of the thermal sensation of the person 1 or a control condition determined based on the estimation result.
  • an output unit 32f for outputting.
  • the functions of the acquisition unit 32a, the detection unit 32b, the identification unit 32c, the calculation unit 32d, the estimation unit 32e, and the output unit 32f are such that the processor or microcomputer constituting the control unit 32 reads a computer program stored in the storage unit 33. It is realized by executing Details of functions of the acquisition unit 32a, the detection unit 32b, the identification unit 32c, the calculation unit 32d, the estimation unit 32e, and the output unit 32f will be described later in operation examples.
  • the storage unit 33 is a storage device that stores a dedicated application program and the like for the control unit 32 to execute.
  • the storage unit 33 may store a learned machine learning model (hereinafter also referred to as a learned model) and the database 35 .
  • the machine learning model is learned by the learning unit 44 of the server device 40 .
  • the control unit 32 stores the learned machine learning model transmitted from the server device 40 in the storage unit 33 , thereby updating the learned machine learning model in the storage unit 33 .
  • the storage unit 33 is implemented by, for example, a semiconductor memory.
  • a machine learning model may have a convolutional layer, for example, a convolutional neural network (CNN), but is not limited to this. Also, for example, as shown in FIG. 1, the machine learning model may consist of a first machine learning model (eg, trained model 34a) and a second machine learning model (eg, trained model 34b). .
  • CNN convolutional neural network
  • the first machine learning model (e.g., learned model 34a) outputs a super-resolution thermal image with input of a thermal image showing the temperature distribution of the air-conditioned space
  • the second machine learning model (e.g., learned The model 34b) may receive the super-resolution thermal image output from the first machine learning model (for example, the trained model 34a) and output the outline of a person present in the air-conditioned space.
  • the storage unit 33 stores two trained models 34a and 34b. may be stored.
  • the database 35 stores, for example, a thermal sensation corresponding to the amount of heat released by a person during the cooling operation and during the heating operation, and the control conditions of the air conditioner 30 corresponding to the thermal sensation in association with each other. may be
  • the server device 40 learns the machine learning model and updates the learned models 34a and 34b.
  • the server device 40 is a cloud computer provided outside the building having the air-conditioned space, but it may be an edge computer provided inside the building.
  • the server device 40 includes a communication section 41 , a control section 42 , a storage section 43 and a learning section 44 .
  • the communication unit 41 is a communication module (communication circuit) for the server device 40 to communicate with the air conditioner 30 .
  • the communication unit 41 transmits the learned models 45a and 45b to the air conditioner 30, for example.
  • the communication unit 41 is a communication circuit (communication module) for communicating via the wide area communication network 5, but a communication circuit for communicating via a local communication network (not shown). (communication module).
  • the communication performed by the communication unit 41 may be wireless communication or wired communication.
  • the communication standard used for communication is also not particularly limited.
  • the control unit 42 performs various types of information processing in the server device 40 .
  • the controller 42 is specifically realized by a processor or a microcomputer.
  • the storage unit 43 stores machine learning models (learned models 45a and 45b) learned by the learning unit 44, teacher data 46 used for learning, computer programs executed by the control unit 42, and the like. is. Specifically, the storage unit 43 is realized by a semiconductor memory, HDD (Hard Disk Drive), or the like.
  • HDD Hard Disk Drive
  • the learning unit 44 performs machine learning using the teacher data 46.
  • the learning unit 44 uses machine learning to generate a machine learning model (a so-called learned model) that outputs a contour of a person present in the air-conditioned space based on a thermal image representing the temperature distribution of the air-conditioned space.
  • a machine learning model for example a CNN, may consist of a first machine learning model and a second machine learning model.
  • the learning unit 44 uses machine learning to input a thermal image showing the temperature distribution of the air-conditioned space and outputs a super-resolution thermal image (for example, a trained model 45a), and A second machine learning model (for example, the learned model 45b) may be generated that inputs the super-resolution thermal image and outputs the outline of a person existing in the air-conditioned space.
  • the trained models 45a, 45b include trained parameters adjusted by machine learning.
  • the generated learned models 45 a and 45 b are stored in the storage unit 43 and transmitted to the air conditioner 30 via the communication unit 41 , for example.
  • the air conditioner 30 updates the learned model 34a in the storage unit 33 using the acquired learned model 45a, and updates the learned model 34b using the learned model 45b.
  • the learning unit 44 is realized by executing a program stored in the storage unit 43 by the processor, for example.
  • the teacher data 46 has a thermal image showing the temperature distribution of the air-conditioned space as input data, and the contours of people existing in the air-conditioned space as output data.
  • the teacher data 46 may be a data set including a set of a thermal image as input data and a contour of a person as output data, or may be a plurality of data sets (for example, first teacher data and second teacher data). two-supervised data).
  • the first teacher data is teacher data used for learning the first machine learning model (for example, the trained model 45a).
  • a super-resolution thermal image obtained by converting the thermal image into super-resolution as output data.
  • the second teacher data is teacher data used for learning the second machine learning model (for example, the trained model 45b), and includes a super-resolution thermal image as input data and the output of the teacher data 46 as output data.
  • a data set containing pairs of data ie human contours).
  • the air conditioner 30 may have the configuration of the server device 40 .
  • the server device 40 acquires the thermal image detected by the sensor 20, estimates the thermal sensation of the person 1, and outputs the estimation result of the thermal sensation of the person 1 to the control of the air conditioner 30. good too.
  • FIG. 4 is a flow chart of an example operation of the control system 10 .
  • the control unit 32 of the air conditioner 30 generates a thermal image 100 (for example, FIG. 3 (b)) of (S11). More specifically, the acquisition unit 32 a of the air conditioner 30 acquires the thermal image 100 from the sensor 20 via the communication unit 31 . At this time, the acquisition unit 32 a may store the acquired thermal image 100 in the storage unit 33 .
  • the detection unit 32b detects the outline of the person 1 existing in the air-conditioned space based on the thermal image 100 acquired by the acquisition unit 32a in step S11 (S12). For example, the detection unit 32b divides a plurality of adjacent pixels having the same characteristics in at least a part of the thermal image 100 into two or more temperature distribution regions as one temperature distribution region. Based on two or more temperature distribution areas, the contour of the person 1 including the clothing 1b of the person 1 (see FIG. 2) is detected. For example, the detection unit 32b may perform segmentation on the thermal image 100 using a machine learning model to divide the thermal image 100 into an area where the person 1 is shown and an area where the person 1 is not shown. good.
  • the machine learning model may have convolutional layers, such as, but not limited to, a convolutional neural network (CNN).
  • CNN convolutional neural network
  • the machine learning model (for example, the learned model 34b in FIG. 1) may be applied to a super-resolution thermal image obtained by converting the thermal image 100 into super-resolution.
  • the detection unit 32b interpolates at least one pixel between pixels adjacent to a plurality of pixels in the thermal image 100 to generate a super-resolution thermal image having higher resolution than the thermal image 100.
  • the contour of the person 1 may be detected based on the super-resolution thermal image obtained.
  • the detection unit 32b obtains an average value of pixel values (that is, temperature values) of pixels adjacent to each other in the thermal image 100, so that the average value Thermal image 100 may be super-resolved using a method that inserts new pixels with pixel values corresponding to .
  • the detection unit 32b may super-resolve the thermal image 100 using a learned machine learning model (for example, the learned model 34a in FIG. 1).
  • the trained model 34a may be, for example, SRGAN (Generative Adversarial Network for Super-Resolution), but is not limited to this.
  • the detection unit 32b detects the outline of the person 1 from the thermal image 100 using, for example, the learned machine learning models (for example, the learned models 34a and 34b in FIG. 1) stored in the storage unit 33. You may
  • the specifying unit 32c specifies, in the thermal image 100, a human body region 100a (see (b) of FIG. 3), which is a region surrounded by the outline of the person 1 detected by the detecting unit 32b in step S12. (S13).
  • the calculation unit 32d calculates the human body temperature including the temperature of the clothing 1b of the person 1 based on the temperature distribution within the human body region 100a specified by the specifying unit 32c in step S13 (S14). Specifically, in step S14, the calculation unit 32d calculates an average value of the surface temperature of the skin 1a (see FIG. 2) of the person 1 and the surface temperature of the clothes 1b of the person 1 as the human body temperature. For example, the calculator 32d may calculate an average value of pixel values (ie, temperature values) of the human body region 100a as the human body temperature.
  • the calculator 32d calculates the ambient temperature of the person 1 based on the temperature distribution in the area other than the human body area 100a (so-called surrounding area 100b) (S15). For example, in step S15, the calculation unit 32d may calculate the average value (that is, the temperature value) of the pixel values of the surrounding area 100b as the ambient temperature. In addition, step S14 and step S15 may be performed in parallel.
  • the calculator 32d calculates the amount of heat released by the person 1 based on the difference between the human body temperature and the ambient temperature (S16). Specifically, the calculation unit 32d calculates a difference value between the human body temperature calculated in steps S14 and S15 and the ambient temperature, and calculates the heat dissipation amount of the person 1 based on the calculated difference value.
  • the estimation unit 32e estimates the thermal sensation of the person 1 based on the amount of heat released by the person 1 calculated by the calculation unit 32d in step S16 (S17). For example, the estimation unit 32e estimates that the person 1 feels cold when the amount of heat dissipation is greater than the threshold, and estimates that the person 1 feels hot when the amount of heat dissipation is less than the threshold.
  • the threshold may include a threshold for cooling operation and a threshold for heating operation.
  • the thermal sensation of the person 1 corresponds to the amount of heat released by the person 1, but the thermal sensation of the person 1 may differ depending on the season even if the amount of heat released is the same. Therefore, by estimating the thermal sensation of the person 1 using the threshold for the cooling operation and the threshold for the heating operation, the thermal sensation of the person 1 can be estimated more accurately.
  • the control unit 32 controls the air conditioner 30 based on the thermal sensation of the person 1 estimated by the estimation unit 32e in step S17 (S18). For example, when the estimation unit 32e estimates in step S17 that the person 1 feels cold, the control unit 32 controls the air conditioner 30 to increase the ambient temperature, and the estimation unit 32e determines that the person 1 is hot. If it is estimated that the air conditioner 30 is feeling The control unit 32 determines control conditions during cooling or heating operation corresponding to the thermal sensation of the person 1 based on the database 35, and operates the louver 36, the compressor 37 and the fan 38 based on the determined control conditions. may be controlled. Since the database 35 has been described above, a description thereof will be omitted here.
  • the control unit 32 may control the air conditioner 30 based on the position of the human body region 100a and the thermal sensation of the person 1.
  • the calculator 32d calculates the position of the person 1 in the air-conditioned space based on the position of the human body region in the thermal image.
  • the storage unit 33 stores table information indicating the correspondence between the positions of pixels in the thermal image and the coordinates in the air-conditioned space.
  • the position (specifically, the coordinates) of the person 1 in the air-conditioned space may be calculated from the position (pixel position) where the person 1 is detected.
  • the temperature distribution of the air-conditioned space calculated by the calculator 32d
  • the coordinates of 1 are two-dimensional coordinates when the air-conditioned space is viewed from above.
  • control system 10 can estimate the thermal sensation of the person 1 present in the air-conditioned space and control the air conditioner 30 based on the estimated thermal sensation. Therefore, the control system 10 can estimate the thermal sensation of the person 1 and achieve a comfortable environmental temperature.
  • the control unit 32 controls the air conditioner 30 based on the estimation result (the thermal sensation of the person 1) estimated by the estimation unit 32e, but may output the estimation result.
  • the output unit 32f may output the result estimated by the estimation unit 32e to an information terminal (not shown) possessed by the person 1.
  • FIG. As a result, the person 1 can confirm the estimation result displayed on the display unit of the information terminal. , the person 1 can input the actual thermal sensation to the information terminal to correct the estimation result. Note that both the estimation result and the corrected estimation result are stored in the storage unit 33 .
  • the detection unit 32b may perform information processing on the thermal image based on a rule-based algorithm that does not use a machine learning model. For example, the detection unit 32b may perform processing for detecting a pixel having a maximum pixel value among a plurality of pixels included in the thermal image.
  • a pixel having a maximum pixel value means a pixel having a maximum pixel value in a two-dimensional arrangement of pixels.
  • a pixel having a maximum pixel value means, in other words, a pixel having a higher pixel value than surrounding pixels when comparing pixel values at the same time in a two-dimensional arrangement of pixels.
  • the detection unit 32b detects a pixel having a maximum pixel value and a pixel value equal to or higher than a predetermined value (for example, 30° C. or higher), the detection unit 32b determines that the person 1 is present in the air-conditioned space. Then, the contour of the person 1 may be detected based on the detected pixels.
  • a predetermined value for example, 30° C. or higher
  • the calculation unit 32d applies the above-described table information to the position of a pixel having a maximum pixel value and having a pixel value equal to or higher than a predetermined value (for example, 30° C. or higher), thereby determining the air conditioning target.
  • the position (that is, the coordinates) of person 1 in space can be calculated.
  • Another example of information processing based on a rule-based algorithm is processing that detects temporal changes in the pixel values (temperature) of each of a plurality of pixels included in a thermal image. Assuming that there are no heat sources other than people in the air-conditioned space, and if there are no people in the air-conditioned space, the pixel values (temperatures) of the plurality of pixels included in the thermal image change slowly over time. be. In this state, when the person 1 enters the air-conditioned space, the pixel values of the pixels in the portion of the thermal image where the person 1 is shown change (increase) abruptly.
  • the detection unit 32b detects changes in the pixel values of each of the plurality of pixels over time, for example, when the pixel values rise sharply, it is estimated that the person 1 exists in the air-conditioned space, and the outline of the person 1 is detected. processing may be performed.
  • the control method is a control method for the air conditioner 30 executed by a computer such as the control system 10.
  • the computer controls the air conditioning detected by the sensor 20 installed on the ceiling of the space to be air conditioned.
  • a thermal image 100 showing the temperature distribution of the target space is acquired (step S11 in FIG. 4), the contour of the person 1 existing in the air conditioning target space is detected based on the acquired thermal image 100 (step S12), and the contour of the detected person 1 is detected.
  • the human body region 100a which is the region surrounded by the outline of the person 1, is specified in the thermal image 100 (step S13), and the temperature of the clothes 1b of the person 1 is calculated based on the temperature distribution in the specified human body region 100a.
  • Step S14 calculates the ambient temperature of a region (surrounding region 100b) other than the human body region 100a in the thermal image 100 (Step S15), and calculates the difference value between the human body temperature and the ambient temperature.
  • Step S16 calculates the heat dissipation amount of the person 1 (step S16), estimate the thermal sensation of the person 1 based on the calculated heat dissipation amount (step S17), and control the air conditioner 30 based on the estimated thermal sensation. (step S18).
  • Such a control method can realize an environmental temperature that the person 1 feels comfortable by estimating the thermal sensation of the person 1 existing in the air-conditioned space.
  • the computer estimates that the person 1 feels cold when the heat release amount is greater than the threshold, and operates the air conditioner 30 to increase the ambient temperature based on the estimated thermal sensation.
  • the heat release amount is smaller than a threshold value, it is estimated that the person 1 feels hot, and the air conditioner 30 is controlled to lower the ambient temperature based on the estimated thermal sensation.
  • Such a control method can estimate the thermal sensation of person 1 from the amount of heat released by person 1 based on the threshold.
  • the computer in detecting the contour of the person 1, has the same characteristics in at least a part of the thermal image 100, and uses two or more adjacent pixels as one temperature distribution area. It is divided into distribution areas (for example, a human body area 100a and a surrounding area 100b), and the outline of the person 1 including the clothing 1b of the person 1 is detected based on two or more divided temperature distribution areas.
  • distribution areas for example, a human body area 100a and a surrounding area 100b
  • Such a control method has the same characteristics in at least a part of the thermal image, and divides a plurality of adjacent pixels into one temperature distribution area, thereby converting the thermal image into an area corresponding to the person 1 ( It can be divided into a human body area 100a) and an area (surrounding area 100b) corresponding to other than the person 1.
  • FIG. Thereby, the control method can detect the outline of the person 1 based on the pixel values of the plurality of pixels included in the area corresponding to the person 1 .
  • the computer interpolates at least one pixel between adjacent pixels in a plurality of pixels in the thermal image 100 to generate a super-resolution thermal image having a higher resolution than the thermal image 100. Then, the contour of the person 1 is detected based on the super-resolution thermal image.
  • the computer detects the contour of the person 1 from the thermal image 100 using a learned machine learning model (for example, the learned models 34a and 34b).
  • a learned machine learning model for example, the learned models 34a and 34b.
  • Such a control method can quickly and accurately detect the contour of the person 1 from the thermal image 100 by using a machine learning model.
  • the machine learning model is learned using teacher data 46, and the teacher data 46 has a thermal image as input data and a human contour as output data.
  • Such a control method can detect the contour of the person 1 by inputting the thermal image 100 into a trained machine learning model.
  • the machine learning model is composed of a first machine learning model (e.g., trained model 34a) and a second machine learning model (e.g., trained model 34b), and the first machine learning model is
  • the second machine learning model is learned using the first teacher data including a set of the input data of the teacher data 46 as input data and the super-resolution thermal image of the input data as output data
  • the second machine learning model uses the above Learning is performed using second teacher data including a set of a super-resolution thermal image and the output data of the teacher data 46 as output data.
  • Such a control method includes a first machine learning model (for example, a trained model 34a) that receives the thermal image 100 as an input and outputs a super-resolution thermal image of the thermal image 100, and a human machine learning model that receives the super-resolution thermal image as an input.
  • the contour of person 1 can be detected from thermal image 100 using a second machine learning model (eg, trained model 34b) that outputs contours. Therefore, since the control method can detect the contour of the person 1 from the super-resolution thermal image having finer resolution than the thermal image 100, the contour of the person 1 can be detected with higher accuracy.
  • the program is a program for causing the computer to execute any of the above control methods.
  • Such a program can realize an environmental temperature that the person 1 feels comfortable by estimating the thermal sensation of the person 1 existing in the air-conditioned space.
  • the control system 10 also includes an acquisition unit 32a that acquires a thermal image 100 showing the temperature distribution of the air-conditioned space detected by the sensor 20 installed on the ceiling of the air-conditioned space, and based on the acquired thermal image 100: a detection unit 32b that detects the contour of the person 1 existing in the air-conditioned space; Based on the obtained temperature distribution in the human body region 100a, the human body temperature including the temperature of the clothes 1b of the person 1 is calculated, and the ambient temperature of the region other than the human body region 100a (so-called surrounding region 100b) in the thermal image 100 is calculated.
  • an acquisition unit 32a that acquires a thermal image 100 showing the temperature distribution of the air-conditioned space detected by the sensor 20 installed on the ceiling of the air-conditioned space, and based on the acquired thermal image 100: a detection unit 32b that detects the contour of the person 1 existing in the air-conditioned space; Based on the obtained temperature distribution in the human body region 100a, the human body temperature including the temperature of the clothes 1b of the person 1 is calculated, and the ambient
  • a calculation unit 32d for calculating the amount of heat dissipation of the person 1 based on the difference value between the human body temperature and the ambient temperature; and a control unit 32 that controls the air conditioner 30 installed in the air-conditioned space based on the received thermal sensation.
  • Such a control system 10 can realize an environmental temperature that the person 1 feels comfortable by estimating the thermal sensation of the person 1 existing in the air-conditioned space.
  • control system was implemented by a plurality of devices, but it may be implemented as a single device.
  • control system may be implemented as a single device that corresponds to the server device.
  • each component included in the control system may be distributed to the plurality of devices in any way.
  • processing executed by a specific processing unit may be executed by another processing unit.
  • order of multiple processes may be changed, and multiple processes may be executed in parallel.
  • each component may be realized by executing a software program suitable for each component.
  • Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
  • each component may be realized by hardware.
  • each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
  • the present invention may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM.
  • any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented.
  • the present invention may be implemented as a computer-implemented control method, such as a control system.
  • the present invention may be implemented as a program for causing a computer to execute the control method, or as a computer-readable non-temporary recording medium storing such a program.
  • a control method for an air conditioner executed by a computer comprising: The computer is Acquiring a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space, detecting a contour of a person existing in the air-conditioned space based on the acquired thermal image; identifying a human body region, which is a region surrounded by the detected outline of the person, in the thermal image; calculating a human body temperature including a temperature of clothes of the person based on the identified temperature distribution in the human body region; calculating the ambient temperature of a region other than the human body region in the thermal image; calculating the heat radiation amount of the person based on the difference value between the human body temperature and the ambient temperature; estimating the thermal sensation of the person based on the calculated amount of heat dissipation; controlling the air conditioner based on the estimated thermal sensation; control method.
  • the computer is estimating that the person feels cold when the heat release amount is greater than a threshold, and controlling the air conditioner to raise the ambient temperature based on the estimated thermal sensation; estimating that the person feels hot when the heat release amount is smaller than the threshold value, and controlling the air conditioner to lower the ambient temperature based on the estimated thermal sensation;
  • Such a control method can estimate a person's thermal sensation from the amount of heat released by the person based on the threshold value.
  • invention 3 The computer, in detecting the contour of the person, dividing a plurality of adjacent pixels into two or more temperature distribution areas having the same characteristics in at least a part of the thermal image, and detecting the contour of the person including the clothing of the person based on the two or more divided temperature distribution areas; The control method according to invention 1 or 2.
  • Such a control method has the same characteristics in at least a part of the thermal image, and divides a plurality of adjacent pixels as one temperature distribution area, thereby dividing the thermal image into an area corresponding to a person (so-called , human body region) and regions other than the person 1 (so-called surrounding region). Thereby, the control method can detect the outline of the person based on the pixel values of the plurality of pixels included in the area corresponding to the person.
  • the computer is generating a super-resolution thermal image having a higher resolution than the thermal image by interpolating at least one pixel between adjacent pixels for a plurality of pixels in the thermal image; detecting a contour of the person based on the super-resolution thermal image; A control method according to any one of Inventions 1 to 3.
  • the computer detects the contour of the person from the thermal image using a trained machine learning model.
  • Such a control method can quickly and accurately detect the contour of a person from a thermal image by using a machine learning model.
  • invention 6 The machine learning model is learned using teacher data, The training data has the thermal image as input data and the outline of the person as output data, A control method according to invention 5.
  • Such a control method can detect the contour of a person by inputting a thermal image into a trained machine learning model.
  • the machine learning model comprises a first machine learning model and a second machine learning model
  • the first machine learning model is learned using first teacher data including a set of the input data of the teacher data and a super-resolution thermal image of the input data
  • the second machine learning model is learned using second teacher data including a set of the super-resolution thermal image and the output data of the teacher data,
  • Such a control method includes a first machine learning model that takes a thermal image as an input and outputs a super-resolution thermal image of the thermal image, and a second machine learning model that takes a super-resolution thermal image as an input and outputs a human contour.
  • a first machine learning model that takes a thermal image as an input and outputs a super-resolution thermal image of the thermal image
  • a second machine learning model that takes a super-resolution thermal image as an input and outputs a human contour.
  • invention 8 A program for causing a computer to execute the control method according to any one of Inventions 1 to 7.
  • control system 20 sensor 30 air conditioner 32 control unit 32a acquisition unit 32b detection unit 32c identification unit 32d calculation unit 32e estimation unit 34a learned model 34b learned model 45a learned model 45b learned model 46 teacher data 100 thermal image 100a human body region 100b surrounding region

Abstract

A computer for, inter alia, a control system (10) for executing a control method performs the following: acquiring (S11) a thermal image (100) showing a temperature distribution of a to-be-air conditioned space detected by a sensor (20) installed on the ceiling of the to-be-air conditioned space; detecting (S12) the outline of a person (1) present in the to-be-air conditioned space on the basis of the thermal image (100); identifying (S13) a human body region(100a), which is a region surrounded by the outline of the person (1), in the thermal image (100); calculating (S14) a human body temperature including the temperature of clothing (1b) of the person (1) on the basis of the temperature distribution inside the human body region (100a); calculating (S15) an ambient temperature of a region (peripheral region (100b)) other than the human body region (100a) in the thermal image (100); calculating (S16) a heat dissipation amount of the person (1) on the basis of a differential value between the human body temperature and the ambient temperature; estimating (S17) a warm/cold sensation of the person (1) on the basis of the heat dissipation amount; and controlling (S18) an air conditioner (30) on the basis of the warm/cold sensation.

Description

制御方法、プログラム、及び、制御システムControl method, program and control system
 本発明は、空気調和機の制御方法、プログラム、及び、制御システムに関する。 The present invention relates to an air conditioner control method, program, and control system.
 乗員の鼻の温度と、頬及び/又は額の周りとの温度差を測定し、温度差に基づいて乗員の感覚や快適性(以下、温冷感)などの乗員の熱状態を測定し、乗員が快適であると感じるように乗り物に搭載された空調装置を制御する技術が知られている(例えば、特許文献1参照)。 Measure the temperature difference between the temperature of the occupant's nose and the temperature around the cheeks and / or forehead, measure the occupant's thermal state such as the occupant's sensation and comfort (hereinafter referred to as thermal sensation) based on the temperature difference, BACKGROUND ART There is known a technique for controlling an air conditioner mounted on a vehicle so that a passenger feels comfortable (see, for example, Patent Literature 1).
特開2017-197195号公報JP 2017-197195 A
 しかしながら、特許文献1に記載の技術では、人の顔周りの温度だけで人の温冷感を推定しており、人の温冷感を推定して快適な環境温度を実現できているとは言い難い。 However, in the technique described in Patent Document 1, the human thermal sensation is estimated only from the temperature around the person's face, and it is unbelievable that a comfortable environmental temperature can be realized by estimating the human thermal sensation. Hard to say.
 そこで、本発明は、人の温冷感を推定して快適な環境温度を実現する、空気調和機の制御方法、プログラム、及び、制御システムを提供する。 Therefore, the present invention provides an air conditioner control method, program, and control system that estimate a person's thermal sensation and achieve a comfortable environmental temperature.
 本発明の一態様に係る制御方法は、コンピュータにより実行される空気調和機の制御方法であって、前記コンピュータは、空調対象空間の天井に設置されたセンサにより検出された前記空調対象空間の温度分布を示す熱画像を取得し、取得された前記熱画像に基づいて前記空調対象空間に存在する人の輪郭を検出し、検出された前記人の輪郭で囲まれた領域である人体領域を、前記熱画像内において特定し、特定された前記人体領域内の温度分布に基づいて前記人の着衣の温度を含む人体温度を算出し、前記熱画像内における前記人体領域以外の領域の周囲温度を算出し、前記人体温度と前記周囲温度との差分値に基づいて前記人の放熱量を算出し、算出された前記放熱量に基づいて前記人の温冷感を推定し、推定された前記温冷感に基づいて前記空気調和機を制御する。 A control method according to an aspect of the present invention is an air conditioner control method executed by a computer, wherein the computer controls the temperature of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space. A thermal image showing the distribution is acquired, the outline of a person present in the air-conditioned space is detected based on the acquired thermal image, and a human body region, which is an area surrounded by the detected outline of the person, specifying in the thermal image, calculating the human body temperature including the temperature of the clothes of the person based on the specified temperature distribution in the human body region, and calculating the ambient temperature of the region other than the human body region in the thermal image; calculating the heat dissipation amount of the person based on the difference value between the human body temperature and the ambient temperature; estimating the thermal sensation of the person based on the calculated heat dissipation amount; The air conditioner is controlled based on the feeling of coolness.
 本発明の一態様に係るプログラムは、前記制御方法をコンピュータに実行させるためのプログラムである。 A program according to one aspect of the present invention is a program for causing a computer to execute the control method.
 本発明の一態様に係る制御システムは、空調対象空間の天井に設置されたセンサにより検出された前記空調対象空間の温度分布を示す熱画像を取得する取得部と、取得された前記熱画像に基づいて前記空調対象空間に存在する人の輪郭を検出する検出部と、検出された前記人の輪郭で囲まれた領域である人体領域を、前記熱画像内において特定する特定部と、特定された前記人体領域内の温度分布に基づいて前記人の着衣の温度を含む人体温度を算出し、前記熱画像内における前記人体領域以外の領域の周囲温度を算出し、前記人体温度と前記周囲温度との差分値に基づいて前記人の放熱量を算出する算出部と、算出された前記放熱量に基づいて前記人の温冷感を推定する推定部と、推定された前記温冷感に基づいて前記空調対象空間に設置された空気調和機を制御する制御部と、を備える。 A control system according to an aspect of the present invention includes an acquisition unit that acquires a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space, and the acquired thermal image: a detection unit that detects the contour of a person existing in the air-conditioned space based on the above; calculating a human body temperature including the temperature of clothes of the person based on the temperature distribution in the human body region, calculating an ambient temperature of a region other than the human body region in the thermal image, and calculating the human body temperature and the ambient temperature; a calculation unit that calculates the heat dissipation amount of the person based on the difference value between the heat dissipation amount of the person, an estimation unit that estimates the thermal sensation of the person based on the calculated heat dissipation amount, and based on the estimated thermal sensation and a control unit that controls the air conditioner installed in the air-conditioned space.
 本発明によれば、人の温冷感を推定して快適な環境温度を実現することができる空気調和機の制御方法及びプログラムが実現される。 According to the present invention, an air conditioner control method and program that can estimate a person's thermal sensation and achieve a comfortable environmental temperature are realized.
図1は、実施の形態に係る制御システムの機能構成の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of a functional configuration of a control system according to an embodiment. 図2は、実施の形態に係る動作システムが適用される空調対象空間を示す図である。FIG. 2 is a diagram showing an air-conditioned space to which the operating system according to the embodiment is applied. 図3は、熱画像を説明するための図である。FIG. 3 is a diagram for explaining a thermal image. 図4は、実施の形態に係る制御システムの動作例のフローチャートである。FIG. 4 is a flowchart of an operation example of the control system according to the embodiment.
 以下、実施の形態について、図面を参照しながら説明する。なお、以下で説明する実施の形態は、いずれも包括的または具体的な例を示すものである。以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置及び接続形態、ステップ、ステップの順序などは、一例であり、本発明を限定する主旨ではない。また、以下の実施の形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意の構成要素として説明される。 Hereinafter, embodiments will be described with reference to the drawings. It should be noted that the embodiments described below are all comprehensive or specific examples. Numerical values, shapes, materials, components, arrangement positions and connection forms of components, steps, order of steps, and the like shown in the following embodiments are examples and are not intended to limit the present invention. Further, among the constituent elements in the following embodiments, constituent elements not described in independent claims will be described as optional constituent elements.
 なお、各図は模式図であり、必ずしも厳密に図示されたものではない。また、各図において、実質的に同一の構成に対しては同一の符号を付し、重複する説明は省略または簡略化される場合がある。 It should be noted that each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code|symbol is attached|subjected with respect to substantially the same structure, and the overlapping description may be abbreviate|omitted or simplified.
 (実施の形態)
 [構成]
 まず、実施の形態に係る制御システムの構成について説明する。図1は、実施の形態に係る制御システムの機能構成の一例を示すブロック図である。図2は、実施の形態に係る制御システムが適用される空調対象空間を示す図である。
(Embodiment)
[composition]
First, the configuration of the control system according to the embodiment will be described. FIG. 1 is a block diagram showing an example of a functional configuration of a control system according to an embodiment. FIG. 2 is a diagram showing an air-conditioned space to which the control system according to the embodiment is applied.
 制御システム10は、空調対象空間の温度分布を示す熱画像に基づいて空調対象空間に存在する人1の輪郭を抽出し、抽出された人1の輪郭で囲まれた人体領域を特定し、特定された人体領域及びそれ以外の領域における温度の差分値を算出し、算出された差分値に基づいて人1の放熱量を算出する。そして、制御システム10は、算出された人1の放熱量に基づいて人1の温冷感を推定し、推定された温冷感に基づいて空気調和機30を制御する。空調対象空間は、例えば、建物内の室内空間である。建物は、例えば、オフィス、商業施設、公共施設、教育施設、又は、住宅であってもよい。図1に示されるように、制御システム10は、センサ20と、空気調和機30と、サーバ装置40とを備える。なお、制御システム10は、センサ20を複数備えてもよい。 The control system 10 extracts the contour of the person 1 existing in the air-conditioned space based on a thermal image showing the temperature distribution of the air-conditioned space, identifies a human body region surrounded by the extracted contour of the person 1, and identifies the human body region. A temperature difference value between the human body region and the other regions is calculated, and the heat release amount of the person 1 is calculated based on the calculated difference value. Then, the control system 10 estimates the thermal sensation of the person 1 based on the calculated heat release amount of the person 1, and controls the air conditioner 30 based on the estimated thermal sensation. The air-conditioned space is, for example, an indoor space in a building. A building may be, for example, an office, a commercial facility, a public facility, an educational facility, or a residence. As shown in FIG. 1 , control system 10 includes sensor 20 , air conditioner 30 , and server device 40 . Note that the control system 10 may include a plurality of sensors 20 .
 [センサ]
 センサ20は、例えば、空調対象空間の天井に設置され、空調対象空間を天井側(言い換えると、上方)から見たときの温度分布を示す熱画像を生成する。
[Sensor]
The sensor 20 is installed, for example, on the ceiling of the air-conditioned space, and generates a thermal image showing the temperature distribution when the air-conditioned space is viewed from the ceiling side (in other words, from above).
 例えば、図2に示されるように、センサ20は、空調対象空間の天井に直接設置されてもよいし、天井に設置された照明装置(不図示)又は火災報知器(不図示)が有する給電端子に着脱自在に接続されてもよい。後者の場合、センサ20は、照明装置又は火災報知器から給電を受けて動作する。給電端子は、例えば、USB(Universal Serial Bus)端子である。 For example, as shown in FIG. 2, the sensor 20 may be installed directly on the ceiling of the air-conditioned space, or may be powered by a lighting device (not shown) or a fire alarm (not shown) installed on the ceiling. It may be detachably connected to the terminal. In the latter case, sensor 20 is powered by a lighting device or fire alarm. The power supply terminal is, for example, a USB (Universal Serial Bus) terminal.
 センサ20は、例えば、赤外線センサである。赤外線センサは、例えば、8×8個の赤外線検出素子のアレイによって構成される赤外線アレイセンサ(言い換えると、熱画像センサ)である。言い換えれば、赤外線センサによって生成される熱画像は、8×8個の画素を有する。熱画像は、赤外線センサのセンシング範囲における温度分布を8×8の分解能で示す。図3は、熱画像を説明するための図である。図3の(a)は、センサ20によって検出される空調対象空間(より詳細には、空調対象空間におけるセンサ20のセンシング範囲)をカメラで撮影した画像を模式的に示す図である。図3の(b)は、センサ20によって検出された熱画像を模式的に示す図である。図3の(a)の8×8個の小領域のそれぞれは画像に含まれる画素を意味し、図3の(b)の8×8個の小領域のそれぞれは熱画像100に含まれる画素を意味する。図3の(b)に示される画素中の数値は画素値であり、具体的には温度を示している。図3の(b)の太線で囲まれた領域は、人体領域100aであり、熱画像100において人体領域100a以外の領域は、周囲領域100bである。以下の実施の形態では、説明の簡略化のため、画素値=温度値として説明が行われる。 The sensor 20 is, for example, an infrared sensor. The infrared sensor is, for example, an infrared array sensor (in other words, a thermal image sensor) configured by an array of 8×8 infrared detection elements. In other words, the thermal image produced by the infrared sensor has 8x8 pixels. The thermal image shows the temperature distribution in the sensing range of the infrared sensor with 8×8 resolution. FIG. 3 is a diagram for explaining a thermal image. FIG. 3A is a diagram schematically showing an image of the air-conditioned space detected by the sensor 20 (more specifically, the sensing range of the sensor 20 in the air-conditioned space) captured by a camera. (b) of FIG. 3 is a diagram schematically showing a thermal image detected by the sensor 20. As shown in FIG. Each of the 8×8 small regions in FIG. 3(a) represents a pixel included in the image, and each of the 8×8 small regions in FIG. 3(b) represents a pixel included in the thermal image 100. means Numerical values in the pixels shown in FIG. 3B are pixel values, and specifically indicate temperatures. The area surrounded by the thick line in FIG. 3B is the human body area 100a, and the area other than the human body area 100a in the thermal image 100 is the surrounding area 100b. In the following embodiments, for simplification of explanation, pixel value=temperature value will be explained.
 なお、センサ20は、壁などに設置されることにより、空調対象空間を壁側(言い換えると、側方)から見たときの温度分布を示す熱画像を生成してもよい。この場合、制御システム10は、空調対象空間を上方及び側方から見たときの温度分布を示す複数の熱画像を取得してもよい。 Note that the sensor 20 may be installed on a wall or the like to generate a thermal image showing the temperature distribution when the air-conditioned space is viewed from the wall side (in other words, the side). In this case, the control system 10 may acquire a plurality of thermal images showing the temperature distribution when the air-conditioned space is viewed from above and from the sides.
 [空気調和機]
 空気調和機30は、空調対象空間に存在する人1の温冷感に基づいて、空調対象空間における人1の周囲の温度(環境温度ともいう)を制御する。図1に示されるように、空気調和機30は、例えば、通信部31と、制御部32と、記憶部33と、ルーバ36と、コンプレッサ37と、ファン38とを備える。
[Air conditioner]
The air conditioner 30 controls the temperature around the person 1 in the air-conditioned space (also called environmental temperature) based on the thermal sensation of the person 1 present in the air-conditioned space. As shown in FIG. 1, the air conditioner 30 includes, for example, a communication section 31, a control section 32, a storage section 33, a louver 36, a compressor 37, and a fan .
 通信部31は、空気調和機30がセンサ20及びサーバ装置40と通信を行うための通信回路(通信モジュール)である。通信部31は、広域通信ネットワーク5を介して通信を行うための通信回路(通信モジュール)と、局所通信ネットワーク(不図示)を介して通信を行うための通信回路(通信モジュール)とを備えてもよい。通信部31は、例えば、無線通信を行う無線通信回路であるが、有線通信を行う有線通信回路であってもよい。なお、通信部31が行う通信の通信規格については特に限定されない。 The communication unit 31 is a communication circuit (communication module) for the air conditioner 30 to communicate with the sensor 20 and the server device 40 . The communication unit 31 includes a communication circuit (communication module) for communication via the wide area communication network 5 and a communication circuit (communication module) for communication via a local communication network (not shown). good too. The communication unit 31 is, for example, a wireless communication circuit that performs wireless communication, but may be a wired communication circuit that performs wired communication. Note that the communication standard for communication performed by the communication unit 31 is not particularly limited.
 制御部32は、センサ20から取得された熱画像に基づいて各種情報処理を行い、空気調和機30の動作を制御する。具体的には、制御部32は、推定部32eによって推定された人1の温冷感に基づいて、ルーバ36、コンプレッサ37及びファン38を制御する。例えば、制御部32は、人1が暑いと感じていると推定された場合、ルーバ36を人1の方向に向け、コンプレッサ37及びファン38を動作させて冷風を発生させる制御を行う。 The control unit 32 performs various types of information processing based on the thermal image acquired from the sensor 20 and controls the operation of the air conditioner 30 . Specifically, the control unit 32 controls the louver 36, the compressor 37, and the fan 38 based on the thermal sensation of the person 1 estimated by the estimation unit 32e. For example, when it is estimated that the person 1 feels hot, the control unit 32 orients the louver 36 toward the person 1 and controls the compressor 37 and the fan 38 to generate cool air.
 制御部32は、具体的には、通信部31によって受信された熱画像を取得する取得部32aと、熱画像内の人1の肌1aと着衣1bとを含む人1の輪郭を検出する検出部32bと、人1の輪郭で囲まれた領域である人体領域100a(図3の(b)参照)を特定する特定部32cと、人体温度と周囲温度との差分値に基づいて人1の放熱量を算出する算出部32dと、放熱量に基づいて人1の温冷感を推定する推定部32eと、人1の温冷感の推定結果又は推定結果に基づいて決定された制御条件を出力する出力部32fとを備える。取得部32a、検出部32b、特定部32c、算出部32d、推定部32e、及び、出力部32fの機能は、制御部32を構成するプロセッサ又はマイクロコンピュータが記憶部33に記憶されたコンピュータプログラムを実行することによって実現される。取得部32a、検出部32b、特定部32c、算出部32d、推定部32e、及び、出力部32fの機能の詳細については、動作例にて後述される。 Specifically, the control unit 32 includes an acquisition unit 32a that acquires the thermal image received by the communication unit 31, and a detection unit that detects the contour of the person 1 including the skin 1a and the clothes 1b of the person 1 in the thermal image. an identification unit 32c that identifies a human body region 100a (see FIG. 3B) that is an area surrounded by the outline of the person 1; A calculation unit 32d that calculates the heat release amount, an estimation unit 32e that estimates the thermal sensation of the person 1 based on the heat release amount, and an estimation result of the thermal sensation of the person 1 or a control condition determined based on the estimation result. and an output unit 32f for outputting. The functions of the acquisition unit 32a, the detection unit 32b, the identification unit 32c, the calculation unit 32d, the estimation unit 32e, and the output unit 32f are such that the processor or microcomputer constituting the control unit 32 reads a computer program stored in the storage unit 33. It is realized by executing Details of functions of the acquisition unit 32a, the detection unit 32b, the identification unit 32c, the calculation unit 32d, the estimation unit 32e, and the output unit 32f will be described later in operation examples.
 記憶部33は、制御部32が実行するための専用のアプリケーションプログラムなどが記憶される記憶装置である。記憶部33には、学習済みの機械学習モデル(以下、学習済みモデルともいう)とデータベース35とが格納されてもよい。機械学習モデルは、サーバ装置40の学習部44で学習される。制御部32は、サーバ装置40から送信された学習済みの機械学習モデルを記憶部33に格納することで、記憶部33内の学習済みの機械学習モデルが更新される。記憶部33は、例えば、半導体メモリによって実現される。 The storage unit 33 is a storage device that stores a dedicated application program and the like for the control unit 32 to execute. The storage unit 33 may store a learned machine learning model (hereinafter also referred to as a learned model) and the database 35 . The machine learning model is learned by the learning unit 44 of the server device 40 . The control unit 32 stores the learned machine learning model transmitted from the server device 40 in the storage unit 33 , thereby updating the learned machine learning model in the storage unit 33 . The storage unit 33 is implemented by, for example, a semiconductor memory.
 機械学習モデルは、畳み込み層を有するものであればよく、例えば、畳み込みニューラルネットワーク(CNN)であってもよいが、これに限定されない。また、例えば、図1に示されるように、機械学習モデルは、第一機械学習モデル(例えば、学習済みモデル34a)及び第二機械学習モデル(例えば、学習済みモデル34b)から構成されてもよい。この場合、第一機械学習モデル(例えば、学習済みモデル34a)は、空調対象空間の温度分布を示す熱画像を入力として超解像熱画像を出力し、第二機械学習モデル(例えば、学習済みモデル34b)は、第一機械学習モデル(例えば、学習済みモデル34a)から出力された超解像熱画像を入力として当該空調対象空間に存在する人の輪郭を出力してもよい。図1の例では、記憶部33には、2つの学習済みモデル34a、34bが格納されているが、学習済みモデル34a及び学習済みモデル34bが連結された1つの学習済みモデルが記憶部33に格納されてもよい。 A machine learning model may have a convolutional layer, for example, a convolutional neural network (CNN), but is not limited to this. Also, for example, as shown in FIG. 1, the machine learning model may consist of a first machine learning model (eg, trained model 34a) and a second machine learning model (eg, trained model 34b). . In this case, the first machine learning model (e.g., learned model 34a) outputs a super-resolution thermal image with input of a thermal image showing the temperature distribution of the air-conditioned space, and the second machine learning model (e.g., learned The model 34b) may receive the super-resolution thermal image output from the first machine learning model (for example, the trained model 34a) and output the outline of a person present in the air-conditioned space. In the example of FIG. 1, the storage unit 33 stores two trained models 34a and 34b. may be stored.
 データベース35には、例えば、冷房運転時、及び、暖房運転時における人の放熱量に対応する温冷感と、当該温冷感に対応する空気調和機30の制御条件とが対応付けられて格納されてもよい。 The database 35 stores, for example, a thermal sensation corresponding to the amount of heat released by a person during the cooling operation and during the heating operation, and the control conditions of the air conditioner 30 corresponding to the thermal sensation in association with each other. may be
 [サーバ装置]
 サーバ装置40は、機械学習モデルの学習を行い、学習済みモデル34a、34bを更新する。図1の例では、サーバ装置40は、空調対象空間を有する建物外に設けられたクラウドコンピュータであるが、当該建物内に設けられたエッジコンピュータであってもよい。サーバ装置40は、通信部41と、制御部42と、記憶部43と、学習部44とを備える。
[Server device]
The server device 40 learns the machine learning model and updates the learned models 34a and 34b. In the example of FIG. 1, the server device 40 is a cloud computer provided outside the building having the air-conditioned space, but it may be an edge computer provided inside the building. The server device 40 includes a communication section 41 , a control section 42 , a storage section 43 and a learning section 44 .
 通信部41は、サーバ装置40が空気調和機30と通信するための通信モジュール(通信回路)である。通信部41は、例えば、学習済みモデル45a、45bを空気調和機30へ送信する。図1の例では、通信部41は、広域通信ネットワーク5を介して通信を行うための通信回路(通信モジュール)であるが、局所通信ネットワーク(不図示)を介して通信を行うための通信回路(通信モジュール)を備えてもよい。通信部41によって行われる通信は、無線通信であってもよいし、有線通信であってもよい。通信に用いられる通信規格についても特に限定されない。 The communication unit 41 is a communication module (communication circuit) for the server device 40 to communicate with the air conditioner 30 . The communication unit 41 transmits the learned models 45a and 45b to the air conditioner 30, for example. In the example of FIG. 1, the communication unit 41 is a communication circuit (communication module) for communicating via the wide area communication network 5, but a communication circuit for communicating via a local communication network (not shown). (communication module). The communication performed by the communication unit 41 may be wireless communication or wired communication. The communication standard used for communication is also not particularly limited.
 制御部42は、サーバ装置40における各種情報処理を行う。制御部42は、具体的には、プロセッサまたはマイクロコンピュータによって実現される。 The control unit 42 performs various types of information processing in the server device 40 . The controller 42 is specifically realized by a processor or a microcomputer.
 記憶部43は、学習部44によって学習された機械学習モデル(学習済みモデル45a、45b)、学習に使用される教師データ46、及び、制御部42が実行するコンピュータプログラムなどが記憶される記憶装置である。記憶部43は、具体的には、半導体メモリまたはHDD(Hard Disk Drive)などによって実現される。 The storage unit 43 stores machine learning models (learned models 45a and 45b) learned by the learning unit 44, teacher data 46 used for learning, computer programs executed by the control unit 42, and the like. is. Specifically, the storage unit 43 is realized by a semiconductor memory, HDD (Hard Disk Drive), or the like.
 学習部44は、教師データ46を用いて機械学習する。学習部44は、機械学習により、空調対象空間の温度分布を示す熱画像を入力とし、当該空調対象空間に存在する人の輪郭を出力する機械学習モデル(いわゆる、学習済みモデル)を生成する。上述したように、機械学習モデルは、例えばCNNであり、第一機械学習モデル及び第二機械学習モデルから構成されてもよい。この場合、学習部44は、機械学習により、空調対象空間の温度分布を示す熱画像を入力して超解像熱画像を出力する第一機械学習モデル(例えば、学習済みモデル45a)、及び、当該超解像熱画像を入力して当該空調対象空間に存在する人の輪郭を出力する第二機械学習モデル(例えば、学習済みモデル45b)を生成してもよい。学習済みモデル45a、45bは、機械学習により調整された学習済みパラメータを含む。生成された学習済みモデル45a、45bは、例えば、記憶部43に格納されるとともに、通信部41を介して空気調和機30へ送信される。これにより、空気調和機30は、取得した学習済みモデル45aを用いて記憶部33内の学習済みモデル34aを更新し、学習済みモデル45bを用いて学習済みモデル34bを更新する。学習部44は、例えば、プロセッサが記憶部43に格納されているプログラムを実行することで実現される。 The learning unit 44 performs machine learning using the teacher data 46. The learning unit 44 uses machine learning to generate a machine learning model (a so-called learned model) that outputs a contour of a person present in the air-conditioned space based on a thermal image representing the temperature distribution of the air-conditioned space. As mentioned above, the machine learning model, for example a CNN, may consist of a first machine learning model and a second machine learning model. In this case, the learning unit 44 uses machine learning to input a thermal image showing the temperature distribution of the air-conditioned space and outputs a super-resolution thermal image (for example, a trained model 45a), and A second machine learning model (for example, the learned model 45b) may be generated that inputs the super-resolution thermal image and outputs the outline of a person existing in the air-conditioned space. The trained models 45a, 45b include trained parameters adjusted by machine learning. The generated learned models 45 a and 45 b are stored in the storage unit 43 and transmitted to the air conditioner 30 via the communication unit 41 , for example. As a result, the air conditioner 30 updates the learned model 34a in the storage unit 33 using the acquired learned model 45a, and updates the learned model 34b using the learned model 45b. The learning unit 44 is realized by executing a program stored in the storage unit 43 by the processor, for example.
 教師データ46は、入力データとして空調対象空間の温度分布を示す熱画像と、出力データとして当該空調対象空間に存在する人の輪郭とを有する。具体的には、教師データ46は、入力データとして熱画像と、出力データとして人の輪郭との組を含むデータセットであってもよいし、複数のデータセット(例えば、第一教師データ及び第二教師データ)を含んでもよい。例えば、第一教師データは、第一機械学習モデル(例えば、学習済みモデル45a)の学習に使用される教師データであり、例えば、入力データとして教師データ46の入力データ(つまり、熱画像)と、出力データとして当該熱画像を超解像度化した超解像熱画像との組を含むデータセットである。例えば、第二教師データは、第二機械学習モデル(例えば、学習済みモデル45b)の学習に使用される教師データであり、入力データとして超解像熱画像と、出力データとして教師データ46の出力データ(つまり、人の輪郭)との組を含むデータセットである。 The teacher data 46 has a thermal image showing the temperature distribution of the air-conditioned space as input data, and the contours of people existing in the air-conditioned space as output data. Specifically, the teacher data 46 may be a data set including a set of a thermal image as input data and a contour of a person as output data, or may be a plurality of data sets (for example, first teacher data and second teacher data). two-supervised data). For example, the first teacher data is teacher data used for learning the first machine learning model (for example, the trained model 45a). , and a super-resolution thermal image obtained by converting the thermal image into super-resolution as output data. For example, the second teacher data is teacher data used for learning the second machine learning model (for example, the trained model 45b), and includes a super-resolution thermal image as input data and the output of the teacher data 46 as output data. A data set containing pairs of data (ie human contours).
 なお、図1では、サーバ装置40で学習済みの機械学習モデルを生成し、生成された学習済みの機械学習モデルを空気調和機30へ送信して機械学習モデルの更新を行う例を説明したが、この例に限られない。例えば、空気調和機30がサーバ装置40の構成を備えてもよい。また、例えば、サーバ装置40がセンサ20によって検出された熱画像を取得して人1の温冷感を推定し、人1の温冷感の推定結果を空気調和機30の制御へ出力してもよい。 In FIG. 1, an example of generating a learned machine learning model in the server device 40 and transmitting the generated learned machine learning model to the air conditioner 30 to update the machine learning model has been described. , but not limited to this example. For example, the air conditioner 30 may have the configuration of the server device 40 . Further, for example, the server device 40 acquires the thermal image detected by the sensor 20, estimates the thermal sensation of the person 1, and outputs the estimation result of the thermal sensation of the person 1 to the control of the air conditioner 30. good too.
 [動作例]
 次に、制御システム10の動作例について説明する。図4は、制御システム10の動作例のフローチャートである。
[Example of operation]
Next, an operation example of the control system 10 will be described. FIG. 4 is a flow chart of an example operation of the control system 10 .
 空気調和機30の制御部32は、センサ20によって検出された空調対象空間(より詳細には、空調対象空間におけるセンサ20のセンシング範囲の空間)の温度分布を示す熱画像100(例えば、図3の(b))を取得する(S11)。より具体的には、空気調和機30の取得部32aは、通信部31を介してセンサ20から熱画像100を取得する。このとき、取得部32aは、取得した熱画像100を記憶部33に格納してもよい。 The control unit 32 of the air conditioner 30 generates a thermal image 100 (for example, FIG. 3 (b)) of (S11). More specifically, the acquisition unit 32 a of the air conditioner 30 acquires the thermal image 100 from the sensor 20 via the communication unit 31 . At this time, the acquisition unit 32 a may store the acquired thermal image 100 in the storage unit 33 .
 次に、検出部32bは、ステップS11で取得部32aにより取得された熱画像100に基づいて、空調対象空間に存在する人1の輪郭を検出する(S12)。例えば、検出部32bは、熱画像100内の少なくとも一部において同じ特徴を有し、かつ、隣接する複数の画素を1つの温度分布領域として2つ以上の温度分布領域に分割し、分割された2つ以上の温度分布領域に基づいて、人1の着衣1b(図2参照)を含む人1の輪郭を検出する。例えば、検出部32bは、熱画像100に対して機械学習モデルを用いてセグメンテーションを行うことにより、熱画像100を人1が映っている領域と人1が映っていない領域とに分割してもよい。上述したように、機械学習モデルは、畳み込み層を有するものであればよく、例えば、畳み込みニューラルネットワーク(CNN)であってもよいが、これに限定されない。なお、機械学習モデル(例えば、図1の学習済みモデル34b)は、熱画像100を超解像度化した超解像熱画像に対して適用されてもよい。 Next, the detection unit 32b detects the outline of the person 1 existing in the air-conditioned space based on the thermal image 100 acquired by the acquisition unit 32a in step S11 (S12). For example, the detection unit 32b divides a plurality of adjacent pixels having the same characteristics in at least a part of the thermal image 100 into two or more temperature distribution regions as one temperature distribution region. Based on two or more temperature distribution areas, the contour of the person 1 including the clothing 1b of the person 1 (see FIG. 2) is detected. For example, the detection unit 32b may perform segmentation on the thermal image 100 using a machine learning model to divide the thermal image 100 into an area where the person 1 is shown and an area where the person 1 is not shown. good. As described above, the machine learning model may have convolutional layers, such as, but not limited to, a convolutional neural network (CNN). Note that the machine learning model (for example, the learned model 34b in FIG. 1) may be applied to a super-resolution thermal image obtained by converting the thermal image 100 into super-resolution.
 検出部32bは、熱画像100内の複数の画素に対して隣接する画素間に少なくとも1つの画素を補間することにより、熱画像100よりも高い解像度を有する超解像熱画像を生成し、生成された超解像熱画像に基づいて人1の輪郭を検出してもよい。例えば、検出部32bは、超解像熱画像の生成において、熱画像100内の互いに隣接する画素の画素値(つまり、温度値)の平均値を求めることで、隣接する画素間に当該平均値に相当する画素値を有する新たな画素を挿入する方法を用いて、熱画像100を超解像度化してもよい。また、例えば、検出部32bは、学習済みの機械学習モデル(例えば、図1の学習済みモデル34a)を用いて、熱画像100を超解像度化してもよい。学習済みモデル34aは、例えば、SRGAN(Generative Adversarial Network for Super-Resolurion)であってもよいが、これに限定されない。 The detection unit 32b interpolates at least one pixel between pixels adjacent to a plurality of pixels in the thermal image 100 to generate a super-resolution thermal image having higher resolution than the thermal image 100. The contour of the person 1 may be detected based on the super-resolution thermal image obtained. For example, in generating a super-resolution thermal image, the detection unit 32b obtains an average value of pixel values (that is, temperature values) of pixels adjacent to each other in the thermal image 100, so that the average value Thermal image 100 may be super-resolved using a method that inserts new pixels with pixel values corresponding to . Further, for example, the detection unit 32b may super-resolve the thermal image 100 using a learned machine learning model (for example, the learned model 34a in FIG. 1). The trained model 34a may be, for example, SRGAN (Generative Adversarial Network for Super-Resolution), but is not limited to this.
 ステップS12では、検出部32bは、例えば、記憶部33に格納された学習済みの機械学習モデル(例えば、図1の学習済みモデル34a、34b)を用いて熱画像100から人1の輪郭を検出してもよい。 In step S12, the detection unit 32b detects the outline of the person 1 from the thermal image 100 using, for example, the learned machine learning models (for example, the learned models 34a and 34b in FIG. 1) stored in the storage unit 33. You may
 次に、特定部32cは、ステップS12で検出部32bにより検出された人1の輪郭で囲まれた領域である人体領域100a(図3の(b)参照)を、熱画像100内において特定する(S13)。 Next, the specifying unit 32c specifies, in the thermal image 100, a human body region 100a (see (b) of FIG. 3), which is a region surrounded by the outline of the person 1 detected by the detecting unit 32b in step S12. (S13).
 次に、算出部32dは、ステップS13で特定部32cにより特定された人体領域100a内の温度分布に基づいて人1の着衣1bの温度を含む人体温度を算出する(S14)。具体的には、ステップS14では、算出部32dは、人1の肌1a(図2参照)の表面温度と、人1の着衣1bの表面温度との平均値を人体温度として算出する。例えば、算出部32dは、人体領域100aの画素値(つまり、温度値)の平均値を人体温度として算出してもよい。 Next, the calculation unit 32d calculates the human body temperature including the temperature of the clothing 1b of the person 1 based on the temperature distribution within the human body region 100a specified by the specifying unit 32c in step S13 (S14). Specifically, in step S14, the calculation unit 32d calculates an average value of the surface temperature of the skin 1a (see FIG. 2) of the person 1 and the surface temperature of the clothes 1b of the person 1 as the human body temperature. For example, the calculator 32d may calculate an average value of pixel values (ie, temperature values) of the human body region 100a as the human body temperature.
 次に、算出部32dは、人体領域100a以外の領域(いわゆる、周囲領域100b)内の温度分布に基づいて人1の周囲温度を算出する(S15)。例えば、ステップS15では、算出部32dは、周囲領域100bの画素値の平均値(つまり、温度値)を周囲温度として算出してもよい。なお、ステップS14及びステップS15は、並行して行われてもよい。 Next, the calculator 32d calculates the ambient temperature of the person 1 based on the temperature distribution in the area other than the human body area 100a (so-called surrounding area 100b) (S15). For example, in step S15, the calculation unit 32d may calculate the average value (that is, the temperature value) of the pixel values of the surrounding area 100b as the ambient temperature. In addition, step S14 and step S15 may be performed in parallel.
 次に、算出部32dは、人体温度と周囲温度との差分値に基づいて人1の放熱量を算出する(S16)。具体的には、算出部32dは、ステップS14及びステップS15で算出された人体温度と周囲温度との差分値を算出し、算出された差分値に基づいて人1の放熱量を算出する。 Next, the calculator 32d calculates the amount of heat released by the person 1 based on the difference between the human body temperature and the ambient temperature (S16). Specifically, the calculation unit 32d calculates a difference value between the human body temperature calculated in steps S14 and S15 and the ambient temperature, and calculates the heat dissipation amount of the person 1 based on the calculated difference value.
 次に、推定部32eは、ステップS16で算出部32dにより算出された人1の放熱量に基づいて人1の温冷感を推定する(S17)。例えば、推定部32eは、放熱量が閾値よりも大きい場合、人1が寒いと感じていると推定し、放熱量が閾値よりも小さい場合、人1が暑いと感じていると推定する。なお、閾値は、冷房運転時の閾値と、暖房運転時の閾値とを含んでもよい。人1の温冷感は、人1の放熱量に対応するが、同じ放熱量であっても季節によって人1の温冷感は異なる場合がある。そのため、冷房運転時の閾値と暖房運転時の閾値とを用いて人1の温冷感を推定することにより、人1の温冷感をより正確に推定することができる。 Next, the estimation unit 32e estimates the thermal sensation of the person 1 based on the amount of heat released by the person 1 calculated by the calculation unit 32d in step S16 (S17). For example, the estimation unit 32e estimates that the person 1 feels cold when the amount of heat dissipation is greater than the threshold, and estimates that the person 1 feels hot when the amount of heat dissipation is less than the threshold. The threshold may include a threshold for cooling operation and a threshold for heating operation. The thermal sensation of the person 1 corresponds to the amount of heat released by the person 1, but the thermal sensation of the person 1 may differ depending on the season even if the amount of heat released is the same. Therefore, by estimating the thermal sensation of the person 1 using the threshold for the cooling operation and the threshold for the heating operation, the thermal sensation of the person 1 can be estimated more accurately.
 次に、制御部32は、ステップS17で推定部32eにより推定された人1の温冷感に基づいて空気調和機30を制御する(S18)。例えば、制御部32は、ステップS17で推定部32eにより人1が寒いと感じていると推定された場合、周囲温度を上げるように空気調和機30を制御し、推定部32eにより人1が暑いと感じていると推定された場合、周囲温度を下げるように空気調和機30を制御する。制御部32は、データベース35に基づいて人1の温冷感に対応する冷房又は暖房運転時の制御条件を決定し、決定された制御条件に基づいてルーバ36、コンプレッサ37及びファン38の動作を制御してもよい。データベース35については上述したため、ここでの説明を省略する。 Next, the control unit 32 controls the air conditioner 30 based on the thermal sensation of the person 1 estimated by the estimation unit 32e in step S17 (S18). For example, when the estimation unit 32e estimates in step S17 that the person 1 feels cold, the control unit 32 controls the air conditioner 30 to increase the ambient temperature, and the estimation unit 32e determines that the person 1 is hot. If it is estimated that the air conditioner 30 is feeling The control unit 32 determines control conditions during cooling or heating operation corresponding to the thermal sensation of the person 1 based on the database 35, and operates the louver 36, the compressor 37 and the fan 38 based on the determined control conditions. may be controlled. Since the database 35 has been described above, a description thereof will be omitted here.
 このとき、制御部32は、人体領域100aの位置及び人1の温冷感に基づいて空気調和機30を制御してもよい。具体的には、算出部32dは、熱画像における人体領域の位置に基づいて、空調対象空間における人1の位置を算出する。例えば、記憶部33には、熱画像内の画素の位置と、空調対象空間における座標との対応関係を示すテーブル情報が記憶されており、算出部32dは、テーブル情報を用いて、熱画像100内で人1が検出された位置(画素の位置)から空調対象空間における人1の位置(具体的には、座標)を算出してもよい。 At this time, the control unit 32 may control the air conditioner 30 based on the position of the human body region 100a and the thermal sensation of the person 1. Specifically, the calculator 32d calculates the position of the person 1 in the air-conditioned space based on the position of the human body region in the thermal image. For example, the storage unit 33 stores table information indicating the correspondence between the positions of pixels in the thermal image and the coordinates in the air-conditioned space. The position (specifically, the coordinates) of the person 1 in the air-conditioned space may be calculated from the position (pixel position) where the person 1 is detected.
 なお、本実施の形態のように熱画像が空調対象空間を上方(つまり、天井側)から見たときの空調対象空間の温度分布を示している場合には、算出部32dによって算出される人1の座標は、空調対象空間を上方から見たときの二次元座標となる。 Note that when the thermal image shows the temperature distribution of the air-conditioned space when the air-conditioned space is viewed from above (that is, from the ceiling side) as in the present embodiment, the temperature distribution of the air-conditioned space calculated by the calculator 32d The coordinates of 1 are two-dimensional coordinates when the air-conditioned space is viewed from above.
 以上説明したように、制御システム10は、空調対象空間内に存在する人1の温冷感を推定し、推定された温冷感に基づいて空気調和機30を制御することができる。そのため、制御システム10は、人1の温冷感を推定して快適な環境温度を実現することができる。 As described above, the control system 10 can estimate the thermal sensation of the person 1 present in the air-conditioned space and control the air conditioner 30 based on the estimated thermal sensation. Therefore, the control system 10 can estimate the thermal sensation of the person 1 and achieve a comfortable environmental temperature.
 [変形例]
 制御部32は、推定部32eにより推定された推定結果(人1の温冷感)に基づいて空気調和機30を制御するが、推定結果を出力してもよい。出力部32fは、推定部32eにより推定された結果を人1が所持する情報端末(不図示)へ出力してもよい。これにより、人1は、情報端末の表示部に表示された推定結果を確認することができるため、人1が実際に感じている温冷感(以下、実際の温冷感という)と推定結果とが異なる場合に、人1は情報端末に実際の温冷感を入力して推定結果を修正することができる。なお、推定結果及び修正後の推定結果は、いずれも記憶部33に記憶される。
[Modification]
The control unit 32 controls the air conditioner 30 based on the estimation result (the thermal sensation of the person 1) estimated by the estimation unit 32e, but may output the estimation result. The output unit 32f may output the result estimated by the estimation unit 32e to an information terminal (not shown) possessed by the person 1. FIG. As a result, the person 1 can confirm the estimation result displayed on the display unit of the information terminal. , the person 1 can input the actual thermal sensation to the information terminal to correct the estimation result. Note that both the estimation result and the corrected estimation result are stored in the storage unit 33 .
 なお、検出部32bは、機械学習モデルを使用しない、ルールベースのアルゴリズムに基づく情報処理を熱画像に対して行ってもよい。例えば、検出部32bは、熱画像に含まれる複数の画素のうち、画素値が極大値となる画素を検出する処理を行ってもよい。ここで、画素値が極大値となる画素とは、画素の二次元配置において画素値が極大値となる画素を意味する。画素値が極大値となる画素は、言い換えれば、画素の二次元配置において同一時刻における画素値を比較した場合に周囲の画素に比べて画素値が高い画素を意味する。画素値が極大値となる画素は1つの熱画像内に複数存在する場合がある。 Note that the detection unit 32b may perform information processing on the thermal image based on a rule-based algorithm that does not use a machine learning model. For example, the detection unit 32b may perform processing for detecting a pixel having a maximum pixel value among a plurality of pixels included in the thermal image. Here, a pixel having a maximum pixel value means a pixel having a maximum pixel value in a two-dimensional arrangement of pixels. A pixel having a maximum pixel value means, in other words, a pixel having a higher pixel value than surrounding pixels when comparing pixel values at the same time in a two-dimensional arrangement of pixels. There may be a plurality of pixels with maximum pixel values in one thermal image.
 ここで、空調対象空間に人よりも温度の高い熱源が存在しないと仮定すると、画素値が極大値となる画素であっても所定の値よりも画素値が小さい(つまり、温度が低い)場合には、当該画素に対応する位置に人は存在しないと考えられる。そこで、検出部32bは、画素値が極大値となる画素であって、画素値が所定の値以上(例えば、30℃以上)である画素を検出した場合に、空調対象空間に人1が存在すると推定し、検出された画素に基づいて人1の輪郭を検出してもよい。 Here, assuming that there is no heat source with a higher temperature than the person in the air-conditioned space, even if the pixel value is the maximum value, if the pixel value is smaller than the predetermined value (that is, the temperature is low), , it is considered that there is no person at the position corresponding to the pixel. Therefore, when the detection unit 32b detects a pixel having a maximum pixel value and a pixel value equal to or higher than a predetermined value (for example, 30° C. or higher), the detection unit 32b determines that the person 1 is present in the air-conditioned space. Then, the contour of the person 1 may be detected based on the detected pixels.
 また、算出部32dは、画素値が極大値となる画素であって、画素値が所定値以上(例えば、30℃以上)である画素の位置に上述のテーブル情報を適用することにより、空調対象空間における人1の位置(つまり、座標)を算出することができる。 Further, the calculation unit 32d applies the above-described table information to the position of a pixel having a maximum pixel value and having a pixel value equal to or higher than a predetermined value (for example, 30° C. or higher), thereby determining the air conditioning target. The position (that is, the coordinates) of person 1 in space can be calculated.
 また、ルールベースのアルゴリズムに基づく情報処理の別の例として、熱画像に含まれる複数の画素それぞれの画素値(温度)の経時変化を検出する処理が挙げられる。空調対象空間に人以外の熱源が無いと仮定した場合で、かつ、空調対象空間に人がいない場合には、熱画像に含まれる複数の画素それぞれの画素値(温度)の経時変化はゆるやかである。この状態で、空調対象空間に人1が入ると、熱画像のうち人1が映っている部分の画素の画素値は急激に変化(上昇)する。検出部32bは、複数の画素それぞれの画素値の経時変化を検出することにより、例えば、急激に画素値が上昇したときに空調対象空間に人1が存在すると推定し、人1の輪郭の検出処理を行ってもよい。 Another example of information processing based on a rule-based algorithm is processing that detects temporal changes in the pixel values (temperature) of each of a plurality of pixels included in a thermal image. Assuming that there are no heat sources other than people in the air-conditioned space, and if there are no people in the air-conditioned space, the pixel values (temperatures) of the plurality of pixels included in the thermal image change slowly over time. be. In this state, when the person 1 enters the air-conditioned space, the pixel values of the pixels in the portion of the thermal image where the person 1 is shown change (increase) abruptly. The detection unit 32b detects changes in the pixel values of each of the plurality of pixels over time, for example, when the pixel values rise sharply, it is estimated that the person 1 exists in the air-conditioned space, and the outline of the person 1 is detected. processing may be performed.
 [効果等]
 以上説明したように、制御方法は、制御システム10などのコンピュータによって実行される空気調和機30の制御方法であって、コンピュータは、空調対象空間の天井に設置されたセンサ20により検出された空調対象空間の温度分布を示す熱画像100を取得し(図4のステップS11)、取得された熱画像100に基づいて空調対象空間に存在する人1の輪郭を検出し(ステップS12)、検出された人1の輪郭で囲まれた領域である人体領域100aを、熱画像100内において特定し(ステップS13)、特定された人体領域100a内の温度分布に基づいて人1の着衣1bの温度を含む人体温度を算出し(ステップS14)、熱画像100内における人体領域100a以外の領域(周囲領域100b)の周囲温度を算出し(ステップS15)、人体温度と周囲温度との差分値に基づいて人1の放熱量を算出し(ステップS16)、算出された放熱量に基づいて人1の温冷感を推定し(ステップS17)、推定された温冷感に基づいて空気調和機30を制御する(ステップS18)。
[Effects, etc.]
As described above, the control method is a control method for the air conditioner 30 executed by a computer such as the control system 10. The computer controls the air conditioning detected by the sensor 20 installed on the ceiling of the space to be air conditioned. A thermal image 100 showing the temperature distribution of the target space is acquired (step S11 in FIG. 4), the contour of the person 1 existing in the air conditioning target space is detected based on the acquired thermal image 100 (step S12), and the contour of the detected person 1 is detected. The human body region 100a, which is the region surrounded by the outline of the person 1, is specified in the thermal image 100 (step S13), and the temperature of the clothes 1b of the person 1 is calculated based on the temperature distribution in the specified human body region 100a. (Step S14), calculates the ambient temperature of a region (surrounding region 100b) other than the human body region 100a in the thermal image 100 (Step S15), and calculates the difference value between the human body temperature and the ambient temperature. Calculate the heat dissipation amount of the person 1 (step S16), estimate the thermal sensation of the person 1 based on the calculated heat dissipation amount (step S17), and control the air conditioner 30 based on the estimated thermal sensation. (step S18).
 このような制御方法は、空調対象空間に存在する人1の温冷感を推定することにより、人1が快適に感じる環境温度を実現することができる。 Such a control method can realize an environmental temperature that the person 1 feels comfortable by estimating the thermal sensation of the person 1 existing in the air-conditioned space.
 また、例えば、上記コンピュータは、放熱量が閾値よりも大きい場合に、人1が寒いと感じていると推定し、推定された温冷感に基づいて周囲温度を上げるように空気調和機30を制御し、放熱量が閾値よりも小さい場合に、人1が暑いと感じていると推定し、推定された温冷感に基づいて周囲温度を下げるように空気調和機30を制御する。 Further, for example, the computer estimates that the person 1 feels cold when the heat release amount is greater than the threshold, and operates the air conditioner 30 to increase the ambient temperature based on the estimated thermal sensation. When the heat release amount is smaller than a threshold value, it is estimated that the person 1 feels hot, and the air conditioner 30 is controlled to lower the ambient temperature based on the estimated thermal sensation.
 このような制御方法は、閾値に基づいて人1の放熱量から人1の温冷感を推定することができる。 Such a control method can estimate the thermal sensation of person 1 from the amount of heat released by person 1 based on the threshold.
 また、例えば、上記コンピュータは、人1の輪郭の検出において、熱画像100内の少なくとも一部において同じ特徴を有し、かつ、隣接する複数の画素を1つの温度分布領域として2つ以上の温度分布領域(例えば、人体領域100a及び周囲領域100b)に分割し、分割された2つ以上の温度分布領域に基づいて人1の着衣1bを含む人1の輪郭を検出する。 Further, for example, in detecting the contour of the person 1, the computer has the same characteristics in at least a part of the thermal image 100, and uses two or more adjacent pixels as one temperature distribution area. It is divided into distribution areas (for example, a human body area 100a and a surrounding area 100b), and the outline of the person 1 including the clothing 1b of the person 1 is detected based on two or more divided temperature distribution areas.
 このような制御方法は、熱画像内の少なくとも一部において同じ特徴を有し、かつ、隣接する複数の画素を1つの温度分布領域として分割することにより、熱画像を人1に対応する領域(人体領域100a)と人1以外に対応する領域(周囲領域100b)とに区分することができる。これにより、制御方法は、人1に対応する領域に含まれる複数の画素の画素値に基づいて人1の輪郭を検出することができる。 Such a control method has the same characteristics in at least a part of the thermal image, and divides a plurality of adjacent pixels into one temperature distribution area, thereby converting the thermal image into an area corresponding to the person 1 ( It can be divided into a human body area 100a) and an area (surrounding area 100b) corresponding to other than the person 1. FIG. Thereby, the control method can detect the outline of the person 1 based on the pixel values of the plurality of pixels included in the area corresponding to the person 1 .
 また、例えば、上記コンピュータは、熱画像100内の複数の画素に対して隣接する画素間に少なくとも1つの画素を補間することにより、熱画像100よりも高い解像度を有する超解像熱画像を生成し、超解像熱画像に基づいて人1の輪郭を検出する。 Also, for example, the computer interpolates at least one pixel between adjacent pixels in a plurality of pixels in the thermal image 100 to generate a super-resolution thermal image having a higher resolution than the thermal image 100. Then, the contour of the person 1 is detected based on the super-resolution thermal image.
 このような制御方法は、熱画像100よりもより細かい分解能で人1の輪郭を検出するため、人1の輪郭をより精度よく検出することができる。 Since such a control method detects the contour of the person 1 with a finer resolution than the thermal image 100, the contour of the person 1 can be detected more accurately.
 また、例えば、上記コンピュータは、学習済みの機械学習モデル(例えば、学習済みモデル34a、34b)を用いて熱画像100から人1の輪郭を検出する。 Also, for example, the computer detects the contour of the person 1 from the thermal image 100 using a learned machine learning model (for example, the learned models 34a and 34b).
 このような制御方法は、機械学習モデルを用いることにより、熱画像100から人1の輪郭を迅速に、かつ、精度よく検出することができる。 Such a control method can quickly and accurately detect the contour of the person 1 from the thermal image 100 by using a machine learning model.
 また、例えば、上記機械学習モデルは、教師データ46を用いて学習され、教師データ46は、入力データとして熱画像と、出力データとして人の輪郭とを有する。 Also, for example, the machine learning model is learned using teacher data 46, and the teacher data 46 has a thermal image as input data and a human contour as output data.
 このような制御方法は、熱画像100を学習済みの機械学習モデルに入力することにより、人1の輪郭を検出することができる。 Such a control method can detect the contour of the person 1 by inputting the thermal image 100 into a trained machine learning model.
 また、例えば、上記機械学習モデルは、第一機械学習モデル(例えば、学習済みモデル34a)及び第二機械学習モデル(例えば、学習済みモデル34b)から構成されており、第一機械学習モデルは、入力データとして上記教師データ46の入力データと、出力データとして当該入力データの超解像熱画像との組を含む第一教師データを用いて学習され、第二機械学習モデルは、入力データとして上記超解像熱画像と、出力データとして上記教師データ46の出力データとの組を含む第二教師データを用いて学習される。 Also, for example, the machine learning model is composed of a first machine learning model (e.g., trained model 34a) and a second machine learning model (e.g., trained model 34b), and the first machine learning model is The second machine learning model is learned using the first teacher data including a set of the input data of the teacher data 46 as input data and the super-resolution thermal image of the input data as output data, and the second machine learning model uses the above Learning is performed using second teacher data including a set of a super-resolution thermal image and the output data of the teacher data 46 as output data.
 このような制御方法は、熱画像100を入力として熱画像100の超解像熱画像を出力する第一機械学習モデル(例えば、学習済みモデル34a)と、超解像熱画像を入力として人の輪郭を出力する第二機械学習モデル(例えば、学習済みモデル34b)とを用いて、熱画像100から人1の輪郭を検出することができる。したがって、制御方法は、熱画像100よりも細かい分解能を有する超解像熱画像から人1の輪郭を検出することができるため、人1の輪郭をより精度よく検出することができる。 Such a control method includes a first machine learning model (for example, a trained model 34a) that receives the thermal image 100 as an input and outputs a super-resolution thermal image of the thermal image 100, and a human machine learning model that receives the super-resolution thermal image as an input. The contour of person 1 can be detected from thermal image 100 using a second machine learning model (eg, trained model 34b) that outputs contours. Therefore, since the control method can detect the contour of the person 1 from the super-resolution thermal image having finer resolution than the thermal image 100, the contour of the person 1 can be detected with higher accuracy.
 また、プログラムは、上記のいずれかの制御方法を上記コンピュータに実行させるためのプログラムである。 Also, the program is a program for causing the computer to execute any of the above control methods.
 このようなプログラムは、空調対象空間に存在する人1の温冷感を推定することにより、人1が快適に感じる環境温度を実現することができる。 Such a program can realize an environmental temperature that the person 1 feels comfortable by estimating the thermal sensation of the person 1 existing in the air-conditioned space.
 また、制御システム10は、空調対象空間の天井に設置されたセンサ20により検出された空調対象空間の温度分布を示す熱画像100を取得する取得部32aと、取得された熱画像100に基づいて空調対象空間に存在する人1の輪郭を検出する検出部32bと、検出された人1の輪郭で囲まれた領域である人体領域100aを、熱画像100内において特定する特定部32cと、特定された人体領域100a内の温度分布に基づいて人1の着衣1bの温度を含む人体温度を算出し、熱画像100内における人体領域100a以外の領域(いわゆる、周囲領域100b)の周囲温度を算出し、人体温度と周囲温度との差分値に基づいて人1の放熱量を算出する算出部32dと、算出された放熱量に基づいて人1の温冷感を推定する推定部32eと、推定された温冷感に基づいて空調対象空間に設置された空気調和機30を制御する制御部32と、を備える。 The control system 10 also includes an acquisition unit 32a that acquires a thermal image 100 showing the temperature distribution of the air-conditioned space detected by the sensor 20 installed on the ceiling of the air-conditioned space, and based on the acquired thermal image 100: a detection unit 32b that detects the contour of the person 1 existing in the air-conditioned space; Based on the obtained temperature distribution in the human body region 100a, the human body temperature including the temperature of the clothes 1b of the person 1 is calculated, and the ambient temperature of the region other than the human body region 100a (so-called surrounding region 100b) in the thermal image 100 is calculated. a calculation unit 32d for calculating the amount of heat dissipation of the person 1 based on the difference value between the human body temperature and the ambient temperature; and a control unit 32 that controls the air conditioner 30 installed in the air-conditioned space based on the received thermal sensation.
 このような制御システム10は、空調対象空間に存在する人1の温冷感を推定することにより、人1が快適に感じる環境温度を実現することができる。 Such a control system 10 can realize an environmental temperature that the person 1 feels comfortable by estimating the thermal sensation of the person 1 existing in the air-conditioned space.
 (その他の実施の形態)
 以上、実施の形態に係る制御システム、及び、制御方法について説明したが、本発明は、上記実施の形態に限定されるものではない。
(Other embodiments)
Although the control system and the control method according to the embodiments have been described above, the present invention is not limited to the above embodiments.
 また、上記実施の形態では、制御システムは、複数の装置によって実現されたが、単一の装置として実現されてもよい。例えば、制御システムは、サーバ装置に相当する単一の装置として実現されてもよい。制御システムが複数の装置によって実現される場合、制御システムが備える各構成要素は、複数の装置にどのように振り分けられてもよい。 Also, in the above embodiments, the control system was implemented by a plurality of devices, but it may be implemented as a single device. For example, the control system may be implemented as a single device that corresponds to the server device. When the control system is realized by a plurality of devices, each component included in the control system may be distributed to the plurality of devices in any way.
 また、上記実施の形態において、特定の処理部が実行する処理を別の処理部が実行してもよい。また、複数の処理の順序が変更されてもよいし、複数の処理が並行して実行されてもよい。 Further, in the above embodiment, the processing executed by a specific processing unit may be executed by another processing unit. In addition, the order of multiple processes may be changed, and multiple processes may be executed in parallel.
 また、上記実施の形態において、各構成要素は、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPUまたはプロセッサなどのプログラム実行部が、ハードディスクまたは半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 Also, in the above embodiments, each component may be realized by executing a software program suitable for each component. Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
 また、各構成要素は、ハードウェアによって実現されてもよい。例えば、各構成要素は、回路(または集積回路)でもよい。これらの回路は、全体として1つの回路を構成してもよいし、それぞれ別々の回路でもよい。また、これらの回路は、それぞれ、汎用的な回路でもよいし、専用の回路でもよい。 Also, each component may be realized by hardware. For example, each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
 また、本発明の全般的または具体的な態様は、システム、装置、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよい。また、システム、装置、方法、集積回路、コンピュータプログラム及び記録媒体の任意な組み合わせで実現されてもよい。例えば、本発明は、制御システムなどのコンピュータによって実行される制御方法として実現されてもよい。また、本発明は、制御方法をコンピュータに実行させるためのプログラムとして実現されてもよいし、このようなプログラムが記憶された、コンピュータ読み取り可能な非一時的な記録媒体として実現されてもよい。 Also, general or specific aspects of the present invention may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM. Also, any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented. For example, the present invention may be implemented as a computer-implemented control method, such as a control system. Further, the present invention may be implemented as a program for causing a computer to execute the control method, or as a computer-readable non-temporary recording medium storing such a program.
 その他、各実施の形態に対して当業者が思いつく各種変形を施して得られる形態、または、本発明の趣旨を逸脱しない範囲で各実施の形態における構成要素及び機能を任意に組み合わせることで実現される形態も本発明に含まれる。 In addition, forms obtained by applying various modifications to each embodiment that a person skilled in the art can think of, or realized by arbitrarily combining the constituent elements and functions of each embodiment without departing from the spirit of the present invention. Also included in the present invention.
 (付記)
 以下、本明細書の開示内容から得られる発明を例示し、当該発明から得られる効果等について説明する。
(Appendix)
Hereinafter, the inventions obtained from the disclosure of the present specification will be exemplified, and the effects and the like obtained from the inventions will be described.
 [発明1]
 コンピュータにより実行される空気調和機の制御方法であって、
 前記コンピュータは、
 空調対象空間の天井に設置されたセンサにより検出された前記空調対象空間の温度分布を示す熱画像を取得し、
 取得された前記熱画像に基づいて前記空調対象空間に存在する人の輪郭を検出し、
 検出された前記人の輪郭で囲まれた領域である人体領域を、前記熱画像内において特定し、
 特定された前記人体領域内の温度分布に基づいて前記人の着衣の温度を含む人体温度を算出し、
 前記熱画像内における前記人体領域以外の領域の周囲温度を算出し、
 前記人体温度と前記周囲温度との差分値に基づいて前記人の放熱量を算出し、
 算出された前記放熱量に基づいて前記人の温冷感を推定し、
 推定された前記温冷感に基づいて前記空気調和機を制御する、
 制御方法。
[Invention 1]
A control method for an air conditioner executed by a computer, comprising:
The computer is
Acquiring a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space,
detecting a contour of a person existing in the air-conditioned space based on the acquired thermal image;
identifying a human body region, which is a region surrounded by the detected outline of the person, in the thermal image;
calculating a human body temperature including a temperature of clothes of the person based on the identified temperature distribution in the human body region;
calculating the ambient temperature of a region other than the human body region in the thermal image;
calculating the heat radiation amount of the person based on the difference value between the human body temperature and the ambient temperature;
estimating the thermal sensation of the person based on the calculated amount of heat dissipation;
controlling the air conditioner based on the estimated thermal sensation;
control method.
 [発明1の効果]
 このような制御方法は、空調対象空間に存在する人の温冷感を推定することにより、人が快適に感じる環境温度を実現することができる。
[Effect of Invention 1]
Such a control method can realize an environmental temperature that people feel comfortable by estimating the thermal sensation of people existing in the air-conditioned space.
 [発明2]
 前記コンピュータは、
 前記放熱量が閾値よりも大きい場合に、前記人が寒いと感じていると推定し、推定された前記温冷感に基づいて前記周囲温度を上げるように前記空気調和機を制御し、
 前記放熱量が前記閾値よりも小さい場合に、前記人が暑いと感じていると推定し、推定された前記温冷感に基づいて前記周囲温度を下げるように前記空気調和機を制御する、
 発明1に記載の制御方法。
[Invention 2]
The computer is
estimating that the person feels cold when the heat release amount is greater than a threshold, and controlling the air conditioner to raise the ambient temperature based on the estimated thermal sensation;
estimating that the person feels hot when the heat release amount is smaller than the threshold value, and controlling the air conditioner to lower the ambient temperature based on the estimated thermal sensation;
The control method according to Invention 1.
 [発明2の効果]
 このような制御方法は、閾値に基づいて人の放熱量から人の温冷感を推定することができる。
[Effect of Invention 2]
Such a control method can estimate a person's thermal sensation from the amount of heat released by the person based on the threshold value.
 [発明3]
 前記コンピュータは、前記人の輪郭の検出において、
 前記熱画像内の少なくとも一部において同じ特徴を有し、かつ、隣接する複数の画素を1つの温度分布領域として2つ以上の温度分布領域に分割し、
 分割された前記2つ以上の温度分布領域に基づいて前記人の着衣を含む前記人の輪郭を検出する、
 発明1又は2に記載の制御方法。
[Invention 3]
The computer, in detecting the contour of the person,
dividing a plurality of adjacent pixels into two or more temperature distribution areas having the same characteristics in at least a part of the thermal image, and
detecting the contour of the person including the clothing of the person based on the two or more divided temperature distribution areas;
The control method according to invention 1 or 2.
 [発明3の効果]
 このような制御方法は、熱画像内の少なくとも一部において同じ特徴を有し、かつ、隣接する複数の画素を1つの温度分布領域として分割することにより、熱画像を人に対応する領域(いわゆる、人体領域)と人1以外に対応する領域(いわゆる、周囲領域)とに区分することができる。これにより、制御方法は、人に対応する領域に含まれる複数の画素の画素値に基づいて人の輪郭を検出することができる。
[Effect of Invention 3]
Such a control method has the same characteristics in at least a part of the thermal image, and divides a plurality of adjacent pixels as one temperature distribution area, thereby dividing the thermal image into an area corresponding to a person (so-called , human body region) and regions other than the person 1 (so-called surrounding region). Thereby, the control method can detect the outline of the person based on the pixel values of the plurality of pixels included in the area corresponding to the person.
 [発明4]
 前記コンピュータは、
 前記熱画像内の複数の画素に対して隣接する画素間に少なくとも1つの画素を補間することにより、前記熱画像よりも高い解像度を有する超解像熱画像を生成し、
 前記超解像熱画像に基づいて前記人の輪郭を検出する、
 発明1~3のいずれかに記載の制御方法。
[Invention 4]
The computer is
generating a super-resolution thermal image having a higher resolution than the thermal image by interpolating at least one pixel between adjacent pixels for a plurality of pixels in the thermal image;
detecting a contour of the person based on the super-resolution thermal image;
A control method according to any one of Inventions 1 to 3.
 [発明4の効果]
 このような制御方法は、熱画像よりもより細かい分解能で人の輪郭を検出するため、人の輪郭をより精度よく検出することができる。
[Effect of Invention 4]
Since such a control method detects the contour of a person with finer resolution than a thermal image, the contour of a person can be detected more accurately.
 [発明5]
 前記コンピュータは、学習済みの機械学習モデルを用いて前記熱画像から前記人の輪郭を検出する、
 発明1~4のいずれかに記載の制御方法。
[Invention 5]
The computer detects the contour of the person from the thermal image using a trained machine learning model.
A control method according to any one of Inventions 1 to 4.
 [発明5の効果]
 このような制御方法は、機械学習モデルを用いることにより、熱画像から人の輪郭を迅速に、かつ、精度よく検出することができる。
[Effect of Invention 5]
Such a control method can quickly and accurately detect the contour of a person from a thermal image by using a machine learning model.
 [発明6]
 前記機械学習モデルは、教師データを用いて学習され、
 前記教師データは、入力データとして前記熱画像と、出力データとして前記人の輪郭とを有する、
 発明5に記載の制御方法。
[Invention 6]
The machine learning model is learned using teacher data,
The training data has the thermal image as input data and the outline of the person as output data,
A control method according to invention 5.
 [発明6の効果]
 このような制御方法は、熱画像を学習済みの機械学習モデルに入力することにより、人の輪郭を検出することができる。
[Effect of Invention 6]
Such a control method can detect the contour of a person by inputting a thermal image into a trained machine learning model.
 [発明7]
 前記機械学習モデルは、第一機械学習モデル及び第二機械学習モデルから構成されており、
 前記第一機械学習モデルは、前記教師データの前記入力データと、前記入力データの超解像熱画像との組を含む第一教師データを用いて学習され、
 前記第二機械学習モデルは、前記超解像熱画像と、前記教師データの前記出力データとの組を含む第二教師データを用いて学習される、
 発明6に記載の制御方法。
[Invention 7]
The machine learning model comprises a first machine learning model and a second machine learning model,
The first machine learning model is learned using first teacher data including a set of the input data of the teacher data and a super-resolution thermal image of the input data,
The second machine learning model is learned using second teacher data including a set of the super-resolution thermal image and the output data of the teacher data,
A control method according to invention 6.
 [発明7の効果]
 このような制御方法は、熱画像を入力として熱画像の超解像熱画像を出力する第一機械学習モデルと、超解像熱画像を入力として人の輪郭を出力する第二機械学習モデルとを用いて、熱画像から人の輪郭を検出することができる。したがって、制御方法は、熱画像よりも細かい分解能を有する超解像熱画像から人の輪郭を検出することができるため、人の輪郭をより精度よく検出することができる。
[Effect of Invention 7]
Such a control method includes a first machine learning model that takes a thermal image as an input and outputs a super-resolution thermal image of the thermal image, and a second machine learning model that takes a super-resolution thermal image as an input and outputs a human contour. can be used to detect human contours from thermal images. Therefore, since the control method can detect the contour of a person from the super-resolution thermal image having finer resolution than the thermal image, the contour of the person can be detected with higher accuracy.
 [発明8]
 発明1~7のいずれかに記載の制御方法をコンピュータに実行させるための
 プログラム。
[Invention 8]
A program for causing a computer to execute the control method according to any one of Inventions 1 to 7.
 [発明8の効果]
 このようなプログラムは、空調対象空間に存在する人の温冷感を推定することにより、人が快適に感じる環境温度を実現することができる。
[Effect of Invention 8]
Such a program can realize an environmental temperature that people feel comfortable by estimating the thermal sensation of people existing in the air-conditioned space.
 [発明9]
 空調対象空間の天井に設置されたセンサにより検出された前記空調対象空間の温度分布を示す熱画像を取得する取得部と、
 取得された前記熱画像に基づいて前記空調対象空間に存在する人の輪郭を検出する検出部と、
 検出された前記人の輪郭で囲まれた領域である人体領域を、前記熱画像内において特定する特定部と、
 特定された前記人体領域内の温度分布に基づいて前記人の着衣の温度を含む人体温度を算出し、前記熱画像内における前記人体領域以外の領域の周囲温度を算出し、前記人体温度と前記周囲温度との差分値に基づいて前記人の放熱量を算出する算出部と、
 算出された前記放熱量に基づいて前記人の温冷感を推定する推定部と、
 推定された前記温冷感に基づいて前記空調対象空間に設置された空気調和機を制御する制御部と、
 を備える、
 制御システム。
[Invention 9]
an acquisition unit that acquires a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space;
a detection unit that detects a contour of a person present in the air-conditioned space based on the acquired thermal image;
a specifying unit that specifies, in the thermal image, a human body region, which is a region surrounded by the detected outline of the person;
calculating a human body temperature including a temperature of clothes of the person based on the specified temperature distribution in the human body region, calculating an ambient temperature of a region other than the human body region in the thermal image, and calculating the human body temperature and the human body temperature; a calculation unit that calculates the heat dissipation amount of the person based on the difference value from the ambient temperature;
an estimating unit that estimates the thermal sensation of the person based on the calculated amount of heat release;
a control unit that controls an air conditioner installed in the air-conditioned space based on the estimated thermal sensation;
comprising
control system.
 [発明9の効果]
 このような制御システムは、空調対象空間に存在する人の温冷感を推定することにより、人が快適に感じる環境温度を実現することができる。
[Effect of invention 9]
Such a control system can realize an environmental temperature that people feel comfortable by estimating the thermal sensation of people existing in the air-conditioned space.
 1 人
 1b 着衣
 10 制御システム
 20 センサ
 30 空気調和機
 32 制御部
 32a 取得部
 32b 検出部
 32c 特定部
 32d 算出部
 32e 推定部
 34a 学習済みモデル
 34b 学習済みモデル
 45a 学習済みモデル
 45b 学習済みモデル
 46 教師データ
 100 熱画像
 100a 人体領域
 100b 周囲領域
1 person 1b clothing 10 control system 20 sensor 30 air conditioner 32 control unit 32a acquisition unit 32b detection unit 32c identification unit 32d calculation unit 32e estimation unit 34a learned model 34b learned model 45a learned model 45b learned model 46 teacher data 100 thermal image 100a human body region 100b surrounding region

Claims (9)

  1.  コンピュータにより実行される空気調和機の制御方法であって、
     前記コンピュータは、
     空調対象空間の天井に設置されたセンサにより検出された前記空調対象空間の温度分布を示す熱画像を取得し、
     取得された前記熱画像に基づいて前記空調対象空間に存在する人の輪郭を検出し、
     検出された前記人の輪郭で囲まれた領域である人体領域を、前記熱画像内において特定し、
     特定された前記人体領域内の温度分布に基づいて前記人の着衣の温度を含む人体温度を算出し、
     前記熱画像内における前記人体領域以外の領域の周囲温度を算出し、
     前記人体温度と前記周囲温度との差分値に基づいて前記人の放熱量を算出し、
     算出された前記放熱量に基づいて前記人の温冷感を推定し、
     推定された前記温冷感に基づいて前記空気調和機を制御する、
     制御方法。
    A control method for an air conditioner executed by a computer, comprising:
    The computer is
    Acquiring a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space,
    detecting a contour of a person existing in the air-conditioned space based on the acquired thermal image;
    identifying a human body region, which is a region surrounded by the detected outline of the person, in the thermal image;
    calculating a human body temperature including a temperature of clothes of the person based on the identified temperature distribution in the human body region;
    calculating the ambient temperature of a region other than the human body region in the thermal image;
    calculating the heat radiation amount of the person based on the difference value between the human body temperature and the ambient temperature;
    estimating the thermal sensation of the person based on the calculated amount of heat dissipation;
    controlling the air conditioner based on the estimated thermal sensation;
    control method.
  2.  前記コンピュータは、
     前記放熱量が閾値よりも大きい場合に、前記人が寒いと感じていると推定し、推定された前記温冷感に基づいて前記周囲温度を上げるように前記空気調和機を制御し、
     前記放熱量が前記閾値よりも小さい場合に、前記人が暑いと感じていると推定し、推定された前記温冷感に基づいて前記周囲温度を下げるように前記空気調和機を制御する、
     請求項1に記載の制御方法。
    The computer is
    estimating that the person feels cold when the heat release amount is greater than a threshold, and controlling the air conditioner to raise the ambient temperature based on the estimated thermal sensation;
    estimating that the person feels hot when the heat release amount is smaller than the threshold value, and controlling the air conditioner to lower the ambient temperature based on the estimated thermal sensation;
    The control method according to claim 1.
  3.  前記コンピュータは、前記人の輪郭の検出において、
     前記熱画像内の少なくとも一部において同じ特徴を有し、かつ、隣接する複数の画素を1つの温度分布領域として2つ以上の温度分布領域に分割し、
     分割された前記2つ以上の温度分布領域に基づいて前記人の着衣を含む前記人の輪郭を検出する、
     請求項1又は2に記載の制御方法。
    The computer, in detecting the contour of the person,
    dividing a plurality of adjacent pixels into two or more temperature distribution areas having the same characteristics in at least a part of the thermal image, and
    detecting the contour of the person including the clothing of the person based on the two or more divided temperature distribution areas;
    The control method according to claim 1 or 2.
  4.  前記コンピュータは、
     前記熱画像内の複数の画素に対して隣接する画素間に少なくとも1つの画素を補間することにより、前記熱画像よりも高い解像度を有する超解像熱画像を生成し、
     前記超解像熱画像に基づいて前記人の輪郭を検出する、
     請求項1又は2に記載の制御方法。
    The computer is
    generating a super-resolution thermal image having a higher resolution than the thermal image by interpolating at least one pixel between adjacent pixels for a plurality of pixels in the thermal image;
    detecting a contour of the person based on the super-resolution thermal image;
    The control method according to claim 1 or 2.
  5.  前記コンピュータは、学習済みの機械学習モデルを用いて前記熱画像から前記人の輪郭を検出する、
     請求項1又は2に記載の制御方法。
    The computer detects the contour of the person from the thermal image using a trained machine learning model.
    The control method according to claim 1 or 2.
  6.  前記機械学習モデルは、教師データを用いて学習され、
     前記教師データは、入力データとして前記熱画像と、出力データとして前記人の輪郭とを有する、
     請求項5に記載の制御方法。
    The machine learning model is learned using teacher data,
    The training data has the thermal image as input data and the outline of the person as output data,
    The control method according to claim 5.
  7.  前記機械学習モデルは、第一機械学習モデル及び第二機械学習モデルから構成されており、
     前記第一機械学習モデルは、前記教師データの前記入力データと、前記入力データの超解像熱画像との組を含む第一教師データを用いて学習され、
     前記第二機械学習モデルは、前記超解像熱画像と、前記教師データの前記出力データとの組を含む第二教師データを用いて学習される、
     請求項6に記載の制御方法。
    The machine learning model comprises a first machine learning model and a second machine learning model,
    The first machine learning model is learned using first teacher data including a set of the input data of the teacher data and a super-resolution thermal image of the input data,
    The second machine learning model is learned using second teacher data including a set of the super-resolution thermal image and the output data of the teacher data,
    The control method according to claim 6.
  8.  請求項1又は2に記載の制御方法をコンピュータに実行させるための
     プログラム。
    A program for causing a computer to execute the control method according to claim 1 or 2.
  9.  空調対象空間の天井に設置されたセンサにより検出された前記空調対象空間の温度分布を示す熱画像を取得する取得部と、
     取得された前記熱画像に基づいて前記空調対象空間に存在する人の輪郭を検出する検出部と、
     検出された前記人の輪郭で囲まれた領域である人体領域を、前記熱画像内において特定する特定部と、
     特定された前記人体領域内の温度分布に基づいて前記人の着衣の温度を含む人体温度を算出し、前記熱画像内における前記人体領域以外の領域の周囲温度を算出し、前記人体温度と前記周囲温度との差分値に基づいて前記人の放熱量を算出する算出部と、
     算出された前記放熱量に基づいて前記人の温冷感を推定する推定部と、
     推定された前記温冷感に基づいて前記空調対象空間に設置された空気調和機を制御する制御部と、
     を備える、
     制御システム。
    an acquisition unit that acquires a thermal image showing the temperature distribution of the air-conditioned space detected by a sensor installed on the ceiling of the air-conditioned space;
    a detection unit that detects a contour of a person present in the air-conditioned space based on the acquired thermal image;
    a specifying unit that specifies, in the thermal image, a human body region, which is a region surrounded by the detected outline of the person;
    calculating a human body temperature including a temperature of clothes of the person based on the specified temperature distribution in the human body region, calculating an ambient temperature of a region other than the human body region in the thermal image, and calculating the human body temperature and the human body temperature; a calculation unit that calculates the heat dissipation amount of the person based on the difference value from the ambient temperature;
    an estimating unit that estimates the thermal sensation of the person based on the calculated amount of heat release;
    a control unit that controls an air conditioner installed in the air-conditioned space based on the estimated thermal sensation;
    comprising
    control system.
PCT/JP2022/031812 2021-08-31 2022-08-24 Control method, program, and control system WO2023032771A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023545485A JPWO2023032771A1 (en) 2021-08-31 2022-08-24

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021141878 2021-08-31
JP2021-141878 2021-08-31

Publications (1)

Publication Number Publication Date
WO2023032771A1 true WO2023032771A1 (en) 2023-03-09

Family

ID=85412533

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/031812 WO2023032771A1 (en) 2021-08-31 2022-08-24 Control method, program, and control system

Country Status (2)

Country Link
JP (1) JPWO2023032771A1 (en)
WO (1) WO2023032771A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013124833A (en) * 2011-12-15 2013-06-24 Samsung Yokohama Research Institute Co Ltd Air conditioning device
JP2015052435A (en) * 2013-09-09 2015-03-19 日立アプライアンス株式会社 Air conditioner
JP2016208408A (en) * 2015-04-27 2016-12-08 パナソニックIpマネジメント株式会社 Detection method, detection device and control method
JP2017203576A (en) * 2016-05-10 2017-11-16 株式会社ノーリツ Air conditioner
JP2018185108A (en) * 2017-04-26 2018-11-22 パナソニックIpマネジメント株式会社 Warm/cold sensation estimation device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013124833A (en) * 2011-12-15 2013-06-24 Samsung Yokohama Research Institute Co Ltd Air conditioning device
JP2015052435A (en) * 2013-09-09 2015-03-19 日立アプライアンス株式会社 Air conditioner
JP2016208408A (en) * 2015-04-27 2016-12-08 パナソニックIpマネジメント株式会社 Detection method, detection device and control method
JP2017203576A (en) * 2016-05-10 2017-11-16 株式会社ノーリツ Air conditioner
JP2018185108A (en) * 2017-04-26 2018-11-22 パナソニックIpマネジメント株式会社 Warm/cold sensation estimation device

Also Published As

Publication number Publication date
JPWO2023032771A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
US11536480B2 (en) Air conditioner, sensor system, and thermal sensation estimation method
CN111247375B (en) Air conditioner control device
CN213501729U (en) Thermal management system for a motor vehicle passenger compartment
US10372990B2 (en) System and method for identification of personal thermal comfort
US20140148706A1 (en) Method and device for detecting thermal comfort
JP5238679B2 (en) Air conditioning control device, air conditioning control method, and radiation temperature measuring device
JP7217058B2 (en) Detecting the presence of one or more human occupants in the built space in real time using one or more thermographic cameras and one or more RGB-D sensors to estimate thermal comfort
JP6340344B2 (en) Air conditioner and control method of air conditioner
JP2011174665A (en) System and method for air conditioning control
JP2017015384A (en) Air conditioning control device
US20200256581A1 (en) Air conditioning control device
JP6668010B2 (en) Air conditioning control device, air conditioning control method, and air conditioning control program
US11015832B2 (en) Thermographic sensing of human thermal conditions to improve thermal comfort
JP6386950B2 (en) Air conditioner and control method of air conditioner
JP2015111019A (en) Air-conditioning system, air-conditioner control device, control method, and program
KR20180025407A (en) Method, apparatus and computer program for controlling heating and cooling using by predicted mean vote
US20210260958A1 (en) System and method for estimating climate needs
WO2023032771A1 (en) Control method, program, and control system
JPWO2021130960A5 (en)
WO2017159632A1 (en) Air conditioner
JP2017219247A (en) Air conditioner control device
JP2020196407A (en) Control device, information recording device, control method and information recording method
WO2023188798A1 (en) Environment control system, environment control method, and program
JP2021124246A (en) Air conditioning system and control method for air conditioner
JP7080294B2 (en) Information processing equipment, information processing programs and information processing systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22864355

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023545485

Country of ref document: JP