WO2024019220A1 - Object behavior detection system using thermal imaging - Google Patents

Object behavior detection system using thermal imaging Download PDF

Info

Publication number
WO2024019220A1
WO2024019220A1 PCT/KR2022/013087 KR2022013087W WO2024019220A1 WO 2024019220 A1 WO2024019220 A1 WO 2024019220A1 KR 2022013087 W KR2022013087 W KR 2022013087W WO 2024019220 A1 WO2024019220 A1 WO 2024019220A1
Authority
WO
WIPO (PCT)
Prior art keywords
interest
brightness value
thermal imaging
region
detection system
Prior art date
Application number
PCT/KR2022/013087
Other languages
French (fr)
Korean (ko)
Inventor
이용권
강연준
Original Assignee
주식회사 럭스로보
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 럭스로보 filed Critical 주식회사 럭스로보
Publication of WO2024019220A1 publication Critical patent/WO2024019220A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0423Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0469Presence detectors to detect unsafe condition, e.g. infrared sensor, microphone
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/182Level alarms, e.g. alarms responsive to variables exceeding a threshold
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the present invention relates to a system for detecting the behavior of an object using thermal imaging, and in particular, to a system for detecting the behavior of an object by analyzing thermal imaging of the object to detect whether the object is in danger.
  • Registered Patent Publication No. 10-2052883 discloses a fall prediction system control method using a thermal imaging camera that determines the risk of falling based on the x-y axis histogram data of the captured thermal image and generates a notification.
  • Registered Patent Publication No. 10-1916631 discloses a fall detection device including a thermal imaging camera that determines whether a patient has fallen through imaging data captured by the thermal imaging camera.
  • Registered Patent Publication No. 10-1927220 (Prior Document 3) describes an object of interest that detects the temperature of an object of interest located within the target area using thermal imaging images and determines the state and situation of the object of interest using the detected temperature. A sensing method and device are disclosed.
  • the purpose of the present invention is to provide an object behavior detection system using thermal imaging images that can accurately detect object behavior from thermal imaging images.
  • the present invention is an object behavior detection method using thermal imaging that enables communication between a plurality of thermal imaging cameras distributed and installed in various spaces, allowing rapid collection of detection information from thermal imaging cameras through communication between thermal imaging cameras.
  • the purpose is to provide a system.
  • the present invention provides an object behavior detection system using thermal imaging that enables quick and accurate detection of a dangerous situation of an object without infringing on individual privacy by acquiring and analyzing thermal imaging of the object through a thermal imaging camera. There is a purpose to doing this.
  • An object behavior detection system using thermal imaging includes a thermal imaging camera that includes a communication module and is installed in a plurality of spaces and acquires thermal imaging images of objects in the installed spaces; It includes a server that receives a thermal image from the thermal image camera and analyzes the received thermal image to determine a risk situation of the object, wherein the server extracts a region of interest from the thermal image and collects the thermal image.
  • the representative brightness value in the region of interest is calculated by setting the brightness value corresponding to the color according to the temperature of the object in the plurality of cells constituting the image, (1) the moving speed of the region of interest, (2) the It is possible to recognize whether the object is in danger using the moving distance of the area, (3) the change rate of the representative brightness value of the immediately previous area of interest, and (4) the duration of the representative brightness value of the currently moved area of interest.
  • the region of interest may include a facial area of the object.
  • the brightness value of the one cell can be calculated as the representative brightness value.
  • the average of the brightness values of the two or more cells can be calculated as the representative brightness value.
  • the largest brightness value among the brightness values of the two or more cells can be calculated as the representative brightness value.
  • the communication modules of the plurality of thermal imaging cameras can transmit and receive the thermal images to each other through mesh Wi-Fi communication.
  • At least one of the plurality of thermal imaging cameras is connected to a gateway and can transmit thermal imaging images to the server through the gateway.
  • the server may recognize that a dangerous situation has occurred in the object.
  • the server may recognize that a dangerous situation has occurred in the object if the moving distance between the areas of interest is greater than a preset second standard value (second condition).
  • the server may recognize that a dangerous situation has occurred in the object if the rate of change of the representative brightness value in the immediately preceding region of interest is greater than or equal to a preset third standard value (third condition).
  • the server may recognize that a dangerous situation has occurred in the object.
  • the server determines that the moving speed of the area of interest is greater than or equal to a preset first reference value (first condition), the moving distance between the areas of interest is greater than or equal to a preset second reference value (second condition), and The rate of change of the representative brightness value in the area of interest satisfies at least one of the preset third reference values or more (third condition), and the duration of the representative brightness value in the moved area of interest is more than the preset fourth reference value. If this is the case (condition 4), it can be recognized that a dangerous situation has occurred in the object.
  • the object behavior detection system using thermal imaging has one or more of the following effects.
  • the temperature in a thermal image is subdivided into numerical values and the behavior of an object is detected according to changes in the numerical values, so that the behavior can be accurately detected in the thermal image.
  • a plurality of thermal imaging cameras can be distributed and placed in multiple spaces, making it possible to detect object behavior occurring in multiple spaces.
  • thermal imaging cameras even if thermal imaging cameras are distributed and deployed in various spaces, communication between each thermal imaging camera can be performed, and detection information from a plurality of thermal imaging cameras can be quickly collected and processed, enabling rapid detection.
  • the present invention by acquiring and analyzing a thermal image of an object through a thermal imaging camera, it is possible to quickly and accurately detect a dangerous situation of an object without infringing on privacy.
  • Figure 1 is a configuration diagram of an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
  • Figure 2 is a diagram for explaining the mesh Wi-Fi connection of a thermal imaging camera according to an embodiment of the present invention.
  • Figure 3 is a detailed configuration diagram of a thermal imaging camera constituting an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
  • Figure 4 is a detailed configuration diagram of a server that configures an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
  • Figure 5 is an example diagram of a thermal image acquired by a thermal imaging camera according to an embodiment of the present invention.
  • Figures 6a and 6b are examples of converting the temperature of each part of an object into a color value in a thermal image according to an embodiment of the present invention.
  • Figure 7 is a flowchart showing the process of detecting the behavior of an object using a thermal image according to an embodiment of the present invention.
  • the present invention provides a system and method for detecting the behavior of an object by analyzing a thermal image of the object.
  • Thermal imaging cameras are distributed and placed in various spaces to acquire thermal images from each thermal imaging camera.
  • thermal imaging images from each thermal imaging camera can be quickly collected. Let it happen.
  • the change in color brightness value according to the temperature of the cells that make up the thermal image is used to detect the behavior of the object and recognize the dangerous situation of the object.
  • the present invention enables fast and accurate object behavior detection.
  • Figure 1 is a schematic overall configuration diagram of an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
  • an object behavior detection system using thermal imaging may include at least one thermal imaging camera 100.
  • the thermal imaging camera 100 may be distributed and installed in a plurality of separate spaces, and may acquire thermal imaging images of each space, particularly thermal imaging images of objects.
  • the thermal imaging camera 100 may include an infrared (IR) sensor capable of measuring the temperature of an object, and the IR sensor may measure the temperature of the object by detecting radiant heat (mainly infrared) emitted from the object.
  • IR infrared
  • the temperature of the object measured in this way may be displayed in a color corresponding to the temperature through a display connected to the thermal imaging camera 100.
  • the thermal imaging camera 100 has a communication module and can communicate with other thermal imaging cameras, or connect to the AP device 10 and/or the network 20.
  • the thermal imaging camera 100 may be equipped with different types of communication modules or may be equipped with a plurality of communication modules.
  • the thermal imaging camera 100 may include a Wi-Fi communication module, NFC communication module, Zigbee communication module, BluetoothTM communication module, etc.
  • the thermal imaging cameras 100 are connected to each other through mesh Wi-Fi and mesh Wi-Fi communication is possible between the thermal imaging cameras 100, so that data can be transmitted and received between the thermal imaging cameras 100.
  • the thermal imaging camera 100 may be connected to the gateway 200 through a communication module.
  • the gateway 200 is a device that connects the network 300 and the home network (1), and transmits thermal image images from the thermal imaging camera 100 to a server 400, which will be described later, through the network 300. It can be sent to .
  • the network 300 may be a cellular communication network such as LTE, 5G, etc., or a typical Internet network.
  • the server 400 may receive a thermal image of an object transmitted from the thermal imaging camera 100 and store it in an internal storage unit.
  • the server 400 can analyze the behavior of objects from thermal imaging images.
  • the server 400 can transmit the thermal image image and the analysis results of the object's behavior to the user terminal 500.
  • the user terminal 500 can display the thermal image and analysis results received from the server 400 on the screen. As a result, the user can check this on the user terminal 500.
  • the user terminal 200 may be, for example, a portable mobile terminal such as a smart phone or tablet PC.
  • the server 400 may transmit an alarm to a related agency server (not shown) according to the analysis results.
  • a related agency server not shown
  • a notification can be sent to the server of related organizations such as medical institutions, police stations, and public health centers.
  • Figure 2 is a diagram for explaining mesh Wi-Fi connection of a plurality of thermal imaging cameras according to an embodiment of the present invention.
  • each thermal imaging camera 100 includes a communication module 110 for mesh Wi-Fi communication.
  • Thermal imaging cameras 100 can be installed in a plurality of spaces 10, 20, and 30, respectively, and each thermal imaging camera 100 can perform wireless communication with each other using the mesh Wi-Fi communication module 110. there is.
  • At least one of the plurality of thermal imaging cameras 100 may be connected to the gateway 200.
  • the communication module 110 of the thermal imaging camera 100 in the second space is connected to the gateway 200.
  • the thermal imaging cameras 100 of the first, second, and third spaces can acquire thermal image images of each space. If an object exists in that space, you can naturally obtain a thermal image of the object.
  • the thermal imaging camera 100 in the first and third spaces can acquire thermal image images for the space and transmit them to the communication module 110 of the thermal imaging camera 100 in the second space through the communication module 110. there is.
  • the thermal imaging camera 100 in the second space can transmit thermal imaging images of its own space and thermal imaging images of the first and third spaces to the gateway 200 through the communication module 110.
  • the gateway 200 can transmit all thermal image images of spaces 1, 2, and 3 to the server 400 through the network 300.
  • the mesh Wi-Fi communication method In the mesh Wi-Fi communication method according to this embodiment, two-way communication is possible between the communication modules 110 of the thermal imaging camera 100. Compared to the existing expandable Wi-Fi communication method, the data transmission speed becomes faster as the number of communication modules increases.
  • the communication module 110 can also communicate with a mobile communication terminal in real life. Even in this case, mesh Wi-Fi communication is performed between the communication modules 110 through a separate communication route, so communication of the mobile communication terminal is not affected.
  • mesh Wi-Fi communication in this embodiment communicates through a communication route different from that of a mobile communication terminal, so communication speed can be optimized even when communicating with a mobile communication terminal.
  • a plurality of communication modules 110 are set to one SSID (service set identifier), so uninterrupted Wi-Fi communication is possible even when the thermal imaging camera 100 is moved.
  • SSID service set identifier
  • Figure 3 is a detailed configuration diagram of a thermal imaging camera constituting a behavior monitoring system according to an embodiment of the present invention.
  • the thermal imaging camera 100 includes a communication module 110 that communicates with an external device, a thermal imaging sensor 120 that acquires a thermal image of an object, and a power supply unit 130 that supplies power. , a storage unit 140 that stores data and programs, and a camera control unit 150 that controls the overall operation of the thermal imaging camera 100.
  • the communication module 110 can communicate with the communication module 110 of another thermal imaging camera 100. As described above, the communication module 110 can perform mesh Wi-Fi communication.
  • the communication module 110 can transmit and receive thermal images.
  • the thermal imaging image may include metadata information that can be used to determine which thermal imaging camera 100 and when the image was acquired.
  • Metadata may include information such as identification information of the thermal imaging camera 100 that captured the thermal imaging image, the time the thermal imaging image was captured, and the location where the thermal imaging image was captured.
  • the communication module 110 may communicate with the gateway 200.
  • the gateway 200 may transmit the thermal image received from the communication module 110 to the server 400.
  • the thermal image sensor 120 may be configured as, for example, an infrared (IR) sensor.
  • IR infrared
  • the thermal image sensor 120 can acquire thermal image images of the surroundings.
  • thermal image images of the space where the thermal imaging camera 100 is placed and objects (eg, people) in the space can be acquired.
  • the power supply unit 130 may include a battery and may supply power to the thermal imaging camera 100.
  • the power supply unit 130 may include a charging device (not shown) that can charge the battery.
  • the power supply unit 130 may receive commercial power, convert it to a voltage suitable for the thermal imaging camera 100, and supply power.
  • thermal image acquired by the thermal image sensor 120 may be stored in the storage unit 140.
  • the camera control unit 150 can control the operations of the communication module 110, the thermal image sensor 120, and the power supply unit 130.
  • the camera control unit 150 can control the thermal imaging sensor 120 to acquire thermal image images of the surroundings and store them in the storage unit 140, and can control other thermal imaging cameras 100 and other thermal imaging cameras 100 through the communication module 110. / Alternatively, it can be controlled to transmit the thermal image to the gateway 200, and can be controlled to supply power from the power supply unit 130.
  • FIG. 4 is a detailed configuration diagram of a server constituting a behavior detection system according to an embodiment of the present invention.
  • the server 400 may be configured to include a server communication unit 410, a database (DB) 420, and a server control unit 430.
  • DB database
  • the server communication unit 410 can connect to the network 300 and communicate with the thermal imaging camera 100 and/or the user terminal 500.
  • the server communication unit 410 may communicate with servers (not shown) of related organizations such as medical institutions, police stations, emergency rooms, and administrative agencies.
  • the database (DB) 420 can store various programs and data necessary for the operation of the server 400.
  • the DB 420 can store thermal image images transmitted from a plurality of thermal imaging cameras 100.
  • the DB 420 is shown as installed inside the server 400, but alternatively, the DB 420 may be built separately outside the server 400 and operate in conjunction with the server 400. .
  • the server control unit 430 can control the overall operation of the server 400.
  • the server control unit 430 can analyze the behavior of objects using thermal imaging images.
  • the server control unit 430 sets the color brightness value according to the temperature of the object for each of the plurality of cells constituting the thermal image image and analyzes the object's behavior using the change in brightness value in a specific cell. .
  • server control unit 430 can recognize whether an object is in a dangerous situation by analyzing the thermal image.
  • the server control unit 430 sets the color brightness value according to the temperature of the object for each of the plurality of cells constituting the thermal image and uses the change in brightness value in a specific cell to determine whether the object performs routine actions. You can recognize whether you are making a dangerous move or are in a dangerous situation.
  • the server control unit 430 can determine whether an object is in a dangerous situation by checking the change in brightness value and the speed of change in brightness value in a plurality of cells.
  • the server control unit 430 is equipped with an artificial neural network (ANN) pre-trained through machine learning and can detect the behavior of objects and recognize dangerous situations based on thermal images.
  • ANN artificial neural network
  • the server control unit 430 may include a deep neural network (DNN) such as CNN (Convolutional Neural Network), RNN (Recurrent Neural Network), and DBN (Deep Belief Network) learned through deep learning. You can.
  • DNN deep neural network
  • CNN Convolutional Neural Network
  • RNN Recurrent Neural Network
  • DBN Deep Belief Network
  • Deep learning can represent a set of machine learning algorithms that extract key data from a plurality of data by sequentially going through hidden layers.
  • the deep learning structure can be composed of deep neural networks (DNN) such as CNN, RNN, and DBN.
  • DNN deep neural networks
  • a deep neural network may include an input layer, a hidden layer, and an output layer.
  • DNN Deep Neural Network
  • the input of each node may be a weight applied to the output of the node of the previous layer.
  • Weight may refer to the strength of the connection between nodes.
  • the deep learning process can also be seen as a process of finding appropriate weights.
  • learning of an artificial neural network can be accomplished by adjusting the weight of the connection lines between nodes to produce a desired output for a given input. Additionally, artificial neural networks can continuously update weight values through learning.
  • the server control unit 430 may transmit a notification to the user terminal 500 and/or a related organization server (not shown) through the server communication unit 410.
  • Figure 5 is an example diagram showing a thermal image obtained from a thermal imaging camera according to an embodiment of the present invention.
  • a thermal image of an object acquired by the thermal imaging camera 100 may be displayed on a display connected to the thermal imaging camera 100 by wire or wirelessly.
  • thermal image images illustrated in Figure 5 (a) represents a person raising a hand, (b) represents a person sitting on a chair, and (c) represents a person sitting on the floor.
  • the server control unit 430 can detect the behavior of an object using values corresponding to colors according to the temperature of each part of the object in the thermal image.
  • Figures 6a and 6b are examples of converting the temperature of each part of an object into a color value in a thermal image according to an embodiment of the present invention.
  • FIG. 6A is an example of converting the temperature of a plurality of cells constituting the thermal image of the object shown in (b) of FIG. 5 into color values.
  • FIG. 6B is an example of converting the temperature of a plurality of cells constituting the thermal image of the object shown in (c) of FIG. 5 into color values.
  • a thermal image may consist of a plurality of cells. Additionally, a color corresponding to the temperature of the object may be displayed in each cell. Different temperatures are displayed in different colors. The higher the temperature in the cell, the relatively brighter the color may be displayed.
  • the color value converted to the drawing may be the relative brightness value of the color corresponding to the temperature of the object. That is, the higher the temperature, the brighter the color, and the brighter the color, the higher the brightness value can be set.
  • the color of the cell can be displayed as a combination of red (R), green (G), and blue (B).
  • R, G, and B can each have color values between 0 and 254, and these color values can be the brightness value of each cell. The brighter it is, the closer it is to 254, and the darker it is, the closer it is to 0.
  • the brightness value of the color can be set to an arbitrary range.
  • the color brightness value for each cell is classified into levels 0 to 100.
  • the color in the cell corresponding to the area where the temperature of the object is relatively high is set to a relatively large brightness value, and conversely, the color in the cell corresponding to the area where the temperature of the object is relatively low is set to a relatively small brightness value.
  • the thermal image consists of 10 ⁇ 15 cells.
  • the thermal image displays colors with different brightness depending on the temperature of the object
  • the temperature corresponds to the brightness of the color for each cell corresponding to the object part.
  • An example of conversion to a brightness value is shown.
  • the brightness value of the darkest part is 0 and the brightness value of the brightest part is 81.
  • the brightness value of the darkest part is 0 and the brightness value of the brightest part is 79.
  • the server control unit 430 may extract a region of interest from the thermal image.
  • a region of interest may be an area set to determine a specific part of an object.
  • the region of interest may be the upper body of the object, preferably the face. In another embodiment, it may be the brightest part of the thermal image, or an area including the brightest part and surrounding areas.
  • the server control unit 430 can calculate a representative brightness value in the region of interest.
  • the brightness value of that cell can be calculated as the representative brightness value.
  • the average of the brightness values of each cell can be calculated as the representative brightness value, or the highest brightness value can be calculated as the representative brightness value.
  • the region of interest in the thermal image may be the brightest M region, and the region of interest in the corresponding cell may be the N region corresponding to the M region.
  • the largest brightness value, 81 may be a representative brightness value.
  • the brightness value in area N will not change, and even if it changes, the amount of change will be small.
  • the region of interest may change from area N to area N'.
  • the region of interest in the thermal image may be the brightest M' region, and the region of interest in the cell corresponding to this may be the N' region corresponding to the M' region.
  • the representative brightness value in the N' area is 46, and when the representative brightness value is set as the largest brightness value, the representative brightness value in the N' area is 71.
  • the server control unit 430 can detect the behavior of an object by detecting a change in the area of interest from the N area to the N' area, and can also detect a dangerous situation of the object.
  • the object's action can be detected.
  • the object's face moves from area N to area N', it can be detected that movement of the object has occurred.
  • the server control unit 430 can determine whether an object is in danger using characteristics according to the movement of the area of interest.
  • the server control unit 430 determines (1) the moving speed of the area of interest, (2) the moving distance of the area of interest, (3) the change speed of the representative brightness value of the previous area of interest, and (4) the moved interest.
  • the danger situation of an object can be determined using the duration of the representative brightness value of the area.
  • the speed at which the area of interest moves can be relatively fast compared to normal, everyday movement. For example, if an object suddenly falls or suddenly sits down, the region of interest will move quickly.
  • the server control unit 430 may recognize a dangerous situation when the area of interest moves faster than a certain speed. That is, if the speed at which the area of interest moves from area N to area N' is greater than or equal to the preset first reference value (first condition), it can be recognized that a dangerous situation has occurred in the object.
  • the moving distance of the area of interest may be relatively long compared to normal, everyday movement. For example, when an object suddenly falls while standing, the distance between the initial area of interest (e.g. area N) and the moved area of interest (e.g. area N') increases.
  • the server control unit 430 may recognize a dangerous situation when the moving distance between areas of interest is more than a certain distance. In other words, if the distance that the area of interest moves from area N to area N' is more than the preset second standard value (second condition), it can be recognized that a dangerous situation has occurred in the object.
  • the area of interest can move quickly. As in the example of FIGS. 6A and 6B, when an object stands and then falls down, the region of interest moves from area N to area N'.
  • the representative brightness value of area N which is the area of interest, in Figure 6a where the object is standing is 81.
  • Figure 6b where the object is sitting down, the area of interest moves to area N', and the representative brightness value of area N, which was the area of interest just before, becomes 8.
  • the server control unit 430 can recognize a dangerous situation of an object by detecting that the representative brightness value in the area of interest increases beyond a certain speed. In other words, if the rate of change of the representative brightness value in area N, which was the area of interest, is more than the preset third standard value (third condition), it can be recognized that a dangerous situation has occurred in the object.
  • the deceleration speed of the representative brightness value in the area of interest is greater than or equal to the third standard value, it can be recognized that a dangerous situation has occurred in the object.
  • conditions 1, 2, and 3 can occur not only when a dangerous situation occurs in an object, but also when the object moves quickly for another purpose. For example, even when an object quickly moves from sitting to standing, any one of the first, second, and third conditions may be satisfied. In this case, there is a risk that the reliability of the object's recognition of dangerous situations may decrease.
  • reliability can be increased in determining whether a dangerous situation has occurred in an object by using the duration of the representative brightness value in the changed area of interest.
  • the fact that the representative brightness value of the moved region of interest persists for more than a certain period of time may mean that there is no movement for more than a certain period of time after the object moves.
  • the area of interest may move from above to below. After the region of interest moves downward, if there is no change in the representative brightness value in the moved region of interest or if the representative brightness value persists for more than a certain period of time, it may mean that there is no movement in the object. This can be a dangerous situation for the object.
  • the server control unit 430 can determine whether the representative brightness value in the moved area of interest continues for more than a certain period of time and recognize the dangerous situation of the object. In other words, if the time for which the representative brightness value lasts in area N', which is the moved area of interest, is more than the preset fourth standard value (fourth condition), it can be recognized that a dangerous situation has occurred in the object.
  • the server 400 can recognize that a dangerous situation has occurred in an object when at least one of the first, second, third, and fourth conditions is satisfied.
  • the server 400 may recognize that a dangerous situation has occurred in an object when at least one of the first, second, and third conditions and the fourth condition are simultaneously satisfied.
  • the fourth condition is satisfied, so if it is determined that a dangerous situation has occurred in the object in this case, accuracy will be improved.
  • Figure 7 is a flowchart showing the process of detecting the behavior of an object using a thermal image according to an embodiment of the present invention.
  • the object behavior detection method acquires a thermal image of the object by thermal imaging cameras 100 installed in a plurality of spaces (S101). Acquisition of thermal imaging images can be done in real time or at set intervals.
  • the thermal imaging camera 100 transmits each acquired thermal imaging image to a specific thermal imaging camera 100 connected to the gateway 200 through mesh Wi-Fi communication (S102).
  • a specific thermal imaging camera 100 transmits a thermal imaging image acquired by itself and a thermal imaging image transmitted from another thermal imaging camera 100 to the server 400 through the gateway 200 and the network 300 (S103) ).
  • the server 400 analyzes the received thermal image (S104) and determines whether at least one of the first, second, third, and fourth conditions occurs (S105).
  • the server 400 recognizes that a dangerous situation has occurred in the object (S106). If the above condition does not occur, the server 400 proceeds to step S101 again and repeats the subsequent process.
  • the server 400 may transmit the occurrence of a dangerous situation of an object to the user terminal 500 and/or a related agency server (S107).
  • the present invention it is possible to determine whether an object is in danger by capturing a thermal image of an object with a thermal imaging camera and receiving and analyzing the thermal image from a server.
  • a thermal imaging camera since a plurality of thermal imaging cameras can communicate with each other through mesh Wi-Fi communication, all thermal imaging images can be quickly collected and transmitted to the server, which has the advantage of faster processing speed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Signal Processing (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Alarm Systems (AREA)

Abstract

The present invention relates to an object behavior detection system for detecting behavior of an object by analyzing a thermal image of the object. The object behavior detection system of the present invention may comprise: a thermal imaging camera that includes a communication module, is installed in each of a plurality of spaces, and acquires a thermal image of an object in each of the installed spaces; and a server that receives a thermal image from the thermal image camera and analyzes the received thermal image to determine whether the object is in a dangerous situation.

Description

열화상 영상을 이용한 객체 행동 감지 시스템Object behavior detection system using thermal imaging
본 발명은 열화상 영상을 이용한 객체의 행동 감지 시스템에 관한 것으로서, 특히 객체에 대한 열화상 영상을 분석하여 객체의 위험상황 여부를 감지하도록 하는 행동 감지 시스템에 관한 것이다.The present invention relates to a system for detecting the behavior of an object using thermal imaging, and in particular, to a system for detecting the behavior of an object by analyzing thermal imaging of the object to detect whether the object is in danger.
최근 인간의 수명이 늘어나면서 오래 건강하게 살고자 하는 욕구와 더불어서 개개인의 건강관리에 대한 관심이 증가하고 있다.Recently, as human lifespan has increased, interest in individual health management is increasing along with the desire to live a long and healthy life.
핵가족화로 인해 가정에 혼자 있는 시간이 많아지고 있지만 비상상황이나 응급상황 발생시 적극적으로 대처하기가 어렵다.Due to the nuclear family, people are spending more time alone at home, but it is difficult to respond proactively in the event of an emergency or medical emergency.
가정 내에 노약자나 환자가 혼자 있는 경우 낙상 등 위험상황이 발생하는 경우 빨리 발견하여 조치를 취하는 것이 중요하다.If an elderly or sick person is alone at home and a dangerous situation such as a fall occurs, it is important to quickly detect and take action.
최근 아파트나 기타 주거공간에 일반 카메라나 CCTV를 설치하여 위험상황 등의 이벤트가 발생할 경우 보호자나 응급센터에 비상연락을 보내도록 하는 시스템이 사용되고 있다.Recently, a system is being used to install general cameras or CCTV in apartments or other residential spaces to send emergency calls to guardians or emergency centers when an event such as a dangerous situation occurs.
하지만, 이 경우 사생활 침해의 우려가 있다는 문제점이 있어 보호자 외에는 위험상황을 모니터링 할 수 없어 응급조치에 한계가 있다.However, in this case, there is a problem that there is a risk of invasion of privacy, and emergency measures are limited because no one except the guardian can monitor the dangerous situation.
이러한 문제점을 해결하기 위해 열화상 카메라를 이용하는 감지시스템이 소개되고 있다.To solve these problems, a detection system using a thermal imaging camera is being introduced.
등록특허공보 제10-2052883호(선행문헌 1)에는 촬영된 열화상 이미지의 x-y축 히스토그램 데이터에 기초하여 낙상위험을 판단하여 알림을 발생하는 열화상 카메라를 이용한 낙상 예측 시스템 제어 방법이 개시된다.Registered Patent Publication No. 10-2052883 (Prior Document 1) discloses a fall prediction system control method using a thermal imaging camera that determines the risk of falling based on the x-y axis histogram data of the captured thermal image and generates a notification.
등록특허공보 제10-1916631호(선행문헌 2)에는 열화상 카메라에 의해 촬영된 촬영데이터를 통해 환자의 낙상 여부를 판단하는 열화상카메라를 포함하는 낙상 감지장치가 개시된다.Registered Patent Publication No. 10-1916631 (Prior Document 2) discloses a fall detection device including a thermal imaging camera that determines whether a patient has fallen through imaging data captured by the thermal imaging camera.
등록특허공보 제10-1927220호(선행문헌 3)에는 열화상 영상을 이용하여 대상영역 내에 위치한 관심대상에 대한 온도를 감지하고, 감지된 온도를 이용하여 관심대상의 상태 및 상황을 판단하는 관심대상 감지 방법 및 장치가 개시된다.Registered Patent Publication No. 10-1927220 (Prior Document 3) describes an object of interest that detects the temperature of an object of interest located within the target area using thermal imaging images and determines the state and situation of the object of interest using the detected temperature. A sensing method and device are disclosed.
하지만, 선행문헌 1-3을 비롯하여 종래에 열화상 화상을 이용한 감지시스템에서는 미세한 동작을 감지하는데 한계가 있어 위험상황 판단의 정확성이 떨어지는 문제가 있다.However, conventional detection systems using thermal imaging, including those in prior literature 1-3, have limitations in detecting minute movements, which reduces the accuracy of determining dangerous situations.
또한, 종래기술에서는 가정 내 여러 공간에서 복수의 열화상 카메라를 설치하는 경우 각 열화상 카메라가 각각의 공간에 분산되어 설치되므로 각 카메라에서의 통신을 통해 감지정보를 모으는데는 시간 지연이 발생하게 되어 열화상 영상의 분석이 느려지는 문제가 있고, 이는 빠른 응급조치가 필요한 사람에는 나쁜 결과를 가져올 수도 있다.In addition, in the prior art, when multiple thermal imaging cameras are installed in various spaces within a home, each thermal imaging camera is distributed and installed in each space, so a time delay occurs in collecting detection information through communication from each camera. There is a problem with the analysis of thermal imaging images being slow, which can have bad results for people who need quick emergency treatment.
이에, 해당 기술분야에서는 열화상 영상을 이용하여 행동을 감지하는 경우 정확한 감지성능이 필요하고 복수개의 공간에 열화상 카메라가 분산되어 설치된 경우에도 감지정보의 빠른 처리가 가능하도록 하는 기술의 개발이 요구되고 있다.Accordingly, in this technology field, accurate detection performance is required when detecting behavior using thermal imaging images, and the development of technology that enables rapid processing of detection information even when thermal imaging cameras are distributed and installed in multiple spaces is required. It is becoming.
본 발명은 열화상 영상으로부터 객체의 행동을 정확하게 감지할 수 있도록 하는 열화상 영상을 이용한 객체 행동 감지 시스템을 제공하는데 목적이 있다.The purpose of the present invention is to provide an object behavior detection system using thermal imaging images that can accurately detect object behavior from thermal imaging images.
본 발명은 여러 공간에 분산되어 설치된 복수의 열화상 카메라의 상호 간 통신이 가능하도록 하여 열화상 카메라 간의 통신을 통해 열화상 카메라에서의 감지정보를 빠르게 모을 수 있도록 하는 열화상 영상을 이용한 객체 행동 감지 시스템을 제공하는데 목적이 있다.The present invention is an object behavior detection method using thermal imaging that enables communication between a plurality of thermal imaging cameras distributed and installed in various spaces, allowing rapid collection of detection information from thermal imaging cameras through communication between thermal imaging cameras. The purpose is to provide a system.
본 발명은 열화상 카메라를 통해 객체에 대한 열화상 영상을 획득하여 분석함으로써 개인의 사생활을 침해하지 않으면서도 객체의 위험상황을 빠르고 정확하게 감지할 수 있도록 하는 열화상 영상을 이용한 객체 행동 감지 시스템을 제공하는데 목적이 있다.The present invention provides an object behavior detection system using thermal imaging that enables quick and accurate detection of a dangerous situation of an object without infringing on individual privacy by acquiring and analyzing thermal imaging of the object through a thermal imaging camera. There is a purpose to doing this.
본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템은, 통신모듈을 포함하고 복수의 공간마다 설치되며 상기 설치된 공간의 객체에 대한 열화상 영상을 획득하는 열화상 카메라; 상기 열화상 카메라로부터 열화상 영상을 수신하고 상기 수신된 열화상 영상을 분석하여 객체의 위험상황을 판단하는 서버를 포함하고, 상기 서버는, 상기 열화상 영상에서 관심영역을 추출하고, 상기 열화상 영상을 구성하는 복수의 셀에서의 객체의 온도에 따른 색상에 대응하는 밝기값을 설정하여 상기 관심영역에서의 대표밝기값을 산출하고, (1)상기 관심영역의 이동속도, (2)상기 관심영역의 이동거리, (3)직전 관심영역의 대표밝기값의 변화속도, (4)현재 이동된 관심영역의 대표밝기값의 지속시간을 이용하여 상기 객체의 위험상황 여부를 인지할 수 있다.An object behavior detection system using thermal imaging according to an embodiment of the present invention includes a thermal imaging camera that includes a communication module and is installed in a plurality of spaces and acquires thermal imaging images of objects in the installed spaces; It includes a server that receives a thermal image from the thermal image camera and analyzes the received thermal image to determine a risk situation of the object, wherein the server extracts a region of interest from the thermal image and collects the thermal image. The representative brightness value in the region of interest is calculated by setting the brightness value corresponding to the color according to the temperature of the object in the plurality of cells constituting the image, (1) the moving speed of the region of interest, (2) the It is possible to recognize whether the object is in danger using the moving distance of the area, (3) the change rate of the representative brightness value of the immediately previous area of interest, and (4) the duration of the representative brightness value of the currently moved area of interest.
본 실시예에서, 상기 관심영역은 상기 객체의 얼굴 부위를 포함할 수 있다.In this embodiment, the region of interest may include a facial area of the object.
본 실시예에서, 상기 관심영역이 하나의 셀에 대응하는 경우 상기 하나의 셀의 밝기값을 상기 대표밝기값으로 산출할 수 있다.In this embodiment, when the region of interest corresponds to one cell, the brightness value of the one cell can be calculated as the representative brightness value.
본 실시예에서, 상기 관심영역이 둘 이상의 셀에 대응하는 경우 상기 둘 이상의 셀의 밝기값의 평균을 상기 대표밝기값으로 산출할 수 있다.In this embodiment, when the region of interest corresponds to two or more cells, the average of the brightness values of the two or more cells can be calculated as the representative brightness value.
본 실시예에서, 상기 관심영역이 둘 이상의 셀에 대응하는 경우 상기 둘 이상의 셀의 밝기값 중 가장 큰 밝기값을 상기 대표밝기값으로 산출할 수 있다.In this embodiment, when the region of interest corresponds to two or more cells, the largest brightness value among the brightness values of the two or more cells can be calculated as the representative brightness value.
본 실시예에서, 상기 복수의 열화상 카메라의 통신모듈은 상호 간에 메시 와이파이 통신을 통해 상기 열화상 열상을 송수신할 수 있다.In this embodiment, the communication modules of the plurality of thermal imaging cameras can transmit and receive the thermal images to each other through mesh Wi-Fi communication.
본 실시예에서, 상기 복수의 열화상 카메라 중 적어도 하나는 게이트웨이와 연결되며 상기 게이트웨이를 통해 상기 서버로 열화상 영상을 전송할 수 있다.In this embodiment, at least one of the plurality of thermal imaging cameras is connected to a gateway and can transmit thermal imaging images to the server through the gateway.
본 실시예에서, 상기 서버는 상기 관심영역의 이동속도가 기설정된 제1기준치 이상이면(제1조건), 상기 객체에 위험상황이 발생한 것으로 인지할 수 있다.In this embodiment, if the moving speed of the area of interest is greater than or equal to a preset first standard value (first condition), the server may recognize that a dangerous situation has occurred in the object.
본 실시예에서, 상기 서버는 상기 관심영역 간의 이동거리가 기설정된 제2기준치 이상이면(제2조건), 상기 객체에 위험상황이 발생한 것으로 인지할 수 있다.In this embodiment, the server may recognize that a dangerous situation has occurred in the object if the moving distance between the areas of interest is greater than a preset second standard value (second condition).
본 실시예에서, 상기 서버는 상기 직전의 관심영역에서의 대표밝기값의 변화속도가 기설정된 제3기준치 이상이면(제3조건), 상기 객체에 위험상황이 발생한 것으로 인지할 수 있다.In this embodiment, the server may recognize that a dangerous situation has occurred in the object if the rate of change of the representative brightness value in the immediately preceding region of interest is greater than or equal to a preset third standard value (third condition).
본 실시예에서, 상기 서버는 상기 이동된 관심영역에서의 대표밝기값의 지속시간이 기설정된 제4기준치 이상이면(제4조건), 상기 객체에 위험상황이 발생한 것으로 인지할 수 있다.In this embodiment, if the duration of the representative brightness value in the moved region of interest is greater than or equal to a preset fourth standard value (fourth condition), the server may recognize that a dangerous situation has occurred in the object.
본 실시예에서, 상기 서버는 상기 관심영역이 이동하는 속도가 기설정된 제1기준치 이상(제1조건), 상기 관심영역 간의 이동거리가 기설정된 제2기준치 이상(제2조건), 상기 직전의 관심영역에서의 대표밝기값의 변화속도가 기설정된 제3기준치 이상(제3조건) 중 적어도 하나의 조건을 만족하면서 상기 이동된 관심영역에서의 대표밝기값의 지속시간이 기설정된 제4기준치 이상이면(제4조건), 상기 객체에 위험상황이 발생한 것으로 인지할 수 있다.In this embodiment, the server determines that the moving speed of the area of interest is greater than or equal to a preset first reference value (first condition), the moving distance between the areas of interest is greater than or equal to a preset second reference value (second condition), and The rate of change of the representative brightness value in the area of interest satisfies at least one of the preset third reference values or more (third condition), and the duration of the representative brightness value in the moved area of interest is more than the preset fourth reference value. If this is the case (condition 4), it can be recognized that a dangerous situation has occurred in the object.
본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템에 의하면 다음과 같은 하나 이상의 효과를 갖는다.The object behavior detection system using thermal imaging according to an embodiment of the present invention has one or more of the following effects.
본 발명에 의하면 열화상 영상에서 온도를 수치로 세분화하여 수치의 변화에 따라 객체의 행동을 감지하므로 열화상 영상에서 행동을 정확히 감지할 수 있다.According to the present invention, the temperature in a thermal image is subdivided into numerical values and the behavior of an object is detected according to changes in the numerical values, so that the behavior can be accurately detected in the thermal image.
본 발명에 의하면 복수의 열화상 카메라를 여러 공간에 분산하여 배치할 수 있어 여러 공간에서 일어나는 객체의 행동을 감지할 수 있다.According to the present invention, a plurality of thermal imaging cameras can be distributed and placed in multiple spaces, making it possible to detect object behavior occurring in multiple spaces.
본 발명에 의하면 열화상 카메라가 여러 공간에 분산되어 배치되어도 각 열화상 카메라들 상호 간에 통신이 이루어지게 하여 복수의 열화상 카메라로부터 감지정보를 빠르게 모아서 처리할 수 있어 빠른 감지가 가능할 수 있다.According to the present invention, even if thermal imaging cameras are distributed and deployed in various spaces, communication between each thermal imaging camera can be performed, and detection information from a plurality of thermal imaging cameras can be quickly collected and processed, enabling rapid detection.
본 발명에 의하면 열화상 카메라를 통해 객체에 대한 열화상 영상을 획득하여 분석함으로써 사생활을 침해하지 않으면서도 객체의 위험상황을 빠르고 정확하게 감지할 수 있다.According to the present invention, by acquiring and analyzing a thermal image of an object through a thermal imaging camera, it is possible to quickly and accurately detect a dangerous situation of an object without infringing on privacy.
도 1은 본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템의 구성도이다.Figure 1 is a configuration diagram of an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
도 2는 본 발명의 일 실시예에 따른 열화상 카메라의 메시 와이파이 연결을 설명하기 위한 도면이다.Figure 2 is a diagram for explaining the mesh Wi-Fi connection of a thermal imaging camera according to an embodiment of the present invention.
도 3은 본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템을 구성하는 열화상 카메라의 상세 구성도이다.Figure 3 is a detailed configuration diagram of a thermal imaging camera constituting an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
도 4는 본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템을 구성하는 서버의 상세 구성도이다.Figure 4 is a detailed configuration diagram of a server that configures an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
도 5는 본 발명의 실시예에 따른 열화상 카메라에서 획득된 열화상 영상의 예시도이다.Figure 5 is an example diagram of a thermal image acquired by a thermal imaging camera according to an embodiment of the present invention.
도 6a 및 도 6b는 본 발명의 실시예에 따른 열화상 영상에서 객체의 부위별 온도를 색상값으로 변환하는 예시도이다.Figures 6a and 6b are examples of converting the temperature of each part of an object into a color value in a thermal image according to an embodiment of the present invention.
도 7은 본 발명의 실시예에 따른 열화상 영상을 이용하여 객체의 행동을 감지하는 과정을 보여주는 흐름도이다.Figure 7 is a flowchart showing the process of detecting the behavior of an object using a thermal image according to an embodiment of the present invention.
이하, 본 발명의 일부 실시예들을 예시적인 도면을 통해 상세하게 설명한다. 각 도면의 구성요소들에 참조부호를 부가함에 있어서, 동일한 구성요소들에 대해서는 비록 다른 도면상에 표시되더라도 가능한 한 동일한 부호를 가지도록 하고 있음에 유의해야 한다. 또한, 본 발명의 실시례를 설명함에 있어, 관련된 공지구성 또는 기능에 대한 구체적인 설명이 본 발명의 실시례에 대한 이해를 방해한다고 판단되는 경우에는 그 상세한 설명은 생략한다.Hereinafter, some embodiments of the present invention will be described in detail through illustrative drawings. When adding reference numerals to components in each drawing, it should be noted that identical components are given the same reference numerals as much as possible even if they are shown in different drawings. Additionally, when describing embodiments of the present invention, if detailed descriptions of related known configurations or functions are judged to impede understanding of the embodiments of the present invention, the detailed descriptions will be omitted.
본 발명은 객체에 대한 열화상 영상을 분석하여 객체의 행동을 감지하도록 하는 시스템 및 방법을 제공한다. 열화상 카메라를 여러 공간에 분산하여 각각 배치하여 각각 열화상 열상을 획득하도록 하고, 열화상 카메라 간에는 메시 와이파이 통신을 통해 열화상 영상을 송수신함으로써 각 열화상 카메라의 열화상 영상을 모두 빠르게 수집할 수 있도록 한다. 또한 열화상 영상을 구성하는 셀의 온도에 따른 색상의 밝기값 변화를 이용하여 객체의 행동을 감지하고 객체의 위험상황을 인지하도록 한다. 본 발명에서는 빠르고 정확한 객체의 행동감지가 가능하도록 한다.The present invention provides a system and method for detecting the behavior of an object by analyzing a thermal image of the object. Thermal imaging cameras are distributed and placed in various spaces to acquire thermal images from each thermal imaging camera. By transmitting and receiving thermal imaging images between thermal imaging cameras through mesh Wi-Fi communication, thermal imaging images from each thermal imaging camera can be quickly collected. Let it happen. In addition, the change in color brightness value according to the temperature of the cells that make up the thermal image is used to detect the behavior of the object and recognize the dangerous situation of the object. The present invention enables fast and accurate object behavior detection.
이하에서, 첨부된 도면들을 참조하여 본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템을 상세하게 설명한다.Hereinafter, an object behavior detection system using thermal imaging according to an embodiment of the present invention will be described in detail with reference to the attached drawings.
도 1은 본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템의 개략적인 전체 구성도이다.Figure 1 is a schematic overall configuration diagram of an object behavior detection system using thermal imaging images according to an embodiment of the present invention.
도 1을 참조하면, 본 발명의 실시예에 따른 열화상 영상을 이용한 객체 행동 감지 시스템은 적어도 하나의 열화상 카메라(100)를 포함할 수 있다.Referring to FIG. 1, an object behavior detection system using thermal imaging according to an embodiment of the present invention may include at least one thermal imaging camera 100.
열화상 카메라(100)는 복수의 분리된 공간에 분산되어 설치될 수 있고, 각각 해당 공간의 열화상 영상, 특히 객체에 대한 열화상 영상을 획득할 수 있다.The thermal imaging camera 100 may be distributed and installed in a plurality of separate spaces, and may acquire thermal imaging images of each space, particularly thermal imaging images of objects.
열화상 카메라(100)는 객체의 온도를 측정할 수 있는 적외선(IR)센서를 포함할 수 있으며, IR센서는 객체에서 방출되는 복사열(주로 적외선)을 감지하여 객체의 온도를 측정할 수 있다. The thermal imaging camera 100 may include an infrared (IR) sensor capable of measuring the temperature of an object, and the IR sensor may measure the temperature of the object by detecting radiant heat (mainly infrared) emitted from the object.
이와 같이 측정된 객체의 온도는 열화상 카메라(100)에 연결된 디스플레이를 통해 온도에 대응하는 색상으로 표시될 수 있다.The temperature of the object measured in this way may be displayed in a color corresponding to the temperature through a display connected to the thermal imaging camera 100.
열화상 카메라(100)는 통신모듈을 구비하며 다른 열화상 카메라와 통신할 수 있고, 또는 AP장치(10) 및/또는 네트워크(20)에 접속할 수 있다.The thermal imaging camera 100 has a communication module and can communicate with other thermal imaging cameras, or connect to the AP device 10 and/or the network 20.
열화상 카메라(100)는 다른 종류의 통신모듈을 구비하거나 복수의 통신모듈을 구비할 수 있다. 예컨대, 열화상 카메라(100)는 와이파이(wi-fi) 통신모듈, NFC 통신모듈, 지그비(zigbee) 통신모듈, 블루투스(Bluetooth™) 통신모듈 등을 포함할 수 있다.The thermal imaging camera 100 may be equipped with different types of communication modules or may be equipped with a plurality of communication modules. For example, the thermal imaging camera 100 may include a Wi-Fi communication module, NFC communication module, Zigbee communication module, Bluetooth™ communication module, etc.
본 실시예에서, 일례로 열화상 카메라(100)는 메시 와이파이(mesh Wi-Fi)로 서로 연결되어 상호 간에 메시 와이파이 통신이 가능하여 열화상 카메라(100) 상호 간에 데이터를 송수신할 수 있다.In this embodiment, for example, the thermal imaging cameras 100 are connected to each other through mesh Wi-Fi and mesh Wi-Fi communication is possible between the thermal imaging cameras 100, so that data can be transmitted and received between the thermal imaging cameras 100.
열화상 카메라(100)는 통신모듈을 통해 게이트웨이(200)와 연결될 수 있다. The thermal imaging camera 100 may be connected to the gateway 200 through a communication module.
게이트웨이(200)는 네트워크(300)와 댁내망(홈네트워크)(1)를 연결하는 장비로서, 열화상 카메라(100)로부터 전달되는 열화상 영상을 네트워크(300)를 통해 후술하는 서버(400)로 전송할 수 있다.The gateway 200 is a device that connects the network 300 and the home network (1), and transmits thermal image images from the thermal imaging camera 100 to a server 400, which will be described later, through the network 300. It can be sent to .
네트워크(300)는 LTE, 5G 등과 같은 셀룰러 통신망이나 통상적인 인터넷망이 될 수 있다.The network 300 may be a cellular communication network such as LTE, 5G, etc., or a typical Internet network.
서버(400)는 열화상 카메라(100)에서 전달되는 객체의 열화상 영상을 수신하여 내부의 저장부에 저장할 수 있다.The server 400 may receive a thermal image of an object transmitted from the thermal imaging camera 100 and store it in an internal storage unit.
서버(400)는 열화상 영상으로부터 객체의 행동을 분석할 수 있다.The server 400 can analyze the behavior of objects from thermal imaging images.
서버(400)는 열화상 영상과 객체의 행동의 분석결과를 사용자단말(500)로 전송할 수 있다.The server 400 can transmit the thermal image image and the analysis results of the object's behavior to the user terminal 500.
사용자단말(500)은 서버(400)로부터 전송받은 열화상 영상과 분석결과를 화면에 표시할 수 있다. 이로써 사용자는 사용자단말(500)에서 이를 확인할 수 있다.The user terminal 500 can display the thermal image and analysis results received from the server 400 on the screen. As a result, the user can check this on the user terminal 500.
사용자단말(200)은 예컨대 스마트폰(smart phone), 태블릿 PC 등 휴대가능한 이동단말기가 될 수 있다.The user terminal 200 may be, for example, a portable mobile terminal such as a smart phone or tablet PC.
서버(400)는 상기 분석결과에 따라 유관기관 서버(미도시)로 알람을 전송할 수도 있다. 즉, 열화상 영상의 분석결과에서 위험상황 또는 응급상황임이 확인되면 의료기관, 경철서, 보건소 등 유관기관 서버로 알림을 보낼 수 있다.The server 400 may transmit an alarm to a related agency server (not shown) according to the analysis results. In other words, if a dangerous or emergency situation is confirmed as a result of the analysis of the thermal image, a notification can be sent to the server of related organizations such as medical institutions, police stations, and public health centers.
도 2는 본 발명의 일 실시예에 따른 복수의 열화상 카메라의 메시 와이파이 연결을 설명하기 위한 도면이다.Figure 2 is a diagram for explaining mesh Wi-Fi connection of a plurality of thermal imaging cameras according to an embodiment of the present invention.
도 2를 참조하면, 각 열화상 카메라(100)는 메시 와이파이(mesh wi-fi) 통신을 위한 통신모듈(110)을 포함한다.Referring to FIG. 2, each thermal imaging camera 100 includes a communication module 110 for mesh Wi-Fi communication.
열화상 카메라(100)는 복수의 공간(10,20,30)에 각각 설치될 수 있으며, 각 열화상 카메라(100)는 메시 와이파이 통신모듈(110)을 이용하여 상호 간에 무선통신을 수행할 수 있다. Thermal imaging cameras 100 can be installed in a plurality of spaces 10, 20, and 30, respectively, and each thermal imaging camera 100 can perform wireless communication with each other using the mesh Wi-Fi communication module 110. there is.
복수의 열화상 카메라(100) 중에서 적어도 하나는 게이트웨이(200)에 연결될 수 있다. 도면에는 일례로 제2공간에 있는 열화상 카메라(100)의 통신모듈(110)이 게이트웨이(200)와 연결된다.At least one of the plurality of thermal imaging cameras 100 may be connected to the gateway 200. In the drawing, for example, the communication module 110 of the thermal imaging camera 100 in the second space is connected to the gateway 200.
제1,2,3공간의 열화상 카메라(100)는 각 공간의 열화상 영상을 획득할 수 있다. 해당 공간에 객체가 존재한다면 당연히 객체의 열화상 영상을 획득할 수 있다.The thermal imaging cameras 100 of the first, second, and third spaces can acquire thermal image images of each space. If an object exists in that space, you can naturally obtain a thermal image of the object.
제1,3공간에 있는 열화상 카메라(100)는 해당 공간에 대한 열화상 영상을 획득하여 통신모듈(110)을 통해 제2공간의 열화상 카메라(100)의 통신모듈(110)로 전송할 수 있다.The thermal imaging camera 100 in the first and third spaces can acquire thermal image images for the space and transmit them to the communication module 110 of the thermal imaging camera 100 in the second space through the communication module 110. there is.
제2공간에 있는 열화상 카메라(100)는 자신의 공간의 열화상 영상과 제1,3공간의 열화상 영상을 통신모듈(110)을 통해 게이트웨이(200)로 전송할 수 있다.The thermal imaging camera 100 in the second space can transmit thermal imaging images of its own space and thermal imaging images of the first and third spaces to the gateway 200 through the communication module 110.
게이트웨이(200)는 네트워크(300)를 통해 서버(400)로 제1,2,3공간의 열화상 영상을 모두 전송할 수 있다.The gateway 200 can transmit all thermal image images of spaces 1, 2, and 3 to the server 400 through the network 300.
본 실시예에 따른 메시 와이파이 통신방식에서는 열화상 카메라(100)의 통신모듈(110) 간에 양방향 통신이 가능하다. 이는 기존의 확장형 와이파이 통신방식에 비해 통신모듈의 개수가 증가할수록 데이터의 전송속도가 빠르다.In the mesh Wi-Fi communication method according to this embodiment, two-way communication is possible between the communication modules 110 of the thermal imaging camera 100. Compared to the existing expandable Wi-Fi communication method, the data transmission speed becomes faster as the number of communication modules increases.
즉, 복수의 통신모듈(110)을 하나의 통신망으로 구축하여 각 통신모듈(110) 간에 양방향 통신이 이루어지기 때문에 통신속도가 빨라지는 것이다.In other words, by constructing a plurality of communication modules 110 into one communication network, two-way communication occurs between each communication module 110, thereby increasing the communication speed.
통신모듈(110)은 실생활에서 이동통신 단말기와도 통신할 수 있다. 이 경우라도 통신모듈(110) 간에는 별도의 통신 루트를 통해 메시 와이파이 통신이 이루어지기 때문에 이동통신 단말기의 통신에는 영향을 주지 않는다. The communication module 110 can also communicate with a mobile communication terminal in real life. Even in this case, mesh Wi-Fi communication is performed between the communication modules 110 through a separate communication route, so communication of the mobile communication terminal is not affected.
즉, 본 실시예의 메시 와이파이 통신은 이동통신 단말기와는 다른 통신 루트로 통신이 되므로 이동통신 단말기와 통신하는 경우에도 통신속도를 최적화할 수 있는 것이다.In other words, mesh Wi-Fi communication in this embodiment communicates through a communication route different from that of a mobile communication terminal, so communication speed can be optimized even when communicating with a mobile communication terminal.
메시 와이파이 통신에서는 복수의 통신모듈(110)이 하나의 SSID(service set identifier)로 설정되므로 열화상 카메라(100)의 이동시에도 끊김 없는 와이파이 통신이 가능하다.In mesh Wi-Fi communication, a plurality of communication modules 110 are set to one SSID (service set identifier), so uninterrupted Wi-Fi communication is possible even when the thermal imaging camera 100 is moved.
도 3은 본 발명의 실시예에 따른 행동 감시 시스템을 구성하는 열화상 카메라의 상세 구성도이다.Figure 3 is a detailed configuration diagram of a thermal imaging camera constituting a behavior monitoring system according to an embodiment of the present invention.
도 3을 참조하면, 열화상 카메라(100)는 외부장치와 통신을 수행하는 통신모듈(110), 객체의 열화상 영상을 획득하는 열화상센서(120), 전원을 공급하는 전원공급부(130), 데이터와 프로그램을 저장하는 저장부(140), 열화상 카메라(100)의 전반적인 동작을 제어하는 카메라제어부(150)를 포함할 수 있다.Referring to FIG. 3, the thermal imaging camera 100 includes a communication module 110 that communicates with an external device, a thermal imaging sensor 120 that acquires a thermal image of an object, and a power supply unit 130 that supplies power. , a storage unit 140 that stores data and programs, and a camera control unit 150 that controls the overall operation of the thermal imaging camera 100.
통신모듈(110)은 다른 열화상 카메라(100)의 통신모듈(110)과 통신할 수 있다. 상술한 바와 같이 통신모듈(110)은 메시 와이파이 통신을 수행할 수 있다. The communication module 110 can communicate with the communication module 110 of another thermal imaging camera 100. As described above, the communication module 110 can perform mesh Wi-Fi communication.
통신모듈(110)은 열화상 영상을 송수신할 수 있다. The communication module 110 can transmit and receive thermal images.
열화상 영상에 어느 열화상 카메라(100)에서 언제 획득한 영상인지를 확인할 수 있는 메타데이터 정보가 포함될 수 있다.The thermal imaging image may include metadata information that can be used to determine which thermal imaging camera 100 and when the image was acquired.
메타데이터에는 열화상 영상을 촬영한 열화상 카메라(100)의 식별정보, 열화상 영상을 촬영한 시간, 열화상 영상이 촬영된 위치 등의 정보가 포함될 수 있다.Metadata may include information such as identification information of the thermal imaging camera 100 that captured the thermal imaging image, the time the thermal imaging image was captured, and the location where the thermal imaging image was captured.
통신모듈(110)은 게이트웨이(200)와 통신할 수도 있다. 게이트웨이(200)는 통신모듈(110)로부터 전달된 열화상 영상을 서버(400)로 전송할 수 있다.The communication module 110 may communicate with the gateway 200. The gateway 200 may transmit the thermal image received from the communication module 110 to the server 400.
열화상센서(120)는 예컨대 적외선(IR)센서로 구성될 수 있다.The thermal image sensor 120 may be configured as, for example, an infrared (IR) sensor.
열화상센서(120)는 주변에 대한 열화상 영상을 획득할 수 있다. 특히 본 실시예에서는 열화상 카메라(100)가 배치된 공간 및 그 공간에 있는 객체(예:사람)에 대한 열화상 영상을 획득할 수 있다.The thermal image sensor 120 can acquire thermal image images of the surroundings. In particular, in this embodiment, thermal image images of the space where the thermal imaging camera 100 is placed and objects (eg, people) in the space can be acquired.
전원공급부(130)는 배터리를 포함할 수 있으며, 열화상 카메라(100)로 전원을 공급할 수 있다. 전원공급부(130)는 배터리를 충전할 수 있는 충전장치(미도시)를 포함할 수 있다. The power supply unit 130 may include a battery and may supply power to the thermal imaging camera 100. The power supply unit 130 may include a charging device (not shown) that can charge the battery.
또는, 전원공급부(130)는 상용전원을 공급받아 열화상 카메라(100)에 적합한 전압으로 전환하여 전원을 공급할 수도 있다.Alternatively, the power supply unit 130 may receive commercial power, convert it to a voltage suitable for the thermal imaging camera 100, and supply power.
저장부(140)에는 열화상 카메라(100)의 동작에 필요한 각종 데이터 및 프로그램이 저장될 수 있다. 특히, 저장부(140)에 열화상센서(120)에서 획득된 열화상 영상이 저장될 수 있다.Various data and programs necessary for the operation of the thermal imaging camera 100 may be stored in the storage unit 140. In particular, a thermal image acquired by the thermal image sensor 120 may be stored in the storage unit 140.
카메라제어부(150)는 통신모듈(110), 열화상센서(120), 전원공급부(130)의 동작을 제어할 수 있다. The camera control unit 150 can control the operations of the communication module 110, the thermal image sensor 120, and the power supply unit 130.
카메라제어부(150)는 열화상 센서(120)가 주변에 대한 열화상 영상을 획득하여 저장부(140)에 저장하도록 제어할 수 있고, 통신모듈(110)를 통해 다른 열화상 카메라(100) 및/또는 게이트웨이(200)로 열화상 영상을 전송하도록 제어할 수 있으며, 전원공급부(130)로부터 전원이 공급되도록 제어할 수 있다.The camera control unit 150 can control the thermal imaging sensor 120 to acquire thermal image images of the surroundings and store them in the storage unit 140, and can control other thermal imaging cameras 100 and other thermal imaging cameras 100 through the communication module 110. / Alternatively, it can be controlled to transmit the thermal image to the gateway 200, and can be controlled to supply power from the power supply unit 130.
도 4는 본 발명의 실시예에 따른 행동 감지 시스템을 구성하는 서버의 상세 구성도이다.Figure 4 is a detailed configuration diagram of a server constituting a behavior detection system according to an embodiment of the present invention.
도 4를 참조하면, 본 발명의 실시예에 따른 서버(400)는 서버통신부(410), 데이터베이스(DB)(420) 및 서버제어부(430)를 포함하여 구성될 수 있다.Referring to FIG. 4, the server 400 according to an embodiment of the present invention may be configured to include a server communication unit 410, a database (DB) 420, and a server control unit 430.
서버통신부(410)는 네트워크(300)에 접속할 수 있고 열화상 카메라(100) 및/또는 사용자단말(500)과 통신할 수 있다. 선택적으로 서버통신부(410)는 의료기관, 경찰서, 응급실, 행정기관 등 유관기관들의 서버(미도시)와 통신할 수도 있다.The server communication unit 410 can connect to the network 300 and communicate with the thermal imaging camera 100 and/or the user terminal 500. Optionally, the server communication unit 410 may communicate with servers (not shown) of related organizations such as medical institutions, police stations, emergency rooms, and administrative agencies.
데이터베이스(DB)(420)는 서버(400)의 동작에 필요한 각종 프로그램 및 데이터를 저장할 수 있다. 특히 DB(420)는 복수의 열화상 카메라(100)로부터 전송된 열화상 영상을 저장할 수 있다.The database (DB) 420 can store various programs and data necessary for the operation of the server 400. In particular, the DB 420 can store thermal image images transmitted from a plurality of thermal imaging cameras 100.
도면에는 일례로 DB(420)가 서버(400)의 내부에 설치된 것으로 도시되어 있으나, 다르게는 DB(420)가 서버(400)의 외부에 별도로 구축되어 서버(400)와 연동되도록 동작할 수도 있다.In the drawing, for example, the DB 420 is shown as installed inside the server 400, but alternatively, the DB 420 may be built separately outside the server 400 and operate in conjunction with the server 400. .
서버제어부(430)는 서버(400)의 전반적인 동작을 제어할 수 있다.The server control unit 430 can control the overall operation of the server 400.
서버제어부(430)는 열화상 영상을 이용하여 객체의 행동을 분석할 수 있다.The server control unit 430 can analyze the behavior of objects using thermal imaging images.
본 실시예에서 서버제어부(430)는 열화상 영상을 구성하는 복수의 셀마다 객체의 온도에 따른 색상의 밝기값을 설정하고 특정 셀에서의 밝기값 변화를 이용하여 객체의 행동을 분석할 수 있다.In this embodiment, the server control unit 430 sets the color brightness value according to the temperature of the object for each of the plurality of cells constituting the thermal image image and analyzes the object's behavior using the change in brightness value in a specific cell. .
예컨대, 열화상 영상을 구성하는 복수의 셀에서 밝기값의 변화를 통해 객체가 앉아 있는지, 서 있는지, 누워 있는지, 객체가 앉아 있다가 일어서는지, 서 있다가 눕는지 등에 대한 행동을 분석할 수 있다.For example, through changes in brightness values in a plurality of cells that make up a thermal image, it is possible to analyze behavior such as whether an object is sitting, standing, or lying down, whether the object is sitting and then standing up, or standing and lying down, etc. .
또한, 서버제어부(430)는 열화상 영상을 분석하여 객체의 위험상황 여부를 인지할 수 있다.Additionally, the server control unit 430 can recognize whether an object is in a dangerous situation by analyzing the thermal image.
본 실시예에서 서버제어부(430)는 열화상 영상을 구성하는 복수의 셀마다 객체의 온도에 따른 색상의 밝기값을 설정하고 특정 셀에서의 밝기값 변화를 이용하여 객체가 일상적인 행동을 하는지, 위험한 동작을 하는지, 위험상황에 처해 있는지 등을 인지할 수 있다.In this embodiment, the server control unit 430 sets the color brightness value according to the temperature of the object for each of the plurality of cells constituting the thermal image and uses the change in brightness value in a specific cell to determine whether the object performs routine actions. You can recognize whether you are making a dangerous move or are in a dangerous situation.
예컨대, 서버제어부(430)는 복수의 셀에서 밝기값 변화, 밝기값 변화의 속도 등을 확인함으로써 객체가 위험상황에 있는지를 확인할 수 있다.For example, the server control unit 430 can determine whether an object is in a dangerous situation by checking the change in brightness value and the speed of change in brightness value in a plurality of cells.
본 실시예에서 서버제어부(430)는 머신러닝(machine learning)으로 기학습된 인공신경망(Artificial Neural Networks:ANN)이 탑재되어 열화상 기반 객체의 행동 감지 및 위험상황 인지 등을 수행할 수 있다.In this embodiment, the server control unit 430 is equipped with an artificial neural network (ANN) pre-trained through machine learning and can detect the behavior of objects and recognize dangerous situations based on thermal images.
인공신경망(ANN)은 소프트웨어 형태로 구현되거나 칩(chip) 등 하드웨어 형태로 구현될 수 있다. 예컨대, 서버제어부(430)는 딥러닝(Deep Learning)으로 학습된 CNN(Convolutional Neural Network), RNN(Recurrent Neural Network), DBN(Deep Belief Network) 등 심층신경망(Deep Neural Network: DNN)을 포함할 수 있다.Artificial neural networks (ANNs) can be implemented in software form or in hardware form such as chips. For example, the server control unit 430 may include a deep neural network (DNN) such as CNN (Convolutional Neural Network), RNN (Recurrent Neural Network), and DBN (Deep Belief Network) learned through deep learning. You can.
딥러닝(Deep learning)은 히든 레이어들을 차례로 거치면서 복수의 데이터들로부터 핵심적인 데이터를 추출하는 머신러닝(Machine Learning) 알고리즘의 집합을 나타낼 수 있다.Deep learning can represent a set of machine learning algorithms that extract key data from a plurality of data by sequentially going through hidden layers.
딥러닝 구조는, CNN, RNN, DBN 등 심층신경망(DNN)으로 구성될 수 있다.The deep learning structure can be composed of deep neural networks (DNN) such as CNN, RNN, and DBN.
심층신경망(DNN)은 입력 레이어(Input Layer), 히든 레이어(Hiddent Layer) 및 출력 레이어(Output Layer)를 포함할 수 있다.A deep neural network (DNN) may include an input layer, a hidden layer, and an output layer.
다중 히든 레이어(hidden layer)를 갖는 것을 DNN(Deep Neural Network)이라 할 수 있다. 각 레이어는 복수의 노드들을 포함하고, 각 레이어는 다음 레이어와 연관되어 있다. 노드들은 웨이트(weight)를 가지고 서로 연결될 수 있다.One with multiple hidden layers can be called a Deep Neural Network (DNN). Each layer contains multiple nodes, and each layer is related to the next layer. Nodes can be connected to each other with weights.
제1 히든 레이어(Hidden Layer 1)에 속한 임의의 노드로부터의 출력은, 제2 히든 레이어(Hidden Layer 2)에 속하는 적어도 하나의 노드의 입력이 된다. 이때 각 노드의 입력은 이전 레이어의 노드의 출력에 웨이트(weight)가 적용된 값일 수 있다. The output from any node belonging to the first hidden layer (Hidden Layer 1) becomes the input to at least one node belonging to the second hidden layer (Hidden Layer 2). At this time, the input of each node may be a weight applied to the output of the node of the previous layer.
웨이트(weight)는 노드간의 연결 강도를 의미할 수 있다. 딥러닝 과정은 적절한 웨이트(weight)를 찾아내는 과정으로도 볼 수 있다.Weight may refer to the strength of the connection between nodes. The deep learning process can also be seen as a process of finding appropriate weights.
한편, 인공신경망의 학습은 주어진 입력에 대하여 원하는 출력이 나오도록 노드간 연결선의 웨이트(weight)를 조정함으로써 이루어질 수 있다. 또한, 인공신경망은 학습에 의해 웨이트(weight) 값을 지속적으로 업데이트시킬 수 있다.Meanwhile, learning of an artificial neural network can be accomplished by adjusting the weight of the connection lines between nodes to produce a desired output for a given input. Additionally, artificial neural networks can continuously update weight values through learning.
서버제어부(430)는 객체의 위험상황인 것으로 인지하면 서버통신부(410)를 통해 사용자단말(500) 및/또는 유관기관 서버(미도시)로 알림을 전송할 수도 있다.If the server control unit 430 recognizes that an object is in a dangerous situation, it may transmit a notification to the user terminal 500 and/or a related organization server (not shown) through the server communication unit 410.
도 5는 본 발명의 실시예에 따른 열화상 카메라에서 획득된 열화상 영상이 표시된 예시도이다.Figure 5 is an example diagram showing a thermal image obtained from a thermal imaging camera according to an embodiment of the present invention.
도 5를 참조하면, 열화상 카메라(100)에 의해 획득된 객체의 열화상 영상은 열화상 카메라(100)에 유무선으로 연결된 디스플레이에 표시될 수 있다.Referring to FIG. 5 , a thermal image of an object acquired by the thermal imaging camera 100 may be displayed on a display connected to the thermal imaging camera 100 by wire or wirelessly.
열화상 영상이 디스플레이에 표시될 때 객체의 부위별 온도에 따른 색상이 표시될 수 있다. 온도가 높은 부위와 낮은 부위는 다른 색상으로 표시될 수 있다.When a thermal image is displayed on a display, colors depending on the temperature of each part of the object may be displayed. Areas with high and low temperatures may be displayed in different colors.
도 5에 예시된 열화상 영상 중 (a)는 사람이 손을 드는 행동, (b)는 사람이 의자에 앉아 있는 행동, (c)는 사람이 바닥에 앉아 있는 행동을 나타낸다. Among the thermal image images illustrated in Figure 5, (a) represents a person raising a hand, (b) represents a person sitting on a chair, and (c) represents a person sitting on the floor.
서버제어부(430)는 열화상 영상에서 객체의 부위별 온도에 따른 색상에 대응하는 수치를 이용하여 객체의 행동을 감지할 수 있다.The server control unit 430 can detect the behavior of an object using values corresponding to colors according to the temperature of each part of the object in the thermal image.
도 6a 및 도 6b는 본 발명의 실시예에 따른 열화상 영상에서 객체의 부위별 온도를 색상값으로 변환하는 예시도이다.Figures 6a and 6b are examples of converting the temperature of each part of an object into a color value in a thermal image according to an embodiment of the present invention.
도 6a는 일례로 도 5의 (b)에 도시된 객체의 열화상 영상에 대해 열화상 영상을 구성하는 복수의 셀별 온도를 색상값으로 변환하는 예이다.FIG. 6A is an example of converting the temperature of a plurality of cells constituting the thermal image of the object shown in (b) of FIG. 5 into color values.
도 6b는 일례로 도 5의 (c)에 도시된 객체의 열화상 영상에 대해 열화상 영상을 구성하는 복수의 셀별 온도를 색상값으로 변환하는 예이다.FIG. 6B is an example of converting the temperature of a plurality of cells constituting the thermal image of the object shown in (c) of FIG. 5 into color values.
열화상 영상은 복수개의 셀로 구성될 수 있다. 그리고 각각의 셀마다 객체의 온도에 대응하는 색상이 표시될 수 있다. 온도가 다르면 다른 색상으로 표시되는 것이다. 셀에서 온도가 높을수록 상대적으로 더 밝은 색상으로 표시될 수 있다.A thermal image may consist of a plurality of cells. Additionally, a color corresponding to the temperature of the object may be displayed in each cell. Different temperatures are displayed in different colors. The higher the temperature in the cell, the relatively brighter the color may be displayed.
도면에 변환된 색상값은 객체의 온도에 대응하는 색상의 상대적인 밝기값이 될 수 있다. 즉, 온도가 높을수록 더 밝은 색상이 되고, 색상이 밝을수록 더 높은 밝기값이 설정될 수 있다.The color value converted to the drawing may be the relative brightness value of the color corresponding to the temperature of the object. That is, the higher the temperature, the brighter the color, and the brighter the color, the higher the brightness value can be set.
셀의 색상은 레드(R), 그린(G), 블루(B)의 조합으로 표시될 수 있다. R,G,B는 예컨대 각각 0~254 사이의 색상값을 가질 수 있으며, 이러한 색상값은 각 셀의 밝기값이 될 수 있다. 밝을수록 254에 가깝고, 어두울수록 0에 가깝다.The color of the cell can be displayed as a combination of red (R), green (G), and blue (B). For example, R, G, and B can each have color values between 0 and 254, and these color values can be the brightness value of each cell. The brighter it is, the closer it is to 254, and the darker it is, the closer it is to 0.
색상의 밝기값은 임의의 범위로 설정할 수 있다. 본 실시예에서는 각 셀별로 색상의 밝기값을 0~100 레벨로 구분한다. The brightness value of the color can be set to an arbitrary range. In this embodiment, the color brightness value for each cell is classified into levels 0 to 100.
객체의 온도가 상대적으로 높은 부위에 대응하는 셀에서의 색상은 상대적으로 큰 밝기값이 설정되며, 반대로 객체의 온도가 상대적으로 낮은 부위에 대응하는 셀에서의 색상은 상대적으로 작은 밝기값이 설정되는 것이다.The color in the cell corresponding to the area where the temperature of the object is relatively high is set to a relatively large brightness value, and conversely, the color in the cell corresponding to the area where the temperature of the object is relatively low is set to a relatively small brightness value. will be.
도 6a 및 도 6b의 일례를 보면, 열화상 영상은 10×15개의 셀로 구성된다.Looking at the example of FIGS. 6A and 6B, the thermal image consists of 10×15 cells.
물론, 이는 일례에 불과하며, 열화상 카메라(100)의 성능, 열화상 영상의 해상도에 따라 M×N(M,N≥2인 정수)개의 셀로 구성될 수 있다.Of course, this is only an example, and may be composed of M×N (M, N ≥ 2 integer) cells depending on the performance of the thermal imaging camera 100 and the resolution of the thermal imaging image.
좌측에는 열화상 영상에 객체의 온도에 따라 밝기가 다른 색상이 표시되고, 우측에는 좌측의 열화상 영상에 대응하는 10×15개의 셀에서 객체의 부위에 대응하는 셀마다 온도를 색상의 밝기에 대응하는 밝기값으로 전환한 예가 도시된다.On the left, the thermal image displays colors with different brightness depending on the temperature of the object, and on the right, in the 10×15 cells corresponding to the thermal image on the left, the temperature corresponds to the brightness of the color for each cell corresponding to the object part. An example of conversion to a brightness value is shown.
도 6a에서는 가장 어두운 부위의 밝기값은 0이고 가장 밝은 부위의 밝기값은 81이다. 도 6b에서는 가장 어두운 부위의 밝기값은 0이고 가장 밝은 부위의 밝기값은 79이다.In Figure 6a, the brightness value of the darkest part is 0 and the brightness value of the brightest part is 81. In Figure 6b, the brightness value of the darkest part is 0 and the brightness value of the brightest part is 79.
이하에서, 도 6a 및 도 6b를 참조하여 본 발명에 따른 서버에서 객체의 행동 및 위험상황을 인지하는 동작을 설명한다. Below, the operation of recognizing object behavior and dangerous situations in the server according to the present invention will be described with reference to FIGS. 6A and 6B.
서버(400)에서 서버제어부(430)가 열화상 영상에서 관심영역을 추출할 수 있다. 관심영역은 객체의 특정 부위를 판단하기 위해 설정되는 영역이 될 수 있다. In the server 400, the server control unit 430 may extract a region of interest from the thermal image. A region of interest may be an area set to determine a specific part of an object.
본 실시예에서 관심영역이 객체의 상체가 될 수 있다, 바람직하게는 얼굴이 될 수 있다. 다른 실시예에서는 열화상 영상에서 가장 밝은 부위나 가장 밝은 부위와 그 주변 부위까지 포함된 영역이 될 수 있다.In this embodiment, the region of interest may be the upper body of the object, preferably the face. In another embodiment, it may be the brightest part of the thermal image, or an area including the brightest part and surrounding areas.
서버제어부(430)는 관심영역에서의 대표밝기값을 산출할 수 있다. 일 실시예에서, 관심영역이 하나의 셀에 대응되는 경우 그 셀의 밝기값을 대표밝기값으로 산출할 수 있다. 다른 실시예에서, 관심영역이 둘 이상의 셀에 해당되는 경우는 각 셀의 밝기값의 평균치를 대표밝기값으로 산출하거나, 또는 가장 높은 밝기값을 대표밝기값으로 산출할 수 있다.The server control unit 430 can calculate a representative brightness value in the region of interest. In one embodiment, when the region of interest corresponds to one cell, the brightness value of that cell can be calculated as the representative brightness value. In another embodiment, when the region of interest corresponds to two or more cells, the average of the brightness values of each cell can be calculated as the representative brightness value, or the highest brightness value can be calculated as the representative brightness value.
도 6a에서 열화상 영상에서 관심영역은 가장 밝은 M 영역이 될 수 있고, 이에 대응하는 셀에서의 관심영역은 M 영역에 대응되는 N 영역이 될 수 있다.In FIG. 6A, the region of interest in the thermal image may be the brightest M region, and the region of interest in the corresponding cell may be the N region corresponding to the M region.
N 영역에는 4개의 셀이 있으므로 N 영역의 대표밝기값은 4개의 셀의 밝기값의 평균값인 70.5(=(55+79+67+81)/4)이 될 수 있고, 또는 다르게는 4개의 셀 중 가장 큰 밝기값인 81이 대표밝기값이 될 수도 있다.Since there are 4 cells in the N area, the representative brightness value of the N area can be 70.5 (=(55+79+67+81)/4), which is the average value of the brightness values of the 4 cells, or alternatively, the 4 cells The largest brightness value, 81, may be a representative brightness value.
만약, 객체가 움직이지 않는다면 N 영역에서의 밝기값은 변하지 않을 것이고, 혹여 변하더라도 변화량은 작을 것이다.If the object does not move, the brightness value in area N will not change, and even if it changes, the amount of change will be small.
도 6a에서의 자세에서 도 6b의 자세로 객체가 움직이게 되면 관심영역은 N 영역에서 N' 영역으로 변경될 수 있다. 도 6b에서는 열화상 영상의 관심영역은 가장 밝은 M' 영역이 될 수 있고, 이에 대응하는 셀에서의 관심영역은 M' 영역에 대응되는 N' 영역이 되는 것이다.When the object moves from the posture in FIG. 6A to the posture in FIG. 6B, the region of interest may change from area N to area N'. In FIG. 6B, the region of interest in the thermal image may be the brightest M' region, and the region of interest in the cell corresponding to this may be the N' region corresponding to the M' region.
대표밝기값을 평균값으로 할 경우 N' 영역에서의 대표밝기값은 46이고, 대표밝기값을 가장 큰 밝기값으로 할 경우는 N' 영역에서의 대표밝기값은 71이 된다.When the representative brightness value is set as the average value, the representative brightness value in the N' area is 46, and when the representative brightness value is set as the largest brightness value, the representative brightness value in the N' area is 71.
서버제어부(430)는 관심영역이 N 영역에서 N' 영역으로 변하는 것을 감지하여 객체의 행동을 감지할 수 있으며, 객체의 위험상황도 감지할 수 있다.The server control unit 430 can detect the behavior of an object by detecting a change in the area of interest from the N area to the N' area, and can also detect a dangerous situation of the object.
관심영역이 이동하게 되면 객체가 행동하게 됨을 감지할 수 있다. 즉, 객체의 얼굴이 N 영역에서 N' 영역으로 이동하게 되면 객체의 움직임이 발생한 것으로 감지할 수 있는 것이다.When the area of interest moves, the object's action can be detected. In other words, when the object's face moves from area N to area N', it can be detected that movement of the object has occurred.
한편, 서버제어부(430)는 관심영역의 이동에 따른 특징을 이용하여 객체의 위험상황 여부를 판단할 수 있다. Meanwhile, the server control unit 430 can determine whether an object is in danger using characteristics according to the movement of the area of interest.
본 실시예에서, 서버제어부(430)는, (1)관심영역의 이동속도, (2)관심영역의 이동거리, (3)이전 관심영역의 대표밝기값의 변화속도, (4)이동된 관심영역의 대표밝기값의 지속시간을 이용하여 객체의 위험상황을 판단할 수 있다.In this embodiment, the server control unit 430 determines (1) the moving speed of the area of interest, (2) the moving distance of the area of interest, (3) the change speed of the representative brightness value of the previous area of interest, and (4) the moved interest. The danger situation of an object can be determined using the duration of the representative brightness value of the area.
(1)관심영역의 이동속도(1) Movement speed of area of interest
객체에 위험상황이 발생하는 경우 일상적인 보통의 움직임에 비해 관심영역이 이동하는 속도는 상대적으로 빠를 수 있다. 예컨대, 객체가 갑자기 쓰러지거나 갑자기 주저앉는 경우 관심영역은 빠르게 이동할 것이다.When a dangerous situation occurs in an object, the speed at which the area of interest moves can be relatively fast compared to normal, everyday movement. For example, if an object suddenly falls or suddenly sits down, the region of interest will move quickly.
서버제어부(430)는 관심영역이 일정속도 이상으로 빠르게 이동하는 경우에는 위험상황으로 인지할 수 있다. 즉, 관심영역이 N 영역에서 N' 영역으로 이동하는 속도가 기설정된 제1기준치 이상이면(제1조건), 객체에 위험상황이 발생한 것으로 인지할 수 있다.The server control unit 430 may recognize a dangerous situation when the area of interest moves faster than a certain speed. That is, if the speed at which the area of interest moves from area N to area N' is greater than or equal to the preset first reference value (first condition), it can be recognized that a dangerous situation has occurred in the object.
(2)관심영역의 이동거리(2) Movement distance of area of interest
또한, 객체에 위험상황이 발생하는 경우 일상적인 보통의 움직임에 비해 관심영역의 이동거리가 상대적으로 길 수 있다. 예컨대, 객체가 서 있다가 갑자기 쓰러지는 경우 처음의 관심영역(예:N영역)에서 이동한 관심영역(예:N'영역) 간의 거리는 커진다.Additionally, when a dangerous situation occurs in an object, the moving distance of the area of interest may be relatively long compared to normal, everyday movement. For example, when an object suddenly falls while standing, the distance between the initial area of interest (e.g. area N) and the moved area of interest (e.g. area N') increases.
서버제어부(430)는 관심영역 간의 이동거리가 일정거리 이상일 경우에는 위험상황으로 인지할 수 있다. 즉, 관심영역이 N 영역에서 N' 영역으로 이동한 거리가 기설정된 제2기준치 이상이면(제2조건), 객체에 위험상황이 발생한 것으로 인지할 수 있다.The server control unit 430 may recognize a dangerous situation when the moving distance between areas of interest is more than a certain distance. In other words, if the distance that the area of interest moves from area N to area N' is more than the preset second standard value (second condition), it can be recognized that a dangerous situation has occurred in the object.
(3)이전 관심영역의 대표밝기값의 변화속도(3) Speed of change of representative brightness value of previous area of interest
또한, 객체에 위험상황이 발생하면 관심영역은 빠르게 이동할 수 있다. 도 6a 및 도 6b의 예시와 같이, 객체가 서 있다가 주저 앉는 경우 관심영역은 N 영역에서 N' 영역으로 이동한다. Additionally, if a dangerous situation occurs in an object, the area of interest can move quickly. As in the example of FIGS. 6A and 6B, when an object stands and then falls down, the region of interest moves from area N to area N'.
대표밝기값을 밝기값이 가장 큰 값으로 하는 경우, 객체가 서 있는 도 6a에서는 관심영역인 N 영역의 대표밝기값은 81이다. 객체가 주저 앉은 도 6b에서는 관심영역이 N' 영역으로 이동함으로써, 직전에 관심영역이었던 N 영역의 대표밝기값은 8이 된다. When the representative brightness value is set as the value with the largest brightness value, the representative brightness value of area N, which is the area of interest, in Figure 6a where the object is standing is 81. In Figure 6b, where the object is sitting down, the area of interest moves to area N', and the representative brightness value of area N, which was the area of interest just before, becomes 8.
N 영역은 관심영역일 때는 대표밝기값이 81이었다가 갑자기 8로 변하게 된 것이다. 이는 객체의 움직임이 있었다는 것이며, 객체에 위험상황이 발생하면 그 움직임은 빨라진다. When area N was an area of interest, the representative brightness value was 81, but suddenly changed to 8. This means that there was movement of the object, and when a dangerous situation occurs in the object, the movement becomes faster.
서버제어부(430)는 관심영역에서의 대표밝기값이 일정속도 이상으로 빨라짐을 감지하여 객체의 위험상황을 인지할 수 있다. 즉, 관심영역이었던 N 영역에서 대표밝기값의 변화속도가 기설정된 제3기준치 이상이면(제3조건), 객체에 위험상황이 발생한 것으로 인지할 수 있다. The server control unit 430 can recognize a dangerous situation of an object by detecting that the representative brightness value in the area of interest increases beyond a certain speed. In other words, if the rate of change of the representative brightness value in area N, which was the area of interest, is more than the preset third standard value (third condition), it can be recognized that a dangerous situation has occurred in the object.
바람직하게는, 관심영역에서 대표밝기값의 감속속도가 제3기준치 이상이면 객체에 위험상황이 발생한 것으로 인지할 수 있다. Preferably, if the deceleration speed of the representative brightness value in the area of interest is greater than or equal to the third standard value, it can be recognized that a dangerous situation has occurred in the object.
(4)이동된 관심영역의 대표밝기값의 지속시간(4) Duration of representative brightness value of the moved region of interest
한편, 제1,2,3조건은 객체에 위험상황이 발생한 경우뿐만 아니라 객체가 다른 목적에 의해 빠르게 움직이는 경우에도 발생될 수 있다. 예컨대, 객체가 앉았다 일어나는 운동을 빠르게 하는 경우에도 제1,2,3조건 중 어느 하나가 만족할 수도 있다. 이 경우에는 객체의 위험상황 인지에 대한 신뢰성이 저하될 우려가 있다.Meanwhile, conditions 1, 2, and 3 can occur not only when a dangerous situation occurs in an object, but also when the object moves quickly for another purpose. For example, even when an object quickly moves from sitting to standing, any one of the first, second, and third conditions may be satisfied. In this case, there is a risk that the reliability of the object's recognition of dangerous situations may decrease.
이에, 본 발명에서는 변경된 관심영역에서 대표밝기값의 지속시간을 이용함으로써 객체에 위험상황이 발생하였는지를 판단하는데 신뢰성을 높일 수 있다.Accordingly, in the present invention, reliability can be increased in determining whether a dangerous situation has occurred in an object by using the duration of the representative brightness value in the changed area of interest.
즉, 이동된 관심영역의 대표밝기값이 일정시간 이상 지속된다는 것은 객체의 움직임이 있은 후 일정시간 이상 움직임이 없다는 것을 의미할 수 있다. In other words, the fact that the representative brightness value of the moved region of interest persists for more than a certain period of time may mean that there is no movement for more than a certain period of time after the object moves.
예컨대, 사람이 서 있다가 쓰러지게 되면 관심영역이 위에 있다가 아래로 이동할 수 있다. 관심영역이 아래로 이동한 이후에, 이동된 관심영역에서 대표밝기값의 변화가 없거나 또는 일정시간 이상 대표밝기값이 지속된다면 객체에 움직임이 없다는 것을 의미할 수 있다. 이는 객체에 위험상황이 될 수 있다.For example, if a person falls from standing, the area of interest may move from above to below. After the region of interest moves downward, if there is no change in the representative brightness value in the moved region of interest or if the representative brightness value persists for more than a certain period of time, it may mean that there is no movement in the object. This can be a dangerous situation for the object.
따라서, 서버제어부(430)는 이동된 관심영역에서의 대표밝기값이 일정시간 이상 지속되는지 판단하여 객체의 위험상황을 인지할 수 있다. 즉, 이동된 관심영역인 N' 영역에서 대표밝기값이 지속되는 시간이 기설정된 제4기준치 이상이면(제4조건), 객체에 위험상황이 발생한 것으로 인지할 수 있다. Accordingly, the server control unit 430 can determine whether the representative brightness value in the moved area of interest continues for more than a certain period of time and recognize the dangerous situation of the object. In other words, if the time for which the representative brightness value lasts in area N', which is the moved area of interest, is more than the preset fourth standard value (fourth condition), it can be recognized that a dangerous situation has occurred in the object.
한편, 본 발명에서 서버(400)는 제1,2,3,4조건 중 적어도 하나의 조건을 만족하면 객체에 위험상황이 발생한 것으로 인지할 수 있다.Meanwhile, in the present invention, the server 400 can recognize that a dangerous situation has occurred in an object when at least one of the first, second, third, and fourth conditions is satisfied.
또는 다르게는, 서버(400)는 제1,2,3조건 중 적어도 하나의 조건과 제4조건을 동시에 만족하는 경우에 객체에 위험상황이 발생한 것으로 인지할 수 있다.Alternatively, the server 400 may recognize that a dangerous situation has occurred in an object when at least one of the first, second, and third conditions and the fourth condition are simultaneously satisfied.
예컨대 객체가 쓰러져서 움직이지 못하는 경우는 제4조건을 만족하게 되므로 이 경우에 객체에 위험상황이 발생한 것으로 판단한다면 정확성이 향상될 것이다.For example, if an object falls and cannot move, the fourth condition is satisfied, so if it is determined that a dangerous situation has occurred in the object in this case, accuracy will be improved.
하지만, 제1,2,3조건 중 적어도 어느 한 조건만 만족하더라도 객체의 위험상황을 판단한다면 빠른 판단이 일어날 수 있을 것이다.However, if at least one of the first, second, and third conditions is satisfied, a quick decision can be made if the dangerous situation of the object is determined.
따라서, 열화상 카메라(100)가 설치되는 장소와 객체의 행동 특성에 맞게 제1-4조건을 적절하게 조합하여 객체의 위험상황 여부를 판단할 수 있다.Therefore, it is possible to determine whether the object is in a dangerous situation by appropriately combining the first to fourth conditions according to the location where the thermal imaging camera 100 is installed and the behavioral characteristics of the object.
도 7은 본 발명의 실시예에 따른 열화상 영상을 이용하여 객체의 행동을 감지하는 과정을 보여주는 흐름도이다.Figure 7 is a flowchart showing the process of detecting the behavior of an object using a thermal image according to an embodiment of the present invention.
도 7을 참조하면, 본 발명에 따른 객체 행동 감지 방법은, 복수의 공간에 각각 설치된 열화상 카메라(100)에 의해 객체의 열화상 영상을 획득한다(S101). 열화상 영상의 획득은 실시간 또는 설정주기로 이루어질 수 있다.Referring to FIG. 7, the object behavior detection method according to the present invention acquires a thermal image of the object by thermal imaging cameras 100 installed in a plurality of spaces (S101). Acquisition of thermal imaging images can be done in real time or at set intervals.
열화상 카메라(100)는 각각 획득된 열화상 영상을 메시 와이파이 통신을 통해 게이트웨이(200)와 연결된 특정 열화상 카메라(100)로 전송한다(S102).The thermal imaging camera 100 transmits each acquired thermal imaging image to a specific thermal imaging camera 100 connected to the gateway 200 through mesh Wi-Fi communication (S102).
특정 열화상 카메라(100)는 자신이 획득한 열화상 영상과 다른 열화상 카메라(100)로부터 전송받은 열화상 영상을 게이트웨이(200) 및 네트워크(300)를 통해 서버(400)로 전송한다(S103).A specific thermal imaging camera 100 transmits a thermal imaging image acquired by itself and a thermal imaging image transmitted from another thermal imaging camera 100 to the server 400 through the gateway 200 and the network 300 (S103) ).
서버(400)는 수신된 열화상 영상을 분석하여(S104), 상기 제1,2,3,4조건 중 적어도 한 조건이 발생하는지를 판단한다(S105).The server 400 analyzes the received thermal image (S104) and determines whether at least one of the first, second, third, and fourth conditions occurs (S105).
서버(400)는 제1-4조건 중 적어도 한 조건이 발생하면 객체에 위험상황이 발생한 것으로 인지하고(S106), 상기 조건이 발생하지 않으면 다시 S101 단계로 진행하여 이후 과정을 반복할 수 있다. If at least one of the first to fourth conditions occurs, the server 400 recognizes that a dangerous situation has occurred in the object (S106). If the above condition does not occur, the server 400 proceeds to step S101 again and repeats the subsequent process.
서버(400)는 객체의 위험상황 발생을 사용자단말(500) 및/또는 유관기관 서버로 전송할 수 있다(S107).The server 400 may transmit the occurrence of a dangerous situation of an object to the user terminal 500 and/or a related agency server (S107).
이상에서 설명한 바와 같이, 본 발명에서는 열화상 카메라에서 객체에 대한 열화상 영상을 촬영하고, 서버에서 열화상 영상을 수신하여 분석함으로써 객체의 위험상황 여부를 판단할 수 있다. 또한, 본 발명에서는 복수의 열화상 카메라가 메시 와이파이 통신으로 상호간에 통신이 가능하므로 모든 열화상 영상을 빠르게 모아서 서버로 전송할 수 있게 되어 처리속도가 빨라지는 장점이 있다. As described above, in the present invention, it is possible to determine whether an object is in danger by capturing a thermal image of an object with a thermal imaging camera and receiving and analyzing the thermal image from a server. In addition, in the present invention, since a plurality of thermal imaging cameras can communicate with each other through mesh Wi-Fi communication, all thermal imaging images can be quickly collected and transmitted to the server, which has the advantage of faster processing speed.
이상 첨부된 도면을 참조하여 본 발명의 실시예들을 설명하였으나, 본 발명은 상기 실시예들에 한정되는 것이 아니라 서로 다른 다양한 형태로 제조될 수 있으며, 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자는 본 발명의 기술적 사상이나 필수적인 특징을 변경하지 않고서 다른 구체적인 형태로 실시될 수 있다는 것을 이해할 수 있을 것이다. 그러므로 이상에서 기술한 실시예들은 모든 면에서 예시적인 것이며 한정적이 아닌 것으로 이해해야만 한다.Although embodiments of the present invention have been described above with reference to the attached drawings, the present invention is not limited to the above embodiments and can be manufactured in various different forms, and can be manufactured in various different forms by those skilled in the art. It will be understood by those who understand that the present invention can be implemented in other specific forms without changing its technical spirit or essential features. Therefore, the embodiments described above should be understood in all respects as illustrative and not restrictive.

Claims (12)

  1. 통신모듈을 포함하고 복수의 공간마다 설치되며 상기 설치된 공간의 객체에 대한 열화상 영상을 획득하는 열화상 카메라;A thermal imaging camera that includes a communication module and is installed in each of a plurality of spaces and acquires thermal imaging images of objects in the installed spaces;
    상기 열화상 카메라로부터 열화상 영상을 수신하고 상기 수신된 열화상 영상을 분석하여 객체의 위험상황을 판단하는 서버를 포함하고,It includes a server that receives a thermal image from the thermal image camera and analyzes the received thermal image to determine a dangerous situation of the object,
    상기 서버는,The server is,
    상기 열화상 영상에서 관심영역을 추출하고, 상기 열화상 영상을 구성하는 복수의 셀에서의 객체의 온도에 따른 색상에 대응하는 밝기값을 설정하여 상기 관심영역에서의 대표밝기값을 산출하고, (1)상기 관심영역의 이동속도, (2)상기 관심영역의 이동거리, (3)직전 관심영역의 대표밝기값의 변화속도, (4)현재 이동된 관심영역의 대표밝기값의 지속시간 중 적어도 하나를 이용하여 상기 객체의 위험상황 여부를 판단하는 객체 행동 감지 시스템.Extracting a region of interest from the thermal image, calculating a representative brightness value in the region of interest by setting a brightness value corresponding to the color according to the temperature of the object in a plurality of cells constituting the thermal image, ( At least one of the following: 1) the moving speed of the region of interest, (2) the moving distance of the region of interest, (3) the change rate of the representative brightness value of the immediately moved region of interest, and (4) the duration of the representative brightness value of the currently moved region of interest. An object behavior detection system that determines whether the object is in danger or not.
  2. 청구항 1에 있어서, 상기 관심영역은 상기 객체의 얼굴 부위를 포함하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein the region of interest includes a facial area of the object.
  3. 청구항 1에 있어서, 상기 관심영역이 하나의 셀에 대응하는 경우 상기 하나의 셀의 밝기값을 상기 대표밝기값으로 산출하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein when the region of interest corresponds to one cell, the brightness value of the one cell is calculated as the representative brightness value.
  4. 청구항 1에 있어서, 상기 관심영역이 둘 이상의 셀에 대응하는 경우 상기 둘 이상의 셀의 밝기값의 평균을 상기 대표밝기값으로 산출하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein, when the region of interest corresponds to two or more cells, an average of the brightness values of the two or more cells is calculated as the representative brightness value.
  5. 청구항 1에 있어서, 상기 관심영역이 둘 이상의 셀에 대응하는 경우 상기 둘 이상의 셀의 밝기값 중 가장 큰 밝기값을 상기 대표밝기값으로 산출하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein when the region of interest corresponds to two or more cells, the representative brightness value is calculated as the largest brightness value among the brightness values of the two or more cells.
  6. 청구항 1에 있어서, 상기 복수의 열화상 카메라의 통신모듈은 상호 간에 메시 와이파이 통신을 통해 상기 열화상 열상을 송수신하는 객체 행동 감지 시스템. The object behavior detection system according to claim 1, wherein the communication modules of the plurality of thermal imaging cameras transmit and receive the thermal images through mesh Wi-Fi communication.
  7. 청구항 1에 있어서, 상기 복수의 열화상 카메라 중 적어도 하나는 게이트웨이와 연결되며 상기 게이트웨이를 통해 상기 서버로 상기 열화상 영상을 전송하는 객체 감지 시스템.The object detection system according to claim 1, wherein at least one of the plurality of thermal imaging cameras is connected to a gateway and transmits the thermal imaging image to the server through the gateway.
  8. 청구항 1에 있어서, 상기 서버는 상기 관심영역이 이동하는 속도가 기설정된 제1기준치 이상이면(제1조건), 상기 객체에 위험상황이 발생한 것으로 인지하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein the server recognizes that a dangerous situation has occurred in the object when the moving speed of the area of interest is greater than or equal to a preset first standard (first condition).
  9. 청구항 1에 있어서, 상기 서버는 상기 관심영역 간의 이동거리가 기설정된 제2기준치 이상이면(제2조건), 상기 객체에 위험상황이 발생한 것으로 인지하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein the server recognizes that a dangerous situation has occurred in the object when the moving distance between the areas of interest is greater than a preset second standard (second condition).
  10. 청구항 1에 있어서, 상기 서버는 상기 직전의 관심영역에서의 대표밝기값의 변화속도가 기설정된 제3기준치 이상이면(제3조건), 상기 객체에 위험상황이 발생한 것으로 인지하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein the server recognizes that a dangerous situation has occurred in the object when the rate of change of the representative brightness value in the immediately preceding region of interest is greater than or equal to a preset third standard (third condition).
  11. 청구항 1에 있어서, 상기 서버는 상기 이동된 관심영역에서의 대표밝기값의 지속시간이 기설정된 제4기준치 이상이면(제4조건), 상기 객체에 위험상황이 발생한 것으로 인지하는 객체 행동 감지 시스템.The object behavior detection system according to claim 1, wherein the server recognizes that a dangerous situation has occurred in the object when the duration of the representative brightness value in the moved region of interest is greater than or equal to a preset fourth standard value (fourth condition).
  12. 청구항 1에 있어서, 상기 서버는 상기 관심영역이 이동하는 속도가 기설정된 제1기준치 이상(제1조건), 상기 관심영역 간의 이동거리가 기설정된 제2기준치 이상(제2조건), 상기 직전의 관심영역에서의 대표밝기값의 변화속도가 기설정된 제3기준치 이상(제3조건) 중 적어도 하나의 조건을 만족하면서 상기 이동된 관심영역에서의 대표밝기값의 지속시간이 기설정된 제4기준치 이상이면(제4조건), 상기 객체에 위험상황이 발생한 것으로 인지하는 객체 행동 감지 시스템.The method of claim 1, wherein the server determines whether the speed at which the region of interest moves is greater than or equal to a preset first reference value (first condition), the moving distance between the regions of interest is greater than or equal to a preset second standard value (second condition), and the immediately preceding The rate of change of the representative brightness value in the area of interest satisfies at least one of the preset third reference values or more (third condition), and the duration of the representative brightness value in the moved area of interest is more than the preset fourth reference value. In this case (fourth condition), an object behavior detection system recognizes that a dangerous situation has occurred in the object.
PCT/KR2022/013087 2022-07-19 2022-09-01 Object behavior detection system using thermal imaging WO2024019220A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0089028 2022-07-19
KR1020220089028A KR102636116B1 (en) 2022-07-19 2022-07-19 System for detecting action of object by using infrared thermography image

Publications (1)

Publication Number Publication Date
WO2024019220A1 true WO2024019220A1 (en) 2024-01-25

Family

ID=89617994

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/013087 WO2024019220A1 (en) 2022-07-19 2022-09-01 Object behavior detection system using thermal imaging

Country Status (2)

Country Link
KR (1) KR102636116B1 (en)
WO (1) WO2024019220A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101837027B1 (en) * 2017-09-25 2018-03-09 건국대학교 산학협력단 Device and method for tracking object using superpixel based on thermal image
KR101852057B1 (en) * 2017-11-23 2018-04-25 주식회사 아이티아이비전 unexpected accident detecting system using images and thermo-graphic image
KR102054213B1 (en) * 2017-11-24 2019-12-10 연세대학교산학협력단 Respiratory measurement system using thermovision camera
KR102199020B1 (en) * 2020-05-08 2021-01-06 성균관대학교산학협력단 Ceiling aihealth monitoring apparatusand remote medical-diagnosis method using the same
KR102295045B1 (en) * 2021-03-03 2021-08-31 주식회사 누리온 Gateway-based situation monitoring system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101927220B1 (en) 2016-09-30 2018-12-11 (주)혜진시스 Method and Apparatus for Detecting Object of Interest By Using Infrared Thermography Image
KR102052883B1 (en) 2017-05-19 2019-12-10 한국과학기술원 Prediction system using a thermal imagery camera and fall prediction method using a thermal imagery camera
KR101916631B1 (en) 2018-05-30 2018-11-08 이운명 Fall Sensing Device Having Thermal Camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101837027B1 (en) * 2017-09-25 2018-03-09 건국대학교 산학협력단 Device and method for tracking object using superpixel based on thermal image
KR101852057B1 (en) * 2017-11-23 2018-04-25 주식회사 아이티아이비전 unexpected accident detecting system using images and thermo-graphic image
KR102054213B1 (en) * 2017-11-24 2019-12-10 연세대학교산학협력단 Respiratory measurement system using thermovision camera
KR102199020B1 (en) * 2020-05-08 2021-01-06 성균관대학교산학협력단 Ceiling aihealth monitoring apparatusand remote medical-diagnosis method using the same
KR102295045B1 (en) * 2021-03-03 2021-08-31 주식회사 누리온 Gateway-based situation monitoring system

Also Published As

Publication number Publication date
KR102636116B1 (en) 2024-02-13
KR20240011512A (en) 2024-01-26

Similar Documents

Publication Publication Date Title
WO2021221249A1 (en) Smart livestock management system and method for same
WO2017065442A1 (en) Wearable device, and method for controlling patient or providing alarm, monitoring server, and computer program, using same
WO2021095916A1 (en) Tracking system capable of tracking movement path of object
WO2012064115A2 (en) Real-time fire-monitoring system using a closed-circuit television camera, and method therefor
WO2015119439A1 (en) Warning method and system using spatiotemporal situation data
WO2019182355A1 (en) Smartphone, vehicle and camera having thermal image sensor, and display and sensing method using same
WO2012067282A1 (en) Mobile device and method for measuring temperature of thermal picture including body temperature
WO2017146313A1 (en) Intelligent smart monitoring system for home using internet of things and broadband radar sensing technique
WO2017175924A1 (en) Surveillance system and control method thereof
WO2018143571A1 (en) Method and apparatus for managing object in wireless communication system
WO2018034439A1 (en) Backscatter-dependent communication method for wireless power transmission and backscatter-dependent communication system therefor
KR102212773B1 (en) Apparatus for video surveillance integrated with body temperature measurement and method thereof
WO2021157794A1 (en) Device and method for controlling door lock
WO2018199630A1 (en) Integrated monitoring system and method using image capturing
WO2018097384A1 (en) Crowdedness notification apparatus and method
JP2016178363A (en) Processing unit, control unit, lobby interphone unit and interphone system
WO2017023151A1 (en) Image processing apparatus
WO2022080844A1 (en) Apparatus and method for tracking object by using skeleton analysis
WO2024019220A1 (en) Object behavior detection system using thermal imaging
WO2021071025A1 (en) Surveillance camera system and method for operating same
CN115988712A (en) Intelligent control method and system for emergency lighting
WO2020017814A1 (en) Abnormal entity detection system and method
WO2019212237A1 (en) Abnormal entity detection device and method
US10542200B2 (en) Surveillance camera system
WO2020091441A2 (en) Home appliance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22952069

Country of ref document: EP

Kind code of ref document: A1