WO2019163211A1 - Monitoring system and control method for monitoring system - Google Patents

Monitoring system and control method for monitoring system Download PDF

Info

Publication number
WO2019163211A1
WO2019163211A1 PCT/JP2018/041401 JP2018041401W WO2019163211A1 WO 2019163211 A1 WO2019163211 A1 WO 2019163211A1 JP 2018041401 W JP2018041401 W JP 2018041401W WO 2019163211 A1 WO2019163211 A1 WO 2019163211A1
Authority
WO
WIPO (PCT)
Prior art keywords
specific temperature
distance
image
alarm
monitoring system
Prior art date
Application number
PCT/JP2018/041401
Other languages
French (fr)
Japanese (ja)
Inventor
哲 細木
晃志 生田目
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2019515549A priority Critical patent/JP6544501B1/en
Publication of WO2019163211A1 publication Critical patent/WO2019163211A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates to a monitoring system and a monitoring system control method.
  • an object of the present invention is to provide a monitoring system capable of reliably capturing the positional relationship between a position of a high temperature object in a three-dimensional space and another object, and a control method for the monitoring system. That is.
  • a rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
  • An infrared camera that images a second area where at least a part of the first area overlaps and outputs an infrared image represented in a two-dimensional coordinate system;
  • Obtaining the distance image from the rider detecting an object from the obtained distance image and obtaining the infrared image from the infrared camera;
  • the infrared image has a specific temperature portion that falls within a predetermined temperature range, the position corresponding to the specific temperature portion is specified as a specific temperature object among the detected objects, and the position in the three-dimensional coordinate system is specified.
  • the detected object includes another object different from the specific temperature object, it is determined whether or not the distance between the specific temperature object and the other object is within a predetermined distance.
  • a control unit that outputs an alarm signal when a distance between the specific temperature object and the other object is within a predetermined distance; Having a surveillance system.
  • control unit A range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and the alarm signal is output when the other object is in the alarm area;
  • the monitoring system according to (1) above.
  • the control unit adjusts the warning area to the movement of the specific temperature object.
  • the control unit A relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative distance and relative between the specific temperature object and the other object are obtained.
  • the monitoring system according to any one of (1) to (4), wherein the length of the predetermined distance is changed according to speed.
  • the monitoring system according to any one of (1) to (6), further including an alarm device that receives the alarm signal and emits sound and / or light.
  • a rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
  • An infrared camera that captures a second region at least partially overlapping with the first region and outputs an infrared image represented in a two-dimensional coordinate system, Acquiring the distance image from the rider and detecting an object from the acquired distance image;
  • the infrared image is acquired from the infrared camera and the infrared image has a specific temperature portion that falls within a predetermined temperature range, an object corresponding to the specific temperature portion is detected among the detected objects.
  • a control method for a monitoring system comprising:
  • the range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and an alarm is issued when the other object is in the alarm area (8) ) Control method of the monitoring system described in the above.
  • a relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative temperature object and the other object are relative to each other.
  • the monitoring system control method according to any one of (8) to (11), wherein the length of the predetermined distance is changed according to a distance and a relative speed.
  • FIG. 1 is a block diagram illustrating a configuration of a monitoring system according to the first embodiment.
  • the monitoring system 100 includes a monitoring unit 110, a control unit 120, a display 130, and an alarm device 140.
  • the monitoring unit 110 is installed at a position where an object (for example, a high-temperature object or an object such as a person, a vehicle, or another object) can be captured.
  • a rider 102 LiDAR: Light Detection And Ranging
  • an infrared camera 104 are attached to and integrated with the same casing.
  • the rider 102 and the infrared camera 104 have the same optical axis (the Z direction (see FIGS. 2 and 3 described later) is the same direction).
  • the rider 102 and the infrared camera 104 are arranged adjacent to each other in the Y direction (vertical direction (see FIGS. 2 and 3 described later)), and the optical axes are aligned in the X direction (lateral direction).
  • the rider 102 scans the laser beam toward the space in the first region and measures the distance from the reflected light to an object existing in the scanned space.
  • the obtained distribution of distance values is also referred to as point cloud data, and the distance from the installation position of the rider 102 to the object, and the size and shape of the object are known.
  • an image having a distance value distribution that is a distance to an object existing in the space or an infinite distance when there is no reflected light is obtained.
  • Such an image output from the rider 102 is referred to as a distance image because it includes information on the distance to the object (sometimes referred to as a rider image).
  • the distance image is output from the rider 102 to the control unit 120 as an image of a three-dimensional coordinate system.
  • the infrared camera 104 captures an object in the space of the second region, and outputs the temperature distribution in the captured space as an infrared image in a two-dimensional coordinate system using monochrome shades.
  • the output value of a pixel that captures a portion with a high temperature is high, and the output value of a pixel that captures a portion with a low temperature is low.
  • the output gradation value increases as the pixel captures a portion with a higher temperature.
  • the first area scanned by the rider 102 and the second area photographed by the infrared camera 104 overlap at least partially.
  • This overlapping area becomes a monitoring range as the monitoring system 100.
  • the distance image of the rider 102 and the infrared image of the infrared camera 104 are used to grasp the three-dimensional spatial position and temperature information of the object existing in the overlapping region.
  • the scanning interval for one frame of the rider 102 and the imaging interval of the infrared camera 104 need not be completely synchronized.
  • the lidar 102 has a scanning interval of about 10 frames / second, whereas the infrared camera 104 can take images at intervals of about a fraction of a second to a few thousandths of a second.
  • the three-dimensional position of the object can be grasped from the distance images acquired at the same time, and the temperature information of the object can be matched.
  • the control unit 120 is a computer.
  • the control unit 120 includes a CPU (Central Processing Unit) 121, a ROM (Read Only Memory) 122, a RAM (Random Access Memory) 123, an HDD (Hard Disk Drive) 124, and the like.
  • the CPU 121 calls a program corresponding to the processing content from the HDD 124 to control the operations of the rider 102 and the infrared camera 104, and performs detection of the three-dimensional position of the object, temperature of the object, alarm operation, display of temperature information, and the like.
  • the HDD 124 becomes a storage unit together with the RAM 123, and stores programs and data necessary for each processing.
  • a nonvolatile semiconductor memory such as a flash memory may be used instead of the HDD 124.
  • the control unit 120 includes an input device 125 such as a touch panel, buttons, and a mouse, and a network interface 126 (NIF: Network Interface) for connecting an external device such as a server.
  • an input device 125 such as a touch panel, buttons, and a mouse
  • a network interface 126 (NIF: Network Interface) for connecting an external device such as a server.
  • the monitoring system 100 includes a display 130 and an alarm device 140.
  • the display 130 can be provided separately from the control unit 120 in order to install the display 130 in a factory monitoring room, for example. Of course, it may be integrated with the control unit 120. Further, the display 130 and the alarm device 140 may be integrated depending on the monitoring environment.
  • the alarm device 140 issues an alarm by a method that can be recognized by, for example, sound, light such as a flashlight or a rotating light, and others. Note that the monitoring system 100 may cause other processing to be performed instead of the alarm by the alarm device 140.
  • the other processing is processing for starting recording of images obtained from the infrared camera 104 and the rider 102, for example. By starting the recording, the movement of the object in the overlapping area and the display on the display 130 can be reliably recorded. Other processing includes, for example, stopping automatic machines such as robots, machine tools, and transport vehicles.
  • the case where the rider 102 and the infrared camera 104 are integrated is illustrated.
  • the rider 102 and the infrared camera 104 may be installed separately, and each may be connected to the control unit 120 via a dedicated line or a network.
  • the range scanned by the rider 102 and the range captured by the infrared camera 104 are overlapped.
  • the control unit 120 may use a general-purpose computer instead of a dedicated computer. Conversely, the infrared camera 104, the rider 102, and the control unit 120 may be integrated. In addition, although the control unit 120 is shown here as a form mainly composed of a CPU, a RAM, and a ROM, for example, it is configured by an integrated circuit such as an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit). May be.
  • FPGA Field-Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the operation of the monitoring system 100 will be described.
  • the operation of the monitoring system 100 is roughly divided into an initial setting operation and a monitoring operation.
  • FIG. 2 is an explanatory diagram for explaining a distance image obtained by the rider 102 scanning the first region.
  • FIG. 3 is an explanatory diagram for explaining an infrared image obtained by photographing the second region by the infrared camera 104.
  • FIG. 4 is an explanatory diagram for explaining an example in which the distance image of FIG. 2 and the infrared image of FIG. 3 are simply superimposed. In FIG. 2 and FIG. 3, the same region (space) is captured.
  • the rider 102 does not reflect light (such as the sky) from reflected light from objects ob1 to ob4 such as objects existing in the first region and objects ob1 to ob4 and the ground (in the room, the floor; the same applies hereinafter).
  • the distance image Im1 is output in a three-dimensional coordinate system as shown in the figure.
  • This distance image Im1 is an image constituted by three-dimensional point group data, and has a three-dimensional coordinate system (X1, Y1, Z1) as shown in the figure. Therefore, the position of each point constituting the distance image Im1 is specified in a three-dimensional coordinate system including the X, Y, and Z axes.
  • the objects ob1 to ob3 are people
  • ob4 is an object that becomes a specific temperature object described later.
  • the infrared camera 104 captures infrared rays emitted from an object ob1 to ob4 such as an object or a person existing in the second region or the ground gr, and outputs an infrared image Im2 as shown in the figure.
  • the objects ob1 to ob4 are all objects having a temperature higher than the ambient temperature (here, the temperature of the ground gr, the background, etc.).
  • the position (coordinate value) of each pixel constituting the infrared image Im2 is specified in a two-dimensional coordinate system including the X axis and the Y axis.
  • the distance image Im1 has a three-dimensional coordinate system (X1, Y1, Z1)
  • the infrared image Im2 has a two-dimensional coordinate system (X2, Y2). Since both coordinate systems do not match with each other as they are, if these are simply overlapped, the objects ob1 to ob4 that are originally in the same position will be displaced as shown in FIG.
  • processing is performed to make the X axis and Y axis of the two-dimensional coordinate system (X2, Y2) correspond to the X axis and Y axis of the three-dimensional coordinate system (X1, Y1, Z1).
  • Such processing is referred to herein as coordinate conversion, and a coefficient necessary for making it correspond is referred to as a coordinate conversion coefficient.
  • the initial setting operation is an operation for calculating the coordinate conversion coefficient.
  • FIG. 5 is a flowchart showing a processing procedure for calculating the coordinate conversion coefficient.
  • 6 and 7 are explanatory diagrams for calculating the coordinate conversion coefficient.
  • This coordinate conversion coefficient calculation (initial setting operation) process is performed by the control unit 120 executing a program for calculating the coordinate conversion coefficient.
  • control unit 120 obtains a distance image Im1 output by scanning the first area including a portion where the rider 102 becomes an overlapping area (S1).
  • S1 overlapping area
  • at least two reference points are set in advance.
  • a heating element such as an incandescent bulb or a heater or an infrared radiation object such as an infrared LED is attached to the head (tip) of the rod.
  • the reference point is preferably a stationary object, but as already described, the reference point may be set on the moving object as long as the scanning time of the rider 102 and the photographing time of the infrared camera 104 are matched (synchronized).
  • FIG. 1 An example of the acquired distance image is shown in FIG. As shown in the figure, the distance image Im1 is shown as reference points P1 and P2 on the image.
  • the control unit 120 obtains an infrared image Im2 output by photographing and outputting the second region including the portion where the infrared camera 104 becomes an overlapping region (S2).
  • An example of the acquired infrared image is shown in FIG.
  • reference points PP1 and PP2 are shown. This is because the infrared radiator is attached to the tip of the bar serving as the reference point, and this portion is a portion with high luminance (gradation value) in the infrared image Im2. Note that the processing order of S1 and S2 may be reversed (may be simultaneous).
  • the control unit 120 calculates a conversion coefficient for matching the two-dimensional coordinate system of the obtained infrared image Im2 with the three-dimensional coordinate system of the distance image Im1 (S3).
  • the distance image Im1 and the infrared image Im2 are images obtained by scanning or photographing the same region (space). For this reason, the actual size and distance (distance between reference points) of the objects present in both images are the same. However, these images have different scales for each image due to different scanning and photographing equipment. For this reason, if both are simply overlapped (see FIG. 4), the position of the object is shifted or the size is different.
  • the two scale coordinate systems may be converted to have the same scale.
  • the scales of the X axis and the Y axis of the infrared image Im2 are adjusted to the X axis and the Y axis of the distance image Im1.
  • two reference points were provided.
  • the distances in the X-axis and Y-axis directions in the images of the reference points P1 and P2 shown in the distance image Im1 are obtained.
  • the distance in the X-axis direction between P1 and P2 is ⁇ (P1-P2) x.
  • the distance in the Y-axis direction between P1 and P2 is ⁇ (P1 ⁇ P2) y.
  • the distances in the X-axis and Y-axis directions in the images of the reference points PP1 and PP2 reflected in the infrared image Im2 are obtained.
  • the distance in the X-axis direction between PP1 and PP2 is ⁇ (PP1-PP2) x
  • the distance in the Y-axis direction between PP1 and PP2 is ⁇ (PP1-PP2) y.
  • the respective conversion coefficients of the X axis and the Y axis are obtained.
  • the obtained conversion coefficient is stored in the RAM 123 or HDD 124.
  • the number of reference points is not particularly limited.
  • an object existing in the space may be used as the reference point.
  • an object corner or the like is designated as a reference point so that it can be easily distinguished in an image.
  • the reference point needs to be an infrared radiation object so that the reference point is reflected in the infrared image.
  • the process of calculating the coordinate conversion coefficient ends. Thereafter, using this conversion coefficient, the coordinate value (or the entire screen) of the object in the two-dimensional coordinate system (XY) in the infrared image is the same coordinate system as the XY plane of the three-dimensional coordinate system in the distance image. Can be converted to
  • image distortion of the infrared camera 104 is also corrected.
  • the infrared camera 104 uses a lens like a normal camera. For this reason, the image is distorted due to a slight difference in refractive index between the lens end and the optical center of the lens. Due to such distortion, even if the same object is at the same distance, the size and position of the infrared image taken of it are slightly different depending on whether it appears in the periphery or in the center. It will end up.
  • the image may be corrected based on the refractive index of the entire lens obtained from the lens design data.
  • the infrared images of the two are compared. You may make it correct
  • correction may be made by using only the central part of the lens without this distortion.
  • the infrared camera 104 is adjusted so that the range reflected in the center of the lens where distortion does not occur becomes the scanning range of the rider 102 (or, as a configuration of the infrared camera itself, a large aperture lens is used,
  • the infrared image sensor (bolometer) receives light only from the center of the lens where no distortion occurs.
  • This coordinate conversion coefficient calculation is performed at a predetermined time, for example, when the monitoring system 100 is installed on the site, at regular maintenance, or at any time determined by the user (when a defect is found, etc.) ).
  • a three-dimensional coordinate system and a two-dimensional coordinate system here use an orthogonal coordinate system
  • a polar coordinate system may be used.
  • FIG. 8 is a flowchart illustrating the processing procedure of the monitoring operation by the control unit 120.
  • the current frame refers to a frame acquired at the current time point
  • the previous frame refers to a frame immediately before the current frame in time series. Since this procedure includes repetitive processing, for convenience of explanation, processing using the result of processing at a later stage may be described first.
  • the scanning interval of the rider 102 and the imaging interval of the infrared camera 104 are synchronized.
  • control unit 120 acquires a distance image for one frame at the current time point from the rider 102 and similarly acquires an infrared image for one frame at the current time point from the infrared camera 104 (S11). Note that the order of obtaining the distance image and the infrared image may be either first (or simultaneously).
  • the control unit 120 clusters the objects detected in the distance image using the background difference method (S12).
  • the background subtraction method compares an image registered in advance as a background image with the acquired frame image (here, the frame image acquired in S11), and if there is a different part from the background image, The part is detected as a newly appearing object.
  • the background image it is preferable to store a distance image obtained by scanning in a state where there is no object in a range (first area space) scanned by the rider 102.
  • the background image is stored in, for example, the HDD 124 and read out to the RAM 123 for use.
  • Clustering is for making it easier to track a detected object in subsequent processing, and a known method can be used. For example, clustering includes the number of pixels of the detected object and the size of the object obtained from the coordinate values in the three-dimensional coordinate system (length, area, volume in the X, Y, and Z directions of the object in the three-dimensional coordinate system). Etc.), each object is made into a cluster of clusters in the distance image. Each cluster stores, for example, the coordinate value of the cluster center and the coordinate value of the cluster outline in the RAM 123 as its position.
  • the control unit 120 performs moving body tracking for the clustered object (S13).
  • moving object tracking it is searched whether or not an object of the same cluster as the object clustered in the distance image of the current frame was in the previous frame. If there is an object of the same cluster in the previous frame, the position of the previous frame of the object and the position of the current frame are compared, and the moving distance, moving direction, and speed of the object (speed is the distance between frames) Obtained by dividing by. Thereby, since the moving distance, moving direction, and speed are known for each object, these are stored in the RAM 123 for each object. If there is an object that does not exist in the previous frame but is detected in the current frame, the object stores the coordinate value (position) in the RAM 123 as an object that has appeared in the current frame.
  • control unit 120 associates a portion having a higher temperature than the periphery (ground, background, etc.) in the infrared image with an object detected in the distance image (S14).
  • object in the infrared image and the object in the distance image can be associated by coordinate transformation.
  • control unit 120 extracts the coordinate values of the two-dimensional coordinate system of the pixels that occupy a portion of the infrared image having a higher temperature than the periphery (for example, ob1 to ob4 shown in FIG. 3). For example, assuming that the coordinate value of one pixel in the two-dimensional coordinate system is (x1, y1), conversion using the already obtained conversion coefficients ⁇ x, ⁇ y results in ( ⁇ x ⁇ x1, ⁇ y ⁇ y1). The same conversion is performed for the other pixels.
  • image conversion can be performed even if only pixels that show a higher temperature in the infrared image than the surroundings, that is, pixels that have a pixel gradation value other than 0 or a predetermined threshold value or more. Alternatively, all the pixels of the infrared image may be converted.
  • control part 120 matches the object in the distance image which overlaps with the pixel of the coordinate value after conversion. At this time, if the coordinate value range of the point cloud data shown as the object in the distance image and the pixel of the coordinate value of the high temperature part after the conversion overlap even a little, the temperature of the infrared image Assume that the high part corresponds to the object in the distance image.
  • the floor surface and the periphery of the object become hot due to radiant heat from the object, and infrared rays may be emitted from the periphery of the object.
  • the periphery of the high-temperature object is also shown as a high-temperature part.
  • the face appears as a part having a higher temperature
  • the body appears as a part having a lower temperature than the face.
  • the size of the high-temperature portion in the infrared image may not match the size of the object point group (actual object size) in the distance image acquired from the rider. Therefore, in the present embodiment, if there is a part that overlaps at least part of the high-temperature part in the infrared image and the object in the distance image, they correspond to each other. There is no limitation on the overlapping ratio. For example, in the case of a person, although it varies depending on the degree of skin exposure from clothes, the portion where the temperature appears high (such as the face) is about 1 to 20% of the whole person, so if there is an overlap of 1% or more, It shall correspond.
  • the associated temperature is stored in the RAM 123 as the temperature of the object.
  • the highest temperature in the temperature distribution may be stored as the temperature of the object (this stored temperature is used for image display described later).
  • the control unit 120 causes the display 130 to display a color according to the temperature and distance on the object based on the infrared image on the display 130 (S15).
  • the display at this time is based on an infrared image (two-dimensional coordinate system), in which the position of the object in the infrared image is understood so that the position of the object obtained from the distance image (three-dimensional coordinate system) can be understood.
  • a frame is drawn on the part corresponding to the position.
  • the position of the object in the infrared image can be obtained by converting the coordinates of the XY plane of the three-dimensional coordinate system into the coordinates of the infrared image of a certain object in the distance image by the coordinate conversion already described.
  • the outline of the cluster in the XY plane of the distance image of the three-dimensional coordinate system is extracted, and the frame is displayed according to the coordinate value of the extracted outline.
  • the frame line attached to the object is the first related information image.
  • the first related information image is not limited to a frame line, and may be, for example, an arrow or a triangle indicating an object. Also, numerical values such as the distance to the object and the temperature may be displayed together with the frame line.
  • the displayed object is colored based on position information and temperature information.
  • the position information is information obtained from a distance image of a three-dimensional coordinate system obtained from the rider 102.
  • the above-mentioned frame line (first related information image) is one of them, but the color attached to the object is changed according to the distance from the installation position of the monitoring unit 110 (that is, the installation position of the rider 102). Yes.
  • the temperature information is a temperature obtained from an infrared image of a two-dimensional coordinate system, and the color applied to the object is also changed by this temperature information.
  • the object is color-coded based on position information and temperature information. Specifically, for example, colors such as blue, yellow, and red are used from the lowest temperature to the highest temperature. Further, the closer the distance from the monitoring unit 110, the higher the brightness of the color (that is, the higher the gradation value of the pixel to be displayed), and the farther the color is, the lower the brightness of the color (lower the gradation value of the pixel to be displayed). , Etc.
  • FIG. 9 is a screen example showing a display example.
  • the displayed colors are (R, G, B) gradation values, and each color is 0 to 255.
  • a high-temperature object an object to be a specific temperature object described later
  • ob4 is displayed in red (150, 0, 0) having a medium brightness because the temperature is high but the distance is long. Since the objects ob1 to ob3 other than the specific temperature object ob4 are humans and the temperature is lower than that of the specific temperature object ob4, they are displayed in a color close to yellow, and the brightness of those colors varies depending on the distance. To do.
  • the closest object ob1 is bright yellow (224, 250, 0)
  • the medium object ob2 is medium bright yellow (180, 190, 0)
  • the farthest object ob3 is dark yellow (100 , 120, 0).
  • the same color is applied to the entire object at the temperature stored as the temperature of the entire object. Apply. This makes it easier to recognize an object such as a person whose temperature is partially high by looking at the screen.
  • the objects ob1 to ob4 are displayed with the frame line fb indicating that they are objects as described above, regardless of the temperature. For this reason, it becomes easy to understand that the object is an object, and in particular, the visibility of the moving object is improved.
  • a distance line (0 m to 40 m) representing the distance from the installation position of the monitoring unit 110 is also shown.
  • the control unit 120 determines whether or not there is a specific temperature portion in the infrared image (S16).
  • the specific temperature portion is a portion in a predetermined temperature range (including a case where the temperature is equal to or higher than a predetermined temperature). For example, when monitoring is performed so that a person does not come close to an object at a temperature that may cause harm to a person as a monitoring target, and an alarm is given when approaching, a predetermined temperature range that is a specific temperature portion is 50 ° C. or higher. (In this case, the upper limit may be a temperature at which each pixel of the infrared camera is saturated, for example).
  • the temperature range for the specific temperature part is arbitrary, and is determined by the temperature of the monitored object (for example, a hot object that is harmful to humans) or the temperature of the environment (such as indoors or outdoors). That's fine.
  • the control unit 120 returns to S11 to acquire the next frame (S16: NO).
  • S16 next frame
  • NO next frame
  • the association in S14 it is a matter of course that the association in S14 is not performed, a screen in which no object exists is displayed in S15, and then NO in S16 and NO to S11 Return and continue processing.
  • NO is determined in S16, if there is data indicating that an alarm area described later is set, the data is cleared to indicate that the alarm area is not set (this is described later in S17 described later). Is required for processing).
  • control part 120 If the control part 120 detects the specific temperature part (S16: YES), the control part 120 will judge next whether the warning area
  • the control unit 120 selects an object corresponding to the specific temperature portion detected in S16 among the objects in the distance image (three-dimensional coordinate system) as the specific temperature object. (S18).
  • the objects in the distance image are already clustered and associated with a portion having a temperature higher than the ambient temperature.
  • the moving object is tracked (S12 to S14 described above). For this reason, in this S18, the specific temperature portion may be searched from the infrared image, and the coordinate value (position) of the object corresponding to it may be specified.
  • the specific temperature portion cannot be associated with the object being tracked in S13 or the object newly appearing in the current frame.
  • it is a stationary object (an object that does not move) that is low in temperature when the background image is acquired (stored), but the temperature subsequently increases.
  • some object existing in the background image in S13 an object that cannot be detected by the background subtraction method
  • the specific temperature portion may be associated with a stationary object (hereinafter simply referred to as a stationary object) that has not been detected as an object.
  • the coordinate value of the stationary object is stored in the RAM 123 as the coordinate value of the specific temperature object.
  • the boiler when there is a boiler (a grounding type that does not move) in the first area, the boiler is naturally reflected even when the background image is acquired. For this reason, in the background subtraction method, the boiler cannot be detected as an object in S12.
  • the boiler that has been stopped at the time of acquiring the background image starts operating from a certain point during the monitoring operation and becomes hot, it is reflected in the infrared image as a specific temperature portion.
  • the specific temperature portion is associated with the point cloud data indicating the boiler that is a stationary object, the boiler that is a stationary object can be subsequently recognized as the specific temperature object.
  • the control unit 120 sets an alarm area around the specific temperature object ob4 (S19).
  • the size of the alarm area is determined by identifying the temperature of the specific temperature portion (particularly a high temperature portion when there is a temperature distribution in the specific temperature object ob4) from the infrared image in which the specific temperature portion is detected, and depending on the temperature
  • the size of the alarm area is variable.
  • FIG. 10 is an explanatory diagram for explaining the specific temperature object.
  • FIG. 10 shows the distance image of the three-dimensional coordinate system and the infrared image of the two-dimensional coordinate system after coordinate conversion superimposed.
  • the specific temperature object is represented by (xmin, ymin, zmin), (xmax, ymin, zmin), ( xmax, ymax, zmin), (xmin, ymax, zmin), (xmin, ymin, zmax), (xmax, ymin, zmax), (xmax, ymin, zmax), (xmax, ymax, zmax), (xmin, ymax, zmax).
  • the specific temperature object is a grounded object, if the origin (0) of the Y axis of the three-dimensional coordinate system is taken as the ground (floor surface), the lower end (ymin) in the Y axis direction is 0 (zero). )
  • FIG. 11 and 12 are explanatory diagrams for explaining an alarm region provided around a specific temperature object.
  • FIG. 11 shows the case of the temperature T1 of the specific temperature object
  • FIG. 12 shows the case of the temperature T2 of the specific temperature object.
  • each temperature is T1 ⁇ T2.
  • the predetermined distance for setting the alarm region is made to correspond to the temperature, and the distance D1 ⁇ D2.
  • the alarm region m1 is set around the specific temperature object ob4 so as to be within a predetermined distance D1 from the outer periphery of the specific temperature object ob4. Specifically, the alarm area m1 sets the range of the distance D1 for each of the X, Y, and Z axes from the coordinate value of the outer peripheral edge of the specific temperature object ob4.
  • the alarm area m1 is represented by coordinate values (x, y, z), (xmin ⁇ D1, ymin ⁇ D1, zmin ⁇ D1), (xmax + D1, ymin + D1, zmin + D1), (xmax + D1, ymax + D1, zmin + D1), (xmin + D1) , Ymax + D1, zmin + D1), (xmin-D1, ymin-D1, zmax + D1), (xmax + D1, ymin-D1, zmax + D1), (xmax + D1, ymin-D1, zmax + D1), (xmax + D1, ymax + D1, zmax + D1), (xmin-D1, ymax + D1, zmax + D1).
  • the alarm region m2 is set around the specific temperature object ob4 to be within a predetermined distance D2 from the outer periphery of the specific temperature object ob4.
  • the alarm area m2 is a range of distance D2 for each of the X, Y, and Z axes from the coordinate value of the outer peripheral end of the specific temperature object ob4.
  • the higher the temperature of the specific temperature object the wider the range of the alarm area is set.
  • Such a relationship between the temperature and the predetermined distance may be stored in advance in the HDD 124 as table data of the temperature versus the predetermined distance and read out to the RAM 123 for use.
  • a predetermined distance is extracted by referring to the table data from the temperature of the specific temperature portion detected from the infrared image in S19. Then, the extracted alarm region separated by a predetermined distance is set.
  • the specific temperature object is a rectangular parallelepiped in the three-dimensional coordinate system
  • the shape of the specific temperature object is not limited to a rectangular parallelepiped, and may be other shapes.
  • the warning area may be set as a range of a predetermined distance (D1, D2, etc.) from the outer periphery of the specific temperature object according to the shape of the specific temperature object.
  • the alarm region is set as a range of a predetermined distance (D1, D2, etc.) from the outer peripheral edge of the specific temperature object, but instead of this, for example, a predetermined distance (however, a predetermined distance from the center of the specific temperature object) May be longer than the distance from the center of the specific temperature object to the outer shape).
  • a predetermined distance (however, a predetermined distance from the center of the specific temperature object) May be longer than the distance from the center of the specific temperature object to the outer shape).
  • the range of the sphere from the cluster center of the clustered specific temperature object may be set as an alarm region, which facilitates calculation (speeding up the processing).
  • the range may be a predetermined distance from the position corresponding to the highest temperature portion of the specific temperature object. Therefore, even in the case where there is a temperature distribution in the specific temperature object, the alarm region can be set around the high temperature portion.
  • the alarm area can be moved in accordance with the movement (See S20 described later).
  • the alarm area is In addition to such a setting method, for example, when it is known that a specific temperature object does not move (in the case of the stationary object), or when the moving range is known, the alarm area is In addition, a fixed range of a predetermined distance around the specific temperature object may be used as an alarm area. When such a fixed alarm region is set, for example, if the specific temperature object is a moving object, the predetermined distance may be short in the non-moving direction and long in the moving direction.
  • the control unit 120 is displayed on the display 130 so that the specific temperature object ob4 and the alarm area m1 (or m2) are displayed with a frame, a line, or a mark that is color-coded or line-typed.
  • the screen is updated (S21).
  • the frame line indicating the warning area is set as the second related information image.
  • the second related information image is not limited to the illustrated frame line, and may be, for example, an arrow or a triangle indicating an object, or a light coloration of the entire alarm area (for example, a specific temperature object has its color) Dark enough to show through).
  • control unit 120 determines whether another object (such as a person or other object) that is different from the specific temperature object is within the alarm region (S22). This comparison compares the range surrounded by the coordinate value of the outer shape of the cluster and the coordinate value indicating the warning area of the object clustered in S12 (that is, the object detected by the background difference method). If the coordinate value of the outer shape of the cluster of another object is within the alarm area, it is determined that the other object is within the alarm area.
  • another object such as a person or other object
  • control unit 120 returns to S11, acquires each image of the next frame, and continues the subsequent processing.
  • the control unit 120 outputs an alarm signal to the alarm device 140 (S23). Thereby, an alarm sound is emitted from the alarm device 140 that has received the alarm signal.
  • the display 130 blinks an object (or a frame or mark surrounding the object) that is determined to be in the alarm area, changes the color of the entire screen, blinks, or displays a warning text. It is good to see that the alarm is also issued visually.
  • various alarm operations such as turning on the rotating lamp, changing the color of the color-coded layered display lamp from blue to red, and turning on and blinking other lamps, etc. You may go.
  • S20 the alarm area is already set up to the previous frame.
  • moving object tracking S13
  • S13 moving object tracking
  • the specific temperature object is a moving object, its moving distance, direction, and speed are known. Therefore, in S20, the already set coordinate value of the alarm region is moved using the moving distance and direction of the specific temperature object.
  • the alarm area has already been set up to the previous frame, even if the specific temperature object is a moving object, it is only necessary to move the alarm area in accordance with the movement. For this reason, as in S18 to S19, the calculation is simpler (the processing speed can be increased) than setting the alarm region from the coordinate value of the specific temperature object in the current frame.
  • control unit 120 proceeds to S21, updates the screen so as to display the moved alarm area and the like, and continues the subsequent processing.
  • the monitoring operation is executed as a repeated process.
  • a predetermined distance range around the specific temperature object is set as the alarm area.
  • the length of the predetermined distance for setting the alarm region may be changed according to the relative distance and relative speed between the specific temperature object and another object.
  • the moving direction and speed of the object are obtained in the step S13 as already described.
  • the specific temperature object is also known as the moving object tracking value in S13. Even when the specific temperature object is a stationary object, the position is known (when the stationary object is identified as the specific temperature object in S18).
  • the specific temperature object and the other object are moving in a direction in which they are relatively approaching from these moving directions and speeds, and if the relative speed (approaching speed) is high, it is already set. Widen the alarm area. As a result, when at least one of the specific temperature object and the other object is a moving object, an alarm signal can be quickly issued in consideration of not only the temperature of the specific temperature object but also the moving speed thereof. .
  • a coordinate system of a two-dimensional coordinate system infrared image that can detect the temperature of an object and a distance image of a rider 102 that captures a three-dimensional position in space as a three-dimensional coordinate system is combined. While detecting the specific temperature portion having a high temperature from the infrared image, the position of the specific temperature object corresponding to the specific temperature portion is specified from the distance image. And since it decided to issue an alarm based on the positional relationship between the position of the specific temperature object and other objects, the alarm to ensure the safety by grasping the close proximity between the high temperature specific temperature object and other objects It can be performed.
  • the specific temperature part is a part of a high temperature that is dangerous to a person, for example, and when another object is a person, for example, an alarm may be given to prevent a high temperature object from approaching the person. it can.
  • the object since the alarm area is set around the specific temperature object, the object enters the alarm area without calculating the distance (distance) between the specific temperature object and another object. It can be determined whether or not. For this reason, the time (calculation processing time) required for risk determination can be reduced.
  • the alarm area is moved in accordance with the movement of the specific temperature object, even if the specific temperature object is moving, the risk is judged by simple calculation of whether or not the object has entered the alarm area. can do.
  • the size of the alarm area is changed based on the direction and speed at which the object approaches the alarm area, if the speed at which the object approaches is high, the danger is early. It is possible to more reliably perform the notification and avoid that the object approaches the specific temperature object.
  • the size of the alarm area is changed according to the temperature of the specific temperature portion, it is possible to more reliably avoid the danger when the object approaches the alarm area.
  • FIG. 13 is a block diagram illustrating a configuration of the monitoring system according to the second embodiment.
  • FIG. 14 is a bird's eye view showing the arrangement of the monitoring units.
  • the monitoring system 200 includes two monitoring units.
  • the two monitoring units are a first monitoring unit 211 and a second monitoring unit 212.
  • the first monitoring unit 211 and the second monitoring unit 212 are arranged to monitor the same monitoring target space from different directions.
  • the internal configurations of the first monitoring unit 211 and the second monitoring unit 212 are the same as those in the first embodiment, and each includes the infrared camera 104 and the rider 102.
  • the control unit 220 has the same configuration as that of the first embodiment except that the first monitoring unit 211 and the second monitoring unit 212 are controlled at a time. For this reason, the first monitoring unit 211 and the second monitoring unit 212 are connected to the control unit 220. Since other configurations are the same as those of the first embodiment, description thereof is omitted.
  • the coordinate conversion operation and the monitoring operation by the control unit 220 are performed for each of the first monitoring unit 211 and the second monitoring unit 212, but the processing procedure is the same as that of the first embodiment, and thus the description thereof is omitted.
  • the processes of S15 and S21 are included in the display processing stage. Different from the first embodiment.
  • the first monitoring unit 211 and the second monitoring unit 212 monitor the same area (space) from different directions. For this reason, even if it is the same object, the visible part (surface) differs.
  • the temperature detected by the infrared camera 104 may vary depending on the surface of the object. In other words, even if one object has a high temperature on one surface and a low temperature on another surface, etc. (a surface with a low temperature is a surface on which an object is covered with an infrared shielding object) Also included).
  • the object ob5 shown in FIG. 14 has a temperature of one side ho higher than that of the other side co.
  • the infrared camera 104 of the first monitoring unit 211 captures the high-temperature surface ho
  • the infrared camera 104 of the second monitoring unit 212 captures the low-temperature surface co.
  • each rider 102 scans the same object ob5 from each position and outputs a distance image.
  • an image of the monitoring unit (the first monitoring unit 211 in FIG. 14) capturing the surface ho having the higher temperature is displayed on the display 130 at the stage of the display process.
  • FIG. 15 is a subroutine flowchart showing the procedure of the display processing stage (S15 and S21 (S23) in FIG. 8) in the second embodiment.
  • control unit 220 uses the distance image and the infrared image acquired from the first monitoring unit 211 and the distance image and the infrared image acquired from the second monitoring unit 212 in FIG.
  • the processes from S11 shown are executed in parallel.
  • control unit 220 extracts the maximum temperature in the infrared image captured by the infrared camera 104 of the second monitoring unit 212 and sets this as the second screen temperature St2 (S32). Note that the order of the processing of S31 and S32 may be reversed (or may be simultaneous).
  • control unit 220 displays the screen with the highest maximum temperature. In this process, it is determined whether or not St1 ⁇ St2 (S33).
  • the control unit 220 causes the display 130 to display an image on the first monitoring unit 211 side.
  • the displayed image uses the distance image acquired from the first monitoring unit 211 and the infrared image to display the object with a color or a frame as described in the first embodiment.
  • the control unit 220 causes the display 130 to display an image on the second monitoring unit 212 side.
  • the displayed image uses the distance image acquired from the second monitoring unit 212 and the infrared image to display the object with a color or a frame as described in the first embodiment.
  • the subroutine in the second embodiment is completed, and the process returns to the main routine (the flowcharts shown in FIGS. 8 and 9).
  • the detection of the specific temperature object, the setting of the alarm area, the detection of the presence or absence of other objects in the alarm area, and the alarm stages (S18 to 23) are also the distance image and infrared image acquired from the first monitoring unit 211. , And each of the distance image and the infrared image acquired from the second monitoring unit 212. At this time, even if only one of the first monitoring unit 211 and the second monitoring unit 212 sets the alarm area, the monitoring operation for the alarm area is executed to perform an alarm or the like. That's fine.
  • the second embodiment when a monitoring operation is performed using two or more monitoring units, it is possible to display an image of the monitoring unit that captures a higher temperature part. For example, in the case of a person, it is possible to display on the display 130 an image that captures the direction the face is facing (usually, the temperature of the face is higher than that of the back head).
  • two monitoring units are used. However, more monitoring units may be provided.
  • the two monitoring units are controlled by one control unit 220.
  • the control unit 220 is provided for each of the two monitoring units, and the screen display is switched. It is also possible to further provide a control unit (computer) dedicated to screen switching that performs only the above.
  • the steps of detecting a specific temperature object, setting an alarm area, detecting the presence / absence of other objects in the alarm area, and alarming are performed by the first monitoring unit 211 and the second From each infrared image of the monitoring unit 212, it may be determined whether or not there is a specific temperature part first, and thereafter, the process may be executed using the distance image and the infrared image of the specific temperature part detected. .
  • an image displayed by color coding based on the distance and the temperature may be simply provided as an image showing a high temperature surface. In this case, detection of the specific temperature object, setting of the alarm area, detection of the presence / absence of another object in the alarm area, and the alarm steps (S18 to S23) may not be performed.
  • the coordinate conversion coefficient for converting the two-dimensional coordinate system of the infrared image into the three-dimensional coordinate system of the distance image is obtained as an initial setting.
  • the angle of view of the camera may be the same as the angle of view of the distance image obtained by the rider's scanning.
  • the angle of view of the infrared camera is adjusted to the angle of view of the distance image by changing the focal length (or magnification) of the lens of the infrared camera. Even in this way, the two-dimensional coordinate system of the infrared image and the coordinate system of the XY plane in the three-dimensional image of the distance image can be made the same.
  • the alarm region is set around the specific temperature object, but after detecting the specific temperature object, the distance between the specific temperature object and another object is calculated one by one.
  • An alarm signal may be output by determining whether the distance is equal to or less than a predetermined distance.

Abstract

[Problem] To provide a monitoring system that makes it possible to reliably obtain the positional relationship between the position of a high-temperature object in three-dimensional space and another object and issue an alarm. [Solution] A monitoring system 100 that has: lidar 102 that scans a first region and outputs a three-dimensional coordinate system range image; an infrared camera 104 that photographs a second region that overlaps the first region and outputs a two-dimensional coordinate system infrared image; a control part 120 that acquires the range image and the infrared image and, when there is a specific temperature section in the infrared image, specifies the three-dimensional coordinate system position of a specific temperature object that corresponds to the specific temperature section, and when another object has been detected in the range image, outputs an alarm signal when the distance between the specific temperature object and the other object is within a prescribed distance; and an alarm 140 that receives the alarm signal from the control part 120 and issues an alarm.

Description

監視システムおよび監視システムの制御方法Monitoring system and monitoring system control method
 本発明は、監視システムおよび監視システムの制御方法に関する。 The present invention relates to a monitoring system and a monitoring system control method.
 従来、所定領域内の安全を確保する技術としては、ライダー(レーザーレーダーともいわれている)から取得したレーダー画像と、監視カメラから取得したカメラ画像とを用いるものがある。この技術では、レーダー画像の座標系とカメラ画像の座標系とを重ね合わせ、移動物体を特定する補助情報をカメラ画像上に重ねて表示させる。この表示によって、監視用域内の状況を容易に把握できるようにしている(特開2004-212129号公報)。 Conventionally, as a technique for ensuring safety within a predetermined area, there is a technique that uses a radar image acquired from a rider (also called a laser radar) and a camera image acquired from a surveillance camera. In this technique, the coordinate system of the radar image and the coordinate system of the camera image are superimposed, and auxiliary information for specifying a moving object is displayed superimposed on the camera image. By this display, the situation in the monitoring area can be easily grasped (Japanese Patent Laid-Open No. 2004-212129).
 また、空間における熱物体の移動状況や位置などを検知する技術として、複数の赤外線検出エレメントから構成される赤外線センサーを用いるものがある。この技術によれば、温度の高い物体の移動状況や位置などを容易に把握できるようになる(特開平9-297057号公報)。 Also, as a technique for detecting the movement state and position of a thermal object in space, there is one using an infrared sensor composed of a plurality of infrared detection elements. According to this technique, it becomes possible to easily grasp the movement state and position of an object having a high temperature (Japanese Patent Laid-Open No. 9-297057).
 特開2004-212129号公報に記載された技術では、監視対象の領域内に存在する移動体の状況を把握することができる。しかし、この従来の技術は、監視対象の領域内に存在する多くの物体のなかから温度の高い物体を認識して、その移動状況を把握したり、温度の高い物体と他の物体との距離を把握したりできない。また、温度の高い物体の温度に基づいて他の物体の安全を監視することもできない。 According to the technique described in Japanese Patent Application Laid-Open No. 2004-212129, it is possible to grasp the status of a moving object existing in the monitored area. However, this conventional technology recognizes a high-temperature object from many objects existing in the monitored area and grasps its movement status, or the distance between a high-temperature object and another object. I ca n’t figure out. In addition, the safety of other objects cannot be monitored based on the temperature of a high-temperature object.
 また、特開平9-297057号公報に記載された技術では、温度の高い物体の移動状況や位置などを2次元方向に把握することはできる。しかし、この従来の技術は、3次元方向に移動する温度の高い物体の移動状況や位置は把握することができない。 Also, with the technique described in Japanese Patent Laid-Open No. 9-297057, it is possible to grasp the movement state and position of an object having a high temperature in a two-dimensional direction. However, this conventional technique cannot grasp the movement state and position of a high-temperature object moving in the three-dimensional direction.
 そこで、本発明の目的は、温度の高い物体の3次元空間内の位置と他の物体との位置関係を確実にとらえて警報を発することのできる監視システム、および監視システムの制御方法を提供することである。 SUMMARY OF THE INVENTION Accordingly, an object of the present invention is to provide a monitoring system capable of reliably capturing the positional relationship between a position of a high temperature object in a three-dimensional space and another object, and a control method for the monitoring system. That is.
 上記の目的は、以下の手段により達成される。 The above objective is achieved by the following means.
 (1)第1領域に向けてレーザー光を走査することによって得られた距離値の分布が3次元座標系で示された距離画像を出力するライダーと、
 前記第1領域と少なくとも一部の領域が重複する第2領域を撮影して2次元座標系で示された赤外線画像を出力する赤外線カメラと、
 前記ライダーから前記距離画像を取得し、取得した前記距離画像から物体を検出すると共に前記赤外線カメラから前記赤外線画像を取得し、
 前記赤外線画像に所定の温度範囲となっている特定温度部分がある場合に、前記検出した物体のうち、前記特定温度部分に対応する物体を特定温度物体として前記3次元座標系における位置を特定すると共に、検出した前記物体のなかに前記特定温度物体とは異なる他の物体が含まれる場合に、前記特定温度物体と前記他の物体との距離が所定距離内であるか否かを判断して、前記特定温度物体と前記他の物体との距離が所定距離内である場合に警報信号を出力する制御部と、
 を有する監視システム。
(1) a rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
An infrared camera that images a second area where at least a part of the first area overlaps and outputs an infrared image represented in a two-dimensional coordinate system;
Obtaining the distance image from the rider, detecting an object from the obtained distance image and obtaining the infrared image from the infrared camera;
When the infrared image has a specific temperature portion that falls within a predetermined temperature range, the position corresponding to the specific temperature portion is specified as a specific temperature object among the detected objects, and the position in the three-dimensional coordinate system is specified. In addition, when the detected object includes another object different from the specific temperature object, it is determined whether or not the distance between the specific temperature object and the other object is within a predetermined distance. A control unit that outputs an alarm signal when a distance between the specific temperature object and the other object is within a predetermined distance;
Having a surveillance system.
 (2)前記制御部は、
 前記3次元座標系において前記特定温度物体の周囲に、前記特定温度物体から前記所定距離の範囲を警報領域として設定し、前記他の物体が前記警報領域にある場合に前記警報信号を出力する、上記(1)に記載の監視システム。
(2) The control unit
A range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and the alarm signal is output when the other object is in the alarm area; The monitoring system according to (1) above.
 (3)前記制御部は、前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体が動体であると判断される場合には、前記警報領域を前記特定温度物体の動きに合わせて移動する、上記(2)に記載の監視システム。 (3) In the case where it is determined that the specific temperature object is a moving object from the plurality of distance images acquired in time series from the rider, the control unit adjusts the warning area to the movement of the specific temperature object. The monitoring system according to (2), wherein the monitoring system moves.
 (4)前記制御部は、前記特定温度物体の周囲に固定された前記警報領域を設定する、上記(2)に記載の監視システム。 (4) The monitoring system according to (2), wherein the control unit sets the alarm area fixed around the specific temperature object.
 (5)前記制御部は、
 前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体と前記他の物体との相対距離および相対速度を求め、求めた前記特定温度物体と前記他の物体との相対距離および相対速度に応じて前記所定距離の長さを変える、上記(1)~(4)のいずれか1つに記載の監視システム。
(5) The control unit
A relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative distance and relative between the specific temperature object and the other object are obtained. The monitoring system according to any one of (1) to (4), wherein the length of the predetermined distance is changed according to speed.
 (6)前記制御部は、前記赤外線カメラが撮影した赤外線画像内の前記特定温度部分の温度が高いほど前記所定距離を長くする、上記(1)~(5)のいずれか1つに記載の監視システム。 (6) The control unit according to any one of (1) to (5), wherein the predetermined distance is lengthened as the temperature of the specific temperature portion in the infrared image captured by the infrared camera is higher. Monitoring system.
 (7)前記警報信号を受信して、音および/または光を発する警報器を有する、上記(1)~(6)のいずれか1つに記載の監視システム。 (7) The monitoring system according to any one of (1) to (6), further including an alarm device that receives the alarm signal and emits sound and / or light.
 (8)第1領域に向けてレーザー光を走査することによって得られた距離値の分布を3次元座標系で示された距離画像を出力するライダーと、
 前記第1領域と少なくとも一部が重複する第2領域を撮影して2次元座標系で示された赤外線画像を出力する赤外線カメラと、を有する監視システムの制御方法であって、
 前記ライダーから前記距離画像を取得し、取得した前記距離画像から物体を検出する段階(a)と、
 前記赤外線カメラから前記赤外線画像を取得し、前記赤外線画像に所定の温度範囲となっている特定温度部分がある場合に、検出した前記物体のうち、前記特定温度部分に対応する物体を特定温度物体として前記3次元座標系における位置を特定する段階(b)と、
 検出した前記物体のなかに前記特定温度物体とは異なる他の物体が含まれる場合に、前記特定温度物体と前記他の物体との距離が所定距離内であるか否かを判断して、前記特定温度物体と前記他の物体との距離が所定距離内である場合に警報を発する段階(c)と、
 を有する監視システムの制御方法。
(8) a rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
An infrared camera that captures a second region at least partially overlapping with the first region and outputs an infrared image represented in a two-dimensional coordinate system,
Acquiring the distance image from the rider and detecting an object from the acquired distance image;
When the infrared image is acquired from the infrared camera and the infrared image has a specific temperature portion that falls within a predetermined temperature range, an object corresponding to the specific temperature portion is detected among the detected objects. (B) identifying a position in the three-dimensional coordinate system as
When the detected object includes another object different from the specific temperature object, it is determined whether the distance between the specific temperature object and the other object is within a predetermined distance, Issuing a warning when the distance between the specific temperature object and the other object is within a predetermined distance (c);
A control method for a monitoring system comprising:
 (9)前記段階(c)においては、
 前記3次元座標系において前記特定温度物体の周囲に、前記特定温度物体から前記所定距離の範囲を警報領域として設定し、前記他の物体が前記警報領域にある場合に警報を発する、上記(8)に記載の監視システムの制御方法。
(9) In the step (c),
The range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and an alarm is issued when the other object is in the alarm area (8) ) Control method of the monitoring system described in the above.
 (10)前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体が動体であると判断される場合には、前記警報領域を前記特定温度物体の動きに合わせて移動させる、上記(9)に記載の監視システムの制御方法。 (10) When the specific temperature object is determined to be a moving object from a plurality of the distance images acquired in time series from the rider, the alarm region is moved in accordance with the movement of the specific temperature object, (9) The monitoring system control method according to (9).
 (11)前記警報領域を、前記特定温度物体の周囲に固定して設定する、上記(9)に記載の監視システムの制御方法。 (11) The monitoring system control method according to (9), wherein the alarm region is fixedly set around the specific temperature object.
 (12)前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体と前記他の物体との相対距離および相対速度を求め、求めた前記特定温度物体と前記他の物体との相対距離および相対速度に応じて前記所定距離の長さを変える、上記(8)~(11)のいずれか1つに記載の監視システムの制御方法。 (12) A relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative temperature object and the other object are relative to each other. The monitoring system control method according to any one of (8) to (11), wherein the length of the predetermined distance is changed according to a distance and a relative speed.
 (13)前記赤外線カメラが撮影した画像内の前記特定温度部分の温度に応じて前記所定距離の大きさを変化させる、上記(8)~(12)のいずれか1つに記載の監視システムの制御方法。 (13) The monitoring system according to any one of (8) to (12), wherein the size of the predetermined distance is changed according to the temperature of the specific temperature portion in the image captured by the infrared camera. Control method.
 (14)前記警報は、音および/または光によるものである、上記(8)~(13)のいずれか1つに記載の監視システムの制御方法。 (14) The monitoring system control method according to any one of (8) to (13), wherein the alarm is generated by sound and / or light.
実施形態1の監視システムの構成を示すブロック図である。It is a block diagram which shows the structure of the monitoring system of Embodiment 1. ライダーが第1領域を走査することによって得られた距離画像を説明するための説明図である。It is explanatory drawing for demonstrating the distance image obtained when the rider scans the 1st field. 赤外線カメラが第2領域を撮影して得られた赤外線画像を説明するための説明図である。It is explanatory drawing for demonstrating the infrared image obtained by the infrared camera image | photographing the 2nd area | region. 距離画像および赤外線画像を単純に重ね合わせた例を説明するための説明図である。It is explanatory drawing for demonstrating the example which overlap | superposed the distance image and the infrared image simply. 座標変換係数算出の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of coordinate transformation coefficient calculation. 座標変換係数算出の説明図である。It is explanatory drawing of coordinate conversion coefficient calculation. 座標変換係数算出の説明図である。It is explanatory drawing of coordinate conversion coefficient calculation. 制御部による監視動作の処理手順を示すフローチャートである。It is a flowchart which shows the process sequence of the monitoring operation | movement by a control part. 表示例を示す画面例図である。It is a screen example figure which shows a display example. 特定温度物体を説明するための説明図である。It is explanatory drawing for demonstrating a specific temperature object. 特定温度物体の周囲に設ける警報領域を説明するための説明図である。It is explanatory drawing for demonstrating the alarm area | region provided around a specific temperature object. 特定温度物体の周囲に設ける警報領域を説明するための説明図である。It is explanatory drawing for demonstrating the alarm area | region provided around a specific temperature object. 実施形態2の監視システムの構成を示すブロック図である。It is a block diagram which shows the structure of the monitoring system of Embodiment 2. 監視部の配置を示す鳥瞰図である。It is a bird's-eye view which shows arrangement | positioning of a monitoring part. 実施形態2における表示処理段階の手順を示すサブルーチンフローチャートである。10 is a subroutine flowchart showing a procedure of a display processing stage in the second embodiment.
 以下、添付した図面を参照しながら、本発明の実施形態を説明する。本発明は以下の実施形態には限定されない。なお、図面の説明において、同一の要素には同一の符号を付し、重複する説明を省略する。また、図面は、本発明の理解を容易にすることを目的として作成しているため、誇張して記載されており、図面の寸法比率などは実際の寸法比率とは異なる場合がある。 Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. The present invention is not limited to the following embodiments. In the description of the drawings, the same elements are denoted by the same reference numerals, and redundant description is omitted. Since the drawings are created for the purpose of facilitating understanding of the present invention, the drawings are exaggerated and the dimensional ratios of the drawings may be different from the actual dimensional ratios.
 (実施形態1)
 (監視システムの構成)
 図1は、実施形態1の監視システムの構成を示すブロック図である。
(Embodiment 1)
(Configuration of monitoring system)
FIG. 1 is a block diagram illustrating a configuration of a monitoring system according to the first embodiment.
 監視システム100は、監視部110、制御部120、ディスプレイ130、および警報器140を有する。監視部110は、物体(たとえば、高温の物体や、人、車両、その他の物などの物体)をとらえることができる位置に設置される。 The monitoring system 100 includes a monitoring unit 110, a control unit 120, a display 130, and an alarm device 140. The monitoring unit 110 is installed at a position where an object (for example, a high-temperature object or an object such as a person, a vehicle, or another object) can be captured.
 監視部110は、ライダー102(LiDAR:Light Detection And Ranging)と赤外線カメラ104とが同一の筐体に取り付けられて一体化されている。このようにすることで、たとえば、ライダー102と赤外線カメラ104は光軸の向きを一致させている(Z方向(後述の図2、3参照)が同一方向)。また、ライダー102と赤外線カメラ104はY方向(縦方向(後述の図2、3参照))に隣接して配置され、X方向(横方向)において光軸を一致させている。 In the monitoring unit 110, a rider 102 (LiDAR: Light Detection And Ranging) and an infrared camera 104 are attached to and integrated with the same casing. In this way, for example, the rider 102 and the infrared camera 104 have the same optical axis (the Z direction (see FIGS. 2 and 3 described later) is the same direction). The rider 102 and the infrared camera 104 are arranged adjacent to each other in the Y direction (vertical direction (see FIGS. 2 and 3 described later)), and the optical axes are aligned in the X direction (lateral direction).
 ライダー102は、第1領域の空間に向けてレーザー光を走査してその反射光から走査する空間内に存在する物体までの距離を計測する。得られた距離値の分布を点群データとも称しており、ライダー102の設置位置から物体までの距離、および物体の大きさや形がわかる。ライダー102がレーザー光を1フレーム分走査することによって、その空間に存在する物体までの距離、または反射光がない部分では無限遠の距離となる距離値の分布からなる画像が得られる。このようなライダー102から出力される画像は、物体までの距離の情報も含まれているため距離画像と称される(ライダー画像と称されることもある)。ライダー102からは、この距離画像が3次元座標系の画像として制御部120へ出力される。 The rider 102 scans the laser beam toward the space in the first region and measures the distance from the reflected light to an object existing in the scanned space. The obtained distribution of distance values is also referred to as point cloud data, and the distance from the installation position of the rider 102 to the object, and the size and shape of the object are known. When the rider 102 scans the laser beam for one frame, an image having a distance value distribution that is a distance to an object existing in the space or an infinite distance when there is no reflected light is obtained. Such an image output from the rider 102 is referred to as a distance image because it includes information on the distance to the object (sometimes referred to as a rider image). The distance image is output from the rider 102 to the control unit 120 as an image of a three-dimensional coordinate system.
 赤外線カメラ104は、第2領域の空間の物体を撮影して、撮影した空間内の温度分布をモノクロ濃淡による2次元座標系の赤外線画像として出力する。赤外線カメラ104は、温度の高い部分をとらえた画素の出力値は高く、温度の低い部分をとらえた画素の出力値は低い。デジタルデータとしての出力の場合、温度が高い部分をとらえた画素ほど出力階調値が高くなる。 The infrared camera 104 captures an object in the space of the second region, and outputs the temperature distribution in the captured space as an infrared image in a two-dimensional coordinate system using monochrome shades. In the infrared camera 104, the output value of a pixel that captures a portion with a high temperature is high, and the output value of a pixel that captures a portion with a low temperature is low. In the case of output as digital data, the output gradation value increases as the pixel captures a portion with a higher temperature.
 ライダー102が走査する第1領域と赤外線カメラ104が撮影する第2領域は、少なくとも一部で重複している。この重複領域は、監視システム100としての監視範囲となる。この重複領域は、できるだけ無駄をなくすために、第1領域と第2領域をできるだけ多く重複させることが好ましい。 The first area scanned by the rider 102 and the second area photographed by the infrared camera 104 overlap at least partially. This overlapping area becomes a monitoring range as the monitoring system 100. In order to eliminate waste as much as possible, it is preferable to overlap the first region and the second region as much as possible.
 ライダー102の距離画像と赤外線カメラ104の赤外線画像とは、重複領域内に存在する物体の3次元空間位置と温度情報とを把握するために用いられる。ライダー102の1フレーム分の走査間隔と赤外線カメラ104の撮像間隔は、完全に同期させる必要はない。たとえば、ライダー102は、10フレーム/秒程度の走査間隔であるのに対し、赤外線カメラ104は、数分の1秒~数千分の1秒程度の間隔で撮影が可能である。しかし、好ましくはこれらを同期させることで、同一時刻に取得された距離画像から物体の3次元位置を把握して、その物体の温度情報を一致させることができる。 The distance image of the rider 102 and the infrared image of the infrared camera 104 are used to grasp the three-dimensional spatial position and temperature information of the object existing in the overlapping region. The scanning interval for one frame of the rider 102 and the imaging interval of the infrared camera 104 need not be completely synchronized. For example, the lidar 102 has a scanning interval of about 10 frames / second, whereas the infrared camera 104 can take images at intervals of about a fraction of a second to a few thousandths of a second. However, preferably, by synchronizing them, the three-dimensional position of the object can be grasped from the distance images acquired at the same time, and the temperature information of the object can be matched.
 ライダー102から出力される距離画像を時系列に複数並べることで動画となる。赤外線カメラ104の赤外線画像も同様に、時系列に複数並べることで動画となる。動画内の1枚の画像をフレームという。 Animated by arranging multiple distance images output from the rider 102 in time series. Similarly, a plurality of infrared images from the infrared camera 104 are arranged in time series to form a moving image. One image in a video is called a frame.
 制御部120はコンピューターである。制御部120は、CPU(Central Processing Unit)121、ROM(Read Only Memory)122、RAM(Random Access Memory)123、HDD(Hard Disk Drive)124などを有する。CPU121は、HDD124から処理内容に応じたプログラムを呼び出して、ライダー102および赤外線カメラ104の動作を制御すると共に、物体の3次元位置の検知、物体の温度、警報動作、温度情報の表示などを行う。HDD124はRAM123と共に記憶部となり、各処理に必要なプログラムやデータなどを記憶している。なお、図1ではHDD124を用いているが、HDD124に代えて、たとえばフラッシュメモリーなどの不揮発性の半導体メモリーを使用しても良い。 The control unit 120 is a computer. The control unit 120 includes a CPU (Central Processing Unit) 121, a ROM (Read Only Memory) 122, a RAM (Random Access Memory) 123, an HDD (Hard Disk Drive) 124, and the like. The CPU 121 calls a program corresponding to the processing content from the HDD 124 to control the operations of the rider 102 and the infrared camera 104, and performs detection of the three-dimensional position of the object, temperature of the object, alarm operation, display of temperature information, and the like. . The HDD 124 becomes a storage unit together with the RAM 123, and stores programs and data necessary for each processing. Although the HDD 124 is used in FIG. 1, a nonvolatile semiconductor memory such as a flash memory may be used instead of the HDD 124.
 制御部120は、タッチパネル、ボタン、マウスなどの入力装置125、およびたとえばサーバー等の外部機器を接続するためのネットワークインターフェース126(NIF:Network Interface)を有する。 The control unit 120 includes an input device 125 such as a touch panel, buttons, and a mouse, and a network interface 126 (NIF: Network Interface) for connecting an external device such as a server.
 監視システム100は、ディスプレイ130、および警報器140を備える。本実施形態ではディスプレイ130をたとえば工場の監視ルームに設置するために制御部120とは分離させて設けることができる。もちろん制御部120と一体化させても良い。また、ディスプレイ130と警報器140も、監視する環境によっては一体化させても良い。警報器140は、たとえば、音、フラッシュライトや回転灯などの光、そのほか人が認知できるような方法で警報を発する。なお、監視システム100は、警報器140による警報代えて、他の処理を行わせるようにしてもよい。他の処理とは、たとえば赤外線カメラ104とライダー102から得られた画像の録画を開始する処理である。録画が開始されることで、重複領域内の物体の動きやディスプレイ130の表示を確実に記録しておくことができる。また、他の処理としては、たとえば、ロボット、工作機械、搬送車などの自動機械の停止などがある。 The monitoring system 100 includes a display 130 and an alarm device 140. In the present embodiment, the display 130 can be provided separately from the control unit 120 in order to install the display 130 in a factory monitoring room, for example. Of course, it may be integrated with the control unit 120. Further, the display 130 and the alarm device 140 may be integrated depending on the monitoring environment. The alarm device 140 issues an alarm by a method that can be recognized by, for example, sound, light such as a flashlight or a rotating light, and others. Note that the monitoring system 100 may cause other processing to be performed instead of the alarm by the alarm device 140. The other processing is processing for starting recording of images obtained from the infrared camera 104 and the rider 102, for example. By starting the recording, the movement of the object in the overlapping area and the display on the display 130 can be reliably recorded. Other processing includes, for example, stopping automatic machines such as robots, machine tools, and transport vehicles.
 本実施形態では、ライダー102および赤外線カメラ104が一体化された場合を例示した。しかし、ライダー102および赤外線カメラ104が分離して設置され、それぞれが制御部120に専用線またはネットワークを介して接続するようにしても良い。ただし、ライダー102および赤外線カメラ104が分離している場合は、ライダー102が走査する範囲と赤外線カメラ104が撮影する範囲を重複させる。 In this embodiment, the case where the rider 102 and the infrared camera 104 are integrated is illustrated. However, the rider 102 and the infrared camera 104 may be installed separately, and each may be connected to the control unit 120 via a dedicated line or a network. However, when the rider 102 and the infrared camera 104 are separated, the range scanned by the rider 102 and the range captured by the infrared camera 104 are overlapped.
 また、制御部120は、専用のコンピューターではなく汎用のコンピューターを用いても良い。また、逆に、赤外線カメラ104、ライダー102、および制御部120が一体化されていてもよい。また、制御部120は、ここではCPUやRAM、ROMを主体とした形態として示したが、たとえば、FPGA(Field-Programmable Gate Array)やASIC(Application Specific Integrated Circuit)などの集積回路によって構成されていてもよい。 The control unit 120 may use a general-purpose computer instead of a dedicated computer. Conversely, the infrared camera 104, the rider 102, and the control unit 120 may be integrated. In addition, although the control unit 120 is shown here as a form mainly composed of a CPU, a RAM, and a ROM, for example, it is configured by an integrated circuit such as an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit). May be.
 監視システム100の動作を説明する。監視システム100の動作は、大別して、初期設定動作と監視動作である。 The operation of the monitoring system 100 will be described. The operation of the monitoring system 100 is roughly divided into an initial setting operation and a monitoring operation.
 (初期設定動作)初期設定動作について説明する。図2は、ライダー102が第1領域を走査することによって得られた距離画像を説明するための説明図である。図3は、赤外線カメラ104が第2領域を撮影して得られた赤外線画像を説明するための説明図である。図4は、図2の距離画像および図3の赤外線画像を単純に重ね合わせた例を説明するための説明図である。なお、図2と図3では、同一の領域(空間)をとらえている。 (Initial setting operation) The initial setting operation will be described. FIG. 2 is an explanatory diagram for explaining a distance image obtained by the rider 102 scanning the first region. FIG. 3 is an explanatory diagram for explaining an infrared image obtained by photographing the second region by the infrared camera 104. FIG. 4 is an explanatory diagram for explaining an example in which the distance image of FIG. 2 and the infrared image of FIG. 3 are simply superimposed. In FIG. 2 and FIG. 3, the same region (space) is captured.
 ライダー102は、図2に示すように、第1領域に存在する物や人などの物体ob1~ob4や地面(室内においては床。以下同様)grなどからの反射光から(空などの反射しない部分では無限遠となる。以下同様)、図に示すような3次元座標系による距離画像Im1を出力する。この距離画像Im1は3次元の点群データによって構成された画像であり、図のような3次元座標系(X1、Y1、Z1)を有している。したがって、距離画像Im1を構成する各点は、X、Y、およびZ軸からなる3次元座標系において位置が特定される。なお、ここで、物体ob1~ob3は人であり、ob4は後述する特定温度物体となる物体である。 As shown in FIG. 2, the rider 102 does not reflect light (such as the sky) from reflected light from objects ob1 to ob4 such as objects existing in the first region and objects ob1 to ob4 and the ground (in the room, the floor; the same applies hereinafter). The distance image Im1 is output in a three-dimensional coordinate system as shown in the figure. This distance image Im1 is an image constituted by three-dimensional point group data, and has a three-dimensional coordinate system (X1, Y1, Z1) as shown in the figure. Therefore, the position of each point constituting the distance image Im1 is specified in a three-dimensional coordinate system including the X, Y, and Z axes. Here, the objects ob1 to ob3 are people, and ob4 is an object that becomes a specific temperature object described later.
 赤外線カメラ104は、図3に示すように、第2領域に存在する物や人などの物体ob1~ob4や地面grなどから放射されている赤外線をとらえ、図に示すような赤外線画像Im2を出力する。この赤外線画像Im2において物体ob1~ob4はいずれも周辺温度(ここでは地面grや背景などの温度)より温度の高い物体である。赤外線画像Im2を構成する各画素は、X軸およびY軸からなる2次元座標系において位置(座標値)が特定される。 As shown in FIG. 3, the infrared camera 104 captures infrared rays emitted from an object ob1 to ob4 such as an object or a person existing in the second region or the ground gr, and outputs an infrared image Im2 as shown in the figure. To do. In the infrared image Im2, the objects ob1 to ob4 are all objects having a temperature higher than the ambient temperature (here, the temperature of the ground gr, the background, etc.). The position (coordinate value) of each pixel constituting the infrared image Im2 is specified in a two-dimensional coordinate system including the X axis and the Y axis.
 このように距離画像Im1は3次元座標系(X1、Y1、Z1)を有し、赤外線画像Im2は2次元座標系(X2、Y2)を有している。このままでは両者の座標系が合っていないため、これらを単純に重ね合わせると、図4に示すように、本来、同じ位置にある物体ob1~ob4がずれてしまう。 Thus, the distance image Im1 has a three-dimensional coordinate system (X1, Y1, Z1), and the infrared image Im2 has a two-dimensional coordinate system (X2, Y2). Since both coordinate systems do not match with each other as they are, if these are simply overlapped, the objects ob1 to ob4 that are originally in the same position will be displaced as shown in FIG.
 そこで、2次元座標系(X2、Y2)のX軸とY軸を3次元座標系(X1、Y1、Z1)のX軸とY軸に対応させるための処理を行う。このような処理をここでは座標変換といい、対応させるために必要となる係数を座標変換係数という。初期設定動作は、この座標変換係数を算出するための動作である。 Therefore, processing is performed to make the X axis and Y axis of the two-dimensional coordinate system (X2, Y2) correspond to the X axis and Y axis of the three-dimensional coordinate system (X1, Y1, Z1). Such processing is referred to herein as coordinate conversion, and a coefficient necessary for making it correspond is referred to as a coordinate conversion coefficient. The initial setting operation is an operation for calculating the coordinate conversion coefficient.
 座標変換係数の算出は、次のようにして行う。図5は、座標変換係数算出のための処理手順を示すフローチャートである。図6および7は座標変換係数算出の説明図である。この座標変換係数算出(初期設定動作)の処理は制御部120が座標変換係数算出のためのプログラムを実行することで行われる。 The coordinate conversion coefficient is calculated as follows. FIG. 5 is a flowchart showing a processing procedure for calculating the coordinate conversion coefficient. 6 and 7 are explanatory diagrams for calculating the coordinate conversion coefficient. This coordinate conversion coefficient calculation (initial setting operation) process is performed by the control unit 120 executing a program for calculating the coordinate conversion coefficient.
 まず、制御部120は、ライダー102が重複領域となる部分を含む第1領域を走査して出力した距離画像Im1を取得する(S1)。ライダー102が走査する第1領域には、あらかじめ少なくとも2点の基準点となる物体を設置しておく。 First, the control unit 120 obtains a distance image Im1 output by scanning the first area including a portion where the rider 102 becomes an overlapping area (S1). In the first area scanned by the rider 102, at least two reference points are set in advance.
 ここでは、基準点として棒の頭(先端)に、たとえば白熱電球やヒーターなどの発熱体や、赤外線LEDなど赤外線放射物体を取り付けている。 Here, as a reference point, a heating element such as an incandescent bulb or a heater or an infrared radiation object such as an infrared LED is attached to the head (tip) of the rod.
 基準点は静止物であることが好ましいが、既に説明したように、ライダー102の走査時刻と赤外線カメラ104の撮影時刻を一致(同期)させていれば動体に基準点を設定してもよい。 The reference point is preferably a stationary object, but as already described, the reference point may be set on the moving object as long as the scanning time of the rider 102 and the photographing time of the infrared camera 104 are matched (synchronized).
 取得した距離画像例を図6に示した。図示するようにこの距離画像Im1においては、画像上での基準点P1およびP2として写っている。 An example of the acquired distance image is shown in FIG. As shown in the figure, the distance image Im1 is shown as reference points P1 and P2 on the image.
 続いて、制御部120は、赤外線カメラ104が重複領域となる部分を含む第2領域を撮影して出力した赤外線画像Im2を取得する(S2)。取得した赤外線画像例を図7に示した。図示するようにこの赤外線画像Im2においては、基準点PP1およびPP2が写っている。これは、設置した基準点となる棒の先端に赤外線放射体が取り付けられているので、この部分が赤外線画像Im2では輝度(階調値)の高い部分となる。なお、S1とS2の処理順番は逆でもよい(同時でもよい)。 Subsequently, the control unit 120 obtains an infrared image Im2 output by photographing and outputting the second region including the portion where the infrared camera 104 becomes an overlapping region (S2). An example of the acquired infrared image is shown in FIG. As shown in the drawing, in this infrared image Im2, reference points PP1 and PP2 are shown. This is because the infrared radiator is attached to the tip of the bar serving as the reference point, and this portion is a portion with high luminance (gradation value) in the infrared image Im2. Note that the processing order of S1 and S2 may be reversed (may be simultaneous).
 続いて、制御部120は、得られた赤外線画像Im2の2次元座標系を距離画像Im1の3次元座標系へ合わせるための変換係数を算出する(S3)。距離画像Im1と赤外線画像Im2とは同じ領域(空間)を走査また撮影して得られた画像である。このため両者の画像に存在する物体の実物の大きさや距離(基準点間の距離)は同じである。しかし、それらの画像では、走査や撮影する機材が異なるために画像ごとに縮尺が異なる。このため両者を単純に重ね合わせると(図4参照)、物体の位置がずれたり、大きさが違ったりしてしまうのである。 Subsequently, the control unit 120 calculates a conversion coefficient for matching the two-dimensional coordinate system of the obtained infrared image Im2 with the three-dimensional coordinate system of the distance image Im1 (S3). The distance image Im1 and the infrared image Im2 are images obtained by scanning or photographing the same region (space). For this reason, the actual size and distance (distance between reference points) of the objects present in both images are the same. However, these images have different scales for each image due to different scanning and photographing equipment. For this reason, if both are simply overlapped (see FIG. 4), the position of the object is shifted or the size is different.
 赤外線画像Im2の2次元座標系を距離画像Im1の3次元座標系へ合わせるためには、両者の座標系の縮尺が同じになるように変換すればよい。ここでは、赤外線画像Im2のX軸およびY軸の縮尺を距離画像Im1のX軸およびY軸に合わせることにした。このために基準点を2つ設けた。 In order to match the two-dimensional coordinate system of the infrared image Im2 with the three-dimensional coordinate system of the distance image Im1, the two scale coordinate systems may be converted to have the same scale. Here, the scales of the X axis and the Y axis of the infrared image Im2 are adjusted to the X axis and the Y axis of the distance image Im1. For this purpose, two reference points were provided.
 具体的には、まず、距離画像Im1に写っている基準点P1およびP2の画像内におけるX軸およびY軸方向の距離を求める。P1とP2のX軸方向の距離をΔ(P1-P2)xとする。同様に、P1とP2のY軸方向の距離をΔ(P1-P2)yとする。 Specifically, first, the distances in the X-axis and Y-axis directions in the images of the reference points P1 and P2 shown in the distance image Im1 are obtained. The distance in the X-axis direction between P1 and P2 is Δ (P1-P2) x. Similarly, the distance in the Y-axis direction between P1 and P2 is Δ (P1−P2) y.
 同様にして、赤外線画像Im2に写っている基準点PP1およびPP2の画像内におけるX軸およびY軸方向の距離を求める。たとえば、PP1とPP2のX軸方向の距離をΔ(PP1-PP2)x、PP1とPP2のY軸方向の距離をΔ(PP1-PP2)yとする。 Similarly, the distances in the X-axis and Y-axis directions in the images of the reference points PP1 and PP2 reflected in the infrared image Im2 are obtained. For example, the distance in the X-axis direction between PP1 and PP2 is Δ (PP1-PP2) x, and the distance in the Y-axis direction between PP1 and PP2 is Δ (PP1-PP2) y.
 そして、X軸、Y軸の変換係数を求める。X軸の変換係数をαxとすると、αx=Δ(P1-P2)x/Δ(PP1-PP2)xとなる。Y軸の変換係数をαyとすると、αy=Δ(P1-P2)y/Δ(PP1-PP2)yとなる。 Then, obtain conversion coefficients for the X and Y axes. Assuming that the X-axis conversion coefficient is αx, αx = Δ (P1−P2) x / Δ (PP1−PP2) x. If the Y-axis conversion coefficient is αy, then αy = Δ (P1−P2) y / Δ (PP1−PP2) y.
 このように、本実施形態では、赤外線画像Im2の2次元座標系を距離画像Im1の3次元座標系へ変化するために、X軸およびY軸のそれぞれの変換係数を求めている。得られた変換係数は、RAM123またはHDD124に記憶しておく。 Thus, in this embodiment, in order to change the two-dimensional coordinate system of the infrared image Im2 to the three-dimensional coordinate system of the distance image Im1, the respective conversion coefficients of the X axis and the Y axis are obtained. The obtained conversion coefficient is stored in the RAM 123 or HDD 124.
 なお、基準点の数は、特に限定されない。また、座標変換のために基準点を重複領域となる空間に設置するのではなく、空間に存在している物体を基準点としてもよい。たとえば、画像内で見分けやすいように物体の角などを基準点として指定する。ただし、基準点とする物体は、赤外線画像内に基準点を写り込ませるために、基準点は赤外線放射物体である必要がある。 Note that the number of reference points is not particularly limited. Further, instead of setting the reference point in the space that is the overlapping area for coordinate conversion, an object existing in the space may be used as the reference point. For example, an object corner or the like is designated as a reference point so that it can be easily distinguished in an image. However, the reference point needs to be an infrared radiation object so that the reference point is reflected in the infrared image.
 変換係数を求めた後、座標変換係数算出(初期設定動作)の処理は終了する。以降は、この変換係数を用いて、赤外線画像における2次元座標系(X-Y)にある物体の座標値(または画面全体)を距離画像における3次元座標系のX-Y平面と同じ座標系に変換することができる。 After obtaining the conversion coefficient, the process of calculating the coordinate conversion coefficient (initial setting operation) ends. Thereafter, using this conversion coefficient, the coordinate value (or the entire screen) of the object in the two-dimensional coordinate system (XY) in the infrared image is the same coordinate system as the XY plane of the three-dimensional coordinate system in the distance image. Can be converted to
 また、この初期設定動作においては、赤外線カメラ104の画像の歪みの補正も行っている。赤外線カメラ104は、通常のカメラ同様にレンズを使用している。このため、どうしてもレンズの端とレンズの光学的な中心との屈折率のわずかな違いから、画像に歪みが出てしまう。このような歪みによって、仮に、同じ物体が同じ距離にあったとしても、それを撮影した赤外線画像では、周辺部に写った場合と中央部に写った場合とで、大きさや位置が微妙に違ってしまうことになる。 In this initial setting operation, image distortion of the infrared camera 104 is also corrected. The infrared camera 104 uses a lens like a normal camera. For this reason, the image is distorted due to a slight difference in refractive index between the lens end and the optical center of the lens. Due to such distortion, even if the same object is at the same distance, the size and position of the infrared image taken of it are slightly different depending on whether it appears in the periphery or in the center. It will end up.
 そこで、このようなレンズ起因の赤外線画像における歪みも初期設定動作において補正する。補正動作自体は、レンズの設計データから得られるレンズ全体の屈折率などから画像を補正してもよい。また、上述した座標変換に用いた基準点が赤外線画像の中央部に来るように撮影した赤外線画像と、周辺部に来るように撮影した赤外線画像とを用いて、両者の赤外線画像を比較して基準点の位置や基準点間距離の違いから補正するようにしてもよい。 Therefore, such distortion in the infrared image caused by the lens is also corrected in the initial setting operation. In the correction operation itself, the image may be corrected based on the refractive index of the entire lens obtained from the lens design data. In addition, using the infrared image captured so that the reference point used for the coordinate conversion described above is in the center of the infrared image and the infrared image captured so as to be in the periphery, the infrared images of the two are compared. You may make it correct | amend from the difference in the position of a reference point, or the distance between reference points.
 もちろん、この歪みのないレンズ中央部のみ使用するようにして補正は行わないようにしてもよい。この場合、たとえば、赤外線カメラ104において、歪みが発生しないレンズ中央部に写る範囲がライダー102の走査範囲となるように調整する(または、赤外線カメラ自体の構成として、大口径のレンズを使用し、歪みが発生しないレンズ中央部だけからの光を赤外線撮像素子(ボロメーター)で受けるようにする)。 Of course, correction may be made by using only the central part of the lens without this distortion. In this case, for example, the infrared camera 104 is adjusted so that the range reflected in the center of the lens where distortion does not occur becomes the scanning range of the rider 102 (or, as a configuration of the infrared camera itself, a large aperture lens is used, The infrared image sensor (bolometer) receives light only from the center of the lens where no distortion occurs.
 この座標変換係数算出(初期設定動作)の処理は、所定の時期、たとえば、監視システム100を現場に設置する時、定期的なメンテナンス時、またはユーザーが決めた任意な時期(不具合の発見時など)に行う。 This coordinate conversion coefficient calculation (initial setting operation) is performed at a predetermined time, for example, when the monitoring system 100 is installed on the site, at regular maintenance, or at any time determined by the user (when a defect is found, etc.) ).
 なお、ここでは、3次元座標系も2次元座標系も、直交座標系を用いたが、極座標系を用いてもよい。 In addition, although a three-dimensional coordinate system and a two-dimensional coordinate system here use an orthogonal coordinate system, a polar coordinate system may be used.
 (監視動作) 監視動作について説明する。図8は、制御部120による監視動作の処理手順を示すフローチャートである。以下の説明において現在フレームとは現在時点で取得したフレームをいい、前フレームとは、時系列的に現在フレームに対して1つ前のフレームをいう。なお、この手順は、繰り返し処理を含むため、説明の都合上、後から行われる段階の処理の結果を使用した処理を先に説明することがある。 (Monitoring operation) Explains the monitoring operation. FIG. 8 is a flowchart illustrating the processing procedure of the monitoring operation by the control unit 120. In the following description, the current frame refers to a frame acquired at the current time point, and the previous frame refers to a frame immediately before the current frame in time series. Since this procedure includes repetitive processing, for convenience of explanation, processing using the result of processing at a later stage may be described first.
 ここでは、ライダー102の走査間隔と赤外線カメラ104の撮影間隔は同期させている。 Here, the scanning interval of the rider 102 and the imaging interval of the infrared camera 104 are synchronized.
 まず、制御部120は、ライダー102から現在時点の1フレーム分の距離画像を取得し、赤外線カメラ104から同じく現在時点の1フレーム分の赤外線画像を取得する(S11)。なお、距離画像と赤外線画像を取得する順番はどちらが先でもよい(同時でもよい)。 First, the control unit 120 acquires a distance image for one frame at the current time point from the rider 102 and similarly acquires an infrared image for one frame at the current time point from the infrared camera 104 (S11). Note that the order of obtaining the distance image and the infrared image may be either first (or simultaneously).
 続いて、制御部120は、背景差分法を用いて、距離画像内で検出される物体をクラスタリングする(S12)。背景差分法は周知のように、あらかじめ背景画像として登録してある画像と、取得したフレームの画像(ここではS11で取得したフレームの画像)を比較して、背景画像と異なる部分があれば、その部分を新たに出現した物体として検出する。背景画像は、ライダー102によって走査する範囲(第1領域の空間)に、物体がない状態で走査して取得した距離画像を記憶しておくとよい。背景画像は、たとえばHDD124に記憶して、RAM123に読み出して使用する。 Subsequently, the control unit 120 clusters the objects detected in the distance image using the background difference method (S12). As is well known, the background subtraction method compares an image registered in advance as a background image with the acquired frame image (here, the frame image acquired in S11), and if there is a different part from the background image, The part is detected as a newly appearing object. As the background image, it is preferable to store a distance image obtained by scanning in a state where there is no object in a range (first area space) scanned by the rider 102. The background image is stored in, for example, the HDD 124 and read out to the RAM 123 for use.
 クラスタリングは、検出した物体をその後の処理において追跡しやすいようにするためであり、周知の方法を用いることができる。たとえば、クラスタリングは、検出した物体の画素数や3次元座標系における座標値から得られる物体の大きさなど(物体の3次元座標系内におけるX,Y,Z各方向の長さや、面積、体積など)によって、距離画像内で、各物体を一塊のクラスターにする。各クラスターは、その位置として、たとえば、クラスター中心の座標値やクラスター外形線の座標値などをRAM123に記憶する。 Clustering is for making it easier to track a detected object in subsequent processing, and a known method can be used. For example, clustering includes the number of pixels of the detected object and the size of the object obtained from the coordinate values in the three-dimensional coordinate system (length, area, volume in the X, Y, and Z directions of the object in the three-dimensional coordinate system). Etc.), each object is made into a cluster of clusters in the distance image. Each cluster stores, for example, the coordinate value of the cluster center and the coordinate value of the cluster outline in the RAM 123 as its position.
 続いて、制御部120は、クラスタリングされた物体について動体追跡を行う(S13)。動体追跡は、現在フレームの距離画像でクラスタリングした物体と同じクラスターの物体が前フレームにあったか否かを検索する。そして、前フレームに同じクラスターの物体があれば、その物体の前フレームの位置と現在フレームの位置を比較して、その物体の移動距離、移動方向、および速度(速度は距離をフレーム間の時間で除することで得られる)を求める。これにより、物体ごとに移動距離、移動方向、および速度がわかるので、これらを物体ごとにRAM123に記憶する。また、前フレームには存在しないが現在フレームで検出された物体があれば、その物体は現在フレームにおいて出現した物体として、RAM123に座標値(位置)を記憶する。 Subsequently, the control unit 120 performs moving body tracking for the clustered object (S13). In the moving object tracking, it is searched whether or not an object of the same cluster as the object clustered in the distance image of the current frame was in the previous frame. If there is an object of the same cluster in the previous frame, the position of the previous frame of the object and the position of the current frame are compared, and the moving distance, moving direction, and speed of the object (speed is the distance between frames) Obtained by dividing by. Thereby, since the moving distance, moving direction, and speed are known for each object, these are stored in the RAM 123 for each object. If there is an object that does not exist in the previous frame but is detected in the current frame, the object stores the coordinate value (position) in the RAM 123 as an object that has appeared in the current frame.
 なお、物体の移動距離、移動方向、および速度を使った処理については後述するが、これらを使用しなければ、S13の処理は行わなくてもよく、クラスタリングと共に現在フレームにおける物体の3次元座標系の座標値(位置)求めるだけでもよい。 The processing using the moving distance, moving direction, and speed of the object will be described later. If these are not used, the processing of S13 may not be performed, and the three-dimensional coordinate system of the object in the current frame together with clustering. It is only necessary to obtain the coordinate value (position) of.
 続いて、制御部120は、赤外線画像内で周辺(地面や背景など)より温度の高い部分と距離画像内で検出された物体とを対応付ける(S14)。既に説明したように、赤外線画像内の物体と距離画像内の物体とは座標変換によって対応させることができる。 Subsequently, the control unit 120 associates a portion having a higher temperature than the periphery (ground, background, etc.) in the infrared image with an object detected in the distance image (S14). As already described, the object in the infrared image and the object in the distance image can be associated by coordinate transformation.
 具体的には、制御部120は、赤外線画像において周辺より温度の高い部分(たとえば図3に示したob1~ob4)を占める画素の2次元座標系の座標値を抽出する。たとえば、2次元座標系における1つの画素の座標値を(x1,y1)とすると、既に求めている変換係数αx,αyを用いて変換すると、(αx×x1,αy×y1)となる。他の各画素についても同じ変換をする。 Specifically, the control unit 120 extracts the coordinate values of the two-dimensional coordinate system of the pixels that occupy a portion of the infrared image having a higher temperature than the periphery (for example, ob1 to ob4 shown in FIG. 3). For example, assuming that the coordinate value of one pixel in the two-dimensional coordinate system is (x1, y1), conversion using the already obtained conversion coefficients αx, αy results in (αx × x1, αy × y1). The same conversion is performed for the other pixels.
 なお、このような画像変換は、赤外線画像において周辺より温度の高い部分を示している画素だけ、すなわち、画素の階調値が0以外、またはあらかじめ決められた閾値以上の画素を変換してもよいし、赤外線画像のすべての画素を変換してもよい。 Note that such image conversion can be performed even if only pixels that show a higher temperature in the infrared image than the surroundings, that is, pixels that have a pixel gradation value other than 0 or a predetermined threshold value or more. Alternatively, all the pixels of the infrared image may be converted.
 そして、制御部120は、変換後の座標値の画素と重なる距離画像内の物体とを対応付けする。このとき、距離画像内の物体として示されている点群データの座標値の範囲と、変換後の温度の高い部分の座標値の画素とが少しでも重なっていれば、赤外線画像内の温度の高い部分と距離画像内の物体とが対応しているものとする。 And the control part 120 matches the object in the distance image which overlaps with the pixel of the coordinate value after conversion. At this time, if the coordinate value range of the point cloud data shown as the object in the distance image and the pixel of the coordinate value of the high temperature part after the conversion overlap even a little, the temperature of the infrared image Assume that the high part corresponds to the object in the distance image.
 これは、高温物体の場合、物体からの輻射熱によって床面やその物体の周辺も熱くなり物体の周辺からも赤外線が放射されていることがある。このような場合、赤外線カメラで撮影した赤外線画像には、高温物体と共にその周辺も温度が高い部分として写る。また逆に、人のような物体を赤外線カメラで撮影した赤外線画像には、顔が温度の高い部分として写り、身体は顔よりも低温の部分として写る。これらの場合、赤外線画像内での温度の高い部分の大きさと、ライダーから取得した距離画像内の物体の点群(実際の物体の大きさ)の大きさとは一致しないことがある。そこで、本実施形態では、赤外線画像中における温度の高い部分と距離画像内の物体とで一部でも重なる部分があれば、それらは対応するものとしたのである。なお、重なりの割合に限定はない。たとえば、人の場合、衣服からの肌の露出の程度によって異なるものの、温度が高く写る部分(顔など)は人全体に対して1~20%程度であるので、1%以上重なりがあれば、対応するものとする。 This is because, in the case of a high-temperature object, the floor surface and the periphery of the object become hot due to radiant heat from the object, and infrared rays may be emitted from the periphery of the object. In such a case, in the infrared image taken by the infrared camera, the periphery of the high-temperature object is also shown as a high-temperature part. Conversely, in an infrared image obtained by photographing an object such as a person with an infrared camera, the face appears as a part having a higher temperature, and the body appears as a part having a lower temperature than the face. In these cases, the size of the high-temperature portion in the infrared image may not match the size of the object point group (actual object size) in the distance image acquired from the rider. Therefore, in the present embodiment, if there is a part that overlaps at least part of the high-temperature part in the infrared image and the object in the distance image, they correspond to each other. There is no limitation on the overlapping ratio. For example, in the case of a person, although it varies depending on the degree of skin exposure from clothes, the portion where the temperature appears high (such as the face) is about 1 to 20% of the whole person, so if there is an overlap of 1% or more, It shall correspond.
 対応付けした各物体には、その対応付けした温度をその物体の温度としてRAM123に記憶しておく。なお、一つの物体のなかで温度分布があるような場合、たとえば、人の顔部分は身体より高い温度となり、人全体として温度分布がある。このような場合には、その温度分布のなかの最高温度をその物体の温度として記憶するとよい(この記憶した温度は後述する画像表示の際に使用する)。 For each associated object, the associated temperature is stored in the RAM 123 as the temperature of the object. When there is a temperature distribution in one object, for example, the face portion of a person has a higher temperature than the body, and the entire person has a temperature distribution. In such a case, the highest temperature in the temperature distribution may be stored as the temperature of the object (this stored temperature is used for image display described later).
 続いて、制御部120は、ディスプレイ130に赤外線画像を元にして、物体に温度と距離に応じた色を付けてディスプレイ130に表示させる(S15)。このときの表示は、赤外線画像(2次元座標系)を基にしており、そのなかに、距離画像(3次元座標系)から得られた物体の位置がわかるように、赤外線画像内の物体の位置に相当する部分に枠線を描いて表示させる。赤外線画像における物体の位置は、すでに説明した座標変換によって、距離画像内のある物体について、その3次元座標系のX-Y平面の座標を赤外線画像の座標に変換することで得られる。枠線は、クラスターとなっている物体では、3次元座標系の距離画像のX-Y平面におけるクラスターの外形を抽出して、抽出した外形の座標値に合わせて、枠線を表示させる。 Subsequently, the control unit 120 causes the display 130 to display a color according to the temperature and distance on the object based on the infrared image on the display 130 (S15). The display at this time is based on an infrared image (two-dimensional coordinate system), in which the position of the object in the infrared image is understood so that the position of the object obtained from the distance image (three-dimensional coordinate system) can be understood. A frame is drawn on the part corresponding to the position. The position of the object in the infrared image can be obtained by converting the coordinates of the XY plane of the three-dimensional coordinate system into the coordinates of the infrared image of a certain object in the distance image by the coordinate conversion already described. For an object that is a cluster, the outline of the cluster in the XY plane of the distance image of the three-dimensional coordinate system is extracted, and the frame is displayed according to the coordinate value of the extracted outline.
 物体に付ける枠線は第1の関連情報像であり、このような枠線を付けることで、視覚的に物体の存在がわかりやすくなる。特に動体については、物体が動いているため画面から視線を離したあとに見失うこともあるが、このように枠線を付けておくことで、わかりやすくなる。なお、第1の関連情報像としては、枠線に限らず、たとえば、物体を指し示す矢印や三角印などでもよい。また、枠線などと共に、物体までの距離や温度などの数値を表示させてもよい。 The frame line attached to the object is the first related information image. By attaching such a frame line, the presence of the object can be easily understood visually. In particular, moving objects may be lost after moving away from the screen because the object is moving, but adding a frame line in this way makes it easier to understand. Note that the first related information image is not limited to a frame line, and may be, for example, an arrow or a triangle indicating an object. Also, numerical values such as the distance to the object and the temperature may be displayed together with the frame line.
 さらに、この表示の際には、表示される物体に、位置の情報および温度の情報を基に色付けする。位置の情報は、ライダー102から得られる3次元座標系の距離画像から得られる情報である。ここでは上述の枠線(第1の関連情報像)もその一つであるが、さらに監視部110の設置位置(すなわちライダー102の設置位置)からの距離に応じて物体に付ける色を変えている。 Furthermore, at the time of this display, the displayed object is colored based on position information and temperature information. The position information is information obtained from a distance image of a three-dimensional coordinate system obtained from the rider 102. Here, the above-mentioned frame line (first related information image) is one of them, but the color attached to the object is changed according to the distance from the installation position of the monitoring unit 110 (that is, the installation position of the rider 102). Yes.
 一方、温度の情報は、2次元座標系の赤外線画像から得られる温度であり、この温度の情報によっても物体に付ける色を変えている。 On the other hand, the temperature information is a temperature obtained from an infrared image of a two-dimensional coordinate system, and the color applied to the object is also changed by this temperature information.
 これらにより、物体には、位置の情報と温度の情報に基づいて色分け表示がなされる。具体的にはたとえば、温度が低い方から高い方へ青、黄、赤などの色とする。さらに、監視部110からの距離が近いほど色の明度を高くし(すなわち表示する画素の階調値を高くし)、遠いほど色が明度を低く(表示する画素の階調値を低く)する、などである。 Thus, the object is color-coded based on position information and temperature information. Specifically, for example, colors such as blue, yellow, and red are used from the lowest temperature to the highest temperature. Further, the closer the distance from the monitoring unit 110, the higher the brightness of the color (that is, the higher the gradation value of the pixel to be displayed), and the farther the color is, the lower the brightness of the color (lower the gradation value of the pixel to be displayed). , Etc.
 図9は、表示例を示す画面例図である。表示される色を(R,G,B)の階調値として、各色0~255とする。図において、高温の物体(後述の特定温度物体となる物体)ob4は、温度が高いが距離が遠いので中程度の明度の赤(150,0,0)で表示する。特定温度物体ob4以外の物体ob1~ob3は人であり、特定温度物体ob4に比べて温度が低いので、黄色に近い色で、さらにそれぞれの距離に応じてそれらの色の明度が異なるように表示する。たとえば、最も近い物体ob1は明るい黄色(224,250,0)、中くらいの距離にある物体ob2は中くらいの明るさの黄色(180,190,0)、最も遠い物体ob3は暗い黄色(100,120,0)で表示される。このとき物体の範囲内、すなわち、1の物体にクラスタリングされた位置(画素)に対応する赤外線画像に温度分布がある場合は、前述の物体全体の温度として記憶した温度で、物体全体に同じ色を施す。これにより、人など部分的に温度が高いような物体でも、画面を見て認識しやすくなる。また、物体ob1~ob4には、その温度にかかわらず、上述のとおり、物体であることを示す枠線fbが付けられて表示される。このため、物体であることがわかりやすくなり、特に動体の視認性が向上する。 FIG. 9 is a screen example showing a display example. The displayed colors are (R, G, B) gradation values, and each color is 0 to 255. In the figure, a high-temperature object (an object to be a specific temperature object described later) ob4 is displayed in red (150, 0, 0) having a medium brightness because the temperature is high but the distance is long. Since the objects ob1 to ob3 other than the specific temperature object ob4 are humans and the temperature is lower than that of the specific temperature object ob4, they are displayed in a color close to yellow, and the brightness of those colors varies depending on the distance. To do. For example, the closest object ob1 is bright yellow (224, 250, 0), the medium object ob2 is medium bright yellow (180, 190, 0), and the farthest object ob3 is dark yellow (100 , 120, 0). At this time, if there is a temperature distribution in the infrared image corresponding to the position (pixel) clustered in one object within the range of the object, the same color is applied to the entire object at the temperature stored as the temperature of the entire object. Apply. This makes it easier to recognize an object such as a person whose temperature is partially high by looking at the screen. Further, the objects ob1 to ob4 are displayed with the frame line fb indicating that they are objects as described above, regardless of the temperature. For this reason, it becomes easy to understand that the object is an object, and in particular, the visibility of the moving object is improved.
 ここでは、さらに、監視部110の設置位置からの距離を表す距離線(0m~40m)も示している。 Here, a distance line (0 m to 40 m) representing the distance from the installation position of the monitoring unit 110 is also shown.
 続いて、制御部120は、赤外線画像のなかに特定温度部分があるか否かを判断する(S16)。特定温度部分とは、所定の温度範囲(所定の温度以上とする場合を含む)となっている部分である。たとえば、監視対象として人に危害が及ぶような温度の物体に人が近付かないように監視し、近付いた場合に警報するような場合は、特定温度部分となる所定の温度範囲としては50℃以上を設定する(この場合、上限はたとえば赤外線カメラの各画素が飽和する温度でよい)。もちろん、特定温度部分とする温度範囲を何度にするかは任意であり、監視対象の物体(たとえば人に危害を及ぼす高温物体)の温度や環境(室内か室外かなど)の温度などにより決めればよい。 Subsequently, the control unit 120 determines whether or not there is a specific temperature portion in the infrared image (S16). The specific temperature portion is a portion in a predetermined temperature range (including a case where the temperature is equal to or higher than a predetermined temperature). For example, when monitoring is performed so that a person does not come close to an object at a temperature that may cause harm to a person as a monitoring target, and an alarm is given when approaching, a predetermined temperature range that is a specific temperature portion is 50 ° C. or higher. (In this case, the upper limit may be a temperature at which each pixel of the infrared camera is saturated, for example). Of course, the temperature range for the specific temperature part is arbitrary, and is determined by the temperature of the monitored object (for example, a hot object that is harmful to humans) or the temperature of the environment (such as indoors or outdoors). That's fine.
 ここで、特定温度部分がなければ、制御部120は、次のフレームを取得するためにS11へ戻る(S16:NO)。なお、詳細は図示省略したが、S12において物体が検出されない場合、当然のことながらS14での対応付けは行われず、S15では物体が存在しない画面が表示され、その後S16でNOとなってS11へ戻り、処理を継続することになる。また、S16でNOと判断された場合は、後述する警報領域を設定したことを示すデータがあれば、クリアして警報領域が設定されてないことを示すようにしておく(これは後述するS17の処理で必要となる)。 Here, if there is no specific temperature portion, the control unit 120 returns to S11 to acquire the next frame (S16: NO). Although not shown in detail, if no object is detected in S12, it is a matter of course that the association in S14 is not performed, a screen in which no object exists is displayed in S15, and then NO in S16 and NO to S11 Return and continue processing. If NO is determined in S16, if there is data indicating that an alarm area described later is set, the data is cleared to indicate that the alarm area is not set (this is described later in S17 described later). Is required for processing).
 制御部120は、特定温度部分を検出したなら(S16:YES)、続いて、制御部120は警報領域が既に存在しているか否かを判断する(S17)。ここで警報領域の有無は、後述するS19において警報領域を設定した際に記憶させる、警報領域が設定された旨のデータを確認することで判断する。ここで既に警報領域が設定されている場合はS20へ進むことになる。S20の処理は後述する。 If the control part 120 detects the specific temperature part (S16: YES), the control part 120 will judge next whether the warning area | region already exists (S17). Here, the presence / absence of the alarm area is determined by confirming data indicating that the alarm area is set, which is stored when the alarm area is set in S19 described later. If the warning area has already been set, the process proceeds to S20. The process of S20 will be described later.
 S17において、警報領域が存在しなければ(S17:NO)、制御部120は、距離画像(3次元座標系)内の物体のうち、S16で検出した特定温度部分に対応した物体を特定温度物体として特定する(S18)。この段階では、既に、距離画像内の物体はクラスタリングされ、かつ、周辺温度よりも温度の高い部分と対応付けされている。そして動体の場合は動体追跡されている(上述のS12~S14)。このため、このS18では、赤外線画像のなかから特定温度部分を検索して、それに対応している物体の座標値(位置)を特定すればよい。 In S17, if there is no warning area (S17: NO), the control unit 120 selects an object corresponding to the specific temperature portion detected in S16 among the objects in the distance image (three-dimensional coordinate system) as the specific temperature object. (S18). At this stage, the objects in the distance image are already clustered and associated with a portion having a temperature higher than the ambient temperature. In the case of a moving object, the moving object is tracked (S12 to S14 described above). For this reason, in this S18, the specific temperature portion may be searched from the infrared image, and the coordinate value (position) of the object corresponding to it may be specified.
 しかし、このとき、特定温度部分にS13で動体追跡されている物体または現在フレームで新たに出現した物体と対応付けできない場合がある。たとえば、背景画像を取得(記憶)した時点では、低温であったが、その後温度が高くなったような静止物(移動しない物体)である。このような場合は、S13に背景画像のなかに存在している何らかの物体(背景差分法では検出できない物体)が発熱したものとして、S11で取得した3次元座標系の距離画像のなかから、S12で物体として検出されていない静止物(以下単に静止物という)と特定温度部分を対応付ければよい。このとき、その静止物の座標値を特定温度物体の座標値としてRAM123に記憶する。 However, at this time, there may be a case where the specific temperature portion cannot be associated with the object being tracked in S13 or the object newly appearing in the current frame. For example, it is a stationary object (an object that does not move) that is low in temperature when the background image is acquired (stored), but the temperature subsequently increases. In such a case, it is assumed that some object existing in the background image in S13 (an object that cannot be detected by the background subtraction method) generates heat, and from the distance image of the three-dimensional coordinate system acquired in S11, S12 The specific temperature portion may be associated with a stationary object (hereinafter simply referred to as a stationary object) that has not been detected as an object. At this time, the coordinate value of the stationary object is stored in the RAM 123 as the coordinate value of the specific temperature object.
 このような事例は、たとえば、第1領域内にボイラー(移動しない接地型のもの)があった場合、背景画像取得時にも当然にボイラーは写っている。このため背景差分法では、ボイラーをS12において物体とは検出できない。一方、背景画像取得時に停止していたボイラー、監視動作中のある時点から稼働し始めて熱くなると、特定温度部分として赤外線画像に写ることになる。このような事例では、上記のとおり、静止物であるボイラーを示している点群データに特定温度部分を対応付けすれば、以後、静止物であるボイラーも特定温度物体として認識できる。 In this case, for example, when there is a boiler (a grounding type that does not move) in the first area, the boiler is naturally reflected even when the background image is acquired. For this reason, in the background subtraction method, the boiler cannot be detected as an object in S12. On the other hand, when the boiler that has been stopped at the time of acquiring the background image starts operating from a certain point during the monitoring operation and becomes hot, it is reflected in the infrared image as a specific temperature portion. In such a case, as described above, if the specific temperature portion is associated with the point cloud data indicating the boiler that is a stationary object, the boiler that is a stationary object can be subsequently recognized as the specific temperature object.
 続いて、制御部120は、特定温度物体ob4の周囲に警報領域を設定する(S19)。この警報領域の大きさは、特定温度部分を検出した赤外線画像から特定温度部分の温度(特定温度物体ob4内で温度分布がある場合は特に高い温度の部分)を特定し、その温度に応じて警報領域の大きさを可変とする。 Subsequently, the control unit 120 sets an alarm area around the specific temperature object ob4 (S19). The size of the alarm area is determined by identifying the temperature of the specific temperature portion (particularly a high temperature portion when there is a temperature distribution in the specific temperature object ob4) from the infrared image in which the specific temperature portion is detected, and depending on the temperature The size of the alarm area is variable.
 図10は特定温度物体を説明するための説明図である。図10は、3次元座標系の距離画像と座標変換後の2次元座標系の赤外線画像とを重ね合わせて示している。図10に示すように、特定温度物体は、たとえば、その形状の端部の座標値(x、y、z)で示すと、(xmin,ymin,zmin),(xmax,ymin,zmin),(xmax,ymax,zmin),(xmin,ymax,zmin),(xmin,ymin,zmax),(xmax,ymin,zmax),(xmax,ymax,zmax),(xmin,ymax,zmax)となる。ここでは、特定温度物体は接地している物体であるので、3次元座標系のY軸の原点(0)を地面(床面)に取れば、Y軸方向の下端(ymin)は0(ゼロ)となる。 FIG. 10 is an explanatory diagram for explaining the specific temperature object. FIG. 10 shows the distance image of the three-dimensional coordinate system and the infrared image of the two-dimensional coordinate system after coordinate conversion superimposed. As shown in FIG. 10, for example, the specific temperature object is represented by (xmin, ymin, zmin), (xmax, ymin, zmin), ( xmax, ymax, zmin), (xmin, ymax, zmin), (xmin, ymin, zmax), (xmax, ymin, zmax), (xmax, ymax, zmax), (xmin, ymax, zmax). Here, since the specific temperature object is a grounded object, if the origin (0) of the Y axis of the three-dimensional coordinate system is taken as the ground (floor surface), the lower end (ymin) in the Y axis direction is 0 (zero). )
 図11および12は、特定温度物体の周囲に設ける警報領域を説明するための説明図である。ここでは、3次元座標系における特定温度物体の周囲のみ示した。図11は特定温度物体の温度T1の場合であり、図12は特定温度物体の温度T2の場合である。ここではそれぞれの温度はT1<T2である。そして警報領域を設定するための所定距離を温度に対応させて距離D1<D2としている。 11 and 12 are explanatory diagrams for explaining an alarm region provided around a specific temperature object. Here, only the periphery of the specific temperature object in the three-dimensional coordinate system is shown. FIG. 11 shows the case of the temperature T1 of the specific temperature object, and FIG. 12 shows the case of the temperature T2 of the specific temperature object. Here, each temperature is T1 <T2. Then, the predetermined distance for setting the alarm region is made to correspond to the temperature, and the distance D1 <D2.
 特定温度物体の温度がT1の場合は、図11に示すように、警報領域m1は、特定温度物体ob4の周囲に特定温度物体ob4の外周から所定距離D1の範囲となるように設定する。具体的には、警報領域m1は、特定温度物体ob4の外周端の座標値からX,Y,Z各軸について、いずれも距離D1の範囲を設定する。したがって、警報領域m1を座標値(x、y、z)で示すと(xmin-D1,ymin-D1,zmin-D1),(xmax+D1,ymin+D1,zmin+D1),(xmax+D1,ymax+D1,zmin+D1),(xmin+D1,ymax+D1,zmin+D1),(xmin-D1,ymin-D1,zmax+D1),(xmax+D1,ymin-D1,zmax+D1),(xmax+D1,ymax+D1,zmax+D1),(xmin-D1,ymax+D1,zmax+D1)の範囲となる。ただし、ここでは前記のとおり特定温度物体は接地している物体であるので、Y軸を原点(0)とすれば、Y軸のマイナス方向に警報領域を設定する必要はない。このため、各座標値のなかのymin-D1=0となる。 When the temperature of the specific temperature object is T1, as shown in FIG. 11, the alarm region m1 is set around the specific temperature object ob4 so as to be within a predetermined distance D1 from the outer periphery of the specific temperature object ob4. Specifically, the alarm area m1 sets the range of the distance D1 for each of the X, Y, and Z axes from the coordinate value of the outer peripheral edge of the specific temperature object ob4. Therefore, when the alarm area m1 is represented by coordinate values (x, y, z), (xmin−D1, ymin−D1, zmin−D1), (xmax + D1, ymin + D1, zmin + D1), (xmax + D1, ymax + D1, zmin + D1), (xmin + D1) , Ymax + D1, zmin + D1), (xmin-D1, ymin-D1, zmax + D1), (xmax + D1, ymin-D1, zmax + D1), (xmax + D1, ymax + D1, zmax + D1), (xmin-D1, ymax + D1, zmax + D1). However, since the specific temperature object is a grounded object as described above, it is not necessary to set the alarm region in the negative direction of the Y axis if the Y axis is the origin (0). For this reason, ymin−D1 = 0 in each coordinate value.
 また、特定温度物体ob4の温度がT2の場合は、図12に示すように、警報領域m2は特定温度物体ob4の周囲に特定温度物体ob4の外周から所定距離D2の範囲となるように設定する。具体的には、警報領域m2は、特定温度物体ob4の外周端の座標値からX,Y,Z各軸について、いずれも距離D2の範囲である。したがって、警報領域m2を座標値(x、y、z)で示すと(xmin-D2,ymin-D2,zmin-D2),(xmax+D2,ymin+D2,zmin+D2),(xmax+D2,ymax+D2,zmin+D2),(xmin+D2,ymax+D2,zmin+D2),(xmin-D2,ymin-D2,zmax+D2),(xmax+D2,ymin-D2,zmax+D2),(xmax+D2,ymax+D2,zmax+D2),(xmin-D2,ymax+D2,zmax+D2)の範囲となる。ただし、ここでは前記のとおり、特定温度物体は接地している物体であるので、Y軸を原点(0)とすれば、Y軸のマイナス方向に警報領域を設定する必要はない。このため、各座標値のなかのymin-D2=0となる。 When the temperature of the specific temperature object ob4 is T2, as shown in FIG. 12, the alarm region m2 is set around the specific temperature object ob4 to be within a predetermined distance D2 from the outer periphery of the specific temperature object ob4. . Specifically, the alarm area m2 is a range of distance D2 for each of the X, Y, and Z axes from the coordinate value of the outer peripheral end of the specific temperature object ob4. Therefore, when the alarm area m2 is expressed by coordinate values (x, y, z), (xmin−D2, ymin−D2, zmin−D2), (xmax + D2, ymin + D2, zmin + D2), (xmax + D2, ymax + D2, zmin + D2), (xmin + D2) , Ymax + D2, zmin + D2), (xmin−D2, ymin−D2, zmax + D2), (xmax + D2, ymin−D2, zmax + D2), (xmax + D2, ymax + D2, zmax + D2), and (xmin−D2, ymax + D2, zmax + D2). However, as described above, since the specific temperature object is a grounded object, it is not necessary to set the alarm region in the negative direction of the Y axis when the Y axis is the origin (0). For this reason, ymin−D2 = 0 among the coordinate values.
 このように本実施形態では、特定温度物体の温度が高い方が警報領域の範囲が広くなるように設定している。このような温度と所定距離との関係は、たとえば、あらかじめHDD124に温度対所定距離のテーブルデータなどとして記憶しておいて、RAM123に読み出して使用するとよい。そして処理の際に、S19において赤外線画像から検出された特定温度部分の温度からテーブルデータを参照して所定距離を抽出する。そして抽出した所定距離分離した警報領域を設定する。 As described above, in this embodiment, the higher the temperature of the specific temperature object, the wider the range of the alarm area is set. Such a relationship between the temperature and the predetermined distance may be stored in advance in the HDD 124 as table data of the temperature versus the predetermined distance and read out to the RAM 123 for use. At the time of processing, a predetermined distance is extracted by referring to the table data from the temperature of the specific temperature portion detected from the infrared image in S19. Then, the extracted alarm region separated by a predetermined distance is set.
 S19においては、警報領域を設定後、警報領域を設定したことを示すデータをRAM123に記憶しておく。 In S19, after setting the alarm area, data indicating that the alarm area is set is stored in the RAM 123.
 なお、上記説明では、特定温度物体が接地している(地面(床面を含む)に着いている)場合を例に説明したが、たとえば、特定温度物体が空中にある場合(たとえば、特定温度物体が吊り下げ搬送などされているような場合)、特定温度物体の下面(Y軸のマイナス方向)にも広がるように警報領域を設定する。ただし、この場合も、Y軸の原点を地面とすれば、ymin-D2が0より小さくなる場合は0にしてよい。 In the above description, the case where the specific temperature object is grounded (has reached the ground (including the floor)) has been described as an example. However, for example, when the specific temperature object is in the air (for example, the specific temperature) When the object is suspended and conveyed), an alarm region is set so as to extend also to the lower surface of the specific temperature object (minus direction of the Y axis). In this case, however, if the origin of the Y axis is the ground, it may be set to 0 if ymin−D2 is smaller than 0.
 また、ここでは、特定温度物体が3次元座標系で直方体となっている場合を例に説明したが、特定温度物体の形状は直方体に限らず、その他の形状であってもよい。その場合、警報領域は、特定温度物体の形状に合わせて、特定温度物体の外周から所定距離(D1、D2など)の範囲として設定すればよい。 In addition, here, the case where the specific temperature object is a rectangular parallelepiped in the three-dimensional coordinate system has been described as an example, but the shape of the specific temperature object is not limited to a rectangular parallelepiped, and may be other shapes. In this case, the warning area may be set as a range of a predetermined distance (D1, D2, etc.) from the outer periphery of the specific temperature object according to the shape of the specific temperature object.
 また、ここでは、警報領域は、特定温度物体の外周端から所定距離(D1、D2など)の範囲として設定したが、これに代えて、たとえば、特定温度物体の中心から所定距離(ただし所定距離は特定温度物体の中心から外形までの距離より長い)の範囲としてもよい。これにより球体やそれに近い形状の場合に、クラスタリングした特定温度物体のクラスター中心から球体の範囲を警報領域として設定すればよいので、計算が容易になる(処理の高速化を図れる)。 Here, the alarm region is set as a range of a predetermined distance (D1, D2, etc.) from the outer peripheral edge of the specific temperature object, but instead of this, for example, a predetermined distance (however, a predetermined distance from the center of the specific temperature object) May be longer than the distance from the center of the specific temperature object to the outer shape). Thus, in the case of a sphere or a shape close to it, the range of the sphere from the cluster center of the clustered specific temperature object may be set as an alarm region, which facilitates calculation (speeding up the processing).
 また、たとえば、特定温度部分として温度分布があるような場合に、特定温度物体のなかで最も温度の高い部分に対応する位置から所定距離の範囲としてもよい。これにより、特定温度物体のなかで、温度分布があるような場合でも、高温部分を中心にして警報領域を設定することができる。 Further, for example, when there is a temperature distribution as the specific temperature portion, the range may be a predetermined distance from the position corresponding to the highest temperature portion of the specific temperature object. Thereby, even in the case where there is a temperature distribution in the specific temperature object, the alarm region can be set around the high temperature portion.
 このように警報領域を特定温度物体から一定の距離または温度に応じた距離離すように設定することで、特定温度物体が動体の場合に、その移動に合わせて警報領域も移動させることができる(後述S20参照)。 In this way, by setting the alarm area to be separated from the specific temperature object by a certain distance or a distance corresponding to the temperature, when the specific temperature object is a moving object, the alarm area can be moved in accordance with the movement ( (See S20 described later).
 なお、警報領域はこのような設定方法以外にも、たとえば、特定温度物体が移動しないことがわかっている場合(上記静止物の場合)や、移動する範囲がわかっている場合には、それらに合わせて、特定温度物体の周囲に、固定された所定距離の範囲を警報領域としてもよい。このような固定された警報領域を設定する場合、たとえば、特定温度物体が動体であれば、所定距離は、移動しない方向は短く、移動する方向は長くしてもよい。 In addition to such a setting method, for example, when it is known that a specific temperature object does not move (in the case of the stationary object), or when the moving range is known, the alarm area is In addition, a fixed range of a predetermined distance around the specific temperature object may be used as an alarm area. When such a fixed alarm region is set, for example, if the specific temperature object is a moving object, the predetermined distance may be short in the non-moving direction and long in the moving direction.
 フローチャートに戻り説明を続ける。警報領域の設定後、制御部120は、特定温度物体ob4および警報領域m1(またはm2)を色分けまたは線種分けした枠や線、または印を付けて表示させるように、ディスプレイ130に表示される画面を更新する(S21)。このとき警報領域を示す枠線を第2の関連情報像とする。これにより、所定の温度範囲となっている物体(特定温度物体)が視覚的にわかりやすくなる。このため、特定温度物体に近付くような他の物体(人や物)がある場合にも、画面上で特定温度物体と他の物体との距離が把握しやすくなる。なお、第2の関連情報像としては、図示した枠線に限らず、たとえば、物体を指し示す矢印や三角印、さらに警報領域全体を薄く色付けするようなことでもよい(たとえば特定温度物体がその色から透けて見える程度の濃さ)。 Return to the flowchart and continue the explanation. After the alarm area is set, the control unit 120 is displayed on the display 130 so that the specific temperature object ob4 and the alarm area m1 (or m2) are displayed with a frame, a line, or a mark that is color-coded or line-typed. The screen is updated (S21). At this time, the frame line indicating the warning area is set as the second related information image. As a result, an object (specific temperature object) in a predetermined temperature range can be visually easily understood. For this reason, even when there is another object (a person or an object) that approaches the specific temperature object, the distance between the specific temperature object and the other object can be easily grasped on the screen. Note that the second related information image is not limited to the illustrated frame line, and may be, for example, an arrow or a triangle indicating an object, or a light coloration of the entire alarm area (for example, a specific temperature object has its color) Dark enough to show through).
 続いて、制御部120は、特定温度物体と異なる、他の物体(人やそのほかの物など)が警報領域内にあるか否かを判断する(S22)。この比較は、S12でクラスタリングした物体(すなわち背景差分法で検出された物体)の、クラスターの外形の座標値と警報領域を示している座標値で囲まれた範囲を比較する。そして、他の物体のクラスターの外形の座標値が警報領域内に入っていれば、他の物体が警報領域内にあると判断する。 Subsequently, the control unit 120 determines whether another object (such as a person or other object) that is different from the specific temperature object is within the alarm region (S22). This comparison compares the range surrounded by the coordinate value of the outer shape of the cluster and the coordinate value indicating the warning area of the object clustered in S12 (that is, the object detected by the background difference method). If the coordinate value of the outer shape of the cluster of another object is within the alarm area, it is determined that the other object is within the alarm area.
 ここで、他の物体が警報領域内にあると判断されなければ(S22:NO)、制御部120は、S11へ戻り、次のフレームの各画像を取得し、その後の処理を継続する。 Here, if it is not determined that another object is in the alarm area (S22: NO), the control unit 120 returns to S11, acquires each image of the next frame, and continues the subsequent processing.
 一方、他の物体が警報領域内にあると判断されたなら(S22:YES)、制御部120は、警報器140に対して警報信号を出力する(S23)。これにより、警報信号を受信した警報器140から警報音が鳴る。また、ディスプレイ130に、警報領域内に入ったと判断された物体(または物体を囲む枠や印など)を点滅させたり、画面全体の色を変えたり、点滅させたり、さらには、警告文を表示させたりして、視覚的にも警報が発せられていることがわかるようにするとよい。また、警報器140と合わせて、回転灯を点灯させたり、色分けされた積層表示灯の色を青から赤に変化させたり、その他のランプを点灯や点滅させたりするなど、様々な警報動作を行ってもよい。 On the other hand, if it is determined that another object is in the alarm region (S22: YES), the control unit 120 outputs an alarm signal to the alarm device 140 (S23). Thereby, an alarm sound is emitted from the alarm device 140 that has received the alarm signal. The display 130 blinks an object (or a frame or mark surrounding the object) that is determined to be in the alarm area, changes the color of the entire screen, blinks, or displays a warning text. It is good to see that the alarm is also issued visually. Along with the alarm device 140, various alarm operations such as turning on the rotating lamp, changing the color of the color-coded layered display lamp from blue to red, and turning on and blinking other lamps, etc. You may go.
 その後、S12へ戻り、処理を継続することになる。S23からS12へ戻って以降、特定温度部分が消失した場合(S16:NOとなった場合)、または、他の物体が警報領域から抜けた場合(S22:NOとなった場合)は、警報信号を止めるようにしてもよい。また、警報信号の出力後は、たとえば手動によって警報を切らない限り鳴り続けるようにしてもよい。 Then, the process returns to S12 and the processing is continued. After returning from S23 to S12, when the specific temperature portion disappears (S16: NO), or when another object leaves the alarm area (S22: NO), an alarm signal You may make it stop. Further, after the alarm signal is output, for example, the alarm may continue to sound unless the alarm is manually turned off.
 S20の処理を説明する。S20では、前フレームまでで既に警報領域が設定されている状態である。しかも、その時点で取得されている現在フレームにおいて動体追跡(S13)も行われている。このため、特定温度物体が動体であれば、その移動距離、方向、および速度がわかっている。そこで、S20においては、既に設定されている警報領域の座標値を、特定温度物体の移動距離および方向を用いて移動させる。これにより、前フレームまでで既に警報領域が設定されていたなら、特定温度物体が動体であっても、その移動に合わせて警報領域を移動させるだけでよい。このためS18~S19のように、現在フレーム内の特定温度物体の座標値から警報領域を設定するよりは、計算が簡単になる(処理の高速化を図れる)。 The process of S20 will be described. In S20, the alarm area is already set up to the previous frame. In addition, moving object tracking (S13) is also performed in the current frame acquired at that time. For this reason, if the specific temperature object is a moving object, its moving distance, direction, and speed are known. Therefore, in S20, the already set coordinate value of the alarm region is moved using the moving distance and direction of the specific temperature object. Thus, if the alarm area has already been set up to the previous frame, even if the specific temperature object is a moving object, it is only necessary to move the alarm area in accordance with the movement. For this reason, as in S18 to S19, the calculation is simpler (the processing speed can be increased) than setting the alarm region from the coordinate value of the specific temperature object in the current frame.
 S20の後は、制御部120は、S21へ進み、移動させた警報領域などを表示させるように画面を更新し、以降の処理を継続する。 After S20, the control unit 120 proceeds to S21, updates the screen so as to display the moved alarm area and the like, and continues the subsequent processing.
 このようにして監視動作は、繰り返し処理として実行される。 In this way, the monitoring operation is executed as a repeated process.
 上記説明では、特定温度物体の周囲にあらかじめ決められた所定距離の範囲を警報領域として設定することとした。これに代えて、特定温度物体と他の物体との相対距離および相対速度に応じて警報領域を設定するための所定距離の長さを変更するようにしてもよい。 In the above description, a predetermined distance range around the specific temperature object is set as the alarm area. Instead of this, the length of the predetermined distance for setting the alarm region may be changed according to the relative distance and relative speed between the specific temperature object and another object.
 物体の移動方向、および速度は、既に説明したように、S13の段階で得られる。特定温度物体についても、S13での動体追跡の値として判明している。また特定温度物体が静止物の場合でもその位置は判明している(S18で特定温度物体として静止物を特定した場合)。 The moving direction and speed of the object are obtained in the step S13 as already described. The specific temperature object is also known as the moving object tracking value in S13. Even when the specific temperature object is a stationary object, the position is known (when the stationary object is identified as the specific temperature object in S18).
 そこで、これらの移動方向および速度から、特定温度物体と他の物体とが相対的に接近する方向に移動している場合で、かつ相対速度(接近速度)が速い場合には、既に設定されている警報領域を広げる。これにより、特定温度物体と他の物体のうち、少なくともいずれか一方が動体の場合に、特定温度物体の温度だけではなく、それらの移動速度なども考慮していち速く警報信号を出すことができる。 Therefore, if the specific temperature object and the other object are moving in a direction in which they are relatively approaching from these moving directions and speeds, and if the relative speed (approaching speed) is high, it is already set. Widen the alarm area. As a result, when at least one of the specific temperature object and the other object is a moving object, an alarm signal can be quickly issued in consideration of not only the temperature of the specific temperature object but also the moving speed thereof. .
 以上説明した実施形態1によれば、以下のような効果を奏する。 According to the first embodiment described above, the following effects can be obtained.
 本実施形態1では、物体の温度を検出できる2次元座標系の赤外線画像と、空間内の3次元的な位置を3次元座標系としてとらえるライダー102の距離画像との座標系を合わせたうえで、赤外線画像から温度の高い特定温度部分を検出する一方で、距離画像から特定温度部分に対応する特定温度物体の位置を特定する。そして、特定温度物体の位置と他の物体との位置関係から警報を発することとしたので、温度の高い特定温度物体と他の物体との接近を確実にとらえて、安全を確保するための警報を行うことができる。 In the first embodiment, a coordinate system of a two-dimensional coordinate system infrared image that can detect the temperature of an object and a distance image of a rider 102 that captures a three-dimensional position in space as a three-dimensional coordinate system is combined. While detecting the specific temperature portion having a high temperature from the infrared image, the position of the specific temperature object corresponding to the specific temperature portion is specified from the distance image. And since it decided to issue an alarm based on the positional relationship between the position of the specific temperature object and other objects, the alarm to ensure the safety by grasping the close proximity between the high temperature specific temperature object and other objects It can be performed.
 特定温度部分は、たとえば人に危険があるような高い温度の部分であり、他の物体は、たとえば人である場合に、温度の高い物体と人との接近を未然に防ぐために警報することができる。 The specific temperature part is a part of a high temperature that is dangerous to a person, for example, and when another object is a person, for example, an alarm may be given to prevent a high temperature object from approaching the person. it can.
 また、本実施形態1では、特定温度物体の周囲に警報領域を設定することとしたので、特定温度物体と他の物体との間隔(距離)を一々演算することなく、警報領域に物体が入ったか否かを判断することができる。このため、危険性の判断にかかる時間(演算処理時間)を少なくすることができる。 In the first embodiment, since the alarm area is set around the specific temperature object, the object enters the alarm area without calculating the distance (distance) between the specific temperature object and another object. It can be determined whether or not. For this reason, the time (calculation processing time) required for risk determination can be reduced.
 また、警報領域は、特定温度物体の移動に合わせて移動させることとしたので、特定温度物体が動いている場合でも、警報領域内に物体が入ったか否かの単純な演算によって危険性を判断することができる。 In addition, since the alarm area is moved in accordance with the movement of the specific temperature object, even if the specific temperature object is moving, the risk is judged by simple calculation of whether or not the object has entered the alarm area. can do.
 また、本実施形態1では、物体が警報領域に接近してくる方向および速度から、警報領域の大きさを変えることにしたので、物体が近付いてくる速度が速い場合には、早めに危険を知らせて、物体が特定温度物体に近付くことを回避することが、より確実に行えるようになる。 In the first embodiment, since the size of the alarm area is changed based on the direction and speed at which the object approaches the alarm area, if the speed at which the object approaches is high, the danger is early. It is possible to more reliably perform the notification and avoid that the object approaches the specific temperature object.
 また、本実施形態1では、特定温度部分の温度に応じて警報領域の大きさを変えることとしたので、物体が警報領域に近付く場合の危険の回避がより確実に行えるようになる。 Further, in the first embodiment, since the size of the alarm area is changed according to the temperature of the specific temperature portion, it is possible to more reliably avoid the danger when the object approaches the alarm area.
 (実施形態2)
 (監視システムの構成)
 図13は、実施形態2の監視システムの構成を示すブロック図である。図14は、監視部の配置を示す鳥瞰図である。
(Embodiment 2)
(Configuration of monitoring system)
FIG. 13 is a block diagram illustrating a configuration of the monitoring system according to the second embodiment. FIG. 14 is a bird's eye view showing the arrangement of the monitoring units.
 実施形態2の監視システム200は2台の監視部を備える。2台の監視部は、第1監視部211と第2監視部212である。第1監視部211と第2監視部212は同じ監視対象となる空間を別な方向から監視するように配置されている。第1監視部211と第2監視部212の内部構成は共に実施形態1同様であり、それぞれに赤外線カメラ104とライダー102を有する。 The monitoring system 200 according to the second embodiment includes two monitoring units. The two monitoring units are a first monitoring unit 211 and a second monitoring unit 212. The first monitoring unit 211 and the second monitoring unit 212 are arranged to monitor the same monitoring target space from different directions. The internal configurations of the first monitoring unit 211 and the second monitoring unit 212 are the same as those in the first embodiment, and each includes the infrared camera 104 and the rider 102.
 制御部220は、第1監視部211と第2監視部212を一度に制御すること以外、その構成は実施形態1と同じである。このため制御部220には、第1監視部211と第2監視部212が接続されている。その他の構成は、実施形態1を同じであるので、説明は省略する。 The control unit 220 has the same configuration as that of the first embodiment except that the first monitoring unit 211 and the second monitoring unit 212 are controlled at a time. For this reason, the first monitoring unit 211 and the second monitoring unit 212 are connected to the control unit 220. Since other configurations are the same as those of the first embodiment, description thereof is omitted.
 制御部220による座標変換動作および監視動作は、第1監視部211と第2監視部212ごとにそれぞれ行われるが、その処理手順は実施形態1と同じであるので、説明は省略する。 The coordinate conversion operation and the monitoring operation by the control unit 220 are performed for each of the first monitoring unit 211 and the second monitoring unit 212, but the processing procedure is the same as that of the first embodiment, and thus the description thereof is omitted.
 本実施形態2においては、表示処理の段階、すなわち、実施形態1で説明した監視動作の処理手順のうち、S15およびS21(S23でディスプレイに警報を表示させる場合も含む、以下同様)の処理が実施形態1と異なる。 In the second embodiment, the processes of S15 and S21 (including the case where an alarm is displayed on the display in S23, among the processing steps of the monitoring operation described in the first embodiment) are included in the display processing stage. Different from the first embodiment.
 本実施形態2では、図14に示したように、第1監視部211と第2監視部212が同じ領域(空間)をそれぞれ異なる方向から監視している。このため同じ物体であっても、見えている部分(面)が異なる。特に赤外線カメラ104によって検知される温度は、物体の面によって異なる場合がある。つまり、1つの物体であっても、ある面は温度が高く、他の面は温度が低い場合などである(温度が低い面とは、物体のある面が赤外線遮蔽物に覆われている場合も含む)。 In the second embodiment, as shown in FIG. 14, the first monitoring unit 211 and the second monitoring unit 212 monitor the same area (space) from different directions. For this reason, even if it is the same object, the visible part (surface) differs. In particular, the temperature detected by the infrared camera 104 may vary depending on the surface of the object. In other words, even if one object has a high temperature on one surface and a low temperature on another surface, etc. (a surface with a low temperature is a surface on which an object is covered with an infrared shielding object) Also included).
 たとえば図14に示した物体ob5は一面hoの温度が他の面coより高い。この場合、第1監視部211の赤外線カメラ104は温度の高い面hoを撮影し、第2監視部212の赤外線カメラ104では温度の低い面coを撮影することになる。一方、それぞれのライダー102は同じ物体ob5をそれぞれの位置から走査して距離画像を出力している。 For example, the object ob5 shown in FIG. 14 has a temperature of one side ho higher than that of the other side co. In this case, the infrared camera 104 of the first monitoring unit 211 captures the high-temperature surface ho, and the infrared camera 104 of the second monitoring unit 212 captures the low-temperature surface co. On the other hand, each rider 102 scans the same object ob5 from each position and outputs a distance image.
 このような場合、同じ物体に対して、温度の低い物体として表示させるよりも、温度の高い物体として表示させた方が、その物体が持つ固有の温度が画面内でわかりやすい。そこで、本実施形態2では、表示処理の段階で、温度が高い面hoをとらえている方の監視部(図14においては第1監視部211)の画像をディスプレイ130に表示させるようにした。 In such a case, when the same object is displayed as a high-temperature object rather than being displayed as a low-temperature object, the unique temperature of the object is easier to understand on the screen. Therefore, in the second embodiment, an image of the monitoring unit (the first monitoring unit 211 in FIG. 14) capturing the surface ho having the higher temperature is displayed on the display 130 at the stage of the display process.
 このために、本実施形態2では、制御部220が表示処理段階であるS15およびS21(S23)において、実施形態1とは異なる処理を行う。図15は、実施形態2における表示処理段階(図8中のS15およびS21(S23))の手順を示すサブルーチンフローチャートである。 For this reason, in the second embodiment, the control unit 220 performs processing different from that in the first embodiment in S15 and S21 (S23), which are display processing stages. FIG. 15 is a subroutine flowchart showing the procedure of the display processing stage (S15 and S21 (S23) in FIG. 8) in the second embodiment.
 まず、制御部220は、実施形態1同様に、第1監視部211から取得した距離画像と赤外線画像、および第2監視部212から取得した距離画像と赤外線画像のそれぞれを用いて、図8に示したS11からの処理を同時並行的に実施する。 First, similarly to the first embodiment, the control unit 220 uses the distance image and the infrared image acquired from the first monitoring unit 211 and the distance image and the infrared image acquired from the second monitoring unit 212 in FIG. The processes from S11 shown are executed in parallel.
 そして、S15またはS21(またはS23)に処理が進むと、図15に示したサブルーチンに移り、制御部220は、第1監視部211の赤外線カメラ104が撮影した赤外線画像内の最高温度を抽出し、これを第1画面温度St1とする(S31)。 Then, when the process proceeds to S15 or S21 (or S23), the process proceeds to a subroutine shown in FIG. 15, and the control unit 220 extracts the maximum temperature in the infrared image captured by the infrared camera 104 of the first monitoring unit 211. This is the first screen temperature St1 (S31).
 続いて、制御部220は、第2監視部212の赤外線カメラ104が撮影した赤外線画像内の最高温度を抽出し、これを第2画面温度St2とする(S32)。なお、S31とS32の処理の順番は逆でもよい(同時でもよい)。 Subsequently, the control unit 220 extracts the maximum temperature in the infrared image captured by the infrared camera 104 of the second monitoring unit 212 and sets this as the second screen temperature St2 (S32). Note that the order of the processing of S31 and S32 may be reversed (or may be simultaneous).
 続いて、制御部220は、最高温度の高い方の画面を表示させる。ここでの処理は、St1≧St2か否かを判断する(S33)。 Subsequently, the control unit 220 displays the screen with the highest maximum temperature. In this process, it is determined whether or not St1 ≧ St2 (S33).
 ここで、St1≧St2であれば(S33:YES)、制御部220は、第1監視部211側の画像をディスプレイ130に表示させる。表示される画像は、第1監視部211から取得した距離画像と赤外線画像を用いて、実施形態1で説明したとおり、物体に色や枠を付けて表示させる。 Here, if St1 ≧ St2 (S33: YES), the control unit 220 causes the display 130 to display an image on the first monitoring unit 211 side. The displayed image uses the distance image acquired from the first monitoring unit 211 and the infrared image to display the object with a color or a frame as described in the first embodiment.
 一方、St1≧St2でなければ(S33:NO)、制御部220は、第2監視部212側の画像をディスプレイ130に表示させる。表示される画像は、第2監視部212から取得した距離画像と赤外線画像を用いて、実施形態1で説明したとおり、物体に色や枠を付けて表示させる。以上により本実施形態2におけるサブルーチンは終了するので、メインルーチン(図8および9に示したフローチャート)に戻ることになる。 On the other hand, if St1 ≧ St2 is not satisfied (S33: NO), the control unit 220 causes the display 130 to display an image on the second monitoring unit 212 side. The displayed image uses the distance image acquired from the second monitoring unit 212 and the infrared image to display the object with a color or a frame as described in the first embodiment. Thus, the subroutine in the second embodiment is completed, and the process returns to the main routine (the flowcharts shown in FIGS. 8 and 9).
 なお、特定温度物体の検出、警報領域の設定、および警報領域内の他の物体の有無の検出、警報の各段階(S18~23)も、第1監視部211から取得した距離画像と赤外線画像、および第2監視部212から取得した距離画像と赤外線画像のそれぞれを用いて行われる。このとき、仮に、第1監視部211と第2監視部212のいずれか一方でしか警報領域の設定が行われない場合も、その警報領域に対する監視動作を実行して警報などを行うようにすればよい。 It should be noted that the detection of the specific temperature object, the setting of the alarm area, the detection of the presence or absence of other objects in the alarm area, and the alarm stages (S18 to 23) are also the distance image and infrared image acquired from the first monitoring unit 211. , And each of the distance image and the infrared image acquired from the second monitoring unit 212. At this time, even if only one of the first monitoring unit 211 and the second monitoring unit 212 sets the alarm area, the monitoring operation for the alarm area is executed to perform an alarm or the like. That's fine.
 本実施形態2によれば、実施形態1の効果に加えて、以下の効果を奏する。 According to the second embodiment, in addition to the effects of the first embodiment, the following effects are provided.
 本実施形態2では、2台以上の監視部を用いて監視動作を行う場合に、より温度の高い部分をとらえている方の監視部の画像を表示させることができる。たとえば人の場合、顔が向いた方をとらえた画像(通常、顔は後ろ姿の頭よりが温度が高い)をディスプレイ130に表示させることができる。 In the second embodiment, when a monitoring operation is performed using two or more monitoring units, it is possible to display an image of the monitoring unit that captures a higher temperature part. For example, in the case of a person, it is possible to display on the display 130 an image that captures the direction the face is facing (usually, the temperature of the face is higher than that of the back head).
 なお、本実施形態2では、2台の監視部としたが、さらに多くの監視部を設けてもよい。また、本実施形態2では2台の監視部に対して1つの制御部220でこれらを制御することとしたが、2台の監視部に対してそれぞれに制御部220を設け、画面表示の切り換えだけを行う画面切り替え専用の制御部(コンピューター)をさらに設けるようにしてもよい。 In the second embodiment, two monitoring units are used. However, more monitoring units may be provided. In the second embodiment, the two monitoring units are controlled by one control unit 220. However, the control unit 220 is provided for each of the two monitoring units, and the screen display is switched. It is also possible to further provide a control unit (computer) dedicated to screen switching that performs only the above.
 また、実施形態2では、特定温度物体の検出、警報領域の設定、および警報領域内の他の物体の有無の検出、警報の各段階(S18~23)は、第1監視部211と第2監視部212のそれぞれ赤外線画像から、先に特定温度部分があるか否か判定して、以後、処理を特定温度部分を検出した方の距離画像と赤外線画像を用いて実行するようにしてもよい。 Further, in the second embodiment, the steps of detecting a specific temperature object, setting an alarm area, detecting the presence / absence of other objects in the alarm area, and alarming (S18 to 23) are performed by the first monitoring unit 211 and the second From each infrared image of the monitoring unit 212, it may be determined whether or not there is a specific temperature part first, and thereafter, the process may be executed using the distance image and the infrared image of the specific temperature part detected. .
 また、実施形態2では、単に、距離と温度に基づいて色分け表示させた画像を、温度の高い面がわかるような画像として提供するだけとしてもよい。この場合、特定温度物体の検出、警報領域の設定、および警報領域内の他の物体の有無の検出、警報の各段階(S18~23)は行わなくてもよい。 Further, in the second embodiment, an image displayed by color coding based on the distance and the temperature may be simply provided as an image showing a high temperature surface. In this case, detection of the specific temperature object, setting of the alarm area, detection of the presence / absence of another object in the alarm area, and the alarm steps (S18 to S23) may not be performed.
 以上本発明を適用した実施形態を説明したが、本発明は、これら実施形態に限定されるものではない。 Although the embodiments to which the present invention is applied have been described above, the present invention is not limited to these embodiments.
 たとえば、上述した実施形態では、特定温度部分を1つ検出した場合を例に説明したが、特定温度部分を複数検出した場合、それらに対応させて、特定温度物体の特定、警報領域の設定、監視、警報などを行うようにするとよい。 For example, in the above-described embodiment, the case where one specific temperature portion is detected has been described as an example, but when a plurality of specific temperature portions are detected, the specific temperature object is specified, the alarm region is set, It is advisable to perform monitoring and alarming.
 また、たとえば、上述した実施形態では、赤外線画像の2次元座標系を距離画像の3次元座標系に変換する座標変換係数を初期設定として求めることとしたが、このような座標変換に代えて赤外線カメラの画角と、ライダーの走査によって得られる距離画像の画角が同じになるようにしてもよい。たとえば、赤外線カメラのレンズの焦点距離(または倍率)を変えるなどすることで赤外線カメラの画角を距離画像の画角に合わせる。このようにしても、赤外線画像の2次元座標系と距離画像の3次元画像内のX-Y平面の座標系を同じにすることができる。 Further, for example, in the above-described embodiment, the coordinate conversion coefficient for converting the two-dimensional coordinate system of the infrared image into the three-dimensional coordinate system of the distance image is obtained as an initial setting. The angle of view of the camera may be the same as the angle of view of the distance image obtained by the rider's scanning. For example, the angle of view of the infrared camera is adjusted to the angle of view of the distance image by changing the focal length (or magnification) of the lens of the infrared camera. Even in this way, the two-dimensional coordinate system of the infrared image and the coordinate system of the XY plane in the three-dimensional image of the distance image can be made the same.
 また、たとえば、上述した実施形態では、特定温度物体の周囲に警報領域を設定することとしたが、特定温度物体の検出後に、特定温度物体と他の物体との距離を逐一算出して、その距離が所定距離以下か否かを判断することで、警報信号を出力するようにしてもよい。 Further, for example, in the above-described embodiment, the alarm region is set around the specific temperature object, but after detecting the specific temperature object, the distance between the specific temperature object and another object is calculated one by one. An alarm signal may be output by determining whether the distance is equal to or less than a predetermined distance.
 そのほか、本発明は特許請求の範囲に記載された構成に基づき様々な改変が可能であり、それらについても本発明の範疇である。 In addition, the present invention can be modified in various ways based on the configuration described in the claims, and these are also within the scope of the present invention.
 本出願は、2018年2月22日に出願された日本国特許出願番号2018-029921号に基づいており、その開示内容は、参照により全体として組み入れられている。 This application is based on Japanese Patent Application No. 2018-029921 filed on Feb. 22, 2018, the disclosure of which is incorporated by reference in its entirety.
 100、200 監視システム、
 102 ライダー、
 104 赤外線カメラ、
 110、211、212 監視部、
 120、220 制御部、
 130 ディスプレイ、
 140 警報器、
 211 第1監視部、
 212 第2監視部。
100, 200 monitoring system,
102 riders,
104 infrared camera,
110, 211, 212 monitoring unit,
120, 220 control unit,
130 display,
140 alarm,
211 the first monitoring unit,
212 Second monitoring unit.

Claims (14)

  1.  第1領域に向けてレーザー光を走査することによって得られた距離値の分布が3次元座標系で示された距離画像を出力するライダーと、
     前記第1領域と少なくとも一部の領域が重複する第2領域を撮影して2次元座標系で示された赤外線画像を出力する赤外線カメラと、
     前記ライダーから前記距離画像を取得し、取得した前記距離画像から物体を検出すると共に前記赤外線カメラから前記赤外線画像を取得し、
     前記赤外線画像に所定の温度範囲となっている特定温度部分がある場合に、前記検出した物体のうち、前記特定温度部分に対応する物体を特定温度物体として前記3次元座標系における位置を特定すると共に、検出した前記物体のなかに前記特定温度物体とは異なる他の物体が含まれる場合に、前記特定温度物体と前記他の物体との距離が所定距離内であるか否かを判断して、前記特定温度物体と前記他の物体との距離が所定距離内である場合に警報信号を出力する制御部と、
     を有する監視システム。
    A rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
    An infrared camera that images a second area where at least a part of the first area overlaps and outputs an infrared image represented in a two-dimensional coordinate system;
    Obtaining the distance image from the rider, detecting an object from the obtained distance image and obtaining the infrared image from the infrared camera;
    When the infrared image has a specific temperature portion that falls within a predetermined temperature range, the position corresponding to the specific temperature portion is specified as a specific temperature object among the detected objects, and the position in the three-dimensional coordinate system is specified. In addition, when the detected object includes another object different from the specific temperature object, it is determined whether or not the distance between the specific temperature object and the other object is within a predetermined distance. A control unit that outputs an alarm signal when a distance between the specific temperature object and the other object is within a predetermined distance;
    Having a surveillance system.
  2.  前記制御部は、
     前記3次元座標系において前記特定温度物体の周囲に、前記特定温度物体から前記所定距離の範囲を警報領域として設定し、前記他の物体が前記警報領域にある場合に前記警報信号を出力する、請求項1に記載の監視システム。
    The controller is
    A range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and the alarm signal is output when the other object is in the alarm area; The monitoring system according to claim 1.
  3.  前記制御部は、前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体が動体であると判断される場合には、前記警報領域を前記特定温度物体の動きに合わせて移動する、請求項2に記載の監視システム。 When the specific temperature object is determined to be a moving object from the plurality of distance images acquired in time series from the rider, the control unit moves the warning area in accordance with the movement of the specific temperature object. The monitoring system according to claim 2.
  4.  前記制御部は、前記特定温度物体の周囲に固定された前記警報領域を設定する、請求項2に記載の監視システム。 The monitoring system according to claim 2, wherein the control unit sets the alarm area fixed around the specific temperature object.
  5.  前記制御部は、
     前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体と前記他の物体との相対距離および相対速度を求め、求めた前記特定温度物体と前記他の物体との相対距離および相対速度に応じて前記所定距離の長さを変える、請求項1~4のいずれか1つに記載の監視システム。
    The controller is
    A relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative distance and relative between the specific temperature object and the other object are obtained. The monitoring system according to any one of claims 1 to 4, wherein a length of the predetermined distance is changed according to a speed.
  6.  前記制御部は、前記赤外線カメラが撮影した赤外線画像内の前記特定温度部分の温度が高いほど前記所定距離を長くする、請求項1~5のいずれか1つに記載の監視システム。 The monitoring system according to any one of claims 1 to 5, wherein the control unit increases the predetermined distance as a temperature of the specific temperature portion in an infrared image captured by the infrared camera is higher.
  7.  前記警報信号を受信して、音および/または光を発する警報器を有する、請求項1~6のいずれか1つに記載の監視システム。 The monitoring system according to any one of claims 1 to 6, further comprising an alarm device that receives the alarm signal and emits a sound and / or light.
  8.  第1領域に向けてレーザー光を走査することによって得られた距離値の分布を3次元座標系で示された距離画像を出力するライダーと、
     前記第1領域と少なくとも一部が重複する第2領域を撮影して2次元座標系で示された赤外線画像を出力する赤外線カメラと、を有する監視システムの制御方法であって、
     前記ライダーから前記距離画像を取得し、取得した前記距離画像から物体を検出する段階(a)と、
     前記赤外線カメラから前記赤外線画像を取得し、前記赤外線画像に所定の温度範囲となっている特定温度部分がある場合に、検出した前記物体のうち、前記特定温度部分に対応する物体を特定温度物体として前記3次元座標系における位置を特定する段階(b)と、
     検出した前記物体のなかに前記特定温度物体とは異なる他の物体が含まれる場合に、前記特定温度物体と前記他の物体との距離が所定距離内であるか否かを判断して、前記特定温度物体と前記他の物体との距離が所定距離内である場合に警報を発する段階(c)と、
     を有する監視システムの制御方法。
    A rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
    An infrared camera that captures a second region at least partially overlapping with the first region and outputs an infrared image represented in a two-dimensional coordinate system,
    Acquiring the distance image from the rider and detecting an object from the acquired distance image;
    When the infrared image is acquired from the infrared camera and the infrared image has a specific temperature portion that falls within a predetermined temperature range, an object corresponding to the specific temperature portion is detected among the detected objects. (B) specifying a position in the three-dimensional coordinate system as
    When the detected object includes another object different from the specific temperature object, it is determined whether the distance between the specific temperature object and the other object is within a predetermined distance, Issuing a warning when the distance between the specific temperature object and the other object is within a predetermined distance (c);
    A control method for a monitoring system comprising:
  9.  前記段階(c)においては、
     前記3次元座標系において前記特定温度物体の周囲に、前記特定温度物体から前記所定距離の範囲を警報領域として設定し、前記他の物体が前記警報領域にある場合に警報を発する、請求項8に記載の監視システムの制御方法。
    In step (c),
    The range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and an alarm is issued when the other object is in the alarm area. A control method for the monitoring system according to claim 1.
  10.  前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体が動体であると判断される場合には、前記警報領域を前記特定温度物体の動きに合わせて移動させる、請求項9に記載の監視システムの制御方法。 The alarm region is moved in accordance with the movement of the specific temperature object when the specific temperature object is determined to be a moving object from the plurality of distance images acquired in time series from the rider. The monitoring system control method described.
  11.  前記警報領域を、前記特定温度物体の周囲に固定して設定する、請求項9に記載の監視システムの制御方法。 10. The monitoring system control method according to claim 9, wherein the alarm area is fixedly set around the specific temperature object.
  12.  前記ライダーから時系列に取得した複数の前記距離画像から前記特定温度物体と前記他の物体との相対距離および相対速度を求め、求めた前記特定温度物体と前記他の物体との相対距離および相対速度に応じて前記所定距離の長さを変える、請求項8~11のいずれか1つに記載の監視システムの制御方法。 A relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative distance and relative between the specific temperature object and the other object are obtained. The monitoring system control method according to any one of claims 8 to 11, wherein the length of the predetermined distance is changed in accordance with a speed.
  13.  前記赤外線カメラが撮影した画像内の前記特定温度部分の温度に応じて前記所定距離の大きさを変化させる、請求項8~12のいずれか1つに記載の監視システムの制御方法。 The monitoring system control method according to any one of claims 8 to 12, wherein a magnitude of the predetermined distance is changed according to a temperature of the specific temperature portion in an image captured by the infrared camera.
  14.  前記警報は、音および/または光によるものである、請求項8~13のいずれか1つに記載の監視システムの制御方法。
     
    The monitoring system control method according to any one of claims 8 to 13, wherein the alarm is generated by sound and / or light.
PCT/JP2018/041401 2018-02-22 2018-11-07 Monitoring system and control method for monitoring system WO2019163211A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019515549A JP6544501B1 (en) 2018-02-22 2018-11-07 Monitoring system and control method of monitoring system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018029921 2018-02-22
JP2018-029921 2018-02-22

Publications (1)

Publication Number Publication Date
WO2019163211A1 true WO2019163211A1 (en) 2019-08-29

Family

ID=67687040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041401 WO2019163211A1 (en) 2018-02-22 2018-11-07 Monitoring system and control method for monitoring system

Country Status (1)

Country Link
WO (1) WO2019163211A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021043932A (en) * 2019-12-19 2021-03-18 ニューラルポケット株式会社 Information process system, information processing device, server device, program, or method
US20210287356A1 (en) * 2020-03-10 2021-09-16 Nec Corporation Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
US11258987B2 (en) 2018-09-21 2022-02-22 Microsoft Technology Licensing, Llc Anti-collision and motion control systems and methods
CN116679319A (en) * 2023-07-28 2023-09-01 深圳市镭神智能系统有限公司 Multi-sensor combined tunnel early warning method, system, device and storage medium
US11815598B2 (en) 2019-06-10 2023-11-14 Microsoft Technology Licensing, Llc Anti-collision and motion monitoring, control, and alerting systems and methods

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132763A (en) * 1998-10-22 2000-05-12 Mitsubishi Electric Corp Fire detector
JP2005114588A (en) * 2003-10-08 2005-04-28 Mitsubishi Heavy Ind Ltd Tracking device
JP2010170930A (en) * 2009-01-26 2010-08-05 Panasonic Corp Induction heating cooker
US20140192184A1 (en) * 2011-06-09 2014-07-10 Guangzhou Sat Infrared Technology Co., Ltd. Forest fire early-warning system and method based on infrared thermal imaging technology
JP2017097702A (en) * 2015-11-26 2017-06-01 株式会社日立国際八木ソリューションズ Monitor system and monitor control device of the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132763A (en) * 1998-10-22 2000-05-12 Mitsubishi Electric Corp Fire detector
JP2005114588A (en) * 2003-10-08 2005-04-28 Mitsubishi Heavy Ind Ltd Tracking device
JP2010170930A (en) * 2009-01-26 2010-08-05 Panasonic Corp Induction heating cooker
US20140192184A1 (en) * 2011-06-09 2014-07-10 Guangzhou Sat Infrared Technology Co., Ltd. Forest fire early-warning system and method based on infrared thermal imaging technology
JP2017097702A (en) * 2015-11-26 2017-06-01 株式会社日立国際八木ソリューションズ Monitor system and monitor control device of the same

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11258987B2 (en) 2018-09-21 2022-02-22 Microsoft Technology Licensing, Llc Anti-collision and motion control systems and methods
US11815598B2 (en) 2019-06-10 2023-11-14 Microsoft Technology Licensing, Llc Anti-collision and motion monitoring, control, and alerting systems and methods
JP2021043932A (en) * 2019-12-19 2021-03-18 ニューラルポケット株式会社 Information process system, information processing device, server device, program, or method
JP7042508B2 (en) 2019-12-19 2022-03-28 ニューラルポケット株式会社 Information processing system, information processing device, server device, program, or method
US20210287356A1 (en) * 2020-03-10 2021-09-16 Nec Corporation Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
US11869179B2 (en) * 2020-03-10 2024-01-09 Nec Corporation Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
CN116679319A (en) * 2023-07-28 2023-09-01 深圳市镭神智能系统有限公司 Multi-sensor combined tunnel early warning method, system, device and storage medium
CN116679319B (en) * 2023-07-28 2023-11-10 深圳市镭神智能系统有限公司 Multi-sensor combined tunnel early warning method, system, device and storage medium

Similar Documents

Publication Publication Date Title
WO2019163212A1 (en) Monitoring system and control method for monitoring system
WO2019163211A1 (en) Monitoring system and control method for monitoring system
US10805535B2 (en) Systems and methods for multi-camera placement
Fofi et al. A comparative survey on invisible structured light
JPWO2020121973A1 (en) Learning methods for object identification systems, arithmetic processing devices, automobiles, lamps for vehicles, and classifiers
JP4985651B2 (en) Light source control device, light source control method, and light source control program
US20140307100A1 (en) Orthographic image capture system
US20060238617A1 (en) Systems and methods for night time surveillance
US20070229850A1 (en) System and method for three-dimensional image capture
US20120288145A1 (en) Environment recognition device and environment recognition method
US8855367B2 (en) Environment recognition device and environment recognition method
JP2005324297A (en) Robot
WO2015186570A1 (en) Human detection system for construction machine
CN114137511B (en) Airport runway foreign matter fusion detection method based on multi-source heterogeneous sensor
GB2586712A (en) Image processing device, image processing method, and image processing program
JP5799232B2 (en) Lighting control device
JP5955292B2 (en) Filtering device
CN102609152A (en) Large-field-angle detection image acquisition method for electronic white board and device
JP6544501B1 (en) Monitoring system and control method of monitoring system
EP3660452B1 (en) Positioning system and positioning method
EP4071578A1 (en) Light source control method for vision machine, and vision machine
JP4804202B2 (en) Stereo monitoring device
JP2000050145A (en) Automatic tracking device
KR102017949B1 (en) Apparatus and method for calibrating camera using rectangular parallelepiped with led
CN106254736B (en) Combined imaging device and its control method based on array image sensor

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019515549

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18906859

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18906859

Country of ref document: EP

Kind code of ref document: A1