WO2019163211A1 - Système de surveillance et procédé de commande pour système de surveillance - Google Patents

Système de surveillance et procédé de commande pour système de surveillance Download PDF

Info

Publication number
WO2019163211A1
WO2019163211A1 PCT/JP2018/041401 JP2018041401W WO2019163211A1 WO 2019163211 A1 WO2019163211 A1 WO 2019163211A1 JP 2018041401 W JP2018041401 W JP 2018041401W WO 2019163211 A1 WO2019163211 A1 WO 2019163211A1
Authority
WO
WIPO (PCT)
Prior art keywords
specific temperature
distance
image
alarm
monitoring system
Prior art date
Application number
PCT/JP2018/041401
Other languages
English (en)
Japanese (ja)
Inventor
哲 細木
晃志 生田目
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Priority to JP2019515549A priority Critical patent/JP6544501B1/ja
Publication of WO2019163211A1 publication Critical patent/WO2019163211A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates to a monitoring system and a monitoring system control method.
  • an object of the present invention is to provide a monitoring system capable of reliably capturing the positional relationship between a position of a high temperature object in a three-dimensional space and another object, and a control method for the monitoring system. That is.
  • a rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
  • An infrared camera that images a second area where at least a part of the first area overlaps and outputs an infrared image represented in a two-dimensional coordinate system;
  • Obtaining the distance image from the rider detecting an object from the obtained distance image and obtaining the infrared image from the infrared camera;
  • the infrared image has a specific temperature portion that falls within a predetermined temperature range, the position corresponding to the specific temperature portion is specified as a specific temperature object among the detected objects, and the position in the three-dimensional coordinate system is specified.
  • the detected object includes another object different from the specific temperature object, it is determined whether or not the distance between the specific temperature object and the other object is within a predetermined distance.
  • a control unit that outputs an alarm signal when a distance between the specific temperature object and the other object is within a predetermined distance; Having a surveillance system.
  • control unit A range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and the alarm signal is output when the other object is in the alarm area;
  • the monitoring system according to (1) above.
  • the control unit adjusts the warning area to the movement of the specific temperature object.
  • the control unit A relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative distance and relative between the specific temperature object and the other object are obtained.
  • the monitoring system according to any one of (1) to (4), wherein the length of the predetermined distance is changed according to speed.
  • the monitoring system according to any one of (1) to (6), further including an alarm device that receives the alarm signal and emits sound and / or light.
  • a rider that outputs a distance image in which a distribution of distance values obtained by scanning a laser beam toward the first region is shown in a three-dimensional coordinate system;
  • An infrared camera that captures a second region at least partially overlapping with the first region and outputs an infrared image represented in a two-dimensional coordinate system, Acquiring the distance image from the rider and detecting an object from the acquired distance image;
  • the infrared image is acquired from the infrared camera and the infrared image has a specific temperature portion that falls within a predetermined temperature range, an object corresponding to the specific temperature portion is detected among the detected objects.
  • a control method for a monitoring system comprising:
  • the range of the predetermined distance from the specific temperature object is set as an alarm area around the specific temperature object in the three-dimensional coordinate system, and an alarm is issued when the other object is in the alarm area (8) ) Control method of the monitoring system described in the above.
  • a relative distance and a relative speed between the specific temperature object and the other object are obtained from a plurality of the distance images acquired in time series from the rider, and the obtained relative temperature object and the other object are relative to each other.
  • the monitoring system control method according to any one of (8) to (11), wherein the length of the predetermined distance is changed according to a distance and a relative speed.
  • FIG. 1 is a block diagram illustrating a configuration of a monitoring system according to the first embodiment.
  • the monitoring system 100 includes a monitoring unit 110, a control unit 120, a display 130, and an alarm device 140.
  • the monitoring unit 110 is installed at a position where an object (for example, a high-temperature object or an object such as a person, a vehicle, or another object) can be captured.
  • a rider 102 LiDAR: Light Detection And Ranging
  • an infrared camera 104 are attached to and integrated with the same casing.
  • the rider 102 and the infrared camera 104 have the same optical axis (the Z direction (see FIGS. 2 and 3 described later) is the same direction).
  • the rider 102 and the infrared camera 104 are arranged adjacent to each other in the Y direction (vertical direction (see FIGS. 2 and 3 described later)), and the optical axes are aligned in the X direction (lateral direction).
  • the rider 102 scans the laser beam toward the space in the first region and measures the distance from the reflected light to an object existing in the scanned space.
  • the obtained distribution of distance values is also referred to as point cloud data, and the distance from the installation position of the rider 102 to the object, and the size and shape of the object are known.
  • an image having a distance value distribution that is a distance to an object existing in the space or an infinite distance when there is no reflected light is obtained.
  • Such an image output from the rider 102 is referred to as a distance image because it includes information on the distance to the object (sometimes referred to as a rider image).
  • the distance image is output from the rider 102 to the control unit 120 as an image of a three-dimensional coordinate system.
  • the infrared camera 104 captures an object in the space of the second region, and outputs the temperature distribution in the captured space as an infrared image in a two-dimensional coordinate system using monochrome shades.
  • the output value of a pixel that captures a portion with a high temperature is high, and the output value of a pixel that captures a portion with a low temperature is low.
  • the output gradation value increases as the pixel captures a portion with a higher temperature.
  • the first area scanned by the rider 102 and the second area photographed by the infrared camera 104 overlap at least partially.
  • This overlapping area becomes a monitoring range as the monitoring system 100.
  • the distance image of the rider 102 and the infrared image of the infrared camera 104 are used to grasp the three-dimensional spatial position and temperature information of the object existing in the overlapping region.
  • the scanning interval for one frame of the rider 102 and the imaging interval of the infrared camera 104 need not be completely synchronized.
  • the lidar 102 has a scanning interval of about 10 frames / second, whereas the infrared camera 104 can take images at intervals of about a fraction of a second to a few thousandths of a second.
  • the three-dimensional position of the object can be grasped from the distance images acquired at the same time, and the temperature information of the object can be matched.
  • the control unit 120 is a computer.
  • the control unit 120 includes a CPU (Central Processing Unit) 121, a ROM (Read Only Memory) 122, a RAM (Random Access Memory) 123, an HDD (Hard Disk Drive) 124, and the like.
  • the CPU 121 calls a program corresponding to the processing content from the HDD 124 to control the operations of the rider 102 and the infrared camera 104, and performs detection of the three-dimensional position of the object, temperature of the object, alarm operation, display of temperature information, and the like.
  • the HDD 124 becomes a storage unit together with the RAM 123, and stores programs and data necessary for each processing.
  • a nonvolatile semiconductor memory such as a flash memory may be used instead of the HDD 124.
  • the control unit 120 includes an input device 125 such as a touch panel, buttons, and a mouse, and a network interface 126 (NIF: Network Interface) for connecting an external device such as a server.
  • an input device 125 such as a touch panel, buttons, and a mouse
  • a network interface 126 (NIF: Network Interface) for connecting an external device such as a server.
  • the monitoring system 100 includes a display 130 and an alarm device 140.
  • the display 130 can be provided separately from the control unit 120 in order to install the display 130 in a factory monitoring room, for example. Of course, it may be integrated with the control unit 120. Further, the display 130 and the alarm device 140 may be integrated depending on the monitoring environment.
  • the alarm device 140 issues an alarm by a method that can be recognized by, for example, sound, light such as a flashlight or a rotating light, and others. Note that the monitoring system 100 may cause other processing to be performed instead of the alarm by the alarm device 140.
  • the other processing is processing for starting recording of images obtained from the infrared camera 104 and the rider 102, for example. By starting the recording, the movement of the object in the overlapping area and the display on the display 130 can be reliably recorded. Other processing includes, for example, stopping automatic machines such as robots, machine tools, and transport vehicles.
  • the case where the rider 102 and the infrared camera 104 are integrated is illustrated.
  • the rider 102 and the infrared camera 104 may be installed separately, and each may be connected to the control unit 120 via a dedicated line or a network.
  • the range scanned by the rider 102 and the range captured by the infrared camera 104 are overlapped.
  • the control unit 120 may use a general-purpose computer instead of a dedicated computer. Conversely, the infrared camera 104, the rider 102, and the control unit 120 may be integrated. In addition, although the control unit 120 is shown here as a form mainly composed of a CPU, a RAM, and a ROM, for example, it is configured by an integrated circuit such as an FPGA (Field-Programmable Gate Array) or an ASIC (Application Specific Integrated Circuit). May be.
  • FPGA Field-Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the operation of the monitoring system 100 will be described.
  • the operation of the monitoring system 100 is roughly divided into an initial setting operation and a monitoring operation.
  • FIG. 2 is an explanatory diagram for explaining a distance image obtained by the rider 102 scanning the first region.
  • FIG. 3 is an explanatory diagram for explaining an infrared image obtained by photographing the second region by the infrared camera 104.
  • FIG. 4 is an explanatory diagram for explaining an example in which the distance image of FIG. 2 and the infrared image of FIG. 3 are simply superimposed. In FIG. 2 and FIG. 3, the same region (space) is captured.
  • the rider 102 does not reflect light (such as the sky) from reflected light from objects ob1 to ob4 such as objects existing in the first region and objects ob1 to ob4 and the ground (in the room, the floor; the same applies hereinafter).
  • the distance image Im1 is output in a three-dimensional coordinate system as shown in the figure.
  • This distance image Im1 is an image constituted by three-dimensional point group data, and has a three-dimensional coordinate system (X1, Y1, Z1) as shown in the figure. Therefore, the position of each point constituting the distance image Im1 is specified in a three-dimensional coordinate system including the X, Y, and Z axes.
  • the objects ob1 to ob3 are people
  • ob4 is an object that becomes a specific temperature object described later.
  • the infrared camera 104 captures infrared rays emitted from an object ob1 to ob4 such as an object or a person existing in the second region or the ground gr, and outputs an infrared image Im2 as shown in the figure.
  • the objects ob1 to ob4 are all objects having a temperature higher than the ambient temperature (here, the temperature of the ground gr, the background, etc.).
  • the position (coordinate value) of each pixel constituting the infrared image Im2 is specified in a two-dimensional coordinate system including the X axis and the Y axis.
  • the distance image Im1 has a three-dimensional coordinate system (X1, Y1, Z1)
  • the infrared image Im2 has a two-dimensional coordinate system (X2, Y2). Since both coordinate systems do not match with each other as they are, if these are simply overlapped, the objects ob1 to ob4 that are originally in the same position will be displaced as shown in FIG.
  • processing is performed to make the X axis and Y axis of the two-dimensional coordinate system (X2, Y2) correspond to the X axis and Y axis of the three-dimensional coordinate system (X1, Y1, Z1).
  • Such processing is referred to herein as coordinate conversion, and a coefficient necessary for making it correspond is referred to as a coordinate conversion coefficient.
  • the initial setting operation is an operation for calculating the coordinate conversion coefficient.
  • FIG. 5 is a flowchart showing a processing procedure for calculating the coordinate conversion coefficient.
  • 6 and 7 are explanatory diagrams for calculating the coordinate conversion coefficient.
  • This coordinate conversion coefficient calculation (initial setting operation) process is performed by the control unit 120 executing a program for calculating the coordinate conversion coefficient.
  • control unit 120 obtains a distance image Im1 output by scanning the first area including a portion where the rider 102 becomes an overlapping area (S1).
  • S1 overlapping area
  • at least two reference points are set in advance.
  • a heating element such as an incandescent bulb or a heater or an infrared radiation object such as an infrared LED is attached to the head (tip) of the rod.
  • the reference point is preferably a stationary object, but as already described, the reference point may be set on the moving object as long as the scanning time of the rider 102 and the photographing time of the infrared camera 104 are matched (synchronized).
  • FIG. 1 An example of the acquired distance image is shown in FIG. As shown in the figure, the distance image Im1 is shown as reference points P1 and P2 on the image.
  • the control unit 120 obtains an infrared image Im2 output by photographing and outputting the second region including the portion where the infrared camera 104 becomes an overlapping region (S2).
  • An example of the acquired infrared image is shown in FIG.
  • reference points PP1 and PP2 are shown. This is because the infrared radiator is attached to the tip of the bar serving as the reference point, and this portion is a portion with high luminance (gradation value) in the infrared image Im2. Note that the processing order of S1 and S2 may be reversed (may be simultaneous).
  • the control unit 120 calculates a conversion coefficient for matching the two-dimensional coordinate system of the obtained infrared image Im2 with the three-dimensional coordinate system of the distance image Im1 (S3).
  • the distance image Im1 and the infrared image Im2 are images obtained by scanning or photographing the same region (space). For this reason, the actual size and distance (distance between reference points) of the objects present in both images are the same. However, these images have different scales for each image due to different scanning and photographing equipment. For this reason, if both are simply overlapped (see FIG. 4), the position of the object is shifted or the size is different.
  • the two scale coordinate systems may be converted to have the same scale.
  • the scales of the X axis and the Y axis of the infrared image Im2 are adjusted to the X axis and the Y axis of the distance image Im1.
  • two reference points were provided.
  • the distances in the X-axis and Y-axis directions in the images of the reference points P1 and P2 shown in the distance image Im1 are obtained.
  • the distance in the X-axis direction between P1 and P2 is ⁇ (P1-P2) x.
  • the distance in the Y-axis direction between P1 and P2 is ⁇ (P1 ⁇ P2) y.
  • the distances in the X-axis and Y-axis directions in the images of the reference points PP1 and PP2 reflected in the infrared image Im2 are obtained.
  • the distance in the X-axis direction between PP1 and PP2 is ⁇ (PP1-PP2) x
  • the distance in the Y-axis direction between PP1 and PP2 is ⁇ (PP1-PP2) y.
  • the respective conversion coefficients of the X axis and the Y axis are obtained.
  • the obtained conversion coefficient is stored in the RAM 123 or HDD 124.
  • the number of reference points is not particularly limited.
  • an object existing in the space may be used as the reference point.
  • an object corner or the like is designated as a reference point so that it can be easily distinguished in an image.
  • the reference point needs to be an infrared radiation object so that the reference point is reflected in the infrared image.
  • the process of calculating the coordinate conversion coefficient ends. Thereafter, using this conversion coefficient, the coordinate value (or the entire screen) of the object in the two-dimensional coordinate system (XY) in the infrared image is the same coordinate system as the XY plane of the three-dimensional coordinate system in the distance image. Can be converted to
  • image distortion of the infrared camera 104 is also corrected.
  • the infrared camera 104 uses a lens like a normal camera. For this reason, the image is distorted due to a slight difference in refractive index between the lens end and the optical center of the lens. Due to such distortion, even if the same object is at the same distance, the size and position of the infrared image taken of it are slightly different depending on whether it appears in the periphery or in the center. It will end up.
  • the image may be corrected based on the refractive index of the entire lens obtained from the lens design data.
  • the infrared images of the two are compared. You may make it correct
  • correction may be made by using only the central part of the lens without this distortion.
  • the infrared camera 104 is adjusted so that the range reflected in the center of the lens where distortion does not occur becomes the scanning range of the rider 102 (or, as a configuration of the infrared camera itself, a large aperture lens is used,
  • the infrared image sensor (bolometer) receives light only from the center of the lens where no distortion occurs.
  • This coordinate conversion coefficient calculation is performed at a predetermined time, for example, when the monitoring system 100 is installed on the site, at regular maintenance, or at any time determined by the user (when a defect is found, etc.) ).
  • a three-dimensional coordinate system and a two-dimensional coordinate system here use an orthogonal coordinate system
  • a polar coordinate system may be used.
  • FIG. 8 is a flowchart illustrating the processing procedure of the monitoring operation by the control unit 120.
  • the current frame refers to a frame acquired at the current time point
  • the previous frame refers to a frame immediately before the current frame in time series. Since this procedure includes repetitive processing, for convenience of explanation, processing using the result of processing at a later stage may be described first.
  • the scanning interval of the rider 102 and the imaging interval of the infrared camera 104 are synchronized.
  • control unit 120 acquires a distance image for one frame at the current time point from the rider 102 and similarly acquires an infrared image for one frame at the current time point from the infrared camera 104 (S11). Note that the order of obtaining the distance image and the infrared image may be either first (or simultaneously).
  • the control unit 120 clusters the objects detected in the distance image using the background difference method (S12).
  • the background subtraction method compares an image registered in advance as a background image with the acquired frame image (here, the frame image acquired in S11), and if there is a different part from the background image, The part is detected as a newly appearing object.
  • the background image it is preferable to store a distance image obtained by scanning in a state where there is no object in a range (first area space) scanned by the rider 102.
  • the background image is stored in, for example, the HDD 124 and read out to the RAM 123 for use.
  • Clustering is for making it easier to track a detected object in subsequent processing, and a known method can be used. For example, clustering includes the number of pixels of the detected object and the size of the object obtained from the coordinate values in the three-dimensional coordinate system (length, area, volume in the X, Y, and Z directions of the object in the three-dimensional coordinate system). Etc.), each object is made into a cluster of clusters in the distance image. Each cluster stores, for example, the coordinate value of the cluster center and the coordinate value of the cluster outline in the RAM 123 as its position.
  • the control unit 120 performs moving body tracking for the clustered object (S13).
  • moving object tracking it is searched whether or not an object of the same cluster as the object clustered in the distance image of the current frame was in the previous frame. If there is an object of the same cluster in the previous frame, the position of the previous frame of the object and the position of the current frame are compared, and the moving distance, moving direction, and speed of the object (speed is the distance between frames) Obtained by dividing by. Thereby, since the moving distance, moving direction, and speed are known for each object, these are stored in the RAM 123 for each object. If there is an object that does not exist in the previous frame but is detected in the current frame, the object stores the coordinate value (position) in the RAM 123 as an object that has appeared in the current frame.
  • control unit 120 associates a portion having a higher temperature than the periphery (ground, background, etc.) in the infrared image with an object detected in the distance image (S14).
  • object in the infrared image and the object in the distance image can be associated by coordinate transformation.
  • control unit 120 extracts the coordinate values of the two-dimensional coordinate system of the pixels that occupy a portion of the infrared image having a higher temperature than the periphery (for example, ob1 to ob4 shown in FIG. 3). For example, assuming that the coordinate value of one pixel in the two-dimensional coordinate system is (x1, y1), conversion using the already obtained conversion coefficients ⁇ x, ⁇ y results in ( ⁇ x ⁇ x1, ⁇ y ⁇ y1). The same conversion is performed for the other pixels.
  • image conversion can be performed even if only pixels that show a higher temperature in the infrared image than the surroundings, that is, pixels that have a pixel gradation value other than 0 or a predetermined threshold value or more. Alternatively, all the pixels of the infrared image may be converted.
  • control part 120 matches the object in the distance image which overlaps with the pixel of the coordinate value after conversion. At this time, if the coordinate value range of the point cloud data shown as the object in the distance image and the pixel of the coordinate value of the high temperature part after the conversion overlap even a little, the temperature of the infrared image Assume that the high part corresponds to the object in the distance image.
  • the floor surface and the periphery of the object become hot due to radiant heat from the object, and infrared rays may be emitted from the periphery of the object.
  • the periphery of the high-temperature object is also shown as a high-temperature part.
  • the face appears as a part having a higher temperature
  • the body appears as a part having a lower temperature than the face.
  • the size of the high-temperature portion in the infrared image may not match the size of the object point group (actual object size) in the distance image acquired from the rider. Therefore, in the present embodiment, if there is a part that overlaps at least part of the high-temperature part in the infrared image and the object in the distance image, they correspond to each other. There is no limitation on the overlapping ratio. For example, in the case of a person, although it varies depending on the degree of skin exposure from clothes, the portion where the temperature appears high (such as the face) is about 1 to 20% of the whole person, so if there is an overlap of 1% or more, It shall correspond.
  • the associated temperature is stored in the RAM 123 as the temperature of the object.
  • the highest temperature in the temperature distribution may be stored as the temperature of the object (this stored temperature is used for image display described later).
  • the control unit 120 causes the display 130 to display a color according to the temperature and distance on the object based on the infrared image on the display 130 (S15).
  • the display at this time is based on an infrared image (two-dimensional coordinate system), in which the position of the object in the infrared image is understood so that the position of the object obtained from the distance image (three-dimensional coordinate system) can be understood.
  • a frame is drawn on the part corresponding to the position.
  • the position of the object in the infrared image can be obtained by converting the coordinates of the XY plane of the three-dimensional coordinate system into the coordinates of the infrared image of a certain object in the distance image by the coordinate conversion already described.
  • the outline of the cluster in the XY plane of the distance image of the three-dimensional coordinate system is extracted, and the frame is displayed according to the coordinate value of the extracted outline.
  • the frame line attached to the object is the first related information image.
  • the first related information image is not limited to a frame line, and may be, for example, an arrow or a triangle indicating an object. Also, numerical values such as the distance to the object and the temperature may be displayed together with the frame line.
  • the displayed object is colored based on position information and temperature information.
  • the position information is information obtained from a distance image of a three-dimensional coordinate system obtained from the rider 102.
  • the above-mentioned frame line (first related information image) is one of them, but the color attached to the object is changed according to the distance from the installation position of the monitoring unit 110 (that is, the installation position of the rider 102). Yes.
  • the temperature information is a temperature obtained from an infrared image of a two-dimensional coordinate system, and the color applied to the object is also changed by this temperature information.
  • the object is color-coded based on position information and temperature information. Specifically, for example, colors such as blue, yellow, and red are used from the lowest temperature to the highest temperature. Further, the closer the distance from the monitoring unit 110, the higher the brightness of the color (that is, the higher the gradation value of the pixel to be displayed), and the farther the color is, the lower the brightness of the color (lower the gradation value of the pixel to be displayed). , Etc.
  • FIG. 9 is a screen example showing a display example.
  • the displayed colors are (R, G, B) gradation values, and each color is 0 to 255.
  • a high-temperature object an object to be a specific temperature object described later
  • ob4 is displayed in red (150, 0, 0) having a medium brightness because the temperature is high but the distance is long. Since the objects ob1 to ob3 other than the specific temperature object ob4 are humans and the temperature is lower than that of the specific temperature object ob4, they are displayed in a color close to yellow, and the brightness of those colors varies depending on the distance. To do.
  • the closest object ob1 is bright yellow (224, 250, 0)
  • the medium object ob2 is medium bright yellow (180, 190, 0)
  • the farthest object ob3 is dark yellow (100 , 120, 0).
  • the same color is applied to the entire object at the temperature stored as the temperature of the entire object. Apply. This makes it easier to recognize an object such as a person whose temperature is partially high by looking at the screen.
  • the objects ob1 to ob4 are displayed with the frame line fb indicating that they are objects as described above, regardless of the temperature. For this reason, it becomes easy to understand that the object is an object, and in particular, the visibility of the moving object is improved.
  • a distance line (0 m to 40 m) representing the distance from the installation position of the monitoring unit 110 is also shown.
  • the control unit 120 determines whether or not there is a specific temperature portion in the infrared image (S16).
  • the specific temperature portion is a portion in a predetermined temperature range (including a case where the temperature is equal to or higher than a predetermined temperature). For example, when monitoring is performed so that a person does not come close to an object at a temperature that may cause harm to a person as a monitoring target, and an alarm is given when approaching, a predetermined temperature range that is a specific temperature portion is 50 ° C. or higher. (In this case, the upper limit may be a temperature at which each pixel of the infrared camera is saturated, for example).
  • the temperature range for the specific temperature part is arbitrary, and is determined by the temperature of the monitored object (for example, a hot object that is harmful to humans) or the temperature of the environment (such as indoors or outdoors). That's fine.
  • the control unit 120 returns to S11 to acquire the next frame (S16: NO).
  • S16 next frame
  • NO next frame
  • the association in S14 it is a matter of course that the association in S14 is not performed, a screen in which no object exists is displayed in S15, and then NO in S16 and NO to S11 Return and continue processing.
  • NO is determined in S16, if there is data indicating that an alarm area described later is set, the data is cleared to indicate that the alarm area is not set (this is described later in S17 described later). Is required for processing).
  • control part 120 If the control part 120 detects the specific temperature part (S16: YES), the control part 120 will judge next whether the warning area
  • the control unit 120 selects an object corresponding to the specific temperature portion detected in S16 among the objects in the distance image (three-dimensional coordinate system) as the specific temperature object. (S18).
  • the objects in the distance image are already clustered and associated with a portion having a temperature higher than the ambient temperature.
  • the moving object is tracked (S12 to S14 described above). For this reason, in this S18, the specific temperature portion may be searched from the infrared image, and the coordinate value (position) of the object corresponding to it may be specified.
  • the specific temperature portion cannot be associated with the object being tracked in S13 or the object newly appearing in the current frame.
  • it is a stationary object (an object that does not move) that is low in temperature when the background image is acquired (stored), but the temperature subsequently increases.
  • some object existing in the background image in S13 an object that cannot be detected by the background subtraction method
  • the specific temperature portion may be associated with a stationary object (hereinafter simply referred to as a stationary object) that has not been detected as an object.
  • the coordinate value of the stationary object is stored in the RAM 123 as the coordinate value of the specific temperature object.
  • the boiler when there is a boiler (a grounding type that does not move) in the first area, the boiler is naturally reflected even when the background image is acquired. For this reason, in the background subtraction method, the boiler cannot be detected as an object in S12.
  • the boiler that has been stopped at the time of acquiring the background image starts operating from a certain point during the monitoring operation and becomes hot, it is reflected in the infrared image as a specific temperature portion.
  • the specific temperature portion is associated with the point cloud data indicating the boiler that is a stationary object, the boiler that is a stationary object can be subsequently recognized as the specific temperature object.
  • the control unit 120 sets an alarm area around the specific temperature object ob4 (S19).
  • the size of the alarm area is determined by identifying the temperature of the specific temperature portion (particularly a high temperature portion when there is a temperature distribution in the specific temperature object ob4) from the infrared image in which the specific temperature portion is detected, and depending on the temperature
  • the size of the alarm area is variable.
  • FIG. 10 is an explanatory diagram for explaining the specific temperature object.
  • FIG. 10 shows the distance image of the three-dimensional coordinate system and the infrared image of the two-dimensional coordinate system after coordinate conversion superimposed.
  • the specific temperature object is represented by (xmin, ymin, zmin), (xmax, ymin, zmin), ( xmax, ymax, zmin), (xmin, ymax, zmin), (xmin, ymin, zmax), (xmax, ymin, zmax), (xmax, ymin, zmax), (xmax, ymax, zmax), (xmin, ymax, zmax).
  • the specific temperature object is a grounded object, if the origin (0) of the Y axis of the three-dimensional coordinate system is taken as the ground (floor surface), the lower end (ymin) in the Y axis direction is 0 (zero). )
  • FIG. 11 and 12 are explanatory diagrams for explaining an alarm region provided around a specific temperature object.
  • FIG. 11 shows the case of the temperature T1 of the specific temperature object
  • FIG. 12 shows the case of the temperature T2 of the specific temperature object.
  • each temperature is T1 ⁇ T2.
  • the predetermined distance for setting the alarm region is made to correspond to the temperature, and the distance D1 ⁇ D2.
  • the alarm region m1 is set around the specific temperature object ob4 so as to be within a predetermined distance D1 from the outer periphery of the specific temperature object ob4. Specifically, the alarm area m1 sets the range of the distance D1 for each of the X, Y, and Z axes from the coordinate value of the outer peripheral edge of the specific temperature object ob4.
  • the alarm area m1 is represented by coordinate values (x, y, z), (xmin ⁇ D1, ymin ⁇ D1, zmin ⁇ D1), (xmax + D1, ymin + D1, zmin + D1), (xmax + D1, ymax + D1, zmin + D1), (xmin + D1) , Ymax + D1, zmin + D1), (xmin-D1, ymin-D1, zmax + D1), (xmax + D1, ymin-D1, zmax + D1), (xmax + D1, ymin-D1, zmax + D1), (xmax + D1, ymax + D1, zmax + D1), (xmin-D1, ymax + D1, zmax + D1).
  • the alarm region m2 is set around the specific temperature object ob4 to be within a predetermined distance D2 from the outer periphery of the specific temperature object ob4.
  • the alarm area m2 is a range of distance D2 for each of the X, Y, and Z axes from the coordinate value of the outer peripheral end of the specific temperature object ob4.
  • the higher the temperature of the specific temperature object the wider the range of the alarm area is set.
  • Such a relationship between the temperature and the predetermined distance may be stored in advance in the HDD 124 as table data of the temperature versus the predetermined distance and read out to the RAM 123 for use.
  • a predetermined distance is extracted by referring to the table data from the temperature of the specific temperature portion detected from the infrared image in S19. Then, the extracted alarm region separated by a predetermined distance is set.
  • the specific temperature object is a rectangular parallelepiped in the three-dimensional coordinate system
  • the shape of the specific temperature object is not limited to a rectangular parallelepiped, and may be other shapes.
  • the warning area may be set as a range of a predetermined distance (D1, D2, etc.) from the outer periphery of the specific temperature object according to the shape of the specific temperature object.
  • the alarm region is set as a range of a predetermined distance (D1, D2, etc.) from the outer peripheral edge of the specific temperature object, but instead of this, for example, a predetermined distance (however, a predetermined distance from the center of the specific temperature object) May be longer than the distance from the center of the specific temperature object to the outer shape).
  • a predetermined distance (however, a predetermined distance from the center of the specific temperature object) May be longer than the distance from the center of the specific temperature object to the outer shape).
  • the range of the sphere from the cluster center of the clustered specific temperature object may be set as an alarm region, which facilitates calculation (speeding up the processing).
  • the range may be a predetermined distance from the position corresponding to the highest temperature portion of the specific temperature object. Therefore, even in the case where there is a temperature distribution in the specific temperature object, the alarm region can be set around the high temperature portion.
  • the alarm area can be moved in accordance with the movement (See S20 described later).
  • the alarm area is In addition to such a setting method, for example, when it is known that a specific temperature object does not move (in the case of the stationary object), or when the moving range is known, the alarm area is In addition, a fixed range of a predetermined distance around the specific temperature object may be used as an alarm area. When such a fixed alarm region is set, for example, if the specific temperature object is a moving object, the predetermined distance may be short in the non-moving direction and long in the moving direction.
  • the control unit 120 is displayed on the display 130 so that the specific temperature object ob4 and the alarm area m1 (or m2) are displayed with a frame, a line, or a mark that is color-coded or line-typed.
  • the screen is updated (S21).
  • the frame line indicating the warning area is set as the second related information image.
  • the second related information image is not limited to the illustrated frame line, and may be, for example, an arrow or a triangle indicating an object, or a light coloration of the entire alarm area (for example, a specific temperature object has its color) Dark enough to show through).
  • control unit 120 determines whether another object (such as a person or other object) that is different from the specific temperature object is within the alarm region (S22). This comparison compares the range surrounded by the coordinate value of the outer shape of the cluster and the coordinate value indicating the warning area of the object clustered in S12 (that is, the object detected by the background difference method). If the coordinate value of the outer shape of the cluster of another object is within the alarm area, it is determined that the other object is within the alarm area.
  • another object such as a person or other object
  • control unit 120 returns to S11, acquires each image of the next frame, and continues the subsequent processing.
  • the control unit 120 outputs an alarm signal to the alarm device 140 (S23). Thereby, an alarm sound is emitted from the alarm device 140 that has received the alarm signal.
  • the display 130 blinks an object (or a frame or mark surrounding the object) that is determined to be in the alarm area, changes the color of the entire screen, blinks, or displays a warning text. It is good to see that the alarm is also issued visually.
  • various alarm operations such as turning on the rotating lamp, changing the color of the color-coded layered display lamp from blue to red, and turning on and blinking other lamps, etc. You may go.
  • S20 the alarm area is already set up to the previous frame.
  • moving object tracking S13
  • S13 moving object tracking
  • the specific temperature object is a moving object, its moving distance, direction, and speed are known. Therefore, in S20, the already set coordinate value of the alarm region is moved using the moving distance and direction of the specific temperature object.
  • the alarm area has already been set up to the previous frame, even if the specific temperature object is a moving object, it is only necessary to move the alarm area in accordance with the movement. For this reason, as in S18 to S19, the calculation is simpler (the processing speed can be increased) than setting the alarm region from the coordinate value of the specific temperature object in the current frame.
  • control unit 120 proceeds to S21, updates the screen so as to display the moved alarm area and the like, and continues the subsequent processing.
  • the monitoring operation is executed as a repeated process.
  • a predetermined distance range around the specific temperature object is set as the alarm area.
  • the length of the predetermined distance for setting the alarm region may be changed according to the relative distance and relative speed between the specific temperature object and another object.
  • the moving direction and speed of the object are obtained in the step S13 as already described.
  • the specific temperature object is also known as the moving object tracking value in S13. Even when the specific temperature object is a stationary object, the position is known (when the stationary object is identified as the specific temperature object in S18).
  • the specific temperature object and the other object are moving in a direction in which they are relatively approaching from these moving directions and speeds, and if the relative speed (approaching speed) is high, it is already set. Widen the alarm area. As a result, when at least one of the specific temperature object and the other object is a moving object, an alarm signal can be quickly issued in consideration of not only the temperature of the specific temperature object but also the moving speed thereof. .
  • a coordinate system of a two-dimensional coordinate system infrared image that can detect the temperature of an object and a distance image of a rider 102 that captures a three-dimensional position in space as a three-dimensional coordinate system is combined. While detecting the specific temperature portion having a high temperature from the infrared image, the position of the specific temperature object corresponding to the specific temperature portion is specified from the distance image. And since it decided to issue an alarm based on the positional relationship between the position of the specific temperature object and other objects, the alarm to ensure the safety by grasping the close proximity between the high temperature specific temperature object and other objects It can be performed.
  • the specific temperature part is a part of a high temperature that is dangerous to a person, for example, and when another object is a person, for example, an alarm may be given to prevent a high temperature object from approaching the person. it can.
  • the object since the alarm area is set around the specific temperature object, the object enters the alarm area without calculating the distance (distance) between the specific temperature object and another object. It can be determined whether or not. For this reason, the time (calculation processing time) required for risk determination can be reduced.
  • the alarm area is moved in accordance with the movement of the specific temperature object, even if the specific temperature object is moving, the risk is judged by simple calculation of whether or not the object has entered the alarm area. can do.
  • the size of the alarm area is changed based on the direction and speed at which the object approaches the alarm area, if the speed at which the object approaches is high, the danger is early. It is possible to more reliably perform the notification and avoid that the object approaches the specific temperature object.
  • the size of the alarm area is changed according to the temperature of the specific temperature portion, it is possible to more reliably avoid the danger when the object approaches the alarm area.
  • FIG. 13 is a block diagram illustrating a configuration of the monitoring system according to the second embodiment.
  • FIG. 14 is a bird's eye view showing the arrangement of the monitoring units.
  • the monitoring system 200 includes two monitoring units.
  • the two monitoring units are a first monitoring unit 211 and a second monitoring unit 212.
  • the first monitoring unit 211 and the second monitoring unit 212 are arranged to monitor the same monitoring target space from different directions.
  • the internal configurations of the first monitoring unit 211 and the second monitoring unit 212 are the same as those in the first embodiment, and each includes the infrared camera 104 and the rider 102.
  • the control unit 220 has the same configuration as that of the first embodiment except that the first monitoring unit 211 and the second monitoring unit 212 are controlled at a time. For this reason, the first monitoring unit 211 and the second monitoring unit 212 are connected to the control unit 220. Since other configurations are the same as those of the first embodiment, description thereof is omitted.
  • the coordinate conversion operation and the monitoring operation by the control unit 220 are performed for each of the first monitoring unit 211 and the second monitoring unit 212, but the processing procedure is the same as that of the first embodiment, and thus the description thereof is omitted.
  • the processes of S15 and S21 are included in the display processing stage. Different from the first embodiment.
  • the first monitoring unit 211 and the second monitoring unit 212 monitor the same area (space) from different directions. For this reason, even if it is the same object, the visible part (surface) differs.
  • the temperature detected by the infrared camera 104 may vary depending on the surface of the object. In other words, even if one object has a high temperature on one surface and a low temperature on another surface, etc. (a surface with a low temperature is a surface on which an object is covered with an infrared shielding object) Also included).
  • the object ob5 shown in FIG. 14 has a temperature of one side ho higher than that of the other side co.
  • the infrared camera 104 of the first monitoring unit 211 captures the high-temperature surface ho
  • the infrared camera 104 of the second monitoring unit 212 captures the low-temperature surface co.
  • each rider 102 scans the same object ob5 from each position and outputs a distance image.
  • an image of the monitoring unit (the first monitoring unit 211 in FIG. 14) capturing the surface ho having the higher temperature is displayed on the display 130 at the stage of the display process.
  • FIG. 15 is a subroutine flowchart showing the procedure of the display processing stage (S15 and S21 (S23) in FIG. 8) in the second embodiment.
  • control unit 220 uses the distance image and the infrared image acquired from the first monitoring unit 211 and the distance image and the infrared image acquired from the second monitoring unit 212 in FIG.
  • the processes from S11 shown are executed in parallel.
  • control unit 220 extracts the maximum temperature in the infrared image captured by the infrared camera 104 of the second monitoring unit 212 and sets this as the second screen temperature St2 (S32). Note that the order of the processing of S31 and S32 may be reversed (or may be simultaneous).
  • control unit 220 displays the screen with the highest maximum temperature. In this process, it is determined whether or not St1 ⁇ St2 (S33).
  • the control unit 220 causes the display 130 to display an image on the first monitoring unit 211 side.
  • the displayed image uses the distance image acquired from the first monitoring unit 211 and the infrared image to display the object with a color or a frame as described in the first embodiment.
  • the control unit 220 causes the display 130 to display an image on the second monitoring unit 212 side.
  • the displayed image uses the distance image acquired from the second monitoring unit 212 and the infrared image to display the object with a color or a frame as described in the first embodiment.
  • the subroutine in the second embodiment is completed, and the process returns to the main routine (the flowcharts shown in FIGS. 8 and 9).
  • the detection of the specific temperature object, the setting of the alarm area, the detection of the presence or absence of other objects in the alarm area, and the alarm stages (S18 to 23) are also the distance image and infrared image acquired from the first monitoring unit 211. , And each of the distance image and the infrared image acquired from the second monitoring unit 212. At this time, even if only one of the first monitoring unit 211 and the second monitoring unit 212 sets the alarm area, the monitoring operation for the alarm area is executed to perform an alarm or the like. That's fine.
  • the second embodiment when a monitoring operation is performed using two or more monitoring units, it is possible to display an image of the monitoring unit that captures a higher temperature part. For example, in the case of a person, it is possible to display on the display 130 an image that captures the direction the face is facing (usually, the temperature of the face is higher than that of the back head).
  • two monitoring units are used. However, more monitoring units may be provided.
  • the two monitoring units are controlled by one control unit 220.
  • the control unit 220 is provided for each of the two monitoring units, and the screen display is switched. It is also possible to further provide a control unit (computer) dedicated to screen switching that performs only the above.
  • the steps of detecting a specific temperature object, setting an alarm area, detecting the presence / absence of other objects in the alarm area, and alarming are performed by the first monitoring unit 211 and the second From each infrared image of the monitoring unit 212, it may be determined whether or not there is a specific temperature part first, and thereafter, the process may be executed using the distance image and the infrared image of the specific temperature part detected. .
  • an image displayed by color coding based on the distance and the temperature may be simply provided as an image showing a high temperature surface. In this case, detection of the specific temperature object, setting of the alarm area, detection of the presence / absence of another object in the alarm area, and the alarm steps (S18 to S23) may not be performed.
  • the coordinate conversion coefficient for converting the two-dimensional coordinate system of the infrared image into the three-dimensional coordinate system of the distance image is obtained as an initial setting.
  • the angle of view of the camera may be the same as the angle of view of the distance image obtained by the rider's scanning.
  • the angle of view of the infrared camera is adjusted to the angle of view of the distance image by changing the focal length (or magnification) of the lens of the infrared camera. Even in this way, the two-dimensional coordinate system of the infrared image and the coordinate system of the XY plane in the three-dimensional image of the distance image can be made the same.
  • the alarm region is set around the specific temperature object, but after detecting the specific temperature object, the distance between the specific temperature object and another object is calculated one by one.
  • An alarm signal may be output by determining whether the distance is equal to or less than a predetermined distance.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)

Abstract

Le problème traité par la présente invention est de fournir un système de surveillance qui permet d'obtenir de manière fiable la relation de position entre la position d'un objet à haute température dans un espace tridimensionnel et un autre objet, et d'émettre une alarme. A cet effet, l'invention concerne un système de surveillance 100 qui comprend: un lidar 102 qui balaie une première région et émet une image de plage de système de coordonnées tridimensionnel; une caméra infrarouge 104 qui photographie une seconde région qui chevauche la première région et émet une image infrarouge de système de coordonnées bidimensionnel; une partie de commande qui acquiert l'image de plage et l'image infrarouge et, lorsqu'il y a une section de température spécifique dans l'image infrarouge, spécifie la position de système de coordonnées tridimensionnel d'un objet à température spécifique qui correspond à la section de température spécifique, et lorsqu'un autre objet a été détecté dans l'image de plage, émet un signal d'alarme lorsque la distance entre l'objet à température spécifique et l'autre objet se trouve à une distance prescrite; et une alarme qui reçoit le signal d'alarme en provenance de la partie de commande 120 et émet une alarme.
PCT/JP2018/041401 2018-02-22 2018-11-07 Système de surveillance et procédé de commande pour système de surveillance WO2019163211A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019515549A JP6544501B1 (ja) 2018-02-22 2018-11-07 監視システムおよび監視システムの制御方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-029921 2018-02-22
JP2018029921 2018-02-22

Publications (1)

Publication Number Publication Date
WO2019163211A1 true WO2019163211A1 (fr) 2019-08-29

Family

ID=67687040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041401 WO2019163211A1 (fr) 2018-02-22 2018-11-07 Système de surveillance et procédé de commande pour système de surveillance

Country Status (1)

Country Link
WO (1) WO2019163211A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021043932A (ja) * 2019-12-19 2021-03-18 ニューラルポケット株式会社 情報処理システム、情報処理装置、サーバ装置、プログラム、又は方法
US20210287356A1 (en) * 2020-03-10 2021-09-16 Nec Corporation Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
US11258987B2 (en) 2018-09-21 2022-02-22 Microsoft Technology Licensing, Llc Anti-collision and motion control systems and methods
CN116679319A (zh) * 2023-07-28 2023-09-01 深圳市镭神智能系统有限公司 多传感器联合的隧道预警方法,系统,装置和存储介质
US11815598B2 (en) 2019-06-10 2023-11-14 Microsoft Technology Licensing, Llc Anti-collision and motion monitoring, control, and alerting systems and methods

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132763A (ja) * 1998-10-22 2000-05-12 Mitsubishi Electric Corp 火気検知装置
JP2005114588A (ja) * 2003-10-08 2005-04-28 Mitsubishi Heavy Ind Ltd 追尾装置
JP2010170930A (ja) * 2009-01-26 2010-08-05 Panasonic Corp 誘導加熱調理器
US20140192184A1 (en) * 2011-06-09 2014-07-10 Guangzhou Sat Infrared Technology Co., Ltd. Forest fire early-warning system and method based on infrared thermal imaging technology
JP2017097702A (ja) * 2015-11-26 2017-06-01 株式会社日立国際八木ソリューションズ 監視システムとその監視制御装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000132763A (ja) * 1998-10-22 2000-05-12 Mitsubishi Electric Corp 火気検知装置
JP2005114588A (ja) * 2003-10-08 2005-04-28 Mitsubishi Heavy Ind Ltd 追尾装置
JP2010170930A (ja) * 2009-01-26 2010-08-05 Panasonic Corp 誘導加熱調理器
US20140192184A1 (en) * 2011-06-09 2014-07-10 Guangzhou Sat Infrared Technology Co., Ltd. Forest fire early-warning system and method based on infrared thermal imaging technology
JP2017097702A (ja) * 2015-11-26 2017-06-01 株式会社日立国際八木ソリューションズ 監視システムとその監視制御装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11258987B2 (en) 2018-09-21 2022-02-22 Microsoft Technology Licensing, Llc Anti-collision and motion control systems and methods
US11815598B2 (en) 2019-06-10 2023-11-14 Microsoft Technology Licensing, Llc Anti-collision and motion monitoring, control, and alerting systems and methods
JP2021043932A (ja) * 2019-12-19 2021-03-18 ニューラルポケット株式会社 情報処理システム、情報処理装置、サーバ装置、プログラム、又は方法
JP7042508B2 (ja) 2019-12-19 2022-03-28 ニューラルポケット株式会社 情報処理システム、情報処理装置、サーバ装置、プログラム、又は方法
US20210287356A1 (en) * 2020-03-10 2021-09-16 Nec Corporation Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
US11869179B2 (en) * 2020-03-10 2024-01-09 Nec Corporation Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
CN116679319A (zh) * 2023-07-28 2023-09-01 深圳市镭神智能系统有限公司 多传感器联合的隧道预警方法,系统,装置和存储介质
CN116679319B (zh) * 2023-07-28 2023-11-10 深圳市镭神智能系统有限公司 多传感器联合的隧道预警方法,系统,装置和存储介质

Similar Documents

Publication Publication Date Title
WO2019163212A1 (fr) Système de surveillance et procédé de commande de système de surveillance
WO2019163211A1 (fr) Système de surveillance et procédé de commande pour système de surveillance
US12010431B2 (en) Systems and methods for multi-camera placement
Fofi et al. A comparative survey on invisible structured light
JP4985651B2 (ja) 光源制御装置、光源制御方法および光源制御プログラム
US20060238617A1 (en) Systems and methods for night time surveillance
US20070229850A1 (en) System and method for three-dimensional image capture
US20120288145A1 (en) Environment recognition device and environment recognition method
US8855367B2 (en) Environment recognition device and environment recognition method
KR102436730B1 (ko) 가상 스크린의 파라미터 추정 방법 및 장치
JP2005324297A (ja) ロボット
WO2015186570A1 (fr) Système de détection de personne pour engin de chantier
CN114137511B (zh) 一种基于多源异构传感器的机场跑道异物融合探测方法
GB2586712A (en) Image processing device, image processing method, and image processing program
JP5799232B2 (ja) 照明制御装置
JP5955292B2 (ja) フィルタリング装置
CN102609152A (zh) 大视场角图像检测电子白板图像采集方法及装置
JP6544501B1 (ja) 監視システムおよび監視システムの制御方法
EP3660452B1 (fr) Système de positionnement et procédé de positionnement
EP4071578A1 (fr) Procédé de commande de source de lumière de machine de visualisation, et machine de visualisation
JP4804202B2 (ja) ステレオ式監視装置
JP2000050145A (ja) 自動追尾装置
KR102017949B1 (ko) 발광 다이오드가 설치된 직육면체를 이용한 카메라의 캘리브레이션 장치 및 방법
CN106254736B (zh) 基于面阵图像传感器的组合成像装置及其控制方法
KR102373572B1 (ko) 서라운드 뷰 모니터링 시스템 및 그 방법

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2019515549

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18906859

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18906859

Country of ref document: EP

Kind code of ref document: A1