WO2023079593A1 - Stain determination device, stain determination method, and stain determination program - Google Patents

Stain determination device, stain determination method, and stain determination program Download PDF

Info

Publication number
WO2023079593A1
WO2023079593A1 PCT/JP2021/040401 JP2021040401W WO2023079593A1 WO 2023079593 A1 WO2023079593 A1 WO 2023079593A1 JP 2021040401 W JP2021040401 W JP 2021040401W WO 2023079593 A1 WO2023079593 A1 WO 2023079593A1
Authority
WO
WIPO (PCT)
Prior art keywords
contamination
degree
dirt
determination
threshold
Prior art date
Application number
PCT/JP2021/040401
Other languages
French (fr)
Japanese (ja)
Inventor
健太 久瀬
恭平 濱田
拓弥 橋口
猛 大佐賀
淳志 堀
Original Assignee
三菱電機ビルソリューションズ株式会社
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機ビルソリューションズ株式会社, 三菱電機株式会社 filed Critical 三菱電機ビルソリューションズ株式会社
Priority to PCT/JP2021/040401 priority Critical patent/WO2023079593A1/en
Priority to JP2023541612A priority patent/JP7455284B2/en
Publication of WO2023079593A1 publication Critical patent/WO2023079593A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to a dirt determination device, a dirt determination method, and a dirt determination program.
  • Patent Literature 1 discloses a detection device that uses an identification model to determine whether photographed data photographed by a surveillance camera is flawed photographed data containing defects. There is This identification model is a model for extracting feature amounts from photographed data and detecting defective photographed data based on the extracted features. Defective photography data is photography data that looks like it was taken when the lens of the camera was dirty.
  • An object of the present invention is to provide a contamination determination device, a contamination determination method, and a contamination determination program.
  • a contamination determination device includes an acquisition unit, an extraction unit, an estimation unit, and a determination unit.
  • the acquisition unit periodically acquires image data captured by the surveillance camera.
  • the extraction unit extracts the feature amount of the acquired photographed data.
  • the estimation unit inputs information based on the extracted feature amount to the learned model and outputs the degree of contamination of the lens of the surveillance camera.
  • the determining unit determines that the lens has a certain amount of dirt when the degree of dirt is equal to or greater than a threshold for the degree of dirt and the degree of dirt does not fall below the threshold even after a predetermined time has passed since the degree of dirt became equal to or greater than the threshold. It is judged to be in a dirty state.
  • a dirt determination method includes a step of periodically acquiring photographed data photographed by a surveillance camera, a step of extracting a feature amount of the acquired photographed data, and a step of learning information based on the extracted feature amount. a step of inputting data into the model and outputting the degree of contamination of the lens of the surveillance camera; and determining that the lens is in a dirty state with a certain degree of dirt if the degree does not fall below the threshold.
  • a dirt determination program provides a computer with a step of periodically acquiring photographed data photographed by a surveillance camera, a step of extracting a feature amount of the acquired photographed data, and a step based on the extracted feature amount. a step of inputting information into a learned model and outputting the degree of contamination of a lens of a surveillance camera; and a step of determining that the lens is in a dirty state with a certain degree of dirt if the degree of dirt does not fall below the threshold value.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 10 is a diagram for explaining the relationship between an image of the inside of the car captured in the first process and the degree of contamination;
  • FIG. 10 is a diagram for explaining the relationship between an image of the interior of the car captured in the second process and the degree of contamination; It is a flow chart of the 2nd processing which a dirt judging device performs.
  • FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
  • FIG. 1 is a diagram showing an example of the hardware configuration of the contamination determination system 1. As shown in FIG.
  • the dirt determination system 1 includes a video recording device 100, an elevator car 401, a monitoring camera 400 installed in the car 401, a dirt determination device 200, and a terminal 300.
  • Video recording device 100 can communicate with surveillance camera 400 and contamination determination device 200 .
  • Terminal 300 can communicate with contamination determination device 200 .
  • a monitoring camera 400 is a camera for monitoring the inside of an elevator car 401 installed in a building.
  • Surveillance camera 400 generates, for example, 30 frame images per second.
  • the continuously generated frame images are reproduced as a moving image by being reproduced continuously.
  • the photographed data in this embodiment corresponds to a frame image.
  • the video recording device 100 is a device that records photographed data (frame images) photographed by the monitoring camera 400 and determines the state of dirt on the lens of the monitoring camera 400 .
  • a monitor camera 400 is installed for each car.
  • the video recording device 100 determines the dirt condition of the lens of the monitoring camera 400 for each car.
  • the monitoring camera 400 is not limited to being installed in the elevator car 401, and may be installed to monitor elevator halls, escalators, and other building equipment.
  • the video recording device 100 includes a CPU (Central Processing Unit) 111, a ROM (Read Only Memory) 112, a RAM (Random Access Memory) 113, a storage section 114, a communication interface 115, and an I/O interface 116. have. These are communicably connected to each other via a bus.
  • CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 111 comprehensively controls the video recording apparatus 100 as a whole.
  • the CPU 111 develops a program stored in the ROM 112 in the RAM 113 and executes it.
  • the ROM 112 stores a program in which processing procedures for processing performed by the video recording apparatus 100 are described.
  • the RAM 113 serves as a work area when the CPU 111 executes programs, and temporarily stores programs and data used when executing the programs.
  • the storage unit 114 is a non-volatile storage device, such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • the video recording device 100 can communicate with the contamination determination device 200 via the communication interface 115 .
  • I/O interface 116 is an interface for CPU 111 to connect with surveillance camera 400 .
  • the contamination determination device 200 has a CPU 211 , a ROM 212 , a RAM 213 , a storage section 214 , a communication interface 215 and an I/O interface 216 . These are communicably connected to each other via a bus.
  • the CPU 211 comprehensively controls the contamination determination device 200 as a whole.
  • the CPU 211 develops a program stored in the ROM 212 in the RAM 213 and executes it.
  • the ROM 212 stores a program in which processing procedures for processing performed by the contamination determination device 200 are described.
  • the RAM 213 serves as a work area when the CPU 211 executes the program, and temporarily stores the program and data used when executing the program.
  • the storage unit 214 is a non-volatile storage device such as an HDD or an SSD.
  • the contamination determination device 200 can communicate with the video recording device 100 and the terminal 300 via the communication interface 215 .
  • the I/O interface 216 is an interface for the CPU 211 to connect with a display device or an input device.
  • the display device is, for example, a display. On the display device, it is possible to confirm photographed data and the state of dirt on the lens.
  • Input devices are, for example, a keyboard and a mouse. For example, an instruction can be given to the contamination determination device 200 by operating the input device.
  • the terminal 300 also has a CPU, a ROM, a RAM, a storage unit, a communication interface, and an I/O interface, similar to the contamination determination device 200 .
  • the terminal 300 may be a personal computer, a tablet terminal, a smart phone, or the like. In the terminal 300 as well, it is possible to check the imaging data recorded in the monitoring camera 400, the dirtiness of the lens, and the like.
  • FIG. 2 is a diagram showing an example of a functional block diagram of the contamination determination system 1.
  • the video recording device 100 has an acquisition unit 121 .
  • Acquisition unit 121 acquires image data captured by surveillance camera 400 and stores it as image data 122 in storage unit 114 .
  • the contamination determination device 200 includes an acquisition unit 131, an extraction unit 132, a normalization unit 133, an estimation unit 141, a model generation unit 134, a determination unit 143, a prediction unit 145, and a notification unit 144.
  • the dirt determination device 200 uses the captured data 122 captured by the monitoring camera 400 to generate the learned model 135 . In addition, it acquires the photographed data 122 photographed by the monitoring camera 400, and finally notifies the terminal 300 of the dirt determination result.
  • the storage unit 214 stores the learned model 135 generated by the model generation unit 134 and the estimation result 142 generated by the estimation unit 141 .
  • the flow of processing will be described below using flowcharts and the like.
  • the contamination determination device 200 can set execution modes including a first mode and a second mode.
  • the determination unit 143 makes a determination every time T1.
  • the contamination determination device 200 activates the first process when the first mode is set, and activates the second process when the second mode is set. In this embodiment, it is assumed that both the first mode and the second mode are set.
  • FIG. 3 is a flowchart of the first process executed by the contamination determination device 200.
  • the first process is a process of judging contamination with a period of time T1.
  • step is also simply referred to as "S”.
  • the contamination determination device 200 determines whether or not time T1 has elapsed. When the contamination determination device 200 determines that the time T1 has elapsed (YES in S11), the process proceeds to S12. If the contamination determination device 200 does not determine that the time T1 has elapsed (NO in S11), the process returns to S11.
  • the contamination determination device 200 performs determination processing (see FIG. 4).
  • the determination process is a process of determining the dirty state of the lens of the monitoring camera 400 based on the image data 122 captured by the monitoring camera 400 .
  • the notification unit 144 notifies the terminal 300 of the determination result of the determination unit 143, and returns the process to S11. In this way, the first process performs the determination process each time the time T1 elapses.
  • FIG. 4 is a flowchart of determination processing executed by the contamination determination device 200.
  • the acquisition unit 131 acquires the photographed data 122 photographed by the monitoring camera 400, and the process proceeds to S22.
  • the photographed data 122 is data obtained by photographing the inside of the elevator car 401 .
  • the extraction unit 132 extracts the feature amount of the acquired photographing data 122, and advances the process to S23.
  • the normalization unit 133 normalizes the extracted feature amount, and the process proceeds to S24.
  • the processing of the extraction unit 132 and the normalization unit 133 is the same as the processing during learning, which will be described later with reference to FIG.
  • the estimation unit 141 inputs information based on the extracted feature amount to the learned model 135 and outputs the degree of contamination of the lens of the surveillance camera 400 .
  • the degree of dirt indicates how dirty the lens of the monitoring camera 400 is by using a numerical value from 0 to 1.
  • the estimating unit 141 stores the normalized data as information based on the extracted feature amount for all the learned models 135 (the learned model shown in FIG. 5 to be described later) stored in the storage unit 214. models A to E, etc.), output the degree of contamination, and advance the process to S25.
  • the estimating unit 141 selects the lowest contamination degree from among all the output contamination degrees, and advances the process to S26.
  • the estimating unit 141 selects one of all the learned models 135 (learned models A to E, etc.) based on the contamination degree output results. Specifically, the estimating unit 141 selects the model with the smallest output result from among the plurality of models. The details will be described later with reference to FIG. 5. The reason is that the model with the smallest output result (dirt degree) is considered to be the optimum model.
  • Determination unit 143 determines that there is contamination when the degree of contamination becomes equal to or greater than the threshold and the degree of contamination becomes less than the threshold within a predetermined period of time (e.g., 8 minutes) after the degree of contamination becomes equal to or greater than the threshold. state (S26 to S28 below).
  • an element unrelated to dirt may be erroneously detected as "dirt". Since it is assumed that these elements will not be detected with the passage of time, the transition of the degree of contamination is confirmed until a predetermined period of time has elapsed.
  • the “threshold” is a threshold set in advance regarding the degree of contamination, and is set to "0.3" in the present embodiment.
  • the “dirt state” is a state in which the lens of the monitoring camera 400 is soiled to a certain degree, and specifically refers to a state in which the degree of dirt is equal to or greater than a threshold.
  • the estimating unit 141 determines whether or not the degree of contamination becomes equal to or greater than the threshold and the degree of contamination does not fall below the threshold even after a predetermined period of time has elapsed since the degree of contamination became equal to or greater than the threshold.
  • steps S21 to S25 are repeated every specific period of time (eg, 30 seconds) until a predetermined period of time (8 minutes) elapses. Then, among the contamination degrees obtained until the predetermined period (8 minutes) has passed, the smallest contamination degree is selected as the correct contamination degree. In S26, it is determined whether or not the value selected as the correct degree of contamination is less than the threshold value.
  • the estimating unit 141 determines that the contamination degree is equal to or greater than the threshold value and that the contamination degree does not become less than the threshold value even after a predetermined period of time has elapsed since the contamination degree became equal to or greater than the threshold value (YES in S26), the process to S27.
  • the determination unit 143 determines that the dirt exists, and terminates the determination process.
  • the estimating unit 141 did not determine that the degree of contamination became equal to or greater than the threshold and that the degree of contamination did not become less than the threshold even after a predetermined period of time had passed since the degree of contamination became equal to or greater than the threshold (that is, the predetermined time had elapsed). (NO in S26), the process proceeds to S28. In S28, the determination unit 143 determines that there is no contamination, and terminates the determination process.
  • the elevator maintenance staff or building manager cleans the lens of the monitoring camera 400 to remove the dirt.
  • FIG. 5 is a diagram for explaining various learned models.
  • the learned model 135 is a model that has been subjected to learning processing in order to output the degree of contamination when information based on the feature amount is input.
  • the trained model 135 includes trained models A to E and the like.
  • Each of the trained models A to E, etc. is a model that has undergone learning processing using a group of photographed data 122 photographed in one of a plurality of photographing environments.
  • the learned model A is a model that has undergone learning processing using 122 groups of shooting data taken in the morning as the shooting environment.
  • the learned model B is a model that has undergone learning processing using the 122 group of shooting data taken in the daytime as the shooting environment.
  • the learned model C is a model for which learning processing has been performed using a group of 122 photographed data taken at night as the photographing environment.
  • the brightness in the car 401 differs in the morning, daytime, and night depending on how light enters.
  • the learned model D is a model in which learning processing is performed using a group of 122 photographed data photographed with the door of the car 401 open (door open state) as the photographing environment.
  • the trained model E is a model in which learning processing is performed using a group of photographed data 122 photographed in a state in which the door of the car 401 is closed (closed door state) as the photographing environment.
  • learning processing is performed using a set of photographed data (a group of photographed data) whose degree of contamination has been determined in advance.
  • Acquisition unit 131 acquires photographed data 122 for which the degree of contamination has been determined in advance as learning data.
  • the imaging data 122 for the learning process may be stored in the storage unit 114 as teaching data after linking the degree of dirt with the imaging environment in advance.
  • the extraction unit 132 extracts the feature amount of the captured image data 122 obtained.
  • the feature amount may be extracted based on the color characteristics of the photographed data 122 .
  • the photographing data 122 is a color image
  • each pixel has color characteristic data. Specifically, there are saturation, lightness, hue, and the like.
  • the photographed data as a whole includes coloration, variation, structure, and the like.
  • a feature amount that shows a difference between the low degree of contamination and the high degree of contamination is extracted.
  • the degree of dirt is high, it is presumed that the image of the dirt adhering to the lens (the imaged area of the dirt portion) is black, that is, the brightness is lowered. Therefore, it is conceivable that the feature amount will be greatly different from that in the case where dirt is not attached. In this way, it is sufficient to extract feature amounts so that a difference appears when the degree of contamination is high.
  • the normalization unit 133 normalizes the feature quantity extracted by the extraction unit 132 .
  • the model generating unit 134 performs learning processing based on the normalized photographing data 122 and the associated degree of contamination to generate a learned model.
  • the determination accuracy of the determination unit 143 When the shooting environment of the shooting data to be determined is close to the shooting environment of the learned model, it can be expected that the determination accuracy of the determination unit 143 will be high. For example, as shown in FIG. 7, if the acquired photographed data is photographed with the door open, then the learned model D undergoes learning processing using the photographed data 122 group photographed with the door open. can be expected to improve the accuracy of determination.
  • the model with the lowest calculated degree of contamination is the model with the highest determination accuracy of the determination unit 143 .
  • the state of the door is different between the photographed data photographed with the door open and the learned model E using the photographed data with the door closed, so there is a possibility that the degree of contamination will be erroneously detected. is high.
  • the degree of contamination in the region of the door is Less chance of false positives.
  • the estimating unit 141 uses all the learned models 135 (learned models A to E, etc.) to output the respective contamination degrees. Then, the model with the smallest degree of contamination is selected as the optimum model (the smallest degree of contamination is determined to be the correct degree of contamination).
  • the learned model is not limited to the above-mentioned learned models A to E, but also a model that has undergone learning processing while passengers are on board, or a model that has undergone learning processing in an environment in which multiple shooting environments are combined.
  • the contamination level may be output by the estimating unit 141 using only one trained model. In this case, the learning process becomes simple.
  • the estimation unit 141 outputs the contamination level for each divided area.
  • the estimating unit 141 calculates “degree of contamination” ⁇ “percentage of the divided region in the entire image” for each region, and adds them up to calculate the “degree of contamination”.
  • FIG. 6 is a diagram for explaining the temporal change in the degree of contamination of the lens.
  • the vertical axis indicates the degree of contamination.
  • the horizontal axis indicates time.
  • the degree of contamination is 0.
  • the threshold is 0.3.
  • the contamination level is 0.
  • scene 2 is time t2 when the degree of contamination rises sharply to 0.9.
  • the degree of contamination is equal to or greater than the threshold value (0.3).
  • scene 3 is time t3 when the degree of contamination drops sharply and becomes 0.1.
  • the degree of contamination (0.1) is below the threshold (0.3). In such a case, it is determined that there is no contamination.
  • scene 4 is time t4 when the degree of contamination rises sharply to 0.8.
  • the degree of contamination (0.8) is equal to or greater than the threshold (0.3).
  • scene 5 is time t5 when the degree of contamination drops sharply to 0.4.
  • the contamination level (0.4) remains equal to or greater than the threshold value (0.3). In such a case, it is judged to be in a dirty state.
  • scene 2 and scene 3 are the same in that the lenses are dirty, but differ in that scene 2 has the door closed and there are four passengers. Therefore, the contamination degree of 0.9 in Scene 2 is considered to include erroneous detection by the open/closed state of the door of car 401 and passengers inside car 401 in addition to dirt on the lens.
  • the degree of contamination is calculated as 0. After that, the dirt degree of 0.1 adheres to the lens, but when four passengers board and the door is closed, the dirt degree increases to 0.9.
  • the degree of contamination is equal to or greater than the threshold value (0.3)
  • the estimating unit 141 determines that the degree of contamination has become equal to or greater than the threshold and that the degree of contamination has become less than the threshold within a predetermined period of time after the degree of contamination becomes equal to or greater than the threshold. If so, it is determined that there is no contamination (S26, S28). Various factors that change the degree of contamination will be described in more detail with reference to FIG. 15 .
  • scene 4 and scene 5 are the same in that the lens is dirty, but differ in that scene 4 has the door closed and there are two passengers. Therefore, the contamination level of 0.8 in scene 4 is considered to include erroneous detection by the open/closed state of the door of car 401 and the passengers inside car 401 in addition to the dirt on the lens.
  • the determining unit 143 determines that the soiled state is present when the degree of soiling is equal to or greater than the threshold and the degree of soiling does not fall below the threshold even after a predetermined time has elapsed since the degree of soiling became equal to or greater than the threshold. (S26, S27).
  • methods other than estimation using a learned model are also used to exclude elements other than dirt, thereby increasing the accuracy of estimating the degree of dirt.
  • the degree of contamination caused by "dirt” does not decrease over time, and the degree of contamination caused by "factors other than contamination” decreases over time.
  • the "door open/closed state” which is an element other than dirt, returns to the door open state with the lapse of time, and the number of passengers returns to 0 with the lapse of time. In this way, there are cases where the threshold is exceeded due to factors other than contamination.
  • FIG. 12 is a diagram for explaining the relationship between the image of the inside of the car 401 photographed in the first process and the degree of contamination.
  • the determination process is performed every T1 (10 minutes).
  • the determination process is performed every T2 (one day) as will be described later with reference to the flowchart of FIG.
  • the degree of contamination when it is determined that the degree of contamination is greater than or equal to the threshold as in scene 4 (t4), the degree of contamination remains greater than or equal to the threshold in scene 3 (t5) even after the predetermined time has passed. , is determined to be in a dirty state.
  • the determination process since the determination process is performed every 10 minutes, it is possible to detect sudden stains such as when the lens is soiled by mischief.
  • FIG. 13 is a diagram for explaining the relationship between the image of the inside of the car 401 photographed in the second process and the degree of contamination.
  • the doorway 51 is in the open state and there is no passenger 53 in the car 401.
  • the lens of the surveillance camera 400 is not dirty, and the degree of dirt is zero.
  • image 80g is the state after 30 days have passed since the state of image 80f (determination processing has been performed 30 times).
  • the doorway 51 is in the open state and no passenger 53 is inside the car 401 .
  • the lens of the monitoring camera 400 is dirty, and the degree of dirt is 0.1.
  • the judgment process when executed, there are passengers, and if the result is equal to or higher than the threshold, it is judged whether the degree of contamination will be less than the threshold within a predetermined period. In this example, the degree of contamination drops to 0.1 at the timing when the passenger gets off.
  • An image 80h is a state in which 30 days have passed since the state of the image 80g (30 determination processes have been performed). Counting from the state of the image 80f, 60 days have passed. In the image 80h, the entrance/exit 51 is still in the open state, and there is no passenger 53 inside the car 401 . However, the lens of the surveillance camera 400 is further contaminated, and the degree of dirt is 0.2.
  • An image 80i is a state in which 30 days have passed since the state of the image 80h (30 determination processes have been performed). Counting from the state of the image 80f, 90 days have passed. In the image 80 g as well, the doorway 51 is in the open state and no passenger 53 is inside the car 401 . However, the lens of the surveillance camera 400 is further contaminated, and the degree of dirt is 0.3. In this case, since the degree of contamination has reached the threshold value, it is determined to be in the contamination state.
  • the degree of contamination increases by 0.1 every 30 days. For this reason, for example, it is possible to predict that the degree of contamination will reach the threshold value 90 days after 30 days from the time-dependent change in the degree of contamination from 0 days to 60 days. This allows maintenance personnel to know when to clean the lens.
  • the second process will be described below using a flowchart.
  • the first process is the process of judging contamination with a cycle of time T1 (10 minutes).
  • the second process dirt is determined in a period of time T2 (one day).
  • the second process estimates future lens contamination.
  • FIG. 14 is a flowchart of the second process executed by the contamination determination device 200.
  • FIG. 14 is a flowchart of the second process executed by the contamination determination device 200.
  • the contamination determination device 200 determines whether or not time T2 has elapsed. When the contamination determination device 200 determines that the time T2 has elapsed (YES in S31), the process proceeds to S32. If the contamination determination device 200 does not determine that the time T2 has elapsed (NO in S31), the process returns to S31.
  • the contamination determination device 200 performs determination processing, and advances the processing to S33. That is, the contamination determination device 200 performs determination processing and the like each time the time T2 elapses.
  • the prediction unit 145 of the contamination determination device 200 stores the degree of contamination in the storage unit 214, and advances the process to S33.
  • the prediction unit 145 estimates the time when it is determined that there is dirt based on the time-series data of the degree of dirt stored in the storage unit 214, and advances the process to S35.
  • the contamination level is 0 on the 0th day.
  • the fouling degree is 0.1.
  • the fouling degree 0.2. Assume that it is now the 60th day.
  • the storage unit 214 stores dirt degree data from the 0th day to the 60th day.
  • the degree of dirt increases by 0.1 every 30 days, so the prediction unit 145 predicts that the degree of dirt will reach the threshold of 0.3 on the 90th day (after 30 days). do. That is, it is predicted that the time when it is determined to be in the dirty state is 30 days later.
  • the prediction unit 145 may generate a prediction model using the least squares method or the like using time-series data of the degree of contamination. The prediction unit 145 predicts the time when it is determined that the soiled state is present using the prediction model.
  • the notification unit 144 transmits the determination result of the determination unit 143 and the estimation result of the prediction unit 145 to the terminal 300, and returns the process to S31.
  • the degree of contamination changes according to at least one of "dirt on lenses”, “loading amount in car 401", and “open/closed state of door of car 401". do.
  • various factors that change the degree of contamination will be described in more detail with reference to FIG. 15 .
  • FIG. 15 is a diagram for explaining the relationship between the image of the interior of the car 401 and the degree of contamination.
  • the doorway 51 is open and there are no passengers inside the car 401.
  • the "loading amount in the car 401" includes not only the number of passengers but also the amount of luggage. Note that the "loading amount in the car 401" may indicate the total area of passengers and luggage estimated from the image 80k.
  • the degree of contamination has increased to 0.2 due to the luggage being loaded in the car 401 together with the passengers.
  • the degree of contamination varies according to the amount of load in the car 401, but the degree of contamination returns to 0 again when the passenger and the cargo leave the car 401 upon arrival at the destination floor. Therefore, it is not determined to be "dirty" based on the passenger and luggage.
  • the above-mentioned "door open/closed state of car 401” includes “state of landing”.
  • the door is open like the image 80j, but the state of the hall has changed.
  • the platform is bright due to sunlight during the daytime, whereas in the image 80l, the platform is dark at night, and the degree of contamination increases to 0.15.
  • the degree of contamination differs between the door open state and the door closed state, but in the door open state the degree of contamination also differs depending on the time zone.
  • the "state of the boarding area” includes the congestion status of the boarding area.
  • the degree of contamination differs between peak hours such as when arriving and leaving work (a plurality of waiting passengers are reflected when the door is open) and off-peak hours (when there are no people at the boarding point). In this case as well, for example, by using different learning models for the shooting environment during peak hours and during other hours, it is possible to further improve the accuracy of determining the degree of contamination.
  • the degree of contamination also changes depending on "the state of the monitoring camera 400 installed in the car 401". For example, in an image 80m of the inside of the car 401, the installation direction of the surveillance camera 400 installed in the car 401 is shifted due to mischief by a passenger or the like. ing. As a result, the degree of contamination is changed to 0.4 due to the difference between the image 80j and the image 80m.
  • the degree of contamination since the degree of contamination does not decrease even after a predetermined period of time has passed, it is erroneously detected as a "dirty state.”
  • the maintenance staff of the elevator or the manager of the building checks the installation state of the monitoring camera 400 and restores the monitoring camera 400 to the normal installation state. As a result, the degree of contamination returns to 0, and the degree of contamination can be normally detected.
  • the "state of the monitoring camera 400 installed in the car 401" may be the installation direction or installation position of the monitoring camera 400 as described above, or a state in which the monitoring camera 400 is out of focus. can be assumed.
  • the contamination determination device 200 includes an acquisition unit 131 , an extraction unit 132 , an estimation unit 141 and a determination unit 143 .
  • Acquisition unit 131 periodically acquires photographed data 122 photographed by surveillance camera 400 .
  • the extraction unit 132 extracts the feature amount of the acquired photographed data 122 .
  • the estimation unit 141 inputs information based on the extracted feature amount to the learned model 135 and outputs the degree of contamination of the lens of the surveillance camera 400 .
  • the determining unit 143 determines that when the degree of contamination is equal to or greater than the threshold for the degree of contamination, and the degree of contamination does not become less than the threshold even after a predetermined time has passed since the degree of contamination became equal to or greater than the threshold, the lens has a certain degree of contamination. It is determined that there is some dirt. By waiting for a predetermined period of time, erroneous detection due to factors other than dirt can be reduced, and it is possible to accurately determine whether or not the lens of the surveillance camera is in a dirty state with a certain degree of
  • the learned model 135 includes a plurality of models (learned models A to E, etc.). Each of the plurality of models is subjected to learning processing to output the degree of contamination when information based on the feature amount is input using a group of photographed data 122 photographed in one of a plurality of photographing environments. model.
  • the estimating unit 141 selects one model from among a plurality of models, and outputs the degree of contamination using the selected model.
  • the estimation unit 141 selects the model with the smallest output result from among the multiple models.
  • the contamination determination device 200 can set execution modes including a first mode and a second mode.
  • the determination unit 143 performs determination for each first cycle.
  • the determination unit 143 performs determination for each second cycle longer than the first cycle.
  • the photographed data 122 is data in which the inside of the elevator car 401 is photographed.
  • the degree of contamination changes according to at least one of the contamination of the lenses, the amount of load (the number of passengers and the amount of luggage) in the car 401, and the door opening/closing state of the car 401.
  • FIG. As a result, it is possible to discriminate dirt on the lens that does not reduce the degree of dirt even after the passage of a predetermined time from the amount of load and door open/closed state that may reduce the degree of dirt before the passage of the predetermined time. Whether or not can be determined with high accuracy.
  • the dirt determination method consists of a step of periodically acquiring the photographed data 122 photographed by the monitoring camera 400, a step of extracting the feature amount of the acquired photographed data 122, and information based on the extracted feature amount. to the learned model 135 and outputting the degree of contamination of the lens of the monitoring camera 400; determining that the lens is in a dirty state with a certain amount of dirt when the degree of dirt does not become less than the threshold value even when the lens is dirty. By waiting for a predetermined period of time, erroneous detection due to factors other than dirt can be reduced, and it is possible to accurately determine whether or not the lens of the surveillance camera is in a dirty state with a certain degree of dirt.
  • the dirt determination program provides the computer with a step of periodically acquiring the photographed data 122 photographed by the monitoring camera 400, a step of extracting the feature amount of the acquired photographed data 122, and a step of extracting the extracted feature amount to the learned model 135 and outputting the degree of contamination of the lens of the monitoring camera 400; and a step of determining that the lens is in a dirty state with a certain degree of dirt when the degree of dirt does not become less than the threshold value even after a lapse of time.
  • 1 Dirt determination system 51, 52 entrance, 53 passengers, 80a to 80i images, 100 video recording device, 111, 211 CPU, 112, 212 ROM, 113, 213 RAM, 114, 214 storage unit, 115, 215 communication interface, 116, 216 I/O interface, 121 acquisition unit, 122 captured data, 131 acquisition unit, 132 extraction unit, 133 normalization unit, 134 model generation unit, 135 learned model, 141 estimation unit, 142 estimation result, 143 determination unit , 144 notification unit, 200 dirt determination device, 300 terminal, 400 surveillance camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

An acquisition unit (131) regularly acquires captured-image data (122) captured by a monitoring camera (400). An extraction unit (132) extracts feature amounts of the acquired captured-image data (122). An estimation unit (141) inputs information based on the extracted feature amounts to a trained model (135) and outputs the degree of stain on a lens of the monitoring camera (400). A determination unit (143) determines a stained state where the lens is stained to a given extent, in the case where the degree of stain has reached or exceeded a threshold relating to the degree of stain, and the degree of stain does not fall below the threshold after lapse of a prescribed period since the degree of stain reached or exceeded the threshold.

Description

汚れ判定装置、汚れ判定方法、および汚れ判定プログラムDirt Judgment Device, Dirt Judgment Method, and Dirt Judgment Program
 本開示は、汚れ判定装置、汚れ判定方法、および汚れ判定プログラムに関する。 The present disclosure relates to a dirt determination device, a dirt determination method, and a dirt determination program.
 特許第6616906号公報(特許文献1)には、監視カメラで撮影した撮影データが、欠陥を含んでいる欠陥有り撮影データであるか否かを識別モデルを用いて判定する検知装置が開示されている。この識別モデルは、撮影データから特徴量を抽出し、抽出した特徴に基づいて欠陥有り撮影データを検知するためのモデルである。欠陥有り撮影データは、カメラのレンズに汚れが付着した場合に撮影されたような撮影データである。 Japanese Patent No. 6616906 (Patent Literature 1) discloses a detection device that uses an identification model to determine whether photographed data photographed by a surveillance camera is flawed photographed data containing defects. there is This identification model is a model for extracting feature amounts from photographed data and detecting defective photographed data based on the extracted features. Defective photography data is photography data that looks like it was taken when the lens of the camera was dirty.
特許第6616906号公報Japanese Patent No. 6616906
 上記識別モデルを用いて判定を行った場合、欠陥でない画像の変化が欠陥データとして認識されてしまう可能性がある。たとえば、エレベーターのかご内を撮影しているような場合、監視カメラのレンズの汚れを検知したいにも関わらず、人や物が映り込んだり、扉の開閉など定期的に変化するものなど、汚れ以外の要素が汚れとして検知されてしまうことがある。 When making judgments using the above identification model, there is a possibility that changes in images that are not defects will be recognized as defect data. For example, when shooting the inside of an elevator car, even though you want to detect the dirt on the lens of the surveillance camera, you may not be able to detect the dirt, such as people or objects being reflected, or things that change regularly, such as opening and closing the door. Other elements may be detected as dirt.
 この場合、汚れ以外の要素を排除した状態で撮影された学習用データを増やして学習を行うことで、推定精度を上げることが可能である。しかしながら、このような状態の発生確率が低い場合、学習データを収集することが困難であり、推定精度を上げることが難しくなる。 In this case, it is possible to increase the accuracy of estimation by increasing the amount of learning data taken with elements other than dirt removed. However, when the probability of occurrence of such a state is low, it is difficult to collect learning data, making it difficult to improve estimation accuracy.
 本開示は、このような課題を解決するためになされたものであって、その目的は、監視カメラのレンズに一定の汚れがある汚れ有状態であるか否かを精度よく判定することができる汚れ判定装置、汚れ判定方法、および汚れ判定プログラムを提供することである。 The present disclosure has been made to solve such a problem, and the object thereof is to be able to accurately determine whether or not the lens of a surveillance camera is in a state with a certain amount of dirt. An object of the present invention is to provide a contamination determination device, a contamination determination method, and a contamination determination program.
 本開示に係る汚れ判定装置は、取得部と、抽出部と、推定部と、判定部とを備える。取得部は、監視カメラにより撮影された撮影データを定期的に取得する。抽出部は、取得された撮影データの特徴量を抽出する。推定部は、抽出された特徴量に基づく情報を学習済モデルに入力して、監視カメラのレンズの汚れ度合を出力する。判定部は、汚れ度合が汚れ度合に関する閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、レンズに一定の汚れがある汚れ有状態であると判定する。 A contamination determination device according to the present disclosure includes an acquisition unit, an extraction unit, an estimation unit, and a determination unit. The acquisition unit periodically acquires image data captured by the surveillance camera. The extraction unit extracts the feature amount of the acquired photographed data. The estimation unit inputs information based on the extracted feature amount to the learned model and outputs the degree of contamination of the lens of the surveillance camera. The determining unit determines that the lens has a certain amount of dirt when the degree of dirt is equal to or greater than a threshold for the degree of dirt and the degree of dirt does not fall below the threshold even after a predetermined time has passed since the degree of dirt became equal to or greater than the threshold. It is judged to be in a dirty state.
 本開示に係る汚れ判定方法は、監視カメラにより撮影された撮影データを定期的に取得するステップと、取得された撮影データの特徴量を抽出するステップと、抽出された特徴量に基づく情報を学習済モデルに入力して、監視カメラのレンズの汚れ度合を出力するステップと、汚れ度合が汚れ度合に関する閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、レンズに一定の汚れがある汚れ有状態であると判定するステップとを備える。 A dirt determination method according to the present disclosure includes a step of periodically acquiring photographed data photographed by a surveillance camera, a step of extracting a feature amount of the acquired photographed data, and a step of learning information based on the extracted feature amount. a step of inputting data into the model and outputting the degree of contamination of the lens of the surveillance camera; and determining that the lens is in a dirty state with a certain degree of dirt if the degree does not fall below the threshold.
 本開示に係る汚れ判定プログラムは、コンピュータに、監視カメラにより撮影された撮影データを定期的に取得するステップと、取得された撮影データの特徴量を抽出するステップと、抽出された特徴量に基づく情報を学習済モデルに入力して、監視カメラのレンズの汚れ度合を出力するステップと、汚れ度合が汚れ度合に関する閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、レンズに一定の汚れがある汚れ有状態であると判定するステップとを実行させる。 A dirt determination program according to the present disclosure provides a computer with a step of periodically acquiring photographed data photographed by a surveillance camera, a step of extracting a feature amount of the acquired photographed data, and a step based on the extracted feature amount. a step of inputting information into a learned model and outputting the degree of contamination of a lens of a surveillance camera; and a step of determining that the lens is in a dirty state with a certain degree of dirt if the degree of dirt does not fall below the threshold value.
 本開示によれば、監視カメラのレンズに一定の汚れがある汚れ有状態であるか否かを精度よく判定することができる。 According to the present disclosure, it is possible to accurately determine whether or not the lens of the surveillance camera is in a dirty state with a certain degree of dirt.
汚れ判定システムのハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of a dirt determination system. 汚れ判定システムの機能ブロック図の一例を示す図である。It is a figure which shows an example of the functional block diagram of a contamination determination system. 汚れ判定装置が実行する第1処理のフローチャートである。4 is a flowchart of first processing executed by the contamination determination device; 汚れ判定装置が実行する判定処理のフローチャートである。4 is a flowchart of determination processing executed by the contamination determination device; 各種学習済モデルを説明するための図である。FIG. 4 is a diagram for explaining various learned models; FIG. レンズの汚れ度合の時間変化を説明するための図である。It is a figure for demonstrating the time change of the dirt degree of a lens. かご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination. かご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination. かご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination. かご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination. かご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination. 第1処理におけるかご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 10 is a diagram for explaining the relationship between an image of the inside of the car captured in the first process and the degree of contamination; 第2処理におけるかご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 10 is a diagram for explaining the relationship between an image of the interior of the car captured in the second process and the degree of contamination; 汚れ判定装置が実行する第2処理のフローチャートである。It is a flow chart of the 2nd processing which a dirt judging device performs. かご内を撮影した画像と汚れ度合いとの関係を説明するための図である。FIG. 5 is a diagram for explaining the relationship between an image of the interior of the car and the degree of contamination.
 以下、図面を参照しつつ、実施の形態について説明する。以下の説明では、同一の部品には同一の符号を付してある。それらの名称および機能も同じである。したがって、それらについての詳細な説明は繰り返さない。 Embodiments will be described below with reference to the drawings. In the following description, the same parts are given the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated.
 [汚れ判定システム1]
 まず、本実施の形態に係る汚れ判定装置200を備える汚れ判定システム1について説明する。図1は、汚れ判定システム1のハードウェア構成の一例を示す図である。
[Dirt determination system 1]
First, the contamination determination system 1 including the contamination determination device 200 according to the present embodiment will be described. FIG. 1 is a diagram showing an example of the hardware configuration of the contamination determination system 1. As shown in FIG.
 汚れ判定システム1は、映像記録装置100と、エレベーターのかご401と、かご401内に設置された監視カメラ400と、汚れ判定装置200と、端末300とを備える。映像記録装置100は、監視カメラ400および汚れ判定装置200と通信可能である。端末300は、汚れ判定装置200と通信可能である。 The dirt determination system 1 includes a video recording device 100, an elevator car 401, a monitoring camera 400 installed in the car 401, a dirt determination device 200, and a terminal 300. Video recording device 100 can communicate with surveillance camera 400 and contamination determination device 200 . Terminal 300 can communicate with contamination determination device 200 .
 監視カメラ400は、ビル内に設置されたエレベーターのかご401内を監視するためのカメラである。監視カメラ400は、たとえば、1秒間に30のフレーム画像を生成する。連続して生成されたフレーム画像は、連続して再生されることで動画像として再生される。 A monitoring camera 400 is a camera for monitoring the inside of an elevator car 401 installed in a building. Surveillance camera 400 generates, for example, 30 frame images per second. The continuously generated frame images are reproduced as a moving image by being reproduced continuously.
 本実施の形態における撮影データは、フレーム画像に相当する。映像記録装置100は、監視カメラ400により撮影された撮影データ(フレーム画像)を記録するとともに、監視カメラ400のレンズの汚れの状態を判定する装置である。 The photographed data in this embodiment corresponds to a frame image. The video recording device 100 is a device that records photographed data (frame images) photographed by the monitoring camera 400 and determines the state of dirt on the lens of the monitoring camera 400 .
 なお、ビル内には複数のかごが設置されていてもよい。この場合、かごごとに監視カメラ400が設置される。映像記録装置100は、かごごとに監視カメラ400のレンズの汚れの状態を判定する。また、監視カメラ400は、エレベーターのかご401内に設置されるものに限らず、エレベーターの乗場、エスカレーター、その他のビル設備を監視するために設置されるものであってもよい。 It should be noted that multiple baskets may be installed in the building. In this case, a monitor camera 400 is installed for each car. The video recording device 100 determines the dirt condition of the lens of the monitoring camera 400 for each car. Moreover, the monitoring camera 400 is not limited to being installed in the elevator car 401, and may be installed to monitor elevator halls, escalators, and other building equipment.
 映像記録装置100は、CPU(Central Processing Unit)111と、ROM(Read Only Memory)112と、RAM(Random Access Memory)113と、記憶部114と、通信インターフェイス115と、I/Oインターフェイス116とを有する。これらは、バスを介して相互に通信可能に接続されている。 The video recording device 100 includes a CPU (Central Processing Unit) 111, a ROM (Read Only Memory) 112, a RAM (Random Access Memory) 113, a storage section 114, a communication interface 115, and an I/O interface 116. have. These are communicably connected to each other via a bus.
 CPU111は、映像記録装置100全体を総括的に制御する。CPU111は、ROM112に格納されているプログラムをRAM113に展開して実行する。ROM112は、映像記録装置100が行う処理の処理手順が記されたプログラムを格納する。 The CPU 111 comprehensively controls the video recording apparatus 100 as a whole. The CPU 111 develops a program stored in the ROM 112 in the RAM 113 and executes it. The ROM 112 stores a program in which processing procedures for processing performed by the video recording apparatus 100 are described.
 RAM113は、CPU111がプログラムを実行する際の作業領域となるものであり、プログラムやプログラムを実行する際のデータ等を一時的に記憶する。また、記憶部114は、不揮発性の記憶装置であり、たとえば、HDD(Hard Disk Drive)やSSD(Solid State Drive)等である。 The RAM 113 serves as a work area when the CPU 111 executes programs, and temporarily stores programs and data used when executing the programs. Also, the storage unit 114 is a non-volatile storage device, such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
 映像記録装置100は、通信インターフェイス115を介して汚れ判定装置200と通信可能である。I/Oインターフェイス116は、CPU111が監視カメラ400と接続するためのインターフェイスである。 The video recording device 100 can communicate with the contamination determination device 200 via the communication interface 115 . I/O interface 116 is an interface for CPU 111 to connect with surveillance camera 400 .
 汚れ判定装置200は、CPU211と、ROM212と、RAM213と、記憶部214と、通信インターフェイス215と、I/Oインターフェイス216とを有する。これらは、バスを介して相互に通信可能に接続されている。 The contamination determination device 200 has a CPU 211 , a ROM 212 , a RAM 213 , a storage section 214 , a communication interface 215 and an I/O interface 216 . These are communicably connected to each other via a bus.
 CPU211は、汚れ判定装置200全体を総括的に制御する。CPU211は、ROM212に格納されているプログラムをRAM213に展開して実行する。ROM212は、汚れ判定装置200が行う処理の処理手順が記されたプログラムを格納する。 The CPU 211 comprehensively controls the contamination determination device 200 as a whole. The CPU 211 develops a program stored in the ROM 212 in the RAM 213 and executes it. The ROM 212 stores a program in which processing procedures for processing performed by the contamination determination device 200 are described.
 RAM213は、CPU211がプログラムを実行する際の作業領域となるものであり、プログラムやプログラムを実行する際のデータ等を一時的に記憶する。また、記憶部214は、不揮発性の記憶装置であり、たとえば、HDDやSSD等である。 The RAM 213 serves as a work area when the CPU 211 executes the program, and temporarily stores the program and data used when executing the program. Also, the storage unit 214 is a non-volatile storage device such as an HDD or an SSD.
 汚れ判定装置200は、通信インターフェイス215を介して映像記録装置100および端末300と通信可能である。I/Oインターフェイス216は、CPU211が表示装置、あるいは入力装置と接続するためのインターフェイスである。 The contamination determination device 200 can communicate with the video recording device 100 and the terminal 300 via the communication interface 215 . The I/O interface 216 is an interface for the CPU 211 to connect with a display device or an input device.
 表示装置は、たとえば、ディスプレイである。表示装置では、撮影データを確認したり、レンズの汚れの状態等を確認することができる。入力装置は、たとえば、キーボードやマウスである。たとえば、入力装置の操作により、汚れ判定装置200に指示を与えることができる。 The display device is, for example, a display. On the display device, it is possible to confirm photographed data and the state of dirt on the lens. Input devices are, for example, a keyboard and a mouse. For example, an instruction can be given to the contamination determination device 200 by operating the input device.
 図示しないが、端末300も、汚れ判定装置200と同様に、CPUと、ROMと、RAMと、記憶部と、通信インターフェイスと、I/Oインターフェイスとを有する。端末300は、パーソナルコンピュータ、タブレット端末、スマートフォン等であってもよい。端末300においても、監視カメラ400に記録された撮影データやレンズの汚れの状態等を確認することができる。 Although not shown, the terminal 300 also has a CPU, a ROM, a RAM, a storage unit, a communication interface, and an I/O interface, similar to the contamination determination device 200 . The terminal 300 may be a personal computer, a tablet terminal, a smart phone, or the like. In the terminal 300 as well, it is possible to check the imaging data recorded in the monitoring camera 400, the dirtiness of the lens, and the like.
 図2は、汚れ判定システム1の機能ブロック図の一例を示す図である。映像記録装置100は、取得部121を備える。取得部121は、監視カメラ400が撮影した撮影データを取得し、記憶部114に撮影データ122として記憶する。 FIG. 2 is a diagram showing an example of a functional block diagram of the contamination determination system 1. As shown in FIG. The video recording device 100 has an acquisition unit 121 . Acquisition unit 121 acquires image data captured by surveillance camera 400 and stores it as image data 122 in storage unit 114 .
 汚れ判定装置200は、取得部131と、抽出部132と、正規化部133と、推定部141と、モデル生成部134と、判定部143と、予測部145、通知部144とを備える。 The contamination determination device 200 includes an acquisition unit 131, an extraction unit 132, a normalization unit 133, an estimation unit 141, a model generation unit 134, a determination unit 143, a prediction unit 145, and a notification unit 144.
 これらの一連の処理により、汚れ判定装置200は、監視カメラ400が撮影した撮影データ122を用いて、学習済モデル135を生成する。また、監視カメラ400が撮影した撮影データ122を取得し、最終的には、端末300に対して汚れの判定結果を通知する。 Through this series of processes, the dirt determination device 200 uses the captured data 122 captured by the monitoring camera 400 to generate the learned model 135 . In addition, it acquires the photographed data 122 photographed by the monitoring camera 400, and finally notifies the terminal 300 of the dirt determination result.
 端末300では、汚れ判定装置200によって通知された判定結果を確認することができる。記憶部214は、モデル生成部134が生成した学習済モデル135および推定部141が生成した推定結果142を記憶する。以下、フローチャート等を用いて処理の流れを説明する。 At the terminal 300, it is possible to confirm the determination result notified by the contamination determination device 200. The storage unit 214 stores the learned model 135 generated by the model generation unit 134 and the estimation result 142 generated by the estimation unit 141 . The flow of processing will be described below using flowcharts and the like.
 [第1処理のフローチャート]
 汚れ判定装置200は、第1モードと第2モードとを含む実行モードを設定可能である。判定部143は、第1モードが設定されている場合は、時間T1ごとに判定を行う。判定部143は、第2モードが設定されている場合は、時間T1よりも長い時間T2ごとに判定を行う。本実施の形態においては、時間T1=10分であり、時間T2=1日である。
[Flowchart of first process]
The contamination determination device 200 can set execution modes including a first mode and a second mode. When the first mode is set, the determination unit 143 makes a determination every time T1. When the second mode is set, the determination unit 143 makes a determination every time T2, which is longer than the time T1. In this embodiment, time T1=10 minutes and time T2=1 day.
 具体的には、汚れ判定装置200は、第1モードが設定されている場合は、第1処理を起動し、第2モードが設定されている場合は、第2処理を起動する。本実施の形態においては、第1モードおよび第2モードのいずれもが設定されているとする。 Specifically, the contamination determination device 200 activates the first process when the first mode is set, and activates the second process when the second mode is set. In this embodiment, it is assumed that both the first mode and the second mode are set.
 図3は、汚れ判定装置200が実行する第1処理のフローチャートである。第1処理は、時間T1の周期で汚れを判定する処理である。以下、「ステップ」を単に「S」とも称する。 FIG. 3 is a flowchart of the first process executed by the contamination determination device 200. FIG. The first process is a process of judging contamination with a period of time T1. Hereinafter, "step" is also simply referred to as "S".
 図3に示すように、第1処理が開始すると、S11において、汚れ判定装置200は、時間T1が経過した否かを判定する。汚れ判定装置200は、時間T1が経過したと判定した場合(S11でYES)、S12に処理を進める。汚れ判定装置200は、時間T1が経過したと判定しなかった場合(S11でNO)、処理をS11に戻す。 As shown in FIG. 3, when the first process starts, in S11, the contamination determination device 200 determines whether or not time T1 has elapsed. When the contamination determination device 200 determines that the time T1 has elapsed (YES in S11), the process proceeds to S12. If the contamination determination device 200 does not determine that the time T1 has elapsed (NO in S11), the process returns to S11.
 S12において、汚れ判定装置200は、判定処理(図4参照)を行う。判定処理は、監視カメラ400で撮影された撮影データ122に基づき、監視カメラ400のレンズの汚れ状態を判定する処理である。 In S12, the contamination determination device 200 performs determination processing (see FIG. 4). The determination process is a process of determining the dirty state of the lens of the monitoring camera 400 based on the image data 122 captured by the monitoring camera 400 .
 S13において、通知部144は、判定部143の判定結果を端末300に通知し、処理をS11に戻す。このように、第1処理は、時間T1が経過するごとに判定処理を行う。 In S13, the notification unit 144 notifies the terminal 300 of the determination result of the determination unit 143, and returns the process to S11. In this way, the first process performs the determination process each time the time T1 elapses.
 図4は、汚れ判定装置200が実行する判定処理のフローチャートである。図4に示すように、判定処理が開始すると、S21において、取得部131は、監視カメラ400で撮影された撮影データ122を取得し、処理をS22に進める。撮影データ122は、エレベーターのかご401内が撮影されたデータである。 FIG. 4 is a flowchart of determination processing executed by the contamination determination device 200. FIG. As shown in FIG. 4, when the determination process starts, in S21, the acquisition unit 131 acquires the photographed data 122 photographed by the monitoring camera 400, and the process proceeds to S22. The photographed data 122 is data obtained by photographing the inside of the elevator car 401 .
 S22において、抽出部132は、取得された撮影データ122の特徴量を抽出し、処理をS23に進める。S23において、正規化部133は、抽出された特徴量を正規化し、処理をS24に進める。抽出部132および正規化部133の処理は、図5を用いて後述する学習時の処理と同様である。 In S22, the extraction unit 132 extracts the feature amount of the acquired photographing data 122, and advances the process to S23. In S23, the normalization unit 133 normalizes the extracted feature amount, and the process proceeds to S24. The processing of the extraction unit 132 and the normalization unit 133 is the same as the processing during learning, which will be described later with reference to FIG.
 推定部141は、抽出された特徴量に基づく情報を学習済モデル135に入力して、監視カメラ400のレンズの汚れ度合を出力する。汚れ度合は、監視カメラ400のレンズがどの程度汚れているかを、0~1の数値を用いて示したものである。 The estimation unit 141 inputs information based on the extracted feature amount to the learned model 135 and outputs the degree of contamination of the lens of the surveillance camera 400 . The degree of dirt indicates how dirty the lens of the monitoring camera 400 is by using a numerical value from 0 to 1. FIG.
 具体的には、S24において、推定部141は、抽出された特徴量に基づく情報として、正規化されたデータを、記憶部214が記憶する全ての学習済モデル135(後述する図5の学習済モデルA~E等)に入力し、それぞれ汚れ度合を出力し、処理をS25に進める。S25において、推定部141は、出力された全ての汚れ度合のうち、最も小さい汚れ度合を選択し、処理をS26に進める。 Specifically, in S24, the estimating unit 141 stores the normalized data as information based on the extracted feature amount for all the learned models 135 (the learned model shown in FIG. 5 to be described later) stored in the storage unit 214. models A to E, etc.), output the degree of contamination, and advance the process to S25. In S25, the estimating unit 141 selects the lowest contamination degree from among all the output contamination degrees, and advances the process to S26.
 このように、推定部141は、汚れ度合の出力結果に基づき、全ての学習済モデル135(学習済モデルA~E等)のうちのいずれかのモデルを選択している。具体的には、推定部141は、複数のモデルのうち、出力結果が最も小さくなるモデルを選択している。詳細は図5を用いて後述するが、その理由は、出力結果(汚れ度合)が最も小さくなるモデルが、最適なモデルであると考えられるからである。 In this way, the estimating unit 141 selects one of all the learned models 135 (learned models A to E, etc.) based on the contamination degree output results. Specifically, the estimating unit 141 selects the model with the smallest output result from among the plurality of models. The details will be described later with reference to FIG. 5. The reason is that the model with the smallest output result (dirt degree) is considered to be the optimum model.
 判定部143は、汚れ度合が閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間(たとえば、8分)が経過するまでに汚れ度合が閾値未満になった場合に、汚れ有状態であると判定する(以下のS26~S28)。 Determination unit 143 determines that there is contamination when the degree of contamination becomes equal to or greater than the threshold and the degree of contamination becomes less than the threshold within a predetermined period of time (e.g., 8 minutes) after the degree of contamination becomes equal to or greater than the threshold. state (S26 to S28 below).
 詳細については、具体例も含めて図6~図10を用いて後述するが、本実施の形態においては、汚れとは無関係な要素が「汚れ」として誤検出されることがある。これらの要素は、時間の経過とともに検出されなくなることが想定されるため、所定時間が経過するまでの汚れ度合の推移を確認するようにしている。 Details will be described later using FIGS. 6 to 10, including specific examples, but in the present embodiment, an element unrelated to dirt may be erroneously detected as "dirt". Since it is assumed that these elements will not be detected with the passage of time, the transition of the degree of contamination is confirmed until a predetermined period of time has elapsed.
 ここで、「閾値」は、汚れ度合に関して事前に設定された閾値であり、本実施の形態においては、「0.3」が設定されている。また、「汚れ有状態」は、監視カメラ400のレンズに一定の汚れがある状態であり、具体的には、汚れ度合が閾値以上となった状態を指す。 Here, the "threshold" is a threshold set in advance regarding the degree of contamination, and is set to "0.3" in the present embodiment. Further, the “dirt state” is a state in which the lens of the monitoring camera 400 is soiled to a certain degree, and specifically refers to a state in which the degree of dirt is equal to or greater than a threshold.
 S26において、推定部141は、汚れ度合が閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならないか否かを判定する。 In S26, the estimating unit 141 determines whether or not the degree of contamination becomes equal to or greater than the threshold and the degree of contamination does not fall below the threshold even after a predetermined period of time has elapsed since the degree of contamination became equal to or greater than the threshold.
 具体的には、汚れ度合が閾値以上である場合には、所定期間(8分)が経過するまで、特定時間(たとえば、30秒)ごとS21~S25のステップを繰り返す。そして、所定期間(8分)が経過するまでに得られた汚れ度合のうち、最も小さいものを正しい汚れ度合として選択する。S26においては、正しい汚れ度合として選択された値が閾値未満にならないか否かを判定している。 Specifically, when the degree of contamination is equal to or greater than the threshold, steps S21 to S25 are repeated every specific period of time (eg, 30 seconds) until a predetermined period of time (8 minutes) elapses. Then, among the contamination degrees obtained until the predetermined period (8 minutes) has passed, the smallest contamination degree is selected as the correct contamination degree. In S26, it is determined whether or not the value selected as the correct degree of contamination is less than the threshold value.
 推定部141は、汚れ度合が閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならないと判定した場合(S26でYES)は、処理をS27に進める。S27において、判定部143は、汚れ有状態であると判定し、判定処理を終了する。 If the estimating unit 141 determines that the contamination degree is equal to or greater than the threshold value and that the contamination degree does not become less than the threshold value even after a predetermined period of time has elapsed since the contamination degree became equal to or greater than the threshold value (YES in S26), the process to S27. In S<b>27 , the determination unit 143 determines that the dirt exists, and terminates the determination process.
 推定部141は、汚れ度合が閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならないと判定しなかった(つまり、所定時間が経過するまでに汚れ度合が閾値未満になったと判定した)場合(S26でNO)は、処理をS28に進める。S28において、判定部143は、汚れ有状態でないと判定し、判定処理を終了する。 The estimating unit 141 did not determine that the degree of contamination became equal to or greater than the threshold and that the degree of contamination did not become less than the threshold even after a predetermined period of time had passed since the degree of contamination became equal to or greater than the threshold (that is, the predetermined time had elapsed). (NO in S26), the process proceeds to S28. In S28, the determination unit 143 determines that there is no contamination, and terminates the determination process.
 端末300で、汚れ有状態であることを確認したエレベーターの保守員あるいはビルの管理者は、監視カメラ400のレンズに付着した汚れを取り除くよう清掃する。 After confirming that the terminal 300 is dirty, the elevator maintenance staff or building manager cleans the lens of the monitoring camera 400 to remove the dirt.
 [学習済モデル]
 図5は、各種学習済モデルを説明するための図である。学習済モデル135は、特徴量に基づく情報が入力された際に、汚れ度合を出力するために学習処理が行われたモデルである。
[Trained model]
FIG. 5 is a diagram for explaining various learned models. The learned model 135 is a model that has been subjected to learning processing in order to output the degree of contamination when information based on the feature amount is input.
 図5に示すように、学習済モデル135は、学習済モデルA~E等を含む。学習済モデルA~E等の各々は、複数の撮影環境のいずれかで撮影された撮影データ122群を用いて学習処理が行われたモデルである。 As shown in FIG. 5, the trained model 135 includes trained models A to E and the like. Each of the trained models A to E, etc. is a model that has undergone learning processing using a group of photographed data 122 photographed in one of a plurality of photographing environments.
 学習済モデルAは、撮影環境として朝に撮影された撮影データ122群を用いて学習処理が行われたモデルである。学習済モデルBは、撮影環境として昼に撮影された撮影データ122群を用いて学習処理が行われたモデルである。学習済モデルCは、撮影環境として夜に撮影された撮影データ122群を用いて学習処理が行われたモデルである。朝と昼と夜とでは、光の入り具合によってかご401内の明るさがそれぞれ異なる。 The learned model A is a model that has undergone learning processing using 122 groups of shooting data taken in the morning as the shooting environment. The learned model B is a model that has undergone learning processing using the 122 group of shooting data taken in the daytime as the shooting environment. The learned model C is a model for which learning processing has been performed using a group of 122 photographed data taken at night as the photographing environment. The brightness in the car 401 differs in the morning, daytime, and night depending on how light enters.
 学習済モデルDは、撮影環境としてかご401の扉が開いた状態(戸開状態)で撮影された撮影データ122群を用いて学習処理が行われたモデルである。学習済モデルEは、撮影環境としてかご401の扉が閉じた状態(戸閉状態)で撮影された撮影データ122群を用いて学習処理が行われたモデルである。 The learned model D is a model in which learning processing is performed using a group of 122 photographed data photographed with the door of the car 401 open (door open state) as the photographing environment. The trained model E is a model in which learning processing is performed using a group of photographed data 122 photographed in a state in which the door of the car 401 is closed (closed door state) as the photographing environment.
 本実施の形態においては、予め汚れ度合が判定されている撮影データのセット(撮影データ群)を用いて学習処理を行う。取得部131は、予め汚れ度合が判定されている撮影データ122を学習データとして取得する。 In the present embodiment, learning processing is performed using a set of photographed data (a group of photographed data) whose degree of contamination has been determined in advance. Acquisition unit 131 acquires photographed data 122 for which the degree of contamination has been determined in advance as learning data.
 その際、たとえば、学習済モデルD(戸開状態)を生成する場合、予め汚れ度合が判定されており、かつ、戸開状態で撮影された撮影データ122を学習データとして取得する。学習処理のための撮影データ122は、汚れ度合と撮影環境とを事前に紐付けた上で、教師用データとして記憶部114に記憶させておけばよい。 At that time, for example, when the learned model D (door open state) is generated, the degree of contamination has been determined in advance, and photographed data 122 photographed with the door open is acquired as learning data. The imaging data 122 for the learning process may be stored in the storage unit 114 as teaching data after linking the degree of dirt with the imaging environment in advance.
 次に、抽出部132は、取得した撮影データ122の特徴量を抽出する。たとえば、撮影データ122の色の特性に基づき特徴量を抽出してもよい。撮影データ122がカラー画像の場合、1画素毎に色の特性データを持っている。具体的には、彩度、明度、色相等がある。また、撮影データ全体としては、配色、ばらつき、構造等がある。これらの色の特性を組み合わせることで、汚れ度合が低い場合と汚れ度合が高い場合とで差が現れる特徴量を抽出する。汚れ度合が高い場合は、レンズに付着している汚れの画像(汚れ部分の撮影領域)は黒色、つまり明るさが低下すると推測される。したがって、汚れが付着していない場合と比較すると特徴量が大きく異なってくると考えられる。このように、汚れ度合が高い場合に差異が現れるように特徴量を抽出すればよい。 Next, the extraction unit 132 extracts the feature amount of the captured image data 122 obtained. For example, the feature amount may be extracted based on the color characteristics of the photographed data 122 . When the photographing data 122 is a color image, each pixel has color characteristic data. Specifically, there are saturation, lightness, hue, and the like. In addition, the photographed data as a whole includes coloration, variation, structure, and the like. By combining these color characteristics, a feature amount that shows a difference between the low degree of contamination and the high degree of contamination is extracted. When the degree of dirt is high, it is presumed that the image of the dirt adhering to the lens (the imaged area of the dirt portion) is black, that is, the brightness is lowered. Therefore, it is conceivable that the feature amount will be greatly different from that in the case where dirt is not attached. In this way, it is sufficient to extract feature amounts so that a difference appears when the degree of contamination is high.
 次に、正規化部133は、抽出部132が抽出した特徴量を正規化する。モデル生成部134は、正規化された撮影データ122と紐付けられた汚れ度合とに基づき学習処理を行って、学習済モデルを生成する。 Next, the normalization unit 133 normalizes the feature quantity extracted by the extraction unit 132 . The model generating unit 134 performs learning processing based on the normalized photographing data 122 and the associated degree of contamination to generate a learned model.
 判定対象となる撮影データの撮影環境と、学習済モデルの撮影環境とが近い場合、判定部143の判定精度が高くなることが期待できる。たとえば、図7のように、取得された撮影データが戸開状態で撮影されたものであれば、戸開状態で撮影された撮影データ122群を用いて学習処理が行われた学習済モデルDを用いると、より判定精度が高くなることが期待できる。 When the shooting environment of the shooting data to be determined is close to the shooting environment of the learned model, it can be expected that the determination accuracy of the determination unit 143 will be high. For example, as shown in FIG. 7, if the acquired photographed data is photographed with the door open, then the learned model D undergoes learning processing using the photographed data 122 group photographed with the door open. can be expected to improve the accuracy of determination.
 そして、算出された汚れ度合が最も小さくなるモデルが、判定部143の判定精度が最も高いモデルであることが期待できる。たとえば、戸開状態で撮影された撮影データと、戸閉状態での撮影データを用いた学習済モデルEとでは、扉の状態が異なることにより、その分が汚れ度合として誤検知される可能性が高い。これに対して、戸開状態で撮影された撮影データと、同じく戸開状態での撮影データを用いた学習済モデルDとでは、扉の状態が同じであるため、扉の領域において汚れ度合として誤検知される可能性が低くなる。 Then, it can be expected that the model with the lowest calculated degree of contamination is the model with the highest determination accuracy of the determination unit 143 . For example, the state of the door is different between the photographed data photographed with the door open and the learned model E using the photographed data with the door closed, so there is a possibility that the degree of contamination will be erroneously detected. is high. On the other hand, since the state of the door is the same between the photographed data photographed with the door open and the learned model D using photographed data similarly with the door open, the degree of contamination in the region of the door is Less chance of false positives.
 このため、上述のように、推定部141は、全ての学習済モデル135(学習済モデルA~E等)を用いて、それぞれの汚れ度合を出力する。そして、汚れ度合が最も小さくなるモデルを最適なモデルとして選択している(最も小さい汚れ度合が、正しい汚れ度合であると判断している)。 Therefore, as described above, the estimating unit 141 uses all the learned models 135 (learned models A to E, etc.) to output the respective contamination degrees. Then, the model with the smallest degree of contamination is selected as the optimum model (the smallest degree of contamination is determined to be the correct degree of contamination).
 なお、学習済モデルは、上記学習済モデルA~Eに限らず、乗客が乗車した状態で学習処理が行われたモデルや、複数の撮影環境が組み合わされた環境で学習処理が行われたモデルを含んでもよい。また、1つの学習済モデルのみを用いて推定部141により汚れ度合を出力させるようにしてもよい。この場合、学習処理が簡易になる。 In addition, the learned model is not limited to the above-mentioned learned models A to E, but also a model that has undergone learning processing while passengers are on board, or a model that has undergone learning processing in an environment in which multiple shooting environments are combined. may include Alternatively, the contamination level may be output by the estimating unit 141 using only one trained model. In this case, the learning process becomes simple.
 なお、画像中の類似する領域をひとまとまりとして分割されるようにしてもよい。この場合、推定部141は、分割された領域ごとに汚れ度合を出力するようにする。推定部141は、領域ごとに、「汚れ度合」×「画像全体のうち分割された領域が占める割合」を算出し、これらを合算して「汚れ度合」として算出する。 It should be noted that similar regions in the image may be divided as a group. In this case, the estimation unit 141 outputs the contamination level for each divided area. The estimating unit 141 calculates “degree of contamination”דpercentage of the divided region in the entire image” for each region, and adds them up to calculate the “degree of contamination”.
 [汚れ度合の時間変化]
 図6は、レンズの汚れ度合の時間変化を説明するための図である。図6において、縦軸は、汚れ度合を示す。横軸は、時間を示す。
[Time change of degree of contamination]
FIG. 6 is a diagram for explaining the temporal change in the degree of contamination of the lens. In FIG. 6, the vertical axis indicates the degree of contamination. The horizontal axis indicates time.
 図6に示すように、レンズが全く汚れていない場合(「正常時」とも称する)の汚れ度合は0である。閾値は、0.3である。 As shown in FIG. 6, when the lens is not dirty at all (also referred to as "normal"), the degree of contamination is 0. The threshold is 0.3.
 時刻t1(場面1)おいて、汚れ度合は0である。その後、汚れ度合が急上昇し0.9になった時刻t2が場面2である。このとき、汚れ度合は、閾値(0.3)以上になっている。その後、汚れ度合が急激に下がり、0.1になった時刻t3が場面3である。このとき、汚れ度合(0.1)は、閾値(0.3)を下回っている。このような場合、汚れ有状態ではないと判断される。 At time t1 (scene 1), the contamination level is 0. After that, scene 2 is time t2 when the degree of contamination rises sharply to 0.9. At this time, the degree of contamination is equal to or greater than the threshold value (0.3). After that, scene 3 is time t3 when the degree of contamination drops sharply and becomes 0.1. At this time, the degree of contamination (0.1) is below the threshold (0.3). In such a case, it is determined that there is no contamination.
 次に、汚れ度合が急上昇して0.8になった時刻t4が場面4である。このとき、汚れ度合(0.8)は、閾値(0.3)以上になっている。その後、汚れ度合が急激に下がり、0.4になった時刻t5が場面5である。このとき、汚れ度合(0.4)は、閾値(0.3)以上になったままである。このような場合、汚れ有状態であると判断される。 Next, scene 4 is time t4 when the degree of contamination rises sharply to 0.8. At this time, the degree of contamination (0.8) is equal to or greater than the threshold (0.3). After that, scene 5 is time t5 when the degree of contamination drops sharply to 0.4. At this time, the contamination level (0.4) remains equal to or greater than the threshold value (0.3). In such a case, it is judged to be in a dirty state.
 以下、場面1~5の状況をかご401内を撮影した画像を用いて説明する。図7~図11は、かご401内を撮影した画像と汚れ度合いとの関係を説明するための図である。 The situation in Scenes 1 to 5 will be explained below using images of the interior of the car 401. 7 to 11 are diagrams for explaining the relationship between the image of the inside of the car 401 and the degree of dirtiness.
 場面1では、図7に示すように、かご401内を撮影した画像80a(撮影データ)において、出入口51は、かご401の扉が開いた状態(戸開状態)である。かご401内には、乗客はいない。また、監視カメラ400のレンズには汚れが付着しておらず、汚れ度合=0が出力されている。 In scene 1, as shown in FIG. 7, in an image 80a (photographed data) of the inside of the car 401, the door of the car 401 is open (door open state). There are no passengers in car 401 . Also, the lens of the monitoring camera 400 is not soiled, and the degree of soiling=0 is output.
 場面2では、図8に示すように、かご401内を撮影した画像80bにおいて、かご401内に乗客53が4人乗り込み、出入口51はかご401の扉が閉じた状態(戸閉状態)になっている。また、レンズには汚れが付着している。このとき、汚れ度合=0.9に上がっている。 In scene 2, as shown in FIG. 8, in an image 80b of the interior of the car 401, four passengers 53 are in the car 401, and the door of the car 401 is closed (door closed state) at the entrance/exit 51. ing. In addition, dirt adheres to the lens. At this time, the contamination level is increased to 0.9.
 場面3では、図9に示すように、かご401内を撮影した画像80cにおいて、出入口51の扉が開き(戸開状態)、かご401内の乗客が全員降車している。このとき、レンズの汚れは場面2と同程度付着している。このとき、汚れ度合=0.1に下がっている。 In scene 3, as shown in FIG. 9, in an image 80c of the inside of the car 401, the door of the entrance 51 is open (door open state) and all the passengers inside the car 401 have gotten off. At this time, the dirt on the lens is attached to the same extent as scene 2. At this time, the contamination degree is lowered to 0.1.
 場面1および場面3においては、いずれも戸開状態かつ乗客=0人である。場面1と場面3との画像の違いは、レンズについた汚れのみであるため、汚れ度合=0.1は、純粋にレンズの汚れのみによるものと考えられる。 In both scenes 1 and 3, the doors are open and there are 0 passengers. Since the only difference in the images between scene 1 and scene 3 is dirt on the lens, the degree of dirt of 0.1 can be considered purely due to dirt on the lens.
 一方で、場面2と場面3とでは、レンズに汚れがついている点は同じであるものの、場面2において戸閉状態かつ乗客=4人である点で異なる。このため、場面2で汚れ度合=0.9であるのは、レンズの汚れに加えて、かご401の戸開閉状態とかご401内の乗客による誤検知を含むものと考えられる。 On the other hand, scene 2 and scene 3 are the same in that the lenses are dirty, but differ in that scene 2 has the door closed and there are four passengers. Therefore, the contamination degree of 0.9 in Scene 2 is considered to include erroneous detection by the open/closed state of the door of car 401 and passengers inside car 401 in addition to dirt on the lens.
 以上のように、本実施の形態においては、かご401が戸開状態、乗客=0人、かつ、レンズの汚れなしである場合(場面1)に、汚れ度合=0と算出される。その後、レンズに汚れ度合=0.1の汚れが付着するが、さらに、乗客が4人乗り込み、戸閉状態となることで、汚れ度合が0.9まで上昇する。 As described above, in the present embodiment, when the door of the car 401 is open, the number of passengers is 0, and the lenses are not dirty (Scene 1), the degree of contamination is calculated as 0. After that, the dirt degree of 0.1 adheres to the lens, but when four passengers board and the door is closed, the dirt degree increases to 0.9.
 この場合、汚れ度合が閾値(0.3)以上になっているものの、実際には、汚れではなく、戸閉状態となり乗客が乗車したことによる画像変化により、汚れ度合が上昇しているだけである。そのため、戸開状態かつ乗客=0人に戻った場面3においては、純粋なレンズの汚れ(汚れ度合=0.1)のみが検知されている。 In this case, although the degree of contamination is equal to or greater than the threshold value (0.3), the degree of contamination is not actually contamination, but simply increases due to an image change due to the door being closed and passengers boarding the vehicle. be. Therefore, in scene 3 where the door is open and the number of passengers is 0, only pure dirt on the lens (degree of dirt = 0.1) is detected.
 このように、汚れ度合は、レンズの汚れと、かご401内の積載量(たとえば、乗車人数)と、かご401の戸開閉状態との少なくともいずれかに応じて変化する。このため、本実施の形態においては、推定部141は、汚れ度合が閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過するまでに汚れ度合が閾値未満になったと判定した場合には、汚れ有状態でないと判定している(S26,S28)。なお、汚れ度合が変化する各種要因について、図15を用いてより詳しく説明する。 In this way, the degree of contamination changes according to at least one of the contamination of the lenses, the amount of load in the car 401 (for example, the number of passengers), and the open/closed state of the door of the car 401. Therefore, in the present embodiment, the estimating unit 141 determines that the degree of contamination has become equal to or greater than the threshold and that the degree of contamination has become less than the threshold within a predetermined period of time after the degree of contamination becomes equal to or greater than the threshold. If so, it is determined that there is no contamination (S26, S28). Various factors that change the degree of contamination will be described in more detail with reference to FIG. 15 .
 次に、場面4では、図10に示すように、かご401内を撮影した画像80dにおいて、かご401内に乗客53が2人乗り込み、戸閉状態になっている。さらに、乗客53のいたずらにより、監視カメラ400のレンズの左下が汚されている。このとき、汚れ度合は0.8に上がっている。 Next, in scene 4, as shown in FIG. 10, in an image 80d of the interior of the car 401, two passengers 53 are in the car 401 and the door is closed. Furthermore, the lower left corner of the lens of the surveillance camera 400 is soiled due to mischief by the passenger 53 . At this time, the degree of contamination has increased to 0.8.
 場面5では、図11に示すように、かご401内を撮影した画像80eにおいて、戸開状態となり、かご401内の乗客が全員降車している。このとき、汚れ度合=0.4に下がっている。 In scene 5, as shown in FIG. 11, in an image 80e of the interior of the car 401, the door is open and all the passengers in the car 401 have gotten off. At this time, the contamination degree is lowered to 0.4.
 場面3および場面5においては、いずれも戸開状態かつ乗客=0人である。場面3と場面5との画像の違いは、乗客53がいたずらでつけたレンズの左下の汚れのみである。これにより、汚れ度合は0.1から0.4に上昇しており、閾値(0.3)を超えている。 In both scenes 3 and 5, the doors are open and there are 0 passengers. The only difference between the images of scene 3 and scene 5 is the dirt on the lower left corner of the lens that the passenger 53 put on for mischief. As a result, the contamination level increases from 0.1 to 0.4, exceeding the threshold value (0.3).
 一方で、場面4と場面5とでは、レンズに汚れがついている点は同じであるものの、場面4において戸閉状態かつ乗客=2人である点で異なる。このため、場面4で汚れ度合=0.8であるのは、レンズの汚れに加えて、かご401の戸開閉状態とかご401内の乗客による誤検知を含むものと考えられる。 On the other hand, scene 4 and scene 5 are the same in that the lens is dirty, but differ in that scene 4 has the door closed and there are two passengers. Therefore, the contamination level of 0.8 in scene 4 is considered to include erroneous detection by the open/closed state of the door of car 401 and the passengers inside car 401 in addition to the dirt on the lens.
 このように、戸開状態、乗客=0人、かつ、レンズの汚れありである場合(場面3)に、汚れ度合=0.1と算出されている。その後、乗客が2人乗り込み、戸閉状態となり、いたずらによるレンズの汚れで、汚れ度合が0.8まで上昇する。 In this way, when the door is open, passengers = 0, and the lens is dirty (Scene 3), the degree of contamination is calculated as 0.1. After that, two passengers board the vehicle, the door is closed, and the fouling degree increases to 0.8 due to fouling of the lens caused by mischief.
 この場合、汚れ度合が閾値(0.3)を超えているが、戸閉状態となり乗客が乗車したことによる画像変化による汚れ度合の上昇分も含まれる。そのため、戸開状態かつ乗客=0人に戻った場面5において、純粋なレンズの汚れ(汚れ度合=0.4)のみが検知される。そして、この場合においても、汚れ度合(0.4)が閾値(0.3)未満とならないため、汚れ有状態であると判定される。 In this case, the degree of contamination exceeds the threshold (0.3), but an increase in the degree of contamination due to image changes due to the passenger getting on the vehicle with the door closed is also included. Therefore, in scene 5 where the door is open and the number of passengers is 0, only pure dirt on the lens (degree of dirt = 0.4) is detected. Also in this case, since the degree of contamination (0.4) does not fall below the threshold value (0.3), it is determined that there is contamination.
 このように、判定部143は、汚れ度合が閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、汚れ有状態であると判定している(S26,S27)。 In this way, the determining unit 143 determines that the soiled state is present when the degree of soiling is equal to or greater than the threshold and the degree of soiling does not fall below the threshold even after a predetermined time has elapsed since the degree of soiling became equal to or greater than the threshold. (S26, S27).
 ここで、学習済モデルを用いて汚れ度合を推定しようとした場合、汚れ以外の要素(たとえば、戸開閉状態や乗客)による誤検知(値が大きくなってしまうこと)が発生する可能性がある。この場合、汚れと汚れ以外の要素を区別可能にするために、より精度の高い学習処理を行うことで、汚れ度合の推定精度を高めることが考えられる。そのためには、汚れ以外の要素を排除した状態での学習データを増やして学習の精度を高めることが考えられる。しかしながら、このような状態の発生確率が低い場合、学習データを収集することが困難であり、推定精度を上げることが難しくなる。 Here, when trying to estimate the degree of contamination using a trained model, there is a possibility that false detections (increased values) may occur due to factors other than contamination (e.g. door open/close status and passengers). . In this case, it is conceivable to improve the accuracy of estimating the degree of contamination by performing more accurate learning processing in order to distinguish between contamination and elements other than contamination. For that purpose, it is conceivable to increase the learning data in a state where elements other than dirt are excluded to improve the accuracy of learning. However, when the probability of occurrence of such a state is low, it is difficult to collect learning data, making it difficult to improve estimation accuracy.
 このため、本実施の形態においては、学習済モデルによる推定以外の手法も併用して、汚れ以外の要素を除外し、これにより汚れ度合の推定精度を高めた。上述したように、「汚れ」による汚れ度合は時間が経過しても減少することはなく、「汚れ以外の要素」による汚れ度合は時間の経過とともに減少する。たとえば、汚れ以外の要素である「戸開閉状態」は時間の経過とともに戸開状態に戻り、乗客は時間の経過とともに0人に戻る。このように、汚れ以外の要素により閾値以上となるケースが発生する。このため、所定期間待つことで汚れ以外の要素による誤検出を減少させ、極力、汚れのみによる汚れ度合が検出できるように構成した。このようにすることで、学習データを追加することなく、汚れ有状態であるか否かを精度よく判定することができる。 For this reason, in the present embodiment, methods other than estimation using a learned model are also used to exclude elements other than dirt, thereby increasing the accuracy of estimating the degree of dirt. As described above, the degree of contamination caused by "dirt" does not decrease over time, and the degree of contamination caused by "factors other than contamination" decreases over time. For example, the "door open/closed state", which is an element other than dirt, returns to the door open state with the lapse of time, and the number of passengers returns to 0 with the lapse of time. In this way, there are cases where the threshold is exceeded due to factors other than contamination. Therefore, by waiting for a predetermined period of time, erroneous detection due to factors other than contamination can be reduced, and the degree of contamination due only to contamination can be detected as much as possible. By doing so, it is possible to accurately determine whether or not there is dirt without adding learning data.
 次に、第1処理および第2処理におけるかご401内を撮影した画像と汚れ度合いとの関係を説明する。図12は、第1処理におけるかご401内を撮影した画像と汚れ度合いとの関係を説明するための図である。 Next, the relationship between the image taken inside the car 401 and the degree of contamination in the first and second processes will be described. FIG. 12 is a diagram for explaining the relationship between the image of the inside of the car 401 photographed in the first process and the degree of contamination.
 第1処理においては、図3,図4のフローチャートを用いて説明したように、T1(10分)ごとに判定処理を行う。これに対して、第2処理においては、図14のフローチャートを用いて後述するようにT2(1日)ごとに判定処理を行う。 In the first process, as described using the flowcharts of FIGS. 3 and 4, the determination process is performed every T1 (10 minutes). On the other hand, in the second process, the determination process is performed every T2 (one day) as will be described later with reference to the flowchart of FIG.
 まず、第1処理において10分ごとに判定処理を行う例について説明する。図12に示すように、画像80jにおいて、出入口51は戸閉状態であり、かご401内には乗客53がいない。また、監視カメラ400のレンズには汚れが付着しておらず、汚れ度合は0である。 First, an example in which determination processing is performed every 10 minutes in the first processing will be described. As shown in FIG. 12, in the image 80j, the doorway 51 is closed and there is no passenger 53 in the car 401. As shown in FIG. Also, the lens of the surveillance camera 400 is not dirty, and the degree of dirt is zero.
 この状態以降、10分ごとに判定処理が行われる。その際、汚れ度合が閾値以上と判定された場合は、所定時間が経過しても閾値以上であるか否かが判断される。図6~図11を用いて説明したように、場面2(t2)のように汚れ度合が閾値以上であると判断された場合であっても、所定時間が経過するまでの場面3(t3)において、汚れ度合が閾値未満になっている場合は、汚れ有状態ではないと判断される。 After this state, judgment processing will be performed every 10 minutes. At that time, if the degree of contamination is determined to be equal to or greater than the threshold, it is determined whether or not the degree of contamination remains equal to or greater than the threshold even after a predetermined period of time has elapsed. As described with reference to FIGS. 6 to 11, even when it is determined that the degree of contamination is equal to or greater than the threshold as in scene 2 (t2), scene 3 (t3) until a predetermined time elapses. , if the degree of contamination is less than the threshold value, it is determined that there is no contamination.
 一方、場面4(t4)のように汚れ度合が閾値以上であると判断された場合であって、所定時間が経過しても、場面3(t5)において汚れ度合が閾値以上のままであるので、汚れ有状態であると判断される。 On the other hand, when it is determined that the degree of contamination is greater than or equal to the threshold as in scene 4 (t4), the degree of contamination remains greater than or equal to the threshold in scene 3 (t5) even after the predetermined time has passed. , is determined to be in a dirty state.
 画像80jの状態において、たとえば、乗客のいたずらによって監視カメラ400のレンズが汚されたとする。画像80jの状態から10分が経過した状態が画像80kである。画像80kにおいて、出入口51は戸閉状態であり、かご401内には乗客53がいない。ただし、監視カメラ400のレンズの右側には汚れが付着しており、汚れ度合は0.4になっている。この場合、汚れ度合が閾値(0.3)以上であるため、汚れ有状態であると判断される。 In the state of image 80j, for example, let us say that the lens of surveillance camera 400 has been soiled by a passenger's mischief. An image 80k is obtained after 10 minutes have elapsed from the state of the image 80j. In the image 80k, the entrance/exit 51 is closed and no passenger 53 is inside the car 401 . However, dirt adheres to the right side of the lens of the monitoring camera 400, and the degree of dirt is 0.4. In this case, since the degree of contamination is equal to or greater than the threshold value (0.3), it is determined that there is contamination.
 このように、第1処理においては、10分ごとに判定処理を行うため、いたずらによってレンズが汚された場合のような、突発的な汚れを検出することができる。 Thus, in the first process, since the determination process is performed every 10 minutes, it is possible to detect sudden stains such as when the lens is soiled by mischief.
 次に、第2処理において1日ごとに判定処理を行う例について説明する。図13は、第2処理におけるかご401内を撮影した画像と汚れ度合いとの関係を説明するための図である。図13に示すように、画像80fにおいて、出入口51は戸開状態であり、かご401内には乗客53がいない。また、監視カメラ400のレンズには汚れが付着しておらず、汚れ度合は0である。 Next, an example in which determination processing is performed for each day in the second processing will be described. FIG. 13 is a diagram for explaining the relationship between the image of the inside of the car 401 photographed in the second process and the degree of contamination. As shown in FIG. 13, in the image 80f, the doorway 51 is in the open state and there is no passenger 53 in the car 401. As shown in FIG. Also, the lens of the surveillance camera 400 is not dirty, and the degree of dirt is zero.
 1日ごとに判定処理が実行され、画像80fの状態から30日が経過した(30回の判定処理が行われた)状態が画像80gである。画像80gにおいても、出入口51は戸開状態であり、かご401内には乗客53がいない。ただし、監視カメラ400のレンズには汚れが付着しており、汚れ度合は0.1になっている。 Determination processing is performed every day, and image 80g is the state after 30 days have passed since the state of image 80f (determination processing has been performed 30 times). In the image 80 g as well, the doorway 51 is in the open state and no passenger 53 is inside the car 401 . However, the lens of the monitoring camera 400 is dirty, and the degree of dirt is 0.1.
 なお、判定処理が実行されるタイミングにおいて、乗客がおり、これにより閾値以上となる場合には、所定期間内に汚れ度合が閾値未満になるかが判定される。本例では、乗客が降車したタイミングで、汚れ度合は0.1まで低下する。 It should be noted that when the judgment process is executed, there are passengers, and if the result is equal to or higher than the threshold, it is judged whether the degree of contamination will be less than the threshold within a predetermined period. In this example, the degree of contamination drops to 0.1 at the timing when the passenger gets off.
 画像80gの状態から30日が経過した(30回の判定処理が行われた)状態が画像80hである。画像80fの状態から数えると、60日が経過している。画像80hにおいて、も出入口51は戸開状態であり、かご401内には乗客53がいない。ただし、監視カメラ400のレンズにはさらに汚れが蓄積されており、汚れ度合は0.2になっている。 An image 80h is a state in which 30 days have passed since the state of the image 80g (30 determination processes have been performed). Counting from the state of the image 80f, 60 days have passed. In the image 80h, the entrance/exit 51 is still in the open state, and there is no passenger 53 inside the car 401 . However, the lens of the surveillance camera 400 is further contaminated, and the degree of dirt is 0.2.
 画像80hの状態から30日が経過した状態(30回の判定処理が行われた)が画像80iである。画像80fの状態から数えると、90日が経過している。画像80gにおいても、出入口51は戸開状態であり、かご401内には乗客53がいない。ただし、監視カメラ400のレンズにはさらに汚れが蓄積されており、汚れ度合は0.3になっている。この場合、汚れ度合が閾値に到達しているので、汚れ有状態であると判断される。 An image 80i is a state in which 30 days have passed since the state of the image 80h (30 determination processes have been performed). Counting from the state of the image 80f, 90 days have passed. In the image 80 g as well, the doorway 51 is in the open state and no passenger 53 is inside the car 401 . However, the lens of the surveillance camera 400 is further contaminated, and the degree of dirt is 0.3. In this case, since the degree of contamination has reached the threshold value, it is determined to be in the contamination state.
 画像80f~80iによれば、汚れ度合は、30日ごとに0.1ずつ増加している。このため、たとえば、0日~60日までの汚れ度合の経時変化から、さらに30日後の90日後には、汚れ度合が閾値に到達することを予測することができる。これにより、保守員は、レンズのクリーニングをすべき時期を知ることができる。以下、第2処理について、フローチャートを用いて説明する。 According to the images 80f to 80i, the degree of contamination increases by 0.1 every 30 days. For this reason, for example, it is possible to predict that the degree of contamination will reach the threshold value 90 days after 30 days from the time-dependent change in the degree of contamination from 0 days to 60 days. This allows maintenance personnel to know when to clean the lens. The second process will be described below using a flowchart.
 [第2処理のフローチャート]
 上述したように第1処理は、時間T1(10分)の周期で汚れを判定する処理である。これに対して、第2処理は、時間T2(1日)の周期で汚れを判定する。さらに、第2処理は、将来のレンズの汚れを推定する。
[Flowchart of second processing]
As described above, the first process is the process of judging contamination with a cycle of time T1 (10 minutes). On the other hand, in the second process, dirt is determined in a period of time T2 (one day). In addition, the second process estimates future lens contamination.
 以下、第2処理について説明する。図14は、汚れ判定装置200が実行する第2処理のフローチャートである。 The second process will be described below. FIG. 14 is a flowchart of the second process executed by the contamination determination device 200. FIG.
 図14に示すように、第2処理が開始すると、S31において、汚れ判定装置200は、時間T2が経過した否かを判定する。汚れ判定装置200は、時間T2が経過したと判定した場合(S31でYES)、S32に処理を進める。汚れ判定装置200は、時間T2が経過したと判定しなかった場合(S31でNO)、処理をS31に戻す。 As shown in FIG. 14, when the second process starts, in S31, the contamination determination device 200 determines whether or not time T2 has elapsed. When the contamination determination device 200 determines that the time T2 has elapsed (YES in S31), the process proceeds to S32. If the contamination determination device 200 does not determine that the time T2 has elapsed (NO in S31), the process returns to S31.
 S32において、汚れ判定装置200は、判定処理を行い、処理をS33に進める。つまり、汚れ判定装置200は、時間T2が経過するごとに判定処理等を行う。 In S32, the contamination determination device 200 performs determination processing, and advances the processing to S33. That is, the contamination determination device 200 performs determination processing and the like each time the time T2 elapses.
 S33において、汚れ判定装置200の予測部145は、汚れ度合いを記憶部214に記憶し、処理をS33に進める。 In S33, the prediction unit 145 of the contamination determination device 200 stores the degree of contamination in the storage unit 214, and advances the process to S33.
 S34において、予測部145は、記憶部214に記憶された汚れ度合いの時系列データに基づき、汚れ有状態であると判定される時期を推定し、処理をS35に進める。 In S34, the prediction unit 145 estimates the time when it is determined that there is dirt based on the time-series data of the degree of dirt stored in the storage unit 214, and advances the process to S35.
 図13を用いて示した例では、0日目は汚れ度合=0である。30日目は汚れ度合=0.1である。60日目は汚れ度合=0.2である。現在、60日目であるとする。記憶部214は、0日目~60日目までの汚れ度合のデータを記憶している。 In the example shown using FIG. 13, the contamination level is 0 on the 0th day. On the 30th day, the fouling degree is 0.1. On the 60th day, the fouling degree=0.2. Assume that it is now the 60th day. The storage unit 214 stores dirt degree data from the 0th day to the 60th day.
 そして、これらのデータによれば、30日ごとに汚れ度合が0.1ずつ増加するため、予測部145は、90日目(30日後)に汚れ度合が閾値である0.3に到達すると予測する。つまり、汚れ有状態であると判定される時期は、30日後であると予測する。 According to these data, the degree of dirt increases by 0.1 every 30 days, so the prediction unit 145 predicts that the degree of dirt will reach the threshold of 0.3 on the 90th day (after 30 days). do. That is, it is predicted that the time when it is determined to be in the dirty state is 30 days later.
 たとえば、予測部145は、汚れ度合いの時系列データを用いて、最小自乗法等により予測モデルを生成させてもよい。予測部145は、予測モデルを用いて、汚れ有状態であると判定される時期を予測する。 For example, the prediction unit 145 may generate a prediction model using the least squares method or the like using time-series data of the degree of contamination. The prediction unit 145 predicts the time when it is determined that the soiled state is present using the prediction model.
 S35において、通知部144は、判定部143の判定結果および予測部145の推定結果を端末300に送信し、S31に処理を戻す。 In S35, the notification unit 144 transmits the determination result of the determination unit 143 and the estimation result of the prediction unit 145 to the terminal 300, and returns the process to S31.
 [汚れ度合が変化する各種要因]
 図6~図11等で説明したように、汚れ度合は、「レンズの汚れ」と、「かご401内の積載量」と、「かご401の戸開閉状態」との少なくともいずれかに応じて変化する。ここでは、汚れ度合が変化する各種要因について、図15を用いてより詳しく説明する。
[Various factors that change the degree of contamination]
As described with reference to FIGS. 6 to 11, etc., the degree of contamination changes according to at least one of "dirt on lenses", "loading amount in car 401", and "open/closed state of door of car 401". do. Here, various factors that change the degree of contamination will be described in more detail with reference to FIG. 15 .
 図15は、かご401内を撮影した画像と汚れ度合いとの関係を説明するための図である。図15に示すように、かご401内を撮影した画像80jにおいて、出入口51は戸開状態であり、かご401内には乗客がいない。また、監視カメラ400のレンズには汚れが付着しておらず、汚れ度合=0が出力されている。 FIG. 15 is a diagram for explaining the relationship between the image of the interior of the car 401 and the degree of contamination. As shown in FIG. 15, in an image 80j of the inside of the car 401, the doorway 51 is open and there are no passengers inside the car 401. As shown in FIG. Also, the lens of the monitoring camera 400 is not soiled, and the degree of soiling=0 is output.
 図6~図11の例では、かご401内に乗客が乗車した例について説明したが、かご401内を撮影した画像80kに示すように、かご401内には乗客とともに大きな荷物が積載される場合もある。上記「かご401内の積載量」としては、乗車人数のみならず荷物の量も含まれる。なお、「かご401内の積載量」は、画像80kから推定される乗客や荷物の総面積を示すものあってもよい。 In the examples of FIGS. 6 to 11, an example in which a passenger rides in the car 401 has been described. There is also The "loading amount in the car 401" includes not only the number of passengers but also the amount of luggage. Note that the "loading amount in the car 401" may indicate the total area of passengers and luggage estimated from the image 80k.
 画像80kの例では、かご401内に乗客とともに荷物が積載されることにより、汚れ度合は0.2まで上昇している。この場合、かご401内の積載量に応じて汚れ度合が変化しているが、目的階に到着すると乗客および荷物はかご401の外に出るため、汚れ度合は再び0に戻る。このため、乗客および荷物に基づき、「汚れ有状態」とは判定されない。 In the example of image 80k, the degree of contamination has increased to 0.2 due to the luggage being loaded in the car 401 together with the passengers. In this case, the degree of contamination varies according to the amount of load in the car 401, but the degree of contamination returns to 0 again when the passenger and the cargo leave the car 401 upon arrival at the destination floor. Therefore, it is not determined to be "dirty" based on the passenger and luggage.
 また、上述の「かご401の戸開閉状態」には、「乗場の状態」も含まれる。かご401内を撮影した画像80lでは、画像80jと同じく戸開状態であるが、乗場の状態が変化している。具体的には、画像80jでは日中の太陽の光で乗場が明るいのに対し、画像80lでは夜になって乗場が暗いために汚れ度合が0.15まで上昇している。上述のように、戸開状態と戸閉状態とでは汚れ度合が異なるが、戸開状態においては時間帯に応じても汚れ度合が異なる。 Also, the above-mentioned "door open/closed state of car 401" includes "state of landing". In an image 80l of the inside of the car 401, the door is open like the image 80j, but the state of the hall has changed. Specifically, in the image 80j, the platform is bright due to sunlight during the daytime, whereas in the image 80l, the platform is dark at night, and the degree of contamination increases to 0.15. As described above, the degree of contamination differs between the door open state and the door closed state, but in the door open state the degree of contamination also differs depending on the time zone.
 図5の例においては、撮影環境として、朝と昼と夜とで学習済モデルを異ならせ(学習済モデルA~C)、戸開状態と戸閉状態とで学習済モデルを異ならせる(学習済モデルD,E)ように構成することで、汚れ度合の判定精度を高めるようにした。本例では、たとえば、戸開状態においてさらに時間帯に応じて学習モデルを異ならせることで、汚れ度合の判定精度をさらに高めることができる。 In the example of FIG. 5, as the shooting environment, different learned models are used for the morning, noon, and night (learned models A to C), and different learned models are used for the door open state and the door closed state (learned model By constructing the models D and E) as described above, the accuracy of determination of the degree of contamination is improved. In this example, for example, by changing the learning model according to the time zone in the door open state, it is possible to further improve the accuracy of determining the degree of contamination.
 また、「乗場の状態」には、乗場の混雑状況も含まれる。たとえば、出勤時や退勤時などのピーク時間帯(戸開状態において複数の待ち客が映り込む)と、閑散時(乗場に人がいない)とでは、汚れ度合が異なる。この場合においても、たとえば、撮影環境として、ピーク時間帯とそれ以外の時間帯とで学習モデルを異ならせることで、汚れ度合の判定精度をさらに高めることができる。 In addition, the "state of the boarding area" includes the congestion status of the boarding area. For example, the degree of contamination differs between peak hours such as when arriving and leaving work (a plurality of waiting passengers are reflected when the door is open) and off-peak hours (when there are no people at the boarding point). In this case as well, for example, by using different learning models for the shooting environment during peak hours and during other hours, it is possible to further improve the accuracy of determining the degree of contamination.
 上記以外にも、汚れ度合は、「かご401内に設置された監視カメラ400の状態」によっても変化する。たとえば、かご401内を撮影した画像80mでは、乗客のいたずら等により、かご401内に設置された監視カメラ400の設置方向がずれてしまい、画像80jに比べると、かご401のやや左側が撮影されている。これにより、画像80jと画像80mとの差分により、汚れ度合が0.4に変化している。 In addition to the above, the degree of contamination also changes depending on "the state of the monitoring camera 400 installed in the car 401". For example, in an image 80m of the inside of the car 401, the installation direction of the surveillance camera 400 installed in the car 401 is shifted due to mischief by a passenger or the like. ing. As a result, the degree of contamination is changed to 0.4 due to the difference between the image 80j and the image 80m.
 本ケースでは、所定時間が経過しても汚れ度合が低下しないため、「汚れ有状態」として誤検知される。この場合、エレベーターの保守員あるいはビルの管理者は、監視カメラ400の設置状態を確認し、監視カメラ400を正常な設置状態に戻す作業を行う。これにより、汚れ度合が0に戻り、汚れ度合が正常に検知されるようになる。「かご401内に設置された監視カメラ400の状態」としては、上記のような監視カメラ400の設置方向や設置位置であってもよいし、監視カメラ400の焦点がぼやけているような状態も想定され得る。 In this case, since the degree of contamination does not decrease even after a predetermined period of time has passed, it is erroneously detected as a "dirty state." In this case, the maintenance staff of the elevator or the manager of the building checks the installation state of the monitoring camera 400 and restores the monitoring camera 400 to the normal installation state. As a result, the degree of contamination returns to 0, and the degree of contamination can be normally detected. The "state of the monitoring camera 400 installed in the car 401" may be the installation direction or installation position of the monitoring camera 400 as described above, or a state in which the monitoring camera 400 is out of focus. can be assumed.
 [主な構成および効果]
 以下、前述した実施の形態の主な構成および効果を説明する。
[Main configuration and effects]
Main configurations and effects of the above-described embodiment will be described below.
 (1) 汚れ判定装置200は、取得部131と、抽出部132と、推定部141と、判定部143とを備える。取得部131は、監視カメラ400により撮影された撮影データ122を定期的に取得する。抽出部132は、取得された撮影データ122の特徴量を抽出する。推定部141は、抽出された特徴量に基づく情報を学習済モデル135に入力して、監視カメラ400のレンズの汚れ度合を出力する。判定部143は、汚れ度合が汚れ度合に関する閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、レンズに一定の汚れがある汚れ有状態であると判定する。所定期間待つことで汚れ以外の要素による誤検出を減少させ、監視カメラのレンズに一定の汚れがある汚れ有状態であるか否かを精度よく判定することができる。 (1) The contamination determination device 200 includes an acquisition unit 131 , an extraction unit 132 , an estimation unit 141 and a determination unit 143 . Acquisition unit 131 periodically acquires photographed data 122 photographed by surveillance camera 400 . The extraction unit 132 extracts the feature amount of the acquired photographed data 122 . The estimation unit 141 inputs information based on the extracted feature amount to the learned model 135 and outputs the degree of contamination of the lens of the surveillance camera 400 . The determining unit 143 determines that when the degree of contamination is equal to or greater than the threshold for the degree of contamination, and the degree of contamination does not become less than the threshold even after a predetermined time has passed since the degree of contamination became equal to or greater than the threshold, the lens has a certain degree of contamination. It is determined that there is some dirt. By waiting for a predetermined period of time, erroneous detection due to factors other than dirt can be reduced, and it is possible to accurately determine whether or not the lens of the surveillance camera is in a dirty state with a certain degree of dirt.
 (2) 学習済モデル135は、複数のモデル(学習済モデルA~E等)を含む。複数のモデルの各々は、複数の撮影環境のいずれかで撮影された撮影データ122群を用いて、特徴量に基づく情報が入力された際に、汚れ度合を出力するために学習処理が行われたモデルである。推定部141は、複数のモデルのうちのいずれかのモデルを選択し、選択した当該モデルを用いて汚れ度合を出力する。撮影環境に応じて適切なモデルを用いることで、周囲の環境変化による誤検出を減少させ、汚れ有状態であるか否かを判定することができる。 (2) The learned model 135 includes a plurality of models (learned models A to E, etc.). Each of the plurality of models is subjected to learning processing to output the degree of contamination when information based on the feature amount is input using a group of photographed data 122 photographed in one of a plurality of photographing environments. model. The estimating unit 141 selects one model from among a plurality of models, and outputs the degree of contamination using the selected model. By using an appropriate model according to the imaging environment, it is possible to reduce erroneous detection due to changes in the surrounding environment and determine whether or not there is dirt.
 (3) 推定部141は、複数のモデルのうち、出力結果が最も小さくなるモデルを選択する。撮影環境に応じて適切なモデルを用いることで、周囲の環境変化による誤検出を減少させ、汚れ有状態であるか否かを判定することができる。 (3) The estimation unit 141 selects the model with the smallest output result from among the multiple models. By using an appropriate model according to the imaging environment, it is possible to reduce erroneous detection due to changes in the surrounding environment and determine whether or not there is dirt.
 (4) 汚れ判定装置200は、第1モードと第2モードとを含む実行モードを設定可能である。判定部143は、第1モードが設定されている場合は、第1周期ごとに判定を行う。判定部143は、第2モードが設定されている場合は、前記第1周期よりも長い第2周期ごとに判定を行う。これにより、第1モードにより突発的に発生する汚れを検知することができ、第2モードにより経時変化により蓄積する汚れを検知することができる。経時変化により蓄積する汚れを検知することで、汚れ有状態となってレンズのクリーニングが必要となる時期を予想することができる。 (4) The contamination determination device 200 can set execution modes including a first mode and a second mode. When the first mode is set, the determination unit 143 performs determination for each first cycle. When the second mode is set, the determination unit 143 performs determination for each second cycle longer than the first cycle. As a result, dirt that suddenly occurs can be detected in the first mode, and dirt that accumulates over time can be detected in the second mode. By detecting dirt that accumulates over time, it is possible to predict when the lens will become dirty and need cleaning.
 (5) 撮影データ122は、エレベーターのかご401内が撮影されたデータである。汚れ度合は、レンズの汚れとかご401内の積載量(乗車人数や荷物の量)とかご401の戸開閉状態との少なくともいずれかに応じて変化する。これにより、所定時間が経過しても汚れ度合が低下しないレンズの汚れと、所定時間が経過するまでに汚れ度合を低下させる可能性のある積載量および戸開閉状態を識別して、汚れ有状態であるか否かを精度よく判定することができる。 (5) The photographed data 122 is data in which the inside of the elevator car 401 is photographed. The degree of contamination changes according to at least one of the contamination of the lenses, the amount of load (the number of passengers and the amount of luggage) in the car 401, and the door opening/closing state of the car 401. FIG. As a result, it is possible to discriminate dirt on the lens that does not reduce the degree of dirt even after the passage of a predetermined time from the amount of load and door open/closed state that may reduce the degree of dirt before the passage of the predetermined time. Whether or not can be determined with high accuracy.
 (6) 汚れ判定方法は、監視カメラ400により撮影された撮影データ122を定期的に取得するステップと、取得された撮影データ122の特徴量を抽出するステップと、抽出された特徴量に基づく情報を学習済モデル135に入力して、監視カメラ400のレンズの汚れ度合を出力するステップと、汚れ度合が汚れ度合に関する閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、レンズに一定の汚れがある汚れ有状態であると判定するステップとを備える。所定期間待つことで汚れ以外の要素による誤検出を減少させ、監視カメラのレンズに一定の汚れがある汚れ有状態であるか否かを精度よく判定することができる。 (6) The dirt determination method consists of a step of periodically acquiring the photographed data 122 photographed by the monitoring camera 400, a step of extracting the feature amount of the acquired photographed data 122, and information based on the extracted feature amount. to the learned model 135 and outputting the degree of contamination of the lens of the monitoring camera 400; determining that the lens is in a dirty state with a certain amount of dirt when the degree of dirt does not become less than the threshold value even when the lens is dirty. By waiting for a predetermined period of time, erroneous detection due to factors other than dirt can be reduced, and it is possible to accurately determine whether or not the lens of the surveillance camera is in a dirty state with a certain degree of dirt.
 (7) 汚れ判定プログラムは、コンピュータに、監視カメラ400により撮影された撮影データ122を定期的に取得するステップと、取得された撮影データ122の特徴量を抽出するステップと、抽出された特徴量に基づく情報を学習済モデル135に入力して、監視カメラ400のレンズの汚れ度合を出力するステップと、汚れ度合が汚れ度合に関する閾値以上になり、かつ、汚れ度合が閾値以上になってから所定時間が経過しても汚れ度合が閾値未満にならない場合に、レンズに一定の汚れがある汚れ有状態であると判定するステップとを実行させる。所定期間待つことで汚れ以外の要素による誤検出を減少させ、監視カメラのレンズに一定の汚れがある汚れ有状態であるか否かを精度よく判定することができる。 (7) The dirt determination program provides the computer with a step of periodically acquiring the photographed data 122 photographed by the monitoring camera 400, a step of extracting the feature amount of the acquired photographed data 122, and a step of extracting the extracted feature amount to the learned model 135 and outputting the degree of contamination of the lens of the monitoring camera 400; and a step of determining that the lens is in a dirty state with a certain degree of dirt when the degree of dirt does not become less than the threshold value even after a lapse of time. By waiting for a predetermined period of time, erroneous detection due to factors other than dirt can be reduced, and it is possible to accurately determine whether or not the lens of the surveillance camera is in a dirty state with a certain degree of dirt.
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本開示の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiments disclosed this time should be considered illustrative in all respects and not restrictive. The scope of the present disclosure is indicated by the scope of claims rather than the above description, and is intended to include all changes within the meaning and scope of equivalence to the scope of claims.
 1 汚れ判定システム、51,52 出入口、53 乗客、80a~80i 画像、100 映像記録装置、111,211 CPU、112,212 ROM、113,213 RAM、114,214 記憶部、115,215 通信インターフェイス、116,216 I/Oインターフェイス、121 取得部、122 撮影データ、131 取得部、132 抽出部、133 正規化部、134 モデル生成部、135 学習済モデル、141 推定部、142 推定結果、143 判定部、144 通知部、200 汚れ判定装置、300 端末、400 監視カメラ。 1 Dirt determination system, 51, 52 entrance, 53 passengers, 80a to 80i images, 100 video recording device, 111, 211 CPU, 112, 212 ROM, 113, 213 RAM, 114, 214 storage unit, 115, 215 communication interface, 116, 216 I/O interface, 121 acquisition unit, 122 captured data, 131 acquisition unit, 132 extraction unit, 133 normalization unit, 134 model generation unit, 135 learned model, 141 estimation unit, 142 estimation result, 143 determination unit , 144 notification unit, 200 dirt determination device, 300 terminal, 400 surveillance camera.

Claims (7)

  1.  監視カメラにより撮影された撮影データを定期的に取得する取得部と、
     取得された前記撮影データの特徴量を抽出する抽出部と、
     抽出された前記特徴量に基づく情報を学習済モデルに入力して、前記監視カメラのレンズの汚れ度合を出力する推定部と、
     前記汚れ度合が前記汚れ度合に関する閾値以上になり、かつ、前記汚れ度合が前記閾値以上になってから所定時間が経過しても前記汚れ度合が前記閾値未満にならない場合に、前記レンズに一定の汚れがある汚れ有状態であると判定する判定部とを備える、汚れ判定装置。
    an acquisition unit that periodically acquires photographed data photographed by a surveillance camera;
    an extraction unit that extracts the feature amount of the acquired photographed data;
    an estimating unit that inputs information based on the extracted feature amount to a trained model and outputs the degree of contamination of the lens of the surveillance camera;
    When the degree of contamination becomes equal to or greater than a threshold value related to the degree of contamination and the degree of contamination does not become less than the threshold even after a predetermined time has passed since the degree of contamination became equal to or greater than the threshold, A contamination determination device, comprising: a determination unit that determines that a contamination state is present.
  2.  前記学習済モデルは、複数のモデルを含み、
     前記複数のモデルの各々は、複数の撮影環境のいずれかで撮影された撮影データ群を用いて、前記特徴量に基づく情報が入力された際に、前記汚れ度合を出力するために学習処理が行われたモデルであり、
     前記推定部は、前記汚れ度合の出力結果に基づき、前記複数のモデルのうちのいずれかのモデルを選択する、請求項1に記載の汚れ判定装置。
    The trained model includes a plurality of models,
    Each of the plurality of models performs learning processing to output the degree of contamination when information based on the feature amount is input using a group of photographed data photographed in one of a plurality of photographing environments. is a model that has been done,
    The contamination determination device according to claim 1, wherein the estimating unit selects one of the plurality of models based on the output result of the degree of contamination.
  3.  前記推定部は、前記複数のモデルのうち、前記汚れ度合の出力結果が最も小さくなるモデルを選択する、請求項2に記載の汚れ判定装置。 The contamination determination device according to claim 2, wherein the estimating unit selects, from among the plurality of models, a model that produces the smallest output result of the degree of contamination.
  4.  前記汚れ判定装置は、第1モードと第2モードとを含む実行モードを設定可能であり、
     前記判定部は、
      前記第1モードが設定されている場合は、第1周期ごとに判定を行い、
      前記第2モードが設定されている場合は、前記第1周期よりも長い第2周期ごとに判定を行う、請求項1~請求項3のいずれか1項に記載の汚れ判定装置。
    The contamination determination device can set an execution mode including a first mode and a second mode,
    The determination unit is
    When the first mode is set, a determination is made for each first cycle,
    The dirt determination device according to any one of claims 1 to 3, wherein when the second mode is set, the determination is performed every second cycle longer than the first cycle.
  5.  前記撮影データは、エレベーターのかご内が撮影されたデータであり、
     前記汚れ度合は、前記レンズの汚れと前記かご内の積載量と前記かごの戸開閉状態との少なくともいずれかに応じて変化する、請求項1~請求項4のいずれか1項に記載の汚れ判定装置。
    The photographed data is data in which the inside of the elevator car is photographed,
    5. The contamination according to any one of claims 1 to 4, wherein the degree of contamination varies according to at least one of the contamination of the lenses, the load capacity in the car, and the open/closed state of the door of the car. judgment device.
  6.  監視カメラにより撮影された撮影データを定期的に取得するステップと、
     取得された前記撮影データの特徴量を抽出するステップと、
     抽出された前記特徴量に基づく情報を学習済モデルに入力して、前記監視カメラのレンズの汚れ度合を出力するステップと、
     前記汚れ度合が前記汚れ度合に関する閾値以上になり、かつ、前記汚れ度合が前記閾値以上になってから所定時間が経過しても前記汚れ度合が前記閾値未満にならない場合に、前記レンズに一定の汚れがある汚れ有状態であると判定するステップとを備える、汚れ判定方法。
    a step of periodically acquiring photographed data photographed by a surveillance camera;
    a step of extracting a feature amount of the acquired photographed data;
    a step of inputting information based on the extracted feature quantity into a trained model and outputting the degree of contamination of the lens of the surveillance camera;
    When the degree of contamination becomes equal to or greater than a threshold value related to the degree of contamination and the degree of contamination does not become less than the threshold even after a predetermined time has passed since the degree of contamination became equal to or greater than the threshold, and determining that the presence of dirt is present.
  7.  コンピュータに、
     監視カメラにより撮影された撮影データを定期的に取得するステップと、
     取得された前記撮影データの特徴量を抽出するステップと、
     抽出された前記特徴量に基づく情報を学習済モデルに入力して、前記監視カメラのレンズの汚れ度合を出力するステップと、
     前記汚れ度合が前記汚れ度合に関する閾値以上になり、かつ、前記汚れ度合が前記閾値以上になってから所定時間が経過しても前記汚れ度合が前記閾値未満にならない場合に、前記レンズに一定の汚れがある汚れ有状態であると判定するステップとを実行させる、汚れ判定プログラム。
    to the computer,
    a step of periodically acquiring photographed data photographed by a surveillance camera;
    a step of extracting a feature amount of the acquired photographed data;
    a step of inputting information based on the extracted feature quantity into a trained model and outputting the degree of contamination of the lens of the surveillance camera;
    When the degree of contamination becomes equal to or greater than a threshold value related to the degree of contamination and the degree of contamination does not become less than the threshold even after a predetermined time has passed since the degree of contamination became equal to or greater than the threshold, A contamination determination program for executing a step of determining that a contamination state is present.
PCT/JP2021/040401 2021-11-02 2021-11-02 Stain determination device, stain determination method, and stain determination program WO2023079593A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/040401 WO2023079593A1 (en) 2021-11-02 2021-11-02 Stain determination device, stain determination method, and stain determination program
JP2023541612A JP7455284B2 (en) 2021-11-02 2021-11-02 Dirt determination device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/040401 WO2023079593A1 (en) 2021-11-02 2021-11-02 Stain determination device, stain determination method, and stain determination program

Publications (1)

Publication Number Publication Date
WO2023079593A1 true WO2023079593A1 (en) 2023-05-11

Family

ID=86240779

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/040401 WO2023079593A1 (en) 2021-11-02 2021-11-02 Stain determination device, stain determination method, and stain determination program

Country Status (2)

Country Link
JP (1) JP7455284B2 (en)
WO (1) WO2023079593A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003189294A (en) * 2001-12-13 2003-07-04 Secom Co Ltd Image monitoring device
JP2019029897A (en) * 2017-08-01 2019-02-21 パナソニックIpマネジメント株式会社 Image monitor, image monitoring method and image monitoring program
JP2019096320A (en) * 2017-11-24 2019-06-20 フィコサ アダス,ソシエダッド リミタダ ユニペルソナル Determination of clear or dirty captured image
WO2019146097A1 (en) * 2018-01-29 2019-08-01 三菱電機ビルテクノサービス株式会社 Sensing device and sensing system for defective photographic data
JP2020190181A (en) * 2019-05-17 2020-11-26 株式会社Lixil Determination device, determination method and program
WO2020250296A1 (en) * 2019-06-11 2020-12-17 三菱電機ビルテクノサービス株式会社 Image processing device, image processing method and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003189294A (en) * 2001-12-13 2003-07-04 Secom Co Ltd Image monitoring device
JP2019029897A (en) * 2017-08-01 2019-02-21 パナソニックIpマネジメント株式会社 Image monitor, image monitoring method and image monitoring program
JP2019096320A (en) * 2017-11-24 2019-06-20 フィコサ アダス,ソシエダッド リミタダ ユニペルソナル Determination of clear or dirty captured image
WO2019146097A1 (en) * 2018-01-29 2019-08-01 三菱電機ビルテクノサービス株式会社 Sensing device and sensing system for defective photographic data
JP2020190181A (en) * 2019-05-17 2020-11-26 株式会社Lixil Determination device, determination method and program
WO2020250296A1 (en) * 2019-06-11 2020-12-17 三菱電機ビルテクノサービス株式会社 Image processing device, image processing method and program

Also Published As

Publication number Publication date
JPWO2023079593A1 (en) 2023-05-11
JP7455284B2 (en) 2024-03-25

Similar Documents

Publication Publication Date Title
CN101571914B (en) Abnormal behavior detection device
US8660700B2 (en) Video-based system and method of elevator door detection
JP4631806B2 (en) Object detection apparatus, object detection method, and object detection program
CN111144247A (en) Escalator passenger reverse-running detection method based on deep learning
CN109409315B (en) Method and system for detecting remnants in panel area of ATM (automatic Teller machine)
KR20090086898A (en) Detection of smoke with a video camera
CN110526058A (en) Elevator door monitoring system, elevator device and elevator door monitoring method
CN110189355A (en) Safe escape channel occupies detection method, device, electronic equipment and storage medium
WO2019146097A1 (en) Sensing device and sensing system for defective photographic data
JP6199216B2 (en) Elevator monitoring device
Lee et al. Real-time fire detection using camera sequence image in tunnel environment
CN112347862A (en) Elevator door fault real-time detection method based on machine vision
KR101651410B1 (en) Violence Detection System And Method Based On Multiple Time Differences Behavior Recognition
Razmi et al. Vision-based flame analysis using motion and edge detection
Mathew et al. Detecting new stable objects in surveillance video
WO2023079593A1 (en) Stain determination device, stain determination method, and stain determination program
Hou et al. Automated people counting at a mass site
CN108184098B (en) Method and system for monitoring safety area
Wang et al. Specular reflection removal for human detection under aquatic environment
JP2011068469A (en) Elevator control device, monitoring system, searching revitalization program, in/out management system
JP5186973B2 (en) Worker safety inspection device
CN115797770A (en) Continuous image target detection method, system and terminal considering relative movement of target
JP6977200B2 (en) Image processing equipment, image processing methods and programs
JP6673737B2 (en) System device abnormality diagnosis device, elevator abnormality diagnosis device, and elevator abnormality diagnosis method
CN114359839A (en) Method and system for identifying entrance of electric vehicle into elevator

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21963191

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023541612

Country of ref document: JP