WO2022195681A1 - Cleaning status detection system - Google Patents

Cleaning status detection system Download PDF

Info

Publication number
WO2022195681A1
WO2022195681A1 PCT/JP2021/010396 JP2021010396W WO2022195681A1 WO 2022195681 A1 WO2022195681 A1 WO 2022195681A1 JP 2021010396 W JP2021010396 W JP 2021010396W WO 2022195681 A1 WO2022195681 A1 WO 2022195681A1
Authority
WO
WIPO (PCT)
Prior art keywords
cleaning
cleaning status
detection system
image
status detection
Prior art date
Application number
PCT/JP2021/010396
Other languages
French (fr)
Japanese (ja)
Inventor
秀昭 打越
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2023506405A priority Critical patent/JP7448722B2/en
Priority to PCT/JP2021/010396 priority patent/WO2022195681A1/en
Publication of WO2022195681A1 publication Critical patent/WO2022195681A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a cleaning status detection system, and more particularly to a cleaning status detection system that can automatically determine the cleaning status.
  • Patent Document 1 discloses a system that monitors forest fires with multiple visible cameras.
  • Patent Document 1 is for forest fire monitoring, and this technology cannot be applied to the detection of cleaning conditions as it is.
  • an object of the present invention is to provide a cleaning status detection system that can automatically determine the cleaning status.
  • one representative cleaning status detection system of the present invention comprises an image pickup device and an image processing device, and the image processing device detects a cleaning condition based on an image taken by the image pickup device. It is characterized by comprising an image determination unit that determines the cleaning status in the image by specifying a cleaning person and specifying a cleaned range from the behavior of the cleaning person.
  • the cleaning status in the cleaning status detection system, the cleaning status can be automatically determined using an image. Therefore, it is possible to solve problems such as failure to visually check whether or not cleaning is being performed, and the inability to properly check the cleaning status. . Problems, configurations, and effects other than those described above will be clarified by the following embodiments.
  • FIG. 1 is a block diagram showing one embodiment of the cleaning state detection system of the present invention.
  • FIG. 2 is a block diagram showing an example of the image processing device of FIG.
  • FIG. 3 shows a first example of a display screen on the display device in the cleaning state detection system of the present invention.
  • FIG. 4 shows a second example of the display screen on the display device in the cleaning state detection system of the present invention.
  • FIG. 5 is a flow chart showing an example of overall processing in the cleaning state detection system of the present invention.
  • FIG. 6 is a flow chart showing an example of cleaning condition determination processing in the cleaning condition detection system of the present invention.
  • FIG. 7 is a conceptual diagram showing an example of detection in the cleaning state detection system of the present invention.
  • FIG. 8 is a conceptual diagram showing an example of detection in the determination process in the cleaning state detection system of the present invention.
  • FIG. 9 is a conceptual diagram showing an example of non-detection in the determination process in the cleaning condition detection system of the present invention.
  • FIG. 1 is a block diagram showing one embodiment of the cleaning status detection system of the present invention.
  • the cleaning status detection system shown in FIG. Each of these is connected to a network 30 .
  • the imaging device 1 has a camera configuration and can be placed in various places. For example, it may be installed at a monitoring location as a surveillance camera.
  • the imaging apparatus 1 can employ a configuration of a camera in which information is obtained by forming an image of incident light on an imaging element via a lens and a diaphragm.
  • Examples of the imaging device here include a CCD (Charge-Coupled Device) image sensor and a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
  • the imaging device 1 captures an image at, for example, three frames per second (3 fps) or more, and the information is sent to the image processing device 10 and the image recording device 21 .
  • a plurality of imaging devices 1 can be installed depending on the situation.
  • the image processing apparatus 10 has the functions of a control section 11 , a user interface section 12 , a report creation section 13 , an image determination section 14 and an image determination recording section 15 .
  • the control unit 11 can exchange information with the network 30, and controls the user interface unit 12, the report creation unit 13, the image determination unit 14, and the image determination recording unit 15.
  • the user interface unit 12, such as the display device 22, performs processing such as accepting information input or operated by the user.
  • the report creation unit 13 performs processing for creating a report such as a daily report.
  • the image determination unit 14 performs image determination processing, which will be described later, using information captured by the imaging device 1 .
  • the image judgment recording unit 15 records the contents of the judgment processing results of the image judgment unit 14 . For example, when the user interface unit 12 receives a signal regarding cleaning completion or report creation from the display device 22, the report creation unit 13 automatically creates a report based on the determination result of the image determination unit 14, and sends the report to the image determination recording unit 15. Record.
  • the image recording device 21 is a device that records images captured by the imaging device 1 . Moreover, the image and the result after processing by the image processing apparatus 10 may be recorded. As the image recording device 21, for example, HDD (Hard Disk Drive), SSD (Solid State Drive), DDS (Digital Data Storage), etc., can be applied as appropriate.
  • HDD Hard Disk Drive
  • SSD Solid State Drive
  • DDS Digital Data Storage
  • the display device 22 is a device that can display the content processed by the image processing device 10 .
  • the information is displayed using a liquid crystal display (LCD), an organic EL (OEL) display, a touch panel, or the like.
  • LCD liquid crystal display
  • OEL organic EL
  • touch panel or the like.
  • the number of display devices 22 can be applied as required. Further, the display device 22 may be attached with operation buttons.
  • the printing device 23 is a device capable of printing the content displayed on the display device 22 and the information on the image processing device 10, and is provided as necessary.
  • the network 30 is a line capable of data communication that connects each device. It can be applied regardless of the type of line, such as a dedicated line, an intranet, an IP network such as the Internet, and the like.
  • FIG. 2 is a block diagram showing an example of the image processing device of FIG.
  • a computer system 300 in FIG. 2 will be described as a specific configuration example of the hardware of the image processing apparatus 10 .
  • the major components of computer system 300 include one or more processors 302 , memory 304 , terminal interfaces 312 , storage interfaces 314 , I/O (input/output) device interfaces 316 , and network interfaces 318 . These components may be interconnected via memory bus 306 , I/O bus 308 , bus interface 309 and I/O bus interface 310 .
  • Computer system 300 may include one or more processing units 302 A and 302 B, collectively referred to as processor 302 .
  • processor 302 executes instructions stored in memory 304 and may include an on-board cache.
  • the processing device for example, CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), etc. can be applied.
  • Memory 304 may include random access semiconductor memory, storage devices, or storage media (either volatile or non-volatile) for storing data and programs. Memory 304 also represents the entire virtual memory of computer system 300 and may include the virtual memory of other computer systems connected to computer system 300 over a network. Memory 304 may conceptually be viewed as a single entity, but may be more complex arrangements, such as hierarchies of caches and other memory devices.
  • the memory 304 may store all or part of the programs, modules, and data structures that implement the functions described in this embodiment.
  • memory 304 may store application 350 .
  • Application 350 may include instructions or descriptions that perform the functions described below on processor 302, or may include instructions or descriptions that are interpreted by other instructions or descriptions.
  • Application 350 may be implemented in hardware via semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware devices instead of or in addition to processor-based systems.
  • Application 350 may include data other than instructions or descriptions.
  • Other data input devices such as cameras and sensors may also be provided in direct communication with bus interface 309 , processor 302 , or other hardware of computer system 300 .
  • Computer system 300 may include bus interface 309 that provides communication between processor 302 , memory 304 , display system 324 , and I/O bus interface 310 .
  • I/O bus interface 310 may couple to I/O bus 308 for transferring data to and from various I/O units.
  • I/O bus interface 310 connects via I/O bus 308 to a plurality of I/O interfaces 312, 314, 316, and 318, also known as I/O processors (IOPs) or I/O adapters (IOAs).
  • IOPs I/O processors
  • IOAs I/O adapters
  • Display system 324 may include a display controller, display memory, or both. The display controller can provide video, audio, or both data to display device 326 .
  • Computer system 300 may also include one or more sensors or other devices configured to collect data and provide such data to processor 302 .
  • the display system 324 may be connected to a display device 326 such as a single display screen, television, tablet, or handheld device.
  • Display device 326 may include speakers for rendering audio.
  • speakers for rendering audio may be connected to the I/O interface.
  • the functionality provided by display system 324 may be implemented by an integrated circuit that includes processor 302 .
  • bus interface 309 may be implemented by an integrated circuit including processor 302 .
  • the I/O interface has the ability to communicate with various storage or I/O devices.
  • terminal interface 312 may be a user output device such as a video display, speaker television, etc., or a user input device such as a keyboard, mouse, keypad, touchpad, trackball, buttons, light pen, or other pointing device.
  • Any user I/O device 320 can be attached.
  • a user inputs input data and instructions to the user I/O device 320 and the computer system 300 by operating the user input device using the user interface, and receives output data from the computer system 300. good too.
  • the user interface may be displayed on a display device or played by speakers via the user I/O device 320, for example.
  • the storage interface 314 allows attachment of one or more disk drives or direct access storage devices 322 .
  • Storage device 322 may be implemented as any secondary storage device.
  • the contents of memory 304 may be stored in storage device 322 and read from storage device 322 as needed.
  • I/O device interface 316 may provide an interface to other I/O devices.
  • Network interface 318 may provide a communication pathway to allow computer system 300 and other devices to communicate with each other. This communication path may be, for example, network 330 .
  • computer system 300 includes a bus structure that provides a direct communication path between processor 302, memory 304, bus interface 309, display system 324, and I/O bus interface 310
  • computer system 300 is hierarchically organized. , star or web configuration point-to-point links, multiple hierarchical buses, parallel or redundant communication paths.
  • I/O bus interface 310 and I/O bus 308 are shown as a single unit, in reality computer system 300 may include multiple I/O bus interfaces 310 or multiple I/O buses 308 . may be provided.
  • multiple I/O interfaces are shown to separate the I/O bus 308 from various communication paths leading to various I/O devices, some or all of the I/O devices may be connected to a single interface. It may be connected directly to the system I/O bus.
  • Computer system 300 may be a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer.
  • the display system 324 and the display device 326 are arbitrary configurations and may or may not be provided.
  • the display device 326 may be provided as a display device similar to the display device 22 in addition to the display device 22 .
  • a display device 326 may be provided instead of the display device 22 .
  • the storage device 322 can be applied to the image determination recording unit 15 .
  • the storage device 322 may be provided instead of the image recording device 21 .
  • the network 330 can be applied as the network 30 .
  • FIG. 3 shows a first example of a display screen on the display device in the cleaning status detection system of the present invention.
  • FIG. 4 shows a second example of the display screen on the display device in the cleaning state detection system of the present invention.
  • a display screen 100 shows a display screen on the display device 22 in FIG.
  • the examples of FIGS. 3 and 4 show display examples when the configuration of the touch panel is applied to the display screen 100 .
  • a cleaning status screen 120 is displayed on the display screen 100 . Further, the display screen 100 displays a cleaning start button 101, a cleaning end button 102, an imaging device area button 103, a cleaning status update button 104, and a report creation button 105 as button displays. When these buttons are touched, the information is sent to the user interface section 12 of the image processing apparatus 10 via the network 30 shown in FIG. Then, the image processing apparatus 10 performs processing according to the sent information.
  • the display screen 100 is configured to be able to display cleaning start date and time 111, cleaning end date and time 112, camera identification information 113, and report creation date and time 114 as information display.
  • the cleaning start button 101 When the cleaner starts cleaning, when the cleaning start button 101 is touched, the time is recorded and the date and time is displayed in the cleaning start date and time 111. Further, when the cleaning person finishes cleaning, when the cleaning end button 102 is touched, the time is recorded and the date and time is displayed in the cleaning end date and time 112 .
  • the information is also sent to the image processing device 10 .
  • the camera identification information 113 displays the identification information of the imaging device 1 capturing the image displayed on the cleaning status screen 120 .
  • the identification information can be represented by a code such as a number, etc., and can specify which camera has captured the image. Furthermore, when the imaging device area button 103 is touched, it is possible to change to another imaging device 1 . When the imaging device 1 is changed to another imaging device 1 , the code display of the camera identification information 113 is changed, and the cleaning status screen 120 is displayed on the cleaning status screen 120 based on the image of the switched imaging device 1 .
  • the report creation button 105 is a button for creating a report.
  • the cleaner or administrator
  • the time is recorded and the date and time is displayed in the report creation date and time 114 .
  • the information is also sent to the image processing device 10 .
  • the report is created as a report containing the information in the display screen 100 at this time.
  • the cleaning status screen 120 On the cleaning status screen 120, an image captured by the imaging device 1 is displayed, and the image is partially processed so that the cleaned range 121 and the uncleaned range 122 can be seen.
  • the cleaned range 121 is displayed in a light color, lightly shaded, or surrounded by a frame of a specific color so that the person who checks it can understand, and the uncleaned range 122 is left unprocessed. is.
  • the uncleaned range 122 may be displayed to distinguish it from the cleaned range 121 .
  • FIG. 3 shows a state in which the cleaning start button 101 is touched and the cleaner is performing cleaning work. Therefore, no information is displayed in the cleaning end date and time 112 and the report creation date and time 114 .
  • FIG. 4 shows a state in which the cleaner finishes the cleaning work and touches the cleaning finish button 102 to finish the cleaning. Furthermore, the cleaner (or administrator) touches the report creation button 105 to create a report. Therefore, the cleaning end date and time 112 and the report creation date and time 114 are displayed with the information of the date and time.
  • FIG. 5 is a flow chart showing an example of the overall processing in the cleaning status detection system of the present invention. The processing here is performed by the image processing apparatus 10 in FIG.
  • the cleaning status determination process is first performed (S102).
  • the process is started when, for example, information that the cleaning start button 101 in FIGS. 3 and 4 is touched is sent from the display device 22 to the image processing device 10 and the user interface unit 12 recognizes it. Cleaning status determination processing will be described with reference to FIG.
  • the cleaning status update process (S103) is performed. This is to update the cleaning status information according to the result of the cleaning status determination process.
  • S104 it is determined whether or not cleaning has ended. If it is determined that the cleaning has been completed, the process proceeds to S105, and if it is determined that the cleaning has not been completed, the process returns to the cleaning status determination process of S102. This determination is made based on, for example, whether or not the information that the cleaning end button 102 of FIGS. will be That is, if the cleaning end button 102 is touched, it can be determined that the cleaning has been completed, and if it has not been touched, it can be determined that the cleaning has not been completed.
  • S105 it is determined whether or not report creation is incomplete (S105). If it is determined that report creation has not been completed, the process proceeds to S106, and if it is determined that report creation has not been completed, the process proceeds to S107. This determination can be made based on whether or not a report has been created before. For example, it is assumed that the cleaner touches the cleaning start button 101 again after finishing cleaning and pressing the report creation button 105 in FIGS. 3 and 4 to resume additional cleaning work. If the cleaning person touches the cleaning end button 102 after completing the additional cleaning, the report has already been created, so the process proceeds to S107. This determination can be made based on, for example, whether the report was created within a predetermined time.
  • report creation processing is performed, and then the processing ends (S109).
  • This processing is performed by the report creation section 13 of the image processing apparatus 10 .
  • This report can create, for example, the report shown in FIG.
  • S107 it is determined whether or not to update the report (S107). If it is determined to update the report, the process proceeds to S108, and if it is determined not to update the report, the process proceeds to S109 and ends. For the determination here, for example, if there is a report that has already been created, it is displayed and it is confirmed whether or not to update it. When the cleaning person touches the corresponding button, the information is sent from the display device 22 to the image processing device 10 . When the user interface unit 12 recognizes this, it is determined to update.
  • FIG. 6 is a flowchart showing an example of cleaning status determination processing in the cleaning status detection system of the present invention. Here, specific processing of the cleaning status determination processing of S102 shown in FIG. 5 is shown. These processes are performed by the image determination unit 14 of the image processing apparatus 10 in FIG. 1 unless otherwise specified.
  • the person detection process is performed (S202). This is a process for identifying a cleaning person based on the image captured by the imaging device 1 .
  • a known person detection mechanism can be used to determine whether the person is the relevant person, or inference processing can be performed by AI (Artificial Intelligence) based on the person's characteristics and movements.
  • AI Artificial Intelligence
  • the inference process here is, for example, a process of extracting feature amounts using a neural network, deep learning, or the like.
  • the cleaner can be identified from movement in the video.
  • a process of identifying the uniform worn by the cleaning person may be performed.
  • determination may be made by combining features such as movement and uniform.
  • the identified cleaner may acquire an image of the skeleton in the image, as described later with reference to FIG. 6, for example. This makes it easier to identify the movement of the cleaner.
  • Identification processing of uniforms and the like may also be specified by the above-described inference processing by AI.
  • detection area determination processing is performed (S203).
  • an area for determining cleaning is determined.
  • a range may be defined in advance. If the imaging device 1 is a fixed camera such as a surveillance camera, the imaging range can be determined in advance because the imaging range is not changed.
  • identification processing may be performed from the image of the imaging device 1 to determine the detection area. In the case of handrails, desk tops, tables, etc., it is possible to identify detection areas that require cleaning based on their characteristics. Also, walls, passages, and the like can be similarly identified as detection areas.
  • the cleaning range may be determined based on past cleaning conditions. A cleaning person records the range cleaned in the past, and the detection area can be specified from the contents. These identification processes may be specified by the AI inference process described above.
  • the feature amount is the process of extracting the feature amount necessary for the next cleaning determination process. For example, by extracting the movement of a specified person as a feature amount, it can be used for the cleaning determination process. Furthermore, by extracting the cleaning tool as a feature amount, the accuracy of the cleaning determination process can be increased. Examples of cleaning tools include dust cloths, mops, brooms, brushes, sponges, vacuum cleaners, and various other tools used for cleaning.
  • cleaning determination processing is performed (S205).
  • the cleaning range is determined using the feature amount extracted in S204 for the detection area specified in S203. Whether or not cleaning is being performed can be determined by setting a predetermined condition. For example, the inference processing by AI described above can be used. In this case, based on the feature amount extracted in S204, if the degree of certainty as to whether or not the action performed by the cleaning person specified in S202 corresponds to cleaning is a predetermined value or more, it is determined that cleaning is being performed. . At this time, it is determined whether or not the detection area has been cleaned by comparing the extracted feature amount with the feature amount of the previously learned data set.
  • S205 it is determined whether or not the detection area has been cleaned. If cleaning has been completed for a detection area that requires cleaning, it is determined that cleaning has been completed for this imaging range. Further, if there is an uncleaned portion in the detection area that needs to be cleaned, that portion is identified and determined as uncleaned.
  • judgment result recording processing is performed (S206).
  • the result of the cleaning determination process in S205 is recorded in the image determination recording unit 15.
  • FIG. This content is, for example, the content of the cleaning status screen 120 shown in FIGS. Also, these determination results can be reflected on the map.
  • map information of a specified area such as inside a building and left as a record. As a result, it is possible to visually ascertain which locations have been cleaned and which locations have not yet been cleaned.
  • FIG. 7 is a conceptual diagram showing an example of detection in the cleaning status detection system of the present invention.
  • the cleaning person C001 is identified by the person detection process in S202 of FIG. As shown in FIG. 7, by forming a skeletal model by human detection processing, analysis for identifying a person can be facilitated.
  • the skeleton model is a simplified model in which the joints are rotatable nodes and the joints are connected by lines.
  • the movement direction F001 and the cleaning tool K001 are identified by the detection feature amount extraction processing in S204 of FIG.
  • the example of FIG. 7 shows a diagram in which the cleaning tool K001 held by the cleaner C001 is moved in the moving direction F001 in the uncleaned range 122a.
  • the image determination unit 14 of the image processing apparatus 10 detects this state and determines that the uncleaned area 122a has been cleaned. It should be noted that it is also possible to detect and determine the state of the moving direction F001 of the hand of the cleaner C001 without detecting the cleaning tool K001.
  • FIG. 8 is a conceptual diagram showing an example of detection of determination processing in the cleaning status detection system of the present invention.
  • FIG. 9 is a conceptual diagram showing an example of non-detection in the determination process in the cleaning condition detection system of the present invention.
  • FIGS. 8 and 9 show an example of the cleaning determination process in S205 of FIG.
  • the detection area 150 which is a range requiring cleaning, is subdivided by grid lines or the like. As a result, a plurality of divided small areas 151 exist.
  • the number of small areas 151 may be, for example, 4 or more, or 10 or more, depending on the situation.
  • the cleaning determination process it is determined whether or not cleaning has been performed based on the two elements of cleaning direction and range. That is, when it is determined that the movement direction of the cleaning tool K001 is within a predetermined range of directions set in advance and the entire small area 151 has been cleaned, it is determined that the small area 151 has been cleaned.
  • the predetermined direction is, for example, a direction in which the cleaning tool K001 is moved in a predetermined direction from a certain direction.
  • the moving direction F001 of the cleaning tool K001 is a predetermined direction, and all small areas 151 on the left side of the detection small area A001 have been cleaned. Therefore, all these areas are determined to have been cleaned. Thereby, the image processing apparatus 10 detects that it has been cleaned.
  • the moving direction F001 of the cleaning tool K001 is not constant but meandering, so the moving direction F001 does not satisfy the predetermined condition. Therefore, even if there is a small area 151 through which the cleaning tool K001 has passed, all the small areas 151 to the left of the detection small area A001 are determined to be uncleaned. Accordingly, the image processing apparatus 10 does not detect cleaning.
  • the cleaning status can be accurately grasped.
  • more accurate determination is possible by using inference processing by AI.
  • the imaging apparatus 1 can be used in common with an existing surveillance camera, the system can be constructed at low cost.
  • the report can be created based on the display on the display device 22, the burden on the cleaner can be reduced.
  • the cleaning status can be shared on the display screen 100, the cleaning status can be grasped remotely. At this time, it is possible to display the uncleaned locations on the cleaning status screen 120 of the display screen 100 in an easy-to-understand manner, thereby constructing a configuration that is easy to grasp.
  • cleaning determination processing can be facilitated by using a skeleton model. Furthermore, as shown in FIGS. 8 and 9, it is possible to perform accurate cleaning determination processing based on the cleaning direction and range. In this case, for example, it is also possible to accurately determine whether a specified method is suitable for cleaning, such as a cleaning direction for preventing the spread of viruses.
  • an infrared camera may be used as the imaging device 1 to measure the surface temperature of the object to be cleaned.
  • the surface temperature is likely to decrease due to the heat of vaporization.
  • the cleaning determination process can be performed by adding information about the surface temperature of the object to be cleaned at the time of cleaning to the feature amount. This enables more accurate cleaning determination.
  • a sensor that can measure the moisture content of the object to be cleaned may be used to add moisture content information and use it for the extraction of the feature amount in S204 of FIG. 6 and the cleaning determination process in S205.
  • the cleaning determination process can be performed by adding the factor to the feature amount. This enables more accurate cleaning determination.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
  • the image processing device 10 the image recording device 21, the display device 22, and the printing device 23 are connected via the network 30, but they may be directly connected. Also, the image processing device 10 and the display device 22 may be integrated. Alternatively, the image processing device 10 may be configured integrally within the imaging device 1 .
  • FIGS. 3 and 4 show a configuration using a touch panel, the configuration is not limited to this, and instead of displaying buttons, a dedicated operation switch, keyboard, or the like may be used.

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

An objective of the present invention is to provide a cleaning status detection system capable of automatically determining a cleaning status. The present invention comprises an image capturing device 1 and an image processing device 10. The image processing device 10 comprises an image determination unit 14 that identifies a cleaner on the basis of an image captured by the image capturing device 1 and determines a cleaning status in the image by specifying a cleaned range from the behavior of the cleaner. In addition, the cleaning status is determined to be uncleaned when there is a portion, of a detection region set in the image, that has not been cleaned.

Description

清掃状況検知システムCleaning status detection system
 本発明は、清掃状況検知システムに関し、特に、自動で清掃状況を判定可能な清掃状況検知システムに関する。 The present invention relates to a cleaning status detection system, and more particularly to a cleaning status detection system that can automatically determine the cleaning status.
 従来、清掃状況の把握は、清掃者又は管理者がその清掃内容を目視で確認する方法が主な方法だった。一方、近年、深層学習技術等を用いた行動分析技術が開発されてきた。監視カメラの映像から自動で人物の行動を分析することで異常行動検知による監視業務支援や高齢者見守り業務に活用されている。 In the past, the main method for grasping the cleaning status was for the cleaner or manager to visually check the cleaning details. On the other hand, in recent years, behavior analysis techniques using deep learning techniques and the like have been developed. By automatically analyzing a person's behavior from surveillance camera footage, it is used to support surveillance work by detecting abnormal behavior and to watch over the elderly.
 また、特許文献1には、複数台の可視カメラで森林の火災監視を行うシステムが開示されている。 In addition, Patent Document 1 discloses a system that monitors forest fires with multiple visible cameras.
特開2013-196655号公報JP 2013-196655 A
 しかしながら、清掃内容を目視で確認する場合は、清掃の有無が見た目では判断しづらく、確認漏れが生じる場合や、詳細な清掃内容まで確認できない場合が存在する。一方、従来の深層学習技術では、人物の行動の検知は、異常行動を検知することが主であり、清掃状況を把握するような検知システムの事例はなかった。また、特許文献1では、森林の火災監視のための技術であり、この技術を清掃状況の検知にそのまま適用することはできない。 However, when visually confirming the cleaning details, it is difficult to visually determine whether or not cleaning has been performed, and there are cases where confirmation is overlooked, and there are cases where detailed cleaning details cannot be confirmed. On the other hand, with conventional deep learning technology, detection of human behavior is mainly for detecting abnormal behavior, and there have been no examples of detection systems that grasp cleaning conditions. Moreover, the technology disclosed in Patent Document 1 is for forest fire monitoring, and this technology cannot be applied to the detection of cleaning conditions as it is.
 本発明は、上記課題に鑑みて、清掃状況を自動で判定することが可能な清掃状況検知システムを提供することを目的とする。 In view of the above problems, an object of the present invention is to provide a cleaning status detection system that can automatically determine the cleaning status.
 上記目的を達成するため、代表的な本発明の清掃状況検知システムの一つは、撮像装置と、画像処理装置とを備え、前記画像処理装置は、前記撮像装置で撮影した画像に基づき、清掃者を特定して、前記清掃者の行動から清掃済み範囲を特定していくことで前記画像内での清掃状況を判定する画像判定部を備えることを特徴とする。 In order to achieve the above object, one representative cleaning status detection system of the present invention comprises an image pickup device and an image processing device, and the image processing device detects a cleaning condition based on an image taken by the image pickup device. It is characterized by comprising an image determination unit that determines the cleaning status in the image by specifying a cleaning person and specifying a cleaned range from the behavior of the cleaning person.
 本発明によれば、清掃状況検知システムにおいて、清掃状況を画像を用いて自動で判定することができるため、目視による清掃有無の確認漏れや清掃状況をうまく確認できないといった問題を解決することができる。
 上記以外の課題、構成及び効果は、以下の実施形態により明らかにされる。
According to the present invention, in the cleaning status detection system, the cleaning status can be automatically determined using an image. Therefore, it is possible to solve problems such as failure to visually check whether or not cleaning is being performed, and the inability to properly check the cleaning status. .
Problems, configurations, and effects other than those described above will be clarified by the following embodiments.
図1は、本発明の清掃状況検知システムの一実施形態を示すブロック図である。FIG. 1 is a block diagram showing one embodiment of the cleaning state detection system of the present invention. 図2は、図1の画像処理装置の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the image processing device of FIG. 図3は、本発明の清掃状況検知システムにおける表示装置での表示画面の第1の例を示す。FIG. 3 shows a first example of a display screen on the display device in the cleaning state detection system of the present invention. 図4は、本発明の清掃状況検知システムにおける表示装置での表示画面の第2の例を示す。FIG. 4 shows a second example of the display screen on the display device in the cleaning state detection system of the present invention. 図5は、本発明の清掃状況検知システムにおける処理全体の一例を示すフローチャートである。FIG. 5 is a flow chart showing an example of overall processing in the cleaning state detection system of the present invention. 図6は、本発明の清掃状況検知システムにおける清掃状況判定処理の一例を示すフローチャートである。FIG. 6 is a flow chart showing an example of cleaning condition determination processing in the cleaning condition detection system of the present invention. 図7は、本発明の清掃状況検知システムにおける検知の一例を示す概念図である。FIG. 7 is a conceptual diagram showing an example of detection in the cleaning state detection system of the present invention. 図8は、本発明の清掃状況検知システムにおける判定処理の検知の場合の一例を示す概念図である。FIG. 8 is a conceptual diagram showing an example of detection in the determination process in the cleaning state detection system of the present invention. 図9は、本発明の清掃状況検知システムにおける判定処理の非検知の場合の一例を示す概念図である。FIG. 9 is a conceptual diagram showing an example of non-detection in the determination process in the cleaning condition detection system of the present invention.
 本発明を実施するための形態を説明する。 A form for carrying out the present invention will be described.
 図1は、本発明の清掃状況検知システムの一実施形態を示すブロック図である。図1で示す清掃状況検知システムは、撮像装置1、画像処理装置10、画像記録装置21、表示装置22、印刷装置23を備えている。これらは、それぞれ、ネットワーク30に接続されている。 FIG. 1 is a block diagram showing one embodiment of the cleaning status detection system of the present invention. The cleaning status detection system shown in FIG. Each of these is connected to a network 30 .
 撮像装置1は、カメラの構成を備えており、様々な場所に配置可能である。例えば、監視カメラとして監視箇所に配置する等である。撮像装置1は、レンズや絞りを介して撮像素子に入射光を結像して情報を得るカメラの構成を適用できる。ここでの撮像素子の例としては、CCD(Charge-Coupled Device)イメージセンサやCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等があげられる。撮像装置1は、映像として、例えば、1秒間に3フレーム(3fps)以上等で撮影して、その情報は、画像処理装置10や画像記録装置21へ送られる。撮像装置1は、状況に応じて複数設置可能である。 The imaging device 1 has a camera configuration and can be placed in various places. For example, it may be installed at a monitoring location as a surveillance camera. The imaging apparatus 1 can employ a configuration of a camera in which information is obtained by forming an image of incident light on an imaging element via a lens and a diaphragm. Examples of the imaging device here include a CCD (Charge-Coupled Device) image sensor and a CMOS (Complementary Metal Oxide Semiconductor) image sensor. The imaging device 1 captures an image at, for example, three frames per second (3 fps) or more, and the information is sent to the image processing device 10 and the image recording device 21 . A plurality of imaging devices 1 can be installed depending on the situation.
 画像処理装置10は、制御部11、ユーザインターフェース部12、レポート作成部13、画像判定部14、画像判定記録部15の機能を備えている。 The image processing apparatus 10 has the functions of a control section 11 , a user interface section 12 , a report creation section 13 , an image determination section 14 and an image determination recording section 15 .
 制御部11は、ネットワーク30との情報のやり取りが可能で、ユーザインターフェース部12、レポート作成部13、画像判定部14、画像判定記録部15の制御を行う。ユーザインターフェース部12は、表示装置22等、ユーザが入力や操作した情報を受け入れる等の処理を行う。レポート作成部13は、日報等のレポートを作成する処理を行う。画像判定部14は、撮像装置1で撮影した情報を用いて後述する画像判定の処理を行う。画像判定記録部15は、画像判定部14の判定処理の結果に基づき、その内容の記録を行う。例えば、表示装置22から清掃終了又はレポート作成に関する信号をユーザインターフェース部12が取得した場合、レポート作成部13は、画像判定部14の判定結果に基づき自動でレポート作成を行い画像判定記録部15に記録する。 The control unit 11 can exchange information with the network 30, and controls the user interface unit 12, the report creation unit 13, the image determination unit 14, and the image determination recording unit 15. The user interface unit 12, such as the display device 22, performs processing such as accepting information input or operated by the user. The report creation unit 13 performs processing for creating a report such as a daily report. The image determination unit 14 performs image determination processing, which will be described later, using information captured by the imaging device 1 . The image judgment recording unit 15 records the contents of the judgment processing results of the image judgment unit 14 . For example, when the user interface unit 12 receives a signal regarding cleaning completion or report creation from the display device 22, the report creation unit 13 automatically creates a report based on the determination result of the image determination unit 14, and sends the report to the image determination recording unit 15. Record.
 画像記録装置21は、撮像装置1で撮影した画像を記録する装置である。また、画像処理装置10での処理後の画像や結果を記録してもよい。画像記録装置21としては、例えば、HDD(Hard Disk Drive)、SSD(Solid State Drive)、DDS(Digital Data Storage)等、必要に応じて適した方式を適用できる。 The image recording device 21 is a device that records images captured by the imaging device 1 . Moreover, the image and the result after processing by the image processing apparatus 10 may be recorded. As the image recording device 21, for example, HDD (Hard Disk Drive), SSD (Solid State Drive), DDS (Digital Data Storage), etc., can be applied as appropriate.
 表示装置22は、画像処理装置10で処理した内容を表示できる装置である。例えば、液晶ディスプレイ(LCD)、有機EL(OEL)ディスプレイ、タッチパネル等の構成により表示させる。表示装置22は必要に応じた台数を適用できる。また、表示装置22は、操作ボタンを付属していてもよい。 The display device 22 is a device that can display the content processed by the image processing device 10 . For example, the information is displayed using a liquid crystal display (LCD), an organic EL (OEL) display, a touch panel, or the like. The number of display devices 22 can be applied as required. Further, the display device 22 may be attached with operation buttons.
 印刷装置23は、表示装置22での表示内容や画像処理装置10での情報を印刷可能な装置であり、必要に応じて設けられる。 The printing device 23 is a device capable of printing the content displayed on the display device 22 and the information on the image processing device 10, and is provided as necessary.
 ネットワーク30は、各装置を結ぶデータ通信可能な回線である。専用線、イントラネット、インターネット等のIPネットワーク等、回線の種類は問わず適用可能である。 The network 30 is a line capable of data communication that connects each device. It can be applied regardless of the type of line, such as a dedicated line, an intranet, an IP network such as the Internet, and the like.
 図2は、図1の画像処理装置の一例を示すブロック図である。画像処理装置10のハードウェアの具体的な構成例として図2のコンピュータシステム300により説明する。 FIG. 2 is a block diagram showing an example of the image processing device of FIG. A computer system 300 in FIG. 2 will be described as a specific configuration example of the hardware of the image processing apparatus 10 .
 コンピュータシステム300の主要コンポーネントは、1つ以上のプロセッサ302、メモリ304、端末インターフェース312、ストレージインターフェース314、I/O(入出力)デバイスインターフェース316、及びネットワークインターフェース318を含む。これらのコンポーネントは、メモリバス306、I/Oバス308、バスインターフェース309、及びI/Oバスインターフェース310を介して、相互的に接続されてもよい。 The major components of computer system 300 include one or more processors 302 , memory 304 , terminal interfaces 312 , storage interfaces 314 , I/O (input/output) device interfaces 316 , and network interfaces 318 . These components may be interconnected via memory bus 306 , I/O bus 308 , bus interface 309 and I/O bus interface 310 .
 コンピュータシステム300は、プロセッサ302と総称される1つ又は複数の処理装置302A及び302Bを含んでもよい。各プロセッサ302は、メモリ304に格納された命令を実行し、オンボードキャッシュを含んでもよい。処理装置としては、例えば、CPU(Central Processing Unit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processong Unit)、DSP(Digital Signal Processor)等を適用できる。 Computer system 300 may include one or more processing units 302 A and 302 B, collectively referred to as processor 302 . Each processor 302 executes instructions stored in memory 304 and may include an on-board cache. As the processing device, for example, CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), etc. can be applied.
 メモリ304は、データ及びプログラムを記憶するためのランダムアクセス半導体メモリ、記憶装置、又は記憶媒体(揮発性又は不揮発性のいずれか)を含んでもよい。また、メモリ304は、コンピュータシステム300の仮想メモリ全体を表しており、ネットワークを介してコンピュータシステム300に接続された他のコンピュータシステムの仮想メモリを含んでもよい。メモリ304は、概念的には単一のものとみなされてもよいが、キャッシュおよび他のメモリデバイスの階層など、より複雑な構成となる場合もある。 Memory 304 may include random access semiconductor memory, storage devices, or storage media (either volatile or non-volatile) for storing data and programs. Memory 304 also represents the entire virtual memory of computer system 300 and may include the virtual memory of other computer systems connected to computer system 300 over a network. Memory 304 may conceptually be viewed as a single entity, but may be more complex arrangements, such as hierarchies of caches and other memory devices.
 メモリ304は、本実施形態で説明する機能を実施するプログラム、モジュール、及びデータ構造のすべて又は一部を格納してもよい。例えば、メモリ304は、アプリケーション350を格納していてもよい。アプリケーション350は、後述する機能をプロセッサ302上で実行する命令又は記述を含んでもよく、あるいは別の命令又は記述によって解釈される命令又は記述を含んでもよい。アプリケーション350は、プロセッサベースのシステムの代わりに、またはプロセッサベースのシステムに加えて、半導体デバイス、チップ、論理ゲート、回路、回路カード、および/または他の物理ハードウェアデバイスを介してハードウェアで実施されてもよい。アプリケーション350は、命令又は記述以外のデータを含んでもよい。また、カメラやセンサ等の他のデータ入力デバイスが、バスインターフェース309、プロセッサ302、またはコンピュータシステム300の他のハードウェアと直接通信するように提供されてもよい。 The memory 304 may store all or part of the programs, modules, and data structures that implement the functions described in this embodiment. For example, memory 304 may store application 350 . Application 350 may include instructions or descriptions that perform the functions described below on processor 302, or may include instructions or descriptions that are interpreted by other instructions or descriptions. Application 350 may be implemented in hardware via semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware devices instead of or in addition to processor-based systems. may be Application 350 may include data other than instructions or descriptions. Other data input devices such as cameras and sensors may also be provided in direct communication with bus interface 309 , processor 302 , or other hardware of computer system 300 .
 コンピュータシステム300は、プロセッサ302、メモリ304、表示システム324、及びI/Oバスインターフェース310間の通信を行うバスインターフェース309を含んでもよい。I/Oバスインターフェース310は、様々なI/Oユニットとの間でデータを転送するためのI/Oバス308と連結していてもよい。I/Oバスインターフェース310は、I/Oバス308を介して、I/Oプロセッサ(IOP)又はI/Oアダプタ(IOA)としても知られる複数のI/Oインターフェース312、314、316、及び318と通信してもよい。表示システム324は、表示コントローラ、表示メモリ、又はその両方を含んでもよい。表示コントローラは、ビデオ、オーディオ、又はその両方のデータを表示装置326に提供することができる。また、コンピュータシステム300は、データを収集し、プロセッサ302に当該データを提供するように構成された1つまたは複数のセンサ等のデバイスを含んでもよい。表示システム324は、単独のディスプレイ画面、テレビ、タブレット、又は携帯型デバイスなどの表示装置326に接続されてもよい。表示装置326は、オーディオをレンダリングするためスピーカを含んでもよい。あるいは、オーディオをレンダリングするためのスピーカは、I/Oインターフェースと接続されてもよい。これ以外に、表示システム324が提供する機能は、プロセッサ302を含む集積回路によって実現されてもよい。同様に、バスインターフェース309が提供する機能は、プロセッサ302を含む集積回路によって実現されてもよい。 Computer system 300 may include bus interface 309 that provides communication between processor 302 , memory 304 , display system 324 , and I/O bus interface 310 . I/O bus interface 310 may couple to I/O bus 308 for transferring data to and from various I/O units. I/O bus interface 310 connects via I/O bus 308 to a plurality of I/O interfaces 312, 314, 316, and 318, also known as I/O processors (IOPs) or I/O adapters (IOAs). may communicate with Display system 324 may include a display controller, display memory, or both. The display controller can provide video, audio, or both data to display device 326 . Computer system 300 may also include one or more sensors or other devices configured to collect data and provide such data to processor 302 . The display system 324 may be connected to a display device 326 such as a single display screen, television, tablet, or handheld device. Display device 326 may include speakers for rendering audio. Alternatively, speakers for rendering audio may be connected to the I/O interface. Alternatively, the functionality provided by display system 324 may be implemented by an integrated circuit that includes processor 302 . Similarly, the functionality provided by bus interface 309 may be implemented by an integrated circuit including processor 302 .
 I/Oインターフェースは、様々なストレージ又はI/Oデバイスと通信する機能を備える。例えば、端末インターフェース312は、ビデオ表示装置、スピーカテレビ等のユーザ出力デバイスや、キーボード、マウス、キーパッド、タッチパッド、トラックボール、ボタン、ライトペン、又は他のポインティングデバイス等のユーザ入力デバイスのようなユーザI/Oデバイス320の取り付けが可能である。ユーザは、ユーザインターフェースを使用して、ユーザ入力デバイスを操作することで、ユーザI/Oデバイス320及びコンピュータシステム300に対して入力データや指示を入力し、コンピュータシステム300からの出力データを受け取ってもよい。ユーザインターフェースは例えば、ユーザI/Oデバイス320を介して、表示装置に表示されたり、スピーカによって再生されたりしてもよい。 The I/O interface has the ability to communicate with various storage or I/O devices. For example, terminal interface 312 may be a user output device such as a video display, speaker television, etc., or a user input device such as a keyboard, mouse, keypad, touchpad, trackball, buttons, light pen, or other pointing device. Any user I/O device 320 can be attached. A user inputs input data and instructions to the user I/O device 320 and the computer system 300 by operating the user input device using the user interface, and receives output data from the computer system 300. good too. The user interface may be displayed on a display device or played by speakers via the user I/O device 320, for example.
 ストレージインターフェース314は、1つ又は複数のディスクドライブや直接アクセスストレージ装置322の取り付けが可能である。ストレージ装置322は、任意の二次記憶装置として実装されてもよい。メモリ304の内容は、ストレージ装置322に記憶され、必要に応じてストレージ装置322から読み出されてもよい。I/Oデバイスインターフェース316は、他のI/Oデバイスに対するインターフェースを提供してもよい。ネットワークインターフェース318は、コンピュータシステム300と他のデバイスが相互的に通信できるように、通信経路を提供してもよい。この通信経路は、例えば、ネットワーク330であってもよい。 The storage interface 314 allows attachment of one or more disk drives or direct access storage devices 322 . Storage device 322 may be implemented as any secondary storage device. The contents of memory 304 may be stored in storage device 322 and read from storage device 322 as needed. I/O device interface 316 may provide an interface to other I/O devices. Network interface 318 may provide a communication pathway to allow computer system 300 and other devices to communicate with each other. This communication path may be, for example, network 330 .
 コンピュータシステム300は、プロセッサ302、メモリ304、バスインターフェース309、表示システム324、及びI/Oバスインターフェース310の間の直接通信経路を提供するバス構造を備えているが、コンピュータシステム300は、階層構成、スター構成、又はウェブ構成のポイントツーポイントリンク、複数の階層バス、平行又は冗長の通信経路を含んでもよい。さらに、I/Oバスインターフェース310及びI/Oバス308が単一のユニットとして示されているが、実際には、コンピュータシステム300は複数のI/Oバスインターフェース310又は複数のI/Oバス308を備えてもよい。また、I/Oバス308を様々なI/Oデバイスに繋がる各種通信経路から分離するための複数のI/Oインターフェースが示されているが、I/Oデバイスの一部または全部が、1つのシステムI/Oバスに直接接続されてもよい。 Although computer system 300 includes a bus structure that provides a direct communication path between processor 302, memory 304, bus interface 309, display system 324, and I/O bus interface 310, computer system 300 is hierarchically organized. , star or web configuration point-to-point links, multiple hierarchical buses, parallel or redundant communication paths. Further, although I/O bus interface 310 and I/O bus 308 are shown as a single unit, in reality computer system 300 may include multiple I/O bus interfaces 310 or multiple I/O buses 308 . may be provided. Also, although multiple I/O interfaces are shown to separate the I/O bus 308 from various communication paths leading to various I/O devices, some or all of the I/O devices may be connected to a single interface. It may be connected directly to the system I/O bus.
 コンピュータシステム300は、マルチユーザメインフレームコンピュータシステム、シングルユーザシステム、又はサーバコンピュータ等の、直接的ユーザインターフェースを有しない、他のコンピュータシステム(クライアント)からの要求を受信するデバイスであってもよい。 Computer system 300 may be a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer.
 図2のコンピュータシステム300を図1の画像処理装置10に適用する場合は、表示システム324や表示装置326は任意の構成であり、備えていてもいなくてもよい。また、表示装置326は、表示装置22に加えて表示装置22と同様の表示装置として設けてもよい。また、表示装置22の代りに表示装置326を設けてもよい。また、ストレージ装置322は、画像判定記録部15に適用できる。また、ストレージ装置322を画像記録装置21の代りに設けてもよい。また、ネットワーク330は、ネットワーク30として適用できる。 When the computer system 300 in FIG. 2 is applied to the image processing apparatus 10 in FIG. 1, the display system 324 and the display device 326 are arbitrary configurations and may or may not be provided. Also, the display device 326 may be provided as a display device similar to the display device 22 in addition to the display device 22 . Also, a display device 326 may be provided instead of the display device 22 . Also, the storage device 322 can be applied to the image determination recording unit 15 . Also, the storage device 322 may be provided instead of the image recording device 21 . Also, the network 330 can be applied as the network 30 .
 図3は、本発明の清掃状況検知システムにおける表示装置での表示画面の第1の例を示す。図4は、本発明の清掃状況検知システムにおける表示装置での表示画面の第2の例を示す。 FIG. 3 shows a first example of a display screen on the display device in the cleaning status detection system of the present invention. FIG. 4 shows a second example of the display screen on the display device in the cleaning state detection system of the present invention.
 表示画面100は、図1の表示装置22における表示画面を示している。図3、4の例では、表示画面100にタッチパネルの構成を適用した場合の表示例を示している。 A display screen 100 shows a display screen on the display device 22 in FIG. The examples of FIGS. 3 and 4 show display examples when the configuration of the touch panel is applied to the display screen 100 .
 表示画面100は、清掃状況画面120が表示されている。さらに、表示画面100は、ボタン表示として、清掃開始ボタン101、清掃終了ボタン102、撮像装置エリアボタン103、清掃状況更新ボタン104、レポート作成ボタン105が表示されている。これらのボタンに触れるとその情報は、図1に記載したネットワーク30を介して画像処理装置10のユーザインターフェース部12へ送られる。そして、画像処理装置10では送られた情報に応じた処理が行われる。表示画面100は情報の表示として、清掃開始日時111、清掃終了日時112、カメラ識別情報113、レポート作成日時114を表示可能に構成されている。 A cleaning status screen 120 is displayed on the display screen 100 . Further, the display screen 100 displays a cleaning start button 101, a cleaning end button 102, an imaging device area button 103, a cleaning status update button 104, and a report creation button 105 as button displays. When these buttons are touched, the information is sent to the user interface section 12 of the image processing apparatus 10 via the network 30 shown in FIG. Then, the image processing apparatus 10 performs processing according to the sent information. The display screen 100 is configured to be able to display cleaning start date and time 111, cleaning end date and time 112, camera identification information 113, and report creation date and time 114 as information display.
 清掃者が、清掃を開始した場合、清掃開始ボタン101に触れると、その時間が記録され、清掃開始日時111にその日時が表示される。さらに、清掃者は、清掃を終了した場合、清掃終了ボタン102に触れると、その時間が記録され、清掃終了日時112にその日時が表示される。その情報は画像処理装置10にも送られる。 When the cleaner starts cleaning, when the cleaning start button 101 is touched, the time is recorded and the date and time is displayed in the cleaning start date and time 111. Further, when the cleaning person finishes cleaning, when the cleaning end button 102 is touched, the time is recorded and the date and time is displayed in the cleaning end date and time 112 . The information is also sent to the image processing device 10 .
 カメラ識別情報113は、清掃状況画面120で表示されている画像を撮影している撮像装置1の識別情報が表示される。識別情報は、数字等の符号等で表すことができ、どのカメラで撮影した画像であるかを特定できる。さらに、撮像装置エリアボタン103に触れると、他の撮像装置1に変更可能である。他の撮像装置1に変更した場合は、カメラ識別情報113の符号表示が変更され、清掃状況画面120は、切り替えた他の撮像装置1の画像に基づき清掃状況画面120に表示される。 The camera identification information 113 displays the identification information of the imaging device 1 capturing the image displayed on the cleaning status screen 120 . The identification information can be represented by a code such as a number, etc., and can specify which camera has captured the image. Furthermore, when the imaging device area button 103 is touched, it is possible to change to another imaging device 1 . When the imaging device 1 is changed to another imaging device 1 , the code display of the camera identification information 113 is changed, and the cleaning status screen 120 is displayed on the cleaning status screen 120 based on the image of the switched imaging device 1 .
 清掃状況更新ボタン104は、清掃者が、このボタンに触れると、その状況が清掃状況画面120に反映される。レポート作成ボタン105は、レポートを作成するためのボタンであり、清掃者(又は管理者)は、このボタンに触れると、その時間が記録され、レポート作成日時114にその日時が表示される。その情報は画像処理装置10にも送られる。レポートは、このときの表示画面100内の情報を含んだレポートとして作成される。 When the cleaning person touches the cleaning status update button 104 , the status is reflected on the cleaning status screen 120 . The report creation button 105 is a button for creating a report. When the cleaner (or administrator) touches this button, the time is recorded and the date and time is displayed in the report creation date and time 114 . The information is also sent to the image processing device 10 . The report is created as a report containing the information in the display screen 100 at this time.
 清掃状況画面120では、撮像装置1で撮影された画像が表示されると共に、清掃済み範囲121と未清掃範囲122が、分かるように画像を一部加工して表示させる。例えば、清掃済み範囲121は、この範囲に薄い色をつける、薄い網掛をする、特定の色の枠で囲うなど、確認する人が分かる表示とし、未清掃範囲122には何も加工をしない等である。もちろん、未清掃範囲122に清掃済み範囲121と区別するための表示を行ってもよい。これらの画像処理は、後述する画像処理装置10の判定結果に基づく情報により行うことができる。 On the cleaning status screen 120, an image captured by the imaging device 1 is displayed, and the image is partially processed so that the cleaned range 121 and the uncleaned range 122 can be seen. For example, the cleaned range 121 is displayed in a light color, lightly shaded, or surrounded by a frame of a specific color so that the person who checks it can understand, and the uncleaned range 122 is left unprocessed. is. Of course, the uncleaned range 122 may be displayed to distinguish it from the cleaned range 121 . These image processes can be performed using information based on the determination results of the image processing apparatus 10, which will be described later.
 図3は、清掃開始ボタン101に触れて、清掃者が清掃作業をしている状態を示している。このため、清掃終了日時112とレポート作成日時114には、情報は表示されていない。図4は、清掃者が清掃作業を終了して、清掃終了ボタン102に触れて清掃が終了した状態を示す。さらに、清掃者(又は管理者)がレポート作成ボタン105に触れてレポートが作成される。このため、清掃終了日時112とレポート作成日時114には、その日時の情報が表示されている。 FIG. 3 shows a state in which the cleaning start button 101 is touched and the cleaner is performing cleaning work. Therefore, no information is displayed in the cleaning end date and time 112 and the report creation date and time 114 . FIG. 4 shows a state in which the cleaner finishes the cleaning work and touches the cleaning finish button 102 to finish the cleaning. Furthermore, the cleaner (or administrator) touches the report creation button 105 to create a report. Therefore, the cleaning end date and time 112 and the report creation date and time 114 are displayed with the information of the date and time.
 図5は、本発明の清掃状況検知システムにおける処理全体の一例を示すフローチャートである。ここでの処理は、図1の画像処理装置10で行われる。 FIG. 5 is a flow chart showing an example of the overall processing in the cleaning status detection system of the present invention. The processing here is performed by the image processing apparatus 10 in FIG.
 処理が開始されると(S101)、まず清掃状況判定処理を行う(S102)。処理の開始は、例えば、図3、4の清掃開始ボタン101に触れられた情報が、表示装置22から画像処理装置10へ送られユーザインターフェース部12がそのことを認知することにより行われる。清掃状況判定処理は、図6で説明する。 When the process starts (S101), the cleaning status determination process is first performed (S102). The process is started when, for example, information that the cleaning start button 101 in FIGS. 3 and 4 is touched is sent from the display device 22 to the image processing device 10 and the user interface unit 12 recognizes it. Cleaning status determination processing will be described with reference to FIG.
 次に、清掃状況判定処理が終了すると清掃状況更新処理(S103)を行う。これは、清掃状況判定処理の結果に応じて、清掃状況の情報を更新する処理を行うものである。 Next, when the cleaning status determination process ends, the cleaning status update process (S103) is performed. This is to update the cleaning status information according to the result of the cleaning status determination process.
 次に、清掃が終了したか否かを判定する(S104)。清掃が終了したと判定した場合はS105へ進み、清掃が終了していないと判定した場合はS102の清掃状況判定処理へ戻る。ここでの判定は、例えば、図3、4の清掃終了ボタン102に触れられた情報が、表示装置22から画像処理装置10へ送られユーザインターフェース部12がそのことを認知するか否かにより行われる。すなわち、清掃終了ボタン102に触れられた場合は、清掃が終了したと判定し、触れられていない場合は、清掃が終了していないと判定できる。 Next, it is determined whether or not cleaning has ended (S104). If it is determined that the cleaning has been completed, the process proceeds to S105, and if it is determined that the cleaning has not been completed, the process returns to the cleaning status determination process of S102. This determination is made based on, for example, whether or not the information that the cleaning end button 102 of FIGS. will be That is, if the cleaning end button 102 is touched, it can be determined that the cleaning has been completed, and if it has not been touched, it can be determined that the cleaning has not been completed.
 S105では、レポート作成が未完了か否かを判定する(S105)。レポート作成が未完了であると判定した場合はS106へ進み、レポート作成が未完了でないと判定した場合はS107へ進む。ここでの判定は、これより前にレポートの作成がなされているか否かで判定することができる。例えば、清掃者が、清掃が終了して、図3、4のレポート作成ボタン105を押した後に、もう一度、清掃開始ボタン101に触れて、追加の清掃作業を再開した場合が想定される。清掃者が、追加の清掃が終了して清掃終了ボタン102に触れた場合、すでにレポートが作成されているためS107へ進むことになる。この判定は、ある所定の時間内にレポートが作成されていたか等で判断できる。 In S105, it is determined whether or not report creation is incomplete (S105). If it is determined that report creation has not been completed, the process proceeds to S106, and if it is determined that report creation has not been completed, the process proceeds to S107. This determination can be made based on whether or not a report has been created before. For example, it is assumed that the cleaner touches the cleaning start button 101 again after finishing cleaning and pressing the report creation button 105 in FIGS. 3 and 4 to resume additional cleaning work. If the cleaning person touches the cleaning end button 102 after completing the additional cleaning, the report has already been created, so the process proceeds to S107. This determination can be made based on, for example, whether the report was created within a predetermined time.
 S106では、レポート作成処理を行い、その後処理が終了する(S109)。この処理は、画像処理装置10のレポート作成部13で行う。このレポートは、例えば、図4で示したレポートを作成することができる。 In S106, report creation processing is performed, and then the processing ends (S109). This processing is performed by the report creation section 13 of the image processing apparatus 10 . This report can create, for example, the report shown in FIG.
 S107では、レポートを更新するか否かを判定する(S107)。レポートを更新すると判定した場合はS108へ進み、レポートを更新しないと判定した場合はS109へ進み処理が終了する。ここでの判定は、例えば、すでに作成されたレポートがあれば、それを表示して、更新するか否かを確認する。清掃者が、該当するボタンに触れた場合は、その情報が表示装置22から画像処理装置10へ送られる。そしてユーザインターフェース部12がそのことを認知すると、更新すると判定される。 In S107, it is determined whether or not to update the report (S107). If it is determined to update the report, the process proceeds to S108, and if it is determined not to update the report, the process proceeds to S109 and ends. For the determination here, for example, if there is a report that has already been created, it is displayed and it is confirmed whether or not to update it. When the cleaning person touches the corresponding button, the information is sent from the display device 22 to the image processing device 10 . When the user interface unit 12 recognizes this, it is determined to update.
 S108では、レポート作成処理を行い、その後処理が終了する(S109)。ここでの処理はS106と同様であるが、更新したレポートが作成される。 In S108, report creation processing is performed, and then the processing ends (S109). The processing here is similar to that of S106, but an updated report is created.
 図6は、本発明の清掃状況検知システムにおける清掃状況判定処理の一例を示すフローチャートである。ここでは、図5で示したS102の清掃状況判定処理の具体的な処理を示している。これらの処理は、特に記載のない限り、図1の画像処理装置10の画像判定部14で行う。 FIG. 6 is a flowchart showing an example of cleaning status determination processing in the cleaning status detection system of the present invention. Here, specific processing of the cleaning status determination processing of S102 shown in FIG. 5 is shown. These processes are performed by the image determination unit 14 of the image processing apparatus 10 in FIG. 1 unless otherwise specified.
 清掃状況判定処理を開始すると(S201)、まず、人物検知処理を行う(S202)。これは、撮像装置1で撮影された映像に基づき、清掃者である人物を特定するための処理である。該当する人物であるかどうかは、既知の人物検知の仕組みを使うこともできるし、その人物の特徴や動きなどから、AI(Artificial Intelligence)による推論処理を行ってもよい。ここでの推論処理は、例えば、ニューラルネットワークやディープラーニング等を用いて特徴量を抽出する処理等である。例えば、映像内の動きから清掃者を特定できる。また、これ以外に、清掃者を特定しやすくするため、清掃者が着ている制服を特定する処理を行ってもよい。これは、制服の全体のデザインや、腕章、マーク等の制服の特徴となるものから、清掃者であることを識別する。また、動きと制服等の特徴を組み合わせて判断してもよい。そして、識別した清掃者は、例えば、図6で後述するように、画像内の骨格のイメージを取得するとよい。このことで、清掃者の動きを特定しやすくなる。制服等の識別処理も、上述したAIによる推論処理で特定してもよい。 When the cleaning status determination process is started (S201), first, the person detection process is performed (S202). This is a process for identifying a cleaning person based on the image captured by the imaging device 1 . A known person detection mechanism can be used to determine whether the person is the relevant person, or inference processing can be performed by AI (Artificial Intelligence) based on the person's characteristics and movements. The inference process here is, for example, a process of extracting feature amounts using a neural network, deep learning, or the like. For example, the cleaner can be identified from movement in the video. In addition to this, in order to facilitate identification of the cleaning person, a process of identifying the uniform worn by the cleaning person may be performed. This identifies a cleaning person from the overall design of the uniform, armbands, markings, and other features of the uniform. In addition, determination may be made by combining features such as movement and uniform. Then, the identified cleaner may acquire an image of the skeleton in the image, as described later with reference to FIG. 6, for example. This makes it easier to identify the movement of the cleaner. Identification processing of uniforms and the like may also be specified by the above-described inference processing by AI.
 次に、検知領域判定処理を行う(S203)。ここでは、清掃を判定するための領域を判定する。この判定は、例えば、あらかじめ範囲を定めておいてもよい。撮像装置1が監視カメラなどの固定カメラである場合は、撮影範囲を動かすことがないため、あらかじめ範囲を定めておくことができる。また、この他、撮像装置1の画像から識別処理を行い、検知領域を定めてもよい。手すりや机の上やテーブルなどの場合は、その特徴から清掃が必要な検知領域であることを特定できる。また、壁や通路なども、同様に検知領域であることを特定できる。また、この他、過去の清掃状況から清掃範囲を定めてもよい。過去に清掃者が、清掃した範囲を記録しておき、その内容から検知領域を特定することができる。これらの識別処理は、上述したAIによる推論処理で特定してもよい。 Next, detection area determination processing is performed (S203). Here, an area for determining cleaning is determined. For this determination, for example, a range may be defined in advance. If the imaging device 1 is a fixed camera such as a surveillance camera, the imaging range can be determined in advance because the imaging range is not changed. In addition, identification processing may be performed from the image of the imaging device 1 to determine the detection area. In the case of handrails, desk tops, tables, etc., it is possible to identify detection areas that require cleaning based on their characteristics. Also, walls, passages, and the like can be similarly identified as detection areas. Alternatively, the cleaning range may be determined based on past cleaning conditions. A cleaning person records the range cleaned in the past, and the detection area can be specified from the contents. These identification processes may be specified by the AI inference process described above.
 次に、検知特徴量抽出処理を行う(S204)。ここでの特徴量は、次の清掃判定処理に必要な特徴量を抽出する処理である。例えば、特定した人物の動きを特徴量として抽出することで、清掃判定処理に用いることができる。さらに、清掃道具を特徴量として抽出することで、清掃判定処理の精度を高くできる。清掃道具としては、例えば、雑巾、モップ、ほうき、ブラシ、スポンジ、掃除機等、清掃に用いる様々な道具が該当する。 Next, a detection feature amount extraction process is performed (S204). The feature amount here is the process of extracting the feature amount necessary for the next cleaning determination process. For example, by extracting the movement of a specified person as a feature amount, it can be used for the cleaning determination process. Furthermore, by extracting the cleaning tool as a feature amount, the accuracy of the cleaning determination process can be increased. Examples of cleaning tools include dust cloths, mops, brooms, brushes, sponges, vacuum cleaners, and various other tools used for cleaning.
 次に、清掃判定処理を行う(S205)。ここでの判定は、S203で特定した検知領域に対して、S204で抽出した特徴量を用いて、清掃を行った範囲を判定する。清掃を行っている状態か否かは、所定の条件を定めて行うことができる。例えば、上述したAIによる推論処理を用いて行うこともできる。この場合、S204で抽出した特徴量に基づき、S202で特定した清掃者が行っている行動が、清掃に該当するかの確信度が所定以上の場合は、清掃していると判断する等である。このとき、抽出された特徴量を予め学習させたデータセットの特徴量と比較を行うことで検知領域が清掃済みか判定する。また、特定の清掃道具を用いているかも判定の材料とでき、清掃箇所に使用する道具が規定と異なる場合は、清掃を行っていないと判定することができる。さらに、清掃道具の移動方向を加味してもよい。清掃範囲を特定する具体例については、図8、9の説明で後述する。 Next, cleaning determination processing is performed (S205). In this determination, the cleaning range is determined using the feature amount extracted in S204 for the detection area specified in S203. Whether or not cleaning is being performed can be determined by setting a predetermined condition. For example, the inference processing by AI described above can be used. In this case, based on the feature amount extracted in S204, if the degree of certainty as to whether or not the action performed by the cleaning person specified in S202 corresponds to cleaning is a predetermined value or more, it is determined that cleaning is being performed. . At this time, it is determined whether or not the detection area has been cleaned by comparing the extracted feature amount with the feature amount of the previously learned data set. In addition, whether a specific cleaning tool is used can also be used as a basis for determination, and if the tool used for the cleaning location is different from the regulation, it can be determined that cleaning has not been performed. Furthermore, the moving direction of the cleaning tool may be taken into consideration. A specific example of specifying the cleaning range will be described later with reference to FIGS.
 さらに、S205では、検知領域に対して清掃が完了しているか否かの判定を行う。清掃が必要とされる検知領域に対して、清掃が完了している場合は、この撮影範囲における清掃は完了したと判定される。また、清掃が必要とされる検知領域に対して、未清掃の箇所がある場合は、その箇所を特定して、未清掃であると判定される。 Furthermore, in S205, it is determined whether or not the detection area has been cleaned. If cleaning has been completed for a detection area that requires cleaning, it is determined that cleaning has been completed for this imaging range. Further, if there is an uncleaned portion in the detection area that needs to be cleaned, that portion is identified and determined as uncleaned.
 次に、判定結果記録処理を行う(S206)。S205の清掃判定処理の結果は画像判定記録部15に記録される。この内容は、例えば、図3、4で示した清掃状況画面120の内容等である。また、これらの判定結果はマップに反映することも可能である。各撮像装置1の場所があらかじめ特定されている場合、例えば、建屋内等の特定エリアのマップ情報と照合して記録として残すことができる。これにより、どこの箇所で清掃が完了して、どこの箇所に未清掃箇所があるのかを視覚的に全体を把握することができる。 Next, judgment result recording processing is performed (S206). The result of the cleaning determination process in S205 is recorded in the image determination recording unit 15. FIG. This content is, for example, the content of the cleaning status screen 120 shown in FIGS. Also, these determination results can be reflected on the map. When the location of each imaging device 1 is specified in advance, for example, it can be compared with map information of a specified area such as inside a building and left as a record. As a result, it is possible to visually ascertain which locations have been cleaned and which locations have not yet been cleaned.
 次に、判定が終了しているか否かを判定する(S207)。現在の撮像装置1からの画像において、判定処理が終了している場合は、S208へ進み清掃状況判定処理が終了する。また、判定処理が終了していない場合は、S202へ戻る。 Next, it is determined whether or not the determination has been completed (S207). In the image from the current imaging device 1, if the determination process has ended, the process proceeds to S208 and the cleaning status determination process ends. If the determination process has not ended, the process returns to S202.
 図7は、本発明の清掃状況検知システムにおける検知の一例を示す概念図である。 FIG. 7 is a conceptual diagram showing an example of detection in the cleaning status detection system of the present invention.
 清掃者C001の特定は、図6のS202における人物検知処理によって行われる。図7に示すように、人物検知処理によって、骨格のモデルを形成することにより、人物を特定するための解析を行いやすくできる。骨格のモデルは、関節を回動可能な節点として、関節同士を線で結ぶ等して単純化したモデルである。 The cleaning person C001 is identified by the person detection process in S202 of FIG. As shown in FIG. 7, by forming a skeletal model by human detection processing, analysis for identifying a person can be facilitated. The skeleton model is a simplified model in which the joints are rotatable nodes and the joints are connected by lines.
 移動方向F001及び清掃道具K001の特定は、図6のS204における検知特徴量の抽出処理によって行われる。図7の例では、未清掃範囲122aにおいて、清掃者C001が保持している清掃道具K001を移動方向F001に動かしている図が示されている。画像処理装置10の画像判定部14では、この状態を検知して、未清掃範囲122aが清掃していることを判定する。なお、清掃道具K001を検知せずに、清掃者C001の手の移動方向F001の状態を検知して判定することも可能である。 The movement direction F001 and the cleaning tool K001 are identified by the detection feature amount extraction processing in S204 of FIG. The example of FIG. 7 shows a diagram in which the cleaning tool K001 held by the cleaner C001 is moved in the moving direction F001 in the uncleaned range 122a. The image determination unit 14 of the image processing apparatus 10 detects this state and determines that the uncleaned area 122a has been cleaned. It should be noted that it is also possible to detect and determine the state of the moving direction F001 of the hand of the cleaner C001 without detecting the cleaning tool K001.
 これらの判定をしていくことにより、図7に示すように、清掃済み範囲121と未清掃範囲122の判定が可能となる。その結果は、表示装置22の清掃状況画面120で表示することが可能である。 By making these determinations, it is possible to determine a cleaned range 121 and an uncleaned range 122, as shown in FIG. The result can be displayed on the cleaning status screen 120 of the display device 22 .
 図8は、本発明の清掃状況検知システムにおける判定処理の検知の場合の一例を示す概念図である。図9は、本発明の清掃状況検知システムにおける判定処理の非検知の場合の一例を示す概念図である。 FIG. 8 is a conceptual diagram showing an example of detection of determination processing in the cleaning status detection system of the present invention. FIG. 9 is a conceptual diagram showing an example of non-detection in the determination process in the cleaning condition detection system of the present invention.
 図8と図9は、図6のS205における清掃判定処理の一例を示す。清掃が必要な範囲となる検知領域150について、グリッド線等によるエリアの細分化を行う。これにより、分割された複数の小領域151が存在することになる。小領域151の数は、例えば、4以上、さらには10以上等と状況に応じて分割とするとよい。  FIGS. 8 and 9 show an example of the cleaning determination process in S205 of FIG. The detection area 150, which is a range requiring cleaning, is subdivided by grid lines or the like. As a result, a plurality of divided small areas 151 exist. The number of small areas 151 may be, for example, 4 or more, or 10 or more, depending on the situation.
 図8と図9の例では、清掃判定処理において、清掃方向と範囲の2つの要素で清掃を行ったか否かを判定している。すなわち、清掃道具K001の移動方向があらかじめ設定した所定の方向の範囲内であって、小領域151内をすべて清掃したと判定した場合に、当該小領域151が清掃されたと判定される。ここで、所定の方向とは、例えば、ある方向から決まった一方向に清掃道具K001を動かす方向等である。これらの条件は、あらかじめ設定しておくことも可能であるし、上述したAIによる推論処理で決めることも可能である。 In the examples of FIGS. 8 and 9, in the cleaning determination process, it is determined whether or not cleaning has been performed based on the two elements of cleaning direction and range. That is, when it is determined that the movement direction of the cleaning tool K001 is within a predetermined range of directions set in advance and the entire small area 151 has been cleaned, it is determined that the small area 151 has been cleaned. Here, the predetermined direction is, for example, a direction in which the cleaning tool K001 is moved in a predetermined direction from a certain direction. These conditions can be set in advance, or can be determined by inference processing by AI described above.
 図8では、清掃道具K001の移動方向F001は所定の方向であって、検知小領域A001から左側の各小領域151はすべて清掃されている。このため、これらの領域はすべて清掃済みと判定される。これにより、画像処理装置10は清掃したことを検知する。 In FIG. 8, the moving direction F001 of the cleaning tool K001 is a predetermined direction, and all small areas 151 on the left side of the detection small area A001 have been cleaned. Therefore, all these areas are determined to have been cleaned. Thereby, the image processing apparatus 10 detects that it has been cleaned.
 図9では、清掃道具K001の移動方向F001は、一定方向でなく、蛇行しているため、移動方向F001はあらかじめ定めた条件を満たさない。このため、清掃道具K001が通過した小領域151があったとしても、検知小領域A001から左側の各小領域151はすべて未清掃と判定される。これにより、画像処理装置10は清掃したことを検知しない。 In FIG. 9, the moving direction F001 of the cleaning tool K001 is not constant but meandering, so the moving direction F001 does not satisfy the predetermined condition. Therefore, even if there is a small area 151 through which the cleaning tool K001 has passed, all the small areas 151 to the left of the detection small area A001 are determined to be uncleaned. Accordingly, the image processing apparatus 10 does not detect cleaning.
(効果)
 以上のように、本実施形態では、撮像装置1からの情報に基づき、清掃者の行動等を分析することにより、清掃状況を的確に把握できる。また、AIによる推論処理を用いることでより正確な判定が可能となる。このとき、撮像装置1として、既存の監視カメラと共用で用いることもできるため、当該システムを安価で構築しやすいものとなる。さらに、表示装置22の表示に基づきレポート作成を行えるので、清掃者の報告負担を軽減することができる。また、清掃状況は、表示画面100で共有できるため、遠隔からも清掃状況の把握が可能である。このとき、未清掃箇所を表示画面100の清掃状況画面120に分かりやすく表示して、把握しやすい構成を構築することが可能となる。
(effect)
As described above, in the present embodiment, by analyzing the behavior of the cleaner based on the information from the imaging device 1, the cleaning status can be accurately grasped. In addition, more accurate determination is possible by using inference processing by AI. At this time, since the imaging apparatus 1 can be used in common with an existing surveillance camera, the system can be constructed at low cost. Furthermore, since the report can be created based on the display on the display device 22, the burden on the cleaner can be reduced. In addition, since the cleaning status can be shared on the display screen 100, the cleaning status can be grasped remotely. At this time, it is possible to display the uncleaned locations on the cleaning status screen 120 of the display screen 100 in an easy-to-understand manner, thereby constructing a configuration that is easy to grasp.
 さらに、図7に示したように、骨格のモデルを用いることで清掃判定処理を行いやすくできる。さらに、図8、図9に示したように、清掃方向と範囲に基づき、的確な清掃判定処理が可能となる。この場合、例えば、ウイルスを拡散しないための清掃方向等、清掃にふさわしい規定の方法であるかといったことも的確に判定可能となる。 Furthermore, as shown in FIG. 7, cleaning determination processing can be facilitated by using a skeleton model. Furthermore, as shown in FIGS. 8 and 9, it is possible to perform accurate cleaning determination processing based on the cleaning direction and range. In this case, for example, it is also possible to accurately determine whether a specified method is suitable for cleaning, such as a cleaning direction for preventing the spread of viruses.
(実施形態の変形例)
 実施形態の変形例として撮像装置1に赤外線カメラを用いて、清掃対象の表面温度を測定してもよい。表面温度は、清掃時にアルコールや水等を使用する場合、その気化熱で低下する可能性が高い。このような表面の温度変化の情報を加味して、図6のS204の特徴量の抽出やS205の清掃判定処理に用いる。この場合、清掃時における清掃対象の表面温度に関する情報を特徴量に加える等して清掃判定処理を行うことができる。これにより、より正確な清掃判定が可能となる。
(Modification of embodiment)
As a modification of the embodiment, an infrared camera may be used as the imaging device 1 to measure the surface temperature of the object to be cleaned. When alcohol, water, or the like is used for cleaning, the surface temperature is likely to decrease due to the heat of vaporization. Taking into consideration such information on temperature change of the surface, it is used for the extraction of the feature amount in S204 of FIG. 6 and the cleaning determination process in S205. In this case, the cleaning determination process can be performed by adding information about the surface temperature of the object to be cleaned at the time of cleaning to the feature amount. This enables more accurate cleaning determination.
 また、清掃対象の水分を測定できるセンサを用いて、水分量の情報を加味して、図6のS204の特徴量の抽出やS205の清掃判定処理に用いてもよい。この場合、清掃時に清掃対象の表面の水分量が一旦、増えて、その後、気化により減少する可能性が高い。このため、そのファクターを特徴量に加える等して清掃判定処理を行うことができる。これにより、より正確な清掃判定が可能となる。 Also, a sensor that can measure the moisture content of the object to be cleaned may be used to add moisture content information and use it for the extraction of the feature amount in S204 of FIG. 6 and the cleaning determination process in S205. In this case, there is a high possibility that the amount of moisture on the surface to be cleaned increases once during cleaning and then decreases due to evaporation. Therefore, the cleaning determination process can be performed by adding the factor to the feature amount. This enables more accurate cleaning determination.
 以上の様に、本発明の実施形態について説明してきたが、本発明は上記した実施形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 As described above, the embodiments of the present invention have been described, but the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described. Moreover, it is possible to add, delete, or replace part of the configuration of each embodiment with another configuration.
 例えば、図1では、画像処理装置10、画像記録装置21、表示装置22、印刷装置23は、ネットワーク30を介することを説明したが、これらは、それぞれ直接接続されていてもよい。また、画像処理装置10と表示装置22は一体に構成してもよい。また、これ以外に画像処理装置10は撮像装置1内に一体に構成してもよい。 For example, in FIG. 1, the image processing device 10, the image recording device 21, the display device 22, and the printing device 23 are connected via the network 30, but they may be directly connected. Also, the image processing device 10 and the display device 22 may be integrated. Alternatively, the image processing device 10 may be configured integrally within the imaging device 1 .
 また、図3、図4はタッチパネルによる構成を示したが、これに限らず、ボタン表示の代りに専用の操作スイッチやキーボード等で構成してもよい。 In addition, although FIGS. 3 and 4 show a configuration using a touch panel, the configuration is not limited to this, and instead of displaying buttons, a dedicated operation switch, keyboard, or the like may be used.
1…撮像装置、10…画像処理装置、11…制御部、12…ユーザインターフェース部、13…レポート作成部、14…画像判定部、15…画像判定記録部、21…画像記録装置、22…表示装置、23…印刷装置、30…ネットワーク、100…表示画面、101…清掃開始ボタン、102…清掃終了ボタン、103…撮像装置エリアボタン、104…清掃状況更新ボタン、105…レポート作成ボタン、111…清掃開始日時、112…清掃終了日時、113…カメラ識別情報、114…レポート作成日時、120…清掃状況画面、121…清掃済み範囲、122、122a…未清掃範囲、150…検知領域、151…小領域、300…コンピュータシステム、302…プロセッサ、302A、302B…処理装置、304…メモリ、306…メモリバス、308…I/Oバス、309…バスインターフェース、310…I/Oバスインターフェース、312…端末インターフェース、314…ストレージインターフェース、316…I/Oデバイスインターフェース、318…ネットワークインターフェース、320…ユーザI/Oデバイス、322…ストレージ装置、324…表示システム、326…表示装置、330…ネットワーク、350…アプリケーション、A001…検知小領域、C001…清掃者、F001…移動方向、K001…清掃道具 DESCRIPTION OF SYMBOLS 1... Imaging device 10... Image processing apparatus 11... Control part 12... User interface part 13... Report preparation part 14... Image determination part 15... Image determination recording part 21... Image recording device 22... Display Devices 23...Printing device 30...Network 100...Display screen 101...Cleaning start button 102...Cleaning end button 103...Imaging device area button 104...Cleaning status update button 105...Report creation button 111... Cleaning start date and time 112 Cleaning end date and time 113 Camera identification information 114 Report creation date and time 120 Cleaning status screen 121 Cleaned range 122, 122a Uncleaned range 150 Detection area 151 Small Area 300 Computer system 302 Processor 302A, 302B Processing device 304 Memory 306 Memory bus 308 I/O bus 309 Bus interface 310 I/O bus interface 312 Terminal Interface 314 Storage interface 316 I/O device interface 318 Network interface 320 User I/O device 322 Storage device 324 Display system 326 Display device 330 Network 350 Application , A001... Detection small area, C001... Cleaner, F001... Moving direction, K001... Cleaning tool

Claims (7)

  1.  撮像装置と、画像処理装置とを備え、
     前記画像処理装置は、前記撮像装置で撮影した画像に基づき、清掃者を特定して、前記清掃者の行動から清掃済み範囲を特定していくことで前記画像内での清掃状況を判定する画像判定部を備えることを特徴とする清掃状況検知システム。
    An imaging device and an image processing device,
    The image processing device identifies a cleaning person based on the image captured by the imaging device, and identifies a cleaned area based on the behavior of the cleaning person, thereby determining the cleaning status in the image. A cleaning status detection system comprising a determination unit.
  2.  請求項1に記載の清掃状況検知システムにおいて、
     前記清掃状況の判定は、前記画像内で設定された検知領域に対して、清掃されていない箇所がある場合、未清掃と判定することを特徴とする清掃状況検知システム。
    In the cleaning status detection system according to claim 1,
    The cleaning status detection system is characterized in that, in determining the cleaning status, if there is a part that has not been cleaned in the detection area set in the image, it is determined that the cleaning status has not been cleaned.
  3.  請求項1に記載の清掃状況検知システムにおいて、
     前記清掃状況の判定は、清掃道具を特定して、前記清掃道具があらかじめ定めた所定の方向に移動していない場合、その範囲は未清掃であると判定することを特徴とする清掃状況検知システム。
    In the cleaning status detection system according to claim 1,
    The cleaning status detection system, wherein the cleaning status is determined by identifying the cleaning tool and determining that the area has not been cleaned if the cleaning tool is not moving in a predetermined direction. .
  4.  請求項1に記載の清掃状況検知システムにおいて、
     表示装置を備え、
     前記表示装置は、前記画像処理装置における前記清掃状況の判定結果を前記撮影した画像を用いて表示することを特徴とする清掃状況検知システム。
    In the cleaning status detection system according to claim 1,
    Equipped with a display device,
    The cleaning status detection system, wherein the display device displays the determination result of the cleaning status by the image processing device using the photographed image.
  5.  請求項4に記載の清掃状況検知システムにおいて、
     前記画像処理装置は、前記表示装置から清掃終了又はレポート作成に関する信号を取得した場合、前記画像処理装置における前記清掃状況の判定結果に基づき自動でレポート作成を行うレポート作成部を備えることを特徴とする清掃状況検知システム。
    In the cleaning status detection system according to claim 4,
    The image processing device is characterized by comprising a report creation unit that automatically creates a report based on the cleaning status determination result in the image processing device when a signal regarding completion of cleaning or creation of a report is obtained from the display device. cleaning status detection system.
  6.  請求項1に記載の清掃状況検知システムにおいて、
     前記清掃状況の判定は、ニューラルネットワークとディープラーニングを用いたAI(Artificial Intelligence)による推論処理により行い、少なくとも前記清掃者の行動を特徴量として扱うことを特徴とする清掃状況検知システム。
    In the cleaning status detection system according to claim 1,
    A cleaning status detection system characterized in that the cleaning status is determined by inference processing by AI (Artificial Intelligence) using a neural network and deep learning, and at least the behavior of the cleaning person is treated as a feature quantity.
  7.  請求項1に記載の清掃状況検知システムにおいて、
     前記撮像装置は、赤外線カメラの機能を有し、
     前記画像処理装置は、前記清掃状況の判定に、前記撮像装置から取得した温度の情報を用いることを特徴とする清掃状況検知システム。
    In the cleaning status detection system according to claim 1,
    The imaging device has the function of an infrared camera,
    The cleaning status detection system, wherein the image processing device uses temperature information acquired from the imaging device to determine the cleaning status.
PCT/JP2021/010396 2021-03-15 2021-03-15 Cleaning status detection system WO2022195681A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023506405A JP7448722B2 (en) 2021-03-15 2021-03-15 Cleaning status detection system
PCT/JP2021/010396 WO2022195681A1 (en) 2021-03-15 2021-03-15 Cleaning status detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/010396 WO2022195681A1 (en) 2021-03-15 2021-03-15 Cleaning status detection system

Publications (1)

Publication Number Publication Date
WO2022195681A1 true WO2022195681A1 (en) 2022-09-22

Family

ID=83320038

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/010396 WO2022195681A1 (en) 2021-03-15 2021-03-15 Cleaning status detection system

Country Status (2)

Country Link
JP (1) JP7448722B2 (en)
WO (1) WO2022195681A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012519343A (en) * 2009-03-02 2012-08-23 ディバーシー・インコーポレーテッド Hygiene condition monitoring management system and method
JP2016149024A (en) * 2015-02-12 2016-08-18 富士通株式会社 Evaluation method of cleaning state, evaluation program of cleaning state and evaluation apparatus of cleaning state

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012519343A (en) * 2009-03-02 2012-08-23 ディバーシー・インコーポレーテッド Hygiene condition monitoring management system and method
JP2016149024A (en) * 2015-02-12 2016-08-18 富士通株式会社 Evaluation method of cleaning state, evaluation program of cleaning state and evaluation apparatus of cleaning state

Also Published As

Publication number Publication date
JPWO2022195681A1 (en) 2022-09-22
JP7448722B2 (en) 2024-03-12

Similar Documents

Publication Publication Date Title
CN103576976B (en) Information processing apparatus and control method thereof
GB2498299B (en) Evaluating an input relative to a display
US6765555B2 (en) Passive optical mouse using image sensor with optional dual mode capability
JP2011522332A (en) Multiple pointer ambiguity and occlusion resolution
WO2021117616A1 (en) Cleaning area estimation apparatus and cleaning area estimation method
TW201214243A (en) Optical touch system and object detection method therefor
TW201124878A (en) Device for operation and control of motion modes of electrical equipment
CN112083801A (en) Gesture recognition system and method based on VR virtual office
WO2022195681A1 (en) Cleaning status detection system
TWI547158B (en) Integrate multiple images in a single summary window
CN107168637A (en) A kind of intelligent terminal for by scaling gesture show scaling
JP6977573B2 (en) Information terminal equipment, information processing system and display control program
CN116086462B (en) Track data processing method, device, medium and computing equipment
WO2012157611A1 (en) Similar image search system
CN106021922A (en) Three-dimensional medical image control equipment, method and system
JP6112436B1 (en) MONITORING SYSTEM, VIDEO DISPLAY METHOD, AND COMPUTER PROGRAM
JP6099025B1 (en) MONITORING SYSTEM, MASK PROCESSING SETTING METHOD, AND COMPUTER PROGRAM
JP2008199145A (en) Electronic equipment
TWM352850U (en) Digital monitoring and recording device with man-machine interface of touch panel
WO2022045049A1 (en) Information processing device, information processing method, and program
CN113711164A (en) Method and apparatus for user control of applications and corresponding devices
CN113990517A (en) Crowd detection method and device, electronic equipment and storage medium
TW200834417A (en) A screen sharing system
TWI455007B (en) Non-contact instruction-inputting method for electronic apparatus with camera
JP6099027B1 (en) MONITORING SYSTEM, VIDEO DISPLAY METHOD, AND COMPUTER PROGRAM

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931436

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023506405

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931436

Country of ref document: EP

Kind code of ref document: A1