WO2024070335A1 - Système de surveillance et procédé de surveillance - Google Patents

Système de surveillance et procédé de surveillance Download PDF

Info

Publication number
WO2024070335A1
WO2024070335A1 PCT/JP2023/030078 JP2023030078W WO2024070335A1 WO 2024070335 A1 WO2024070335 A1 WO 2024070335A1 JP 2023030078 W JP2023030078 W JP 2023030078W WO 2024070335 A1 WO2024070335 A1 WO 2024070335A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
display
information
acquired
Prior art date
Application number
PCT/JP2023/030078
Other languages
English (en)
Japanese (ja)
Inventor
和氣勝
古賀太一郎
陰山杏奈
長澤正佳
神澤博之
市川純一
柳内久和
Original Assignee
グローリー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by グローリー株式会社 filed Critical グローリー株式会社
Publication of WO2024070335A1 publication Critical patent/WO2024070335A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to a monitoring system and a monitoring method.
  • Patent Document 1 describes a monitoring system for monitoring a person being monitored.
  • this monitoring system (referred to as a “monitoring system for monitored persons” in Patent Document 1), an image is acquired by an imaging unit. Then, the image showing the person being monitored is displayed on a mobile communication terminal.
  • the area of the displayed image in which the person being monitored is located can be pixelated and displayed.
  • Patent Document 1 does not mention setting whether or not to pixelate the image for each monitored person when multiple monitored people are monitored.
  • the objective of the present disclosure is to provide a monitoring system and monitoring method that is capable of processing images acquired based on imaging by an imaging unit while respecting the privacy of the monitored person, and that is capable of processing differently for each monitored person.
  • the feature of the monitoring system is that it comprises a plurality of imaging units corresponding to a plurality of monitored persons, each of which is configured to capture an image of a monitored area in which the corresponding monitored person is present, a display unit capable of displaying an acquired image, which is an image captured based on imaging by the imaging units, a privacy information acquisition unit that acquires privacy information related to the acquired image stored for each monitored person, and a processing unit that performs processing related to at least one of displaying and storing the acquired image, and the processing unit processes the acquired image corresponding to the monitored person based on the privacy information corresponding to that monitored person.
  • the feature of the monitoring method according to the present disclosure is that it is a monitoring method in a monitoring system, and includes a plurality of imaging steps corresponding to a plurality of monitored persons, and in each imaging step, an image of a monitored area in which the corresponding monitored person is present is captured, and includes a display step of displaying an acquired image, which is an image captured based on the image captured in the imaging step, a privacy information acquisition step of acquiring privacy information related to the acquired image stored for each monitored person, and a processing step of performing processing related to at least one of displaying and storing the acquired image, and in the processing step, processing of the acquired image corresponding to the monitored person is performed based on the privacy information corresponding to the monitored person.
  • privacy information regarding the acquired image is stored for each monitored person. Then, based on the privacy information corresponding to the monitored person, processing (e.g., mosaic processing) is performed on the acquired image corresponding to the monitored person.
  • processing e.g., mosaic processing
  • images acquired based on imaging by the imaging unit can be processed in a manner that takes into consideration the privacy of the person being monitored, and different processing can be performed for each person being monitored.
  • the processing unit is capable of generating a second image by performing an abstraction process on a first image, which is an image captured by the imaging unit, and the first image and the second image are both the captured images, and it is preferable that the processing unit switches between displaying the first image or the second image on the display unit based on the privacy information.
  • the privacy information reflects the wishes of the monitored person
  • the device includes a storage unit capable of storing the acquired image, the storage unit stores the first image, and when the second image is displayed, the processing unit generates the second image by performing the abstraction process on the first image acquired from the storage unit, and displays the second image on the display unit.
  • a storage unit capable of storing the acquired image is provided, and when the second image is displayed, the processing unit stores the second image in the storage unit and displays the second image on the display unit.
  • the second image when the second image is displayed on the display unit, the second image is stored in the storage unit, and the first image is not stored. This makes it possible to realize a configuration in which the first image is not viewed by the user.
  • a storage unit capable of storing the acquired image
  • an authority acquisition unit that acquires authority information that is information indicating the user's authority regarding viewing of the acquired image, and when the privacy information indicates the display of the first image, and the authority information indicates that the user is not permitted to view the first image, but is permitted to view the second image
  • the storage unit stores the first image
  • the processing unit generates the second image by performing the abstraction process on the first image acquired from the storage unit, and causes the second image to be displayed on the display unit.
  • This configuration makes it possible to realize a configuration in which whether or not the first image is displayed on the display unit is switched depending on the user's authority regarding viewing the acquired image.
  • the monitored area is a room in a welfare facility for the elderly and the user is a staff member of the welfare facility, it is possible to realize a system in which only a limited number of staff members (e.g., the manager) can view the first image.
  • an authority acquisition unit that acquires authority information that is information indicating the user's authority regarding viewing the acquired image
  • the processing unit performs processing regarding at least one of displaying and storing the acquired image based on the authority information
  • the acquired image is processed (e.g., mosaic processing) based on the user's authority.
  • This makes it possible to realize a system in which, for example, if the area to be monitored is a room in a welfare facility for the elderly and the user of the display unit is a staff member of the welfare facility, the acquired image is processed depending on whether the staff member is a limited number of people (e.g., a manager).
  • a status detection unit which detects the status of the monitored person based on the acquired image
  • the display unit is capable of displaying a detection image which is the acquired image showing the status of the monitored area when a predetermined status is detected by the status detection unit
  • the display unit is capable of displaying a current image which is the acquired image showing the current status of the monitored area
  • the display unit is capable of displaying a past image which is the acquired image showing the status of the monitored area at a past date and time specified by the user.
  • the user can confirm the status of the monitored area when a specific state (for example, the monitored person has fallen) is detected. Also, by viewing the current image, the user can confirm the current status of the monitored area. Also, by viewing the past image, the user can confirm the status of the monitored area at a past date and time specified by the user.
  • a specified state for example, the monitored person has fallen
  • the privacy information includes first information regarding the detection image, second information regarding the current image, and third information regarding the past image.
  • This configuration allows the processing unit to perform processing on the detection image based on the first information, processing on the current image based on the second information, and processing on the past image based on the third information. This allows the processing unit to realize a system in which different processing can be performed on each of the detection image, the current image, and the past image.
  • the processing unit is incapable of executing a non-display process for preventing the detection image from being displayed on the display unit based on the first information
  • the processing unit is capable of executing a current non-display process, which is a non-display process for preventing the current image from being displayed on the display unit, based on the second information
  • the processing unit is capable of executing a past non-display process, which is a non-display process for preventing the past images from being displayed on the display unit, based on the third information.
  • the privacy information reflects the wishes of the monitored person
  • the current image and past images can be hidden on the display unit according to the monitored person's wishes. This makes it possible to satisfy the wishes of each monitored person regarding whether or not to display the current image and past images on the display unit.
  • the present disclosure it is preferable to include a storage unit capable of storing the acquired images, and in the past image hiding process, the acquired images for displaying the past images are not stored in the storage unit.
  • a pattern setting unit capable of setting one or more privacy patterns which are combination patterns of the first information, the second information, and the third information, and a number output unit which outputs information indicating the number of monitored persons corresponding to each of the privacy patterns.
  • the user can ascertain the number of monitored persons corresponding to each privacy pattern by checking the information output by the number output unit. This allows the user to take action such as deleting a privacy pattern when there is a privacy pattern that corresponds to a relatively small number of monitored persons (e.g., the number of monitored persons corresponding to the privacy pattern is zero).
  • the processing unit switches the display mode of the detection image depending on the state of the monitored person detected by the state detection unit.
  • an abstraction process e.g., mosaic process
  • an abstraction process is not applied to the detection image when the monitored person is in danger. This makes it easier to grasp the details of the monitored person's condition when the monitored person is in danger, while protecting the privacy of the monitored person under normal circumstances.
  • an operation unit that accepts operation input by a user is provided, and the processing unit switches the display mode of the acquired image on the display unit in response to the operation input to the operation unit.
  • the display mode of the acquired image on the display unit can be switched in response to operational input by the user. This makes it possible to realize a system that allows the user to switch the display mode of the acquired image according to the situation or their own wishes.
  • the processing unit switches the display mode of the acquired image on the display unit depending on the time of image capture by the imaging unit.
  • a system can be realized in which an acquired image that has been subjected to abstraction processing (e.g., mosaic processing) is displayed on the display unit during times when consideration of privacy is required, and an acquired image that has not been subjected to abstraction processing (e.g., mosaic processing) is displayed on the display unit during other times.
  • abstraction processing e.g., mosaic processing
  • a skeletal information generating unit that generates skeletal information indicating the skeleton of a person present in the monitored area based on an image captured by the imaging unit, and the processing unit is capable of displaying the acquired image and the skeletal information in a superimposed state on the display unit.
  • FIG. 1 is a diagram showing an overview of a monitoring system.
  • FIG. 1 is a block diagram of a monitoring system.
  • FIG. 2 is a diagram showing a home screen.
  • FIG. 11 is a diagram illustrating an example of skeletal information.
  • FIG. 13 is a diagram showing an example of a second image.
  • FIG. 13 is a diagram showing an example of a second image on which skeletal information is superimposed.
  • FIG. 13 is a diagram showing an example of a first image on which skeletal information is superimposed.
  • FIG. 2 is a diagram showing a home screen.
  • FIG. 13 is a diagram showing a pattern setting screen.
  • 13 is a flowchart of a detection image display flow. 13 is a flowchart of a current image display flow. 13 is a flowchart of a past image storage flow. 13 is a flowchart of a past image display flow. FIG. 13 illustrates a download screen.
  • a monitoring system A in this embodiment includes a plurality of monitoring devices 1 and a monitoring terminal 2.
  • the monitoring devices 1 and the monitoring terminals 2 are connected to each other via a predetermined network.
  • Each monitoring device 1 is installed in a different monitored area B.
  • a monitored person D exists in each monitored area B.
  • the monitoring device 1 monitors a monitored person D who is present in the monitored area B in which the monitoring device 1 is installed. With this configuration, there is a one-to-one correspondence between each monitoring device 1 and each monitored person D. In other words, the monitoring system A is equipped with multiple monitoring devices 1 corresponding to multiple monitored persons D.
  • An image showing the situation within monitored area B is transmitted from monitoring device 1 to monitoring terminal 2.
  • user E of monitoring terminal 2 can grasp the situation within monitored area B while being outside monitored area B.
  • the monitored area B is a room in a welfare facility for the elderly
  • the user E is a staff member of the welfare facility for the elderly
  • the monitoring terminal 2 is an information terminal (such as a personal computer or smartphone) used by the staff member.
  • the monitored person D is a resident of the welfare facility for the elderly
  • the staff member user E is located in a room other than the room (for example, a waiting room).
  • the number of monitoring devices 1 included in the monitoring system A may be any number equal to or greater than two.
  • the monitoring devices 1 may be installed in each room of the elderly welfare facility, or may be installed in areas other than the rooms (e.g., hallways, etc.).
  • Monitoring system A may include one monitoring terminal 2 or multiple monitoring terminals 2. Multiple users E may each use one monitoring terminal 2, or may share one monitoring terminal 2.
  • the monitoring device 1 has an imaging unit 3.
  • the imaging unit 3 captures an image of a monitoring target area B (see Fig. 1).
  • the surveillance system A includes multiple surveillance devices 1.
  • the surveillance system A includes multiple imaging units 3.
  • each imaging unit 3 and each monitored person D have a one-to-one correspondence.
  • the surveillance system A has multiple imaging units 3 corresponding to multiple monitored persons D. Furthermore, each imaging unit 3 is configured to capture an image of the surveillance target area B in which the corresponding monitored person D is present.
  • the imaging unit 3 is configured with an infrared camera.
  • the imaging unit 3 may be any device other than an infrared camera as long as it is capable of capturing images within the monitored area B.
  • the imaging unit 3 is a ToF (Time of Flight) camera.
  • the captured image acquired by the imaging unit 3 is a depth image, which is an image including depth information. It is to be noted that the captured image acquired by the imaging unit 3 may be a video or a still image.
  • FIG. 3 shows a home screen 91 displayed on the display 21 (corresponding to the "display unit” according to the present disclosure) (see FIG. 2) of the monitoring terminal 2.
  • a first area F1 is provided in the lower right portion of the home screen 91.
  • the first image 11 is displayed in the first area F1 in FIG. 3.
  • the first image 11 is an image captured by the imaging unit 3.
  • the first image 11 shows the monitored person D, a bed, curtains, walls, etc.
  • the first image 11 corresponds to the "acquired image" according to this disclosure.
  • the acquired image is an image acquired based on imaging by the imaging unit 3.
  • the monitoring device 1 has a control unit 4.
  • the control unit 4 includes a skeletal information generating unit 41 and a state detecting unit 42.
  • the first image 11 is sent from the imaging unit 3 to the control unit 4.
  • the skeletal information generation unit 41 generates skeletal information 10 as shown in FIG. 4 based on the first image 11.
  • the skeletal information 10 is information that indicates the skeleton of a person (monitored person D) present in the monitored area B.
  • the surveillance system A is equipped with a skeletal information generation unit 41 that generates skeletal information 10 that indicates the skeleton of a person present in the monitored area B based on the image captured by the imaging unit 3.
  • the skeletal information 10 in this embodiment is specifically a skeletal model.
  • the skeletal model is generated by analyzing the first image 11 to recognize the positions of the joints of the monitored person D in three dimensions and connecting each joint with a straight line.
  • the state detection unit 42 detects the state of the monitored person D based on the skeletal information 10 generated by the skeletal information generation unit 41. This allows the state detection unit 42 to detect whether the monitored person D is in a predetermined state.
  • the state detected by the state detection unit 42 may be the body position of the monitored person D (e.g., standing, lying, sitting). Also, although not particularly limited, the predetermined state may be a dangerous state (e.g., a fallen state).
  • the monitoring system A may also be configured to allow the user E to set the above-mentioned predetermined state.
  • the monitoring system A may be configured to allow the user E to select and set the predetermined state from options such as falling, lying down, slipping down, boundary position, getting up, and getting out of bed.
  • the status detection unit 42 detects the status of the monitored person D based on the skeletal information 10 generated based on the first image 11.
  • the monitoring system A is equipped with a status detection unit 42 that detects the status of the monitored person D based on the acquired image.
  • the monitoring system A includes a processing unit P.
  • the processing unit P has a first processing unit 43 and a second processing unit 22.
  • the first processing unit 43 is included in the control unit 4.
  • the second processing unit 22 is included in the monitoring terminal 2.
  • the processing unit P is configured to be able to control the display content on the display 21.
  • the processing unit P is configured to be able to generate a second image 12 as shown in FIG. 5 by performing an abstraction process on the first image 11. That is, the processing unit P is able to generate the second image 12 by performing an abstraction process on the first image 11, which is an image captured by the imaging unit 3.
  • the abstraction process in this embodiment is specifically a mosaic process.
  • the present disclosure is not limited to this, and the abstraction process may be, for example, any blurring process other than a mosaic process, a silhouette process, or a process of filling in a predetermined color (for example, gray).
  • the abstraction process may be, for example, to the extent that the monitored person D cannot be personally identified, and the positions of objects (e.g., beds, curtains, walls) in the monitored area B can be determined.
  • objects e.g., beds, curtains, walls
  • the second image 12 is generated by applying an abstraction process to the entire first image 11 shown in FIG. 3.
  • the processing unit P may be configured to be able to generate the second image 12 by applying an abstraction process to only a partial area of the first image 11.
  • the processing unit P may determine the part of the first image 11 to which the abstraction process is applied based on the skeletal information 10.
  • the processing unit P may abstract only the part of the first image 11 that corresponds to the part of the monitored person D's body, clothing, etc. that requires consideration for privacy (e.g., the face, etc.) based on the skeletal information 10.
  • the processing unit P may be configured to generate the second image 12 by performing abstraction processing on only an area of the first image 11 that has been set in advance by the monitored person D or user E.
  • the processing unit P acquires the second image 12 by generating the second image 12 based on the first image 11. That is, the second image 12 is an image acquired based on imaging by the imaging unit 3.
  • the second image 12 corresponds to the "acquired image” according to the present disclosure. That is, the first image 11 and the second image 12 are both acquired images.
  • FIGS. 6 and 7 show enlarged views of the first area F1 described above.
  • the processing unit P is capable of displaying the second image 12 and the skeletal information 10 in a superimposed state in the first area F1.
  • the processing unit P is capable of displaying the first image 11 and the skeletal information 10 in a superimposed state in the first area F1.
  • the processing unit P can display the acquired image and the skeletal information 10 on the display 21 in a superimposed state.
  • the surveillance system A also includes a display 21 capable of displaying the acquired image, which is an image acquired based on imaging by the imaging unit 3.
  • the monitoring terminal 2 has an operation unit 23.
  • the operation unit 23 is configured to accept operation input by the user E.
  • the operation unit 23 is not particularly limited, and may be, for example, a keyboard or a mouse. That is, the monitoring system A includes the operation unit 23 that accepts operation input by the user E.
  • a first button 31, a second button 32, and a third button 33 are displayed above a first area F1 on the home screen 91.
  • a user E can operate the first button 31, the second button 32, and the third button 33 by inputting an operation via the operation unit 23.
  • the second processing unit 22 switches the display mode of the acquired image in the first area F1 in response to the operation of the first button 31, the second button 32, and the third button 33.
  • the first image 11 without the skeletal information 10 superimposed thereon is displayed in the first area F1, as shown in FIG. 3.
  • the first image 11 with the skeletal information 10 superimposed thereon is displayed in the first area F1, as shown in FIG. 7.
  • the second image 12 with the skeletal information 10 superimposed thereon is displayed in the first area F1, as shown in FIG. 6.
  • the processing unit P switches the display mode of the acquired image on the display 21 in response to the operation input to the operation unit 23.
  • the processing unit P can display a detection image G1 in the first area F1 in the home screen 91.
  • the detection image G1 is an acquired image showing the state of the monitored area B when a predetermined state is detected by the state detection unit 42.
  • the display 21 can display the detection image G1, which is an acquired image showing the status of the monitored area B when a specified status is detected by the status detection unit 42.
  • the expression "status of the monitored area B" includes the status of the monitored person D (status of a part of the area within the monitored area B).
  • the detection image G1 may be an acquired image showing the status of the monitored person D (status of a part of the area within the monitored area B) when a specified status is detected by the status detection unit 42.
  • the detection image G1 is a video. More specifically, in this embodiment, the detection image G1 is an acquired image based on imaging by the imaging unit 3 during a period including the time when a predetermined state is detected by the state detection unit 42 (for example, the time when the monitored person D falls). Hereinafter, this period is referred to as the "detection image period.”
  • the start point of the detection image period is not particularly limited, but may be a predetermined number of seconds before (not particularly limited, for example, 20 seconds before) the time when the specified state is detected by the state detection unit 42.
  • the end point of the detection image period is not particularly limited, but may be a predetermined number of seconds after (not particularly limited, for example, 20 seconds after) the time when the specified state is detected by the state detection unit 42.
  • the display mode of the detection image G1 becomes the first image 11 without the skeletal information 10 superimposed thereon.
  • the display mode of the detection image G1 becomes the first image 11 with the skeletal information 10 superimposed thereon.
  • the display mode of the detection image G1 becomes the second image 12 with the skeletal information 10 superimposed thereon.
  • a second area F2 is provided to the left of the first area F1 on the home screen 91.
  • a list is displayed in the second area F2. This list shows a plurality of detection images G1 stored in the storage unit 5 (see FIG. 2) of the monitoring device 1 in chronological order.
  • the selected detection image G1 is displayed (played back) in the first area F1.
  • This list includes a first column 34 and a second column 35.
  • the first column 34 shows the specific contents of the "predetermined state” detected by the state detection unit 42 for each detection image G1.
  • the "predetermined state” detected by the state detection unit 42 is, for example, sitting up, getting out of bed, boundary position, slipping down, falling, and lying down.
  • a tag (identification information) is displayed for each detection image G1 indicating the situation when the image was captured. For example, in the second column 35 in the bottom row of the list shown in FIG. 8, a tag "Multiple people present” is displayed. This tag indicates that multiple people were present in the monitored area B when the detection image G1 was captured.
  • the monitoring system A (processing unit P) may be configured to detect the presence of multiple people within the monitored area B based on the skeletal information 10, and when the presence of multiple people within the monitored area B is detected, to display a "multiple people present" tag in the second column 35 for the detection image G1 corresponding to that point in time.
  • the technology for detecting the presence of multiple people based on skeletal information 10 is publicly known, so a description of it will be omitted.
  • the processing unit P can display a current image G2 in the first area F1 in the home screen 91.
  • the current image G2 is an acquired image showing the current situation of the monitored area B.
  • the current image G2 is an acquired image showing the real-time situation of the monitored area B.
  • the display 21 can display the current image G2, which is an acquired image showing the current status of the monitored area B.
  • the expression "status of the monitored area B" includes the status of the monitored person D (status of a part of the area within the monitored area B).
  • the current image G2 may be an acquired image showing the current status of the monitored person D (status of a part of the area within the monitored area B).
  • the current image G2 is a video.
  • the display mode of the current image G2 becomes the first image 11 without the skeletal information 10 superimposed thereon.
  • the display mode of the current image G2 becomes the first image 11 with the skeletal information 10 superimposed thereon.
  • the display mode of the current image G2 becomes the second image 12 with the skeletal information 10 superimposed thereon.
  • a display start button 36 is displayed above the first area F1 on the home screen 91.
  • the user E operates the display start button 36 by inputting an operation via the operation unit 23, the current image G2 is displayed in the first area F1.
  • the display may be configured so that if no operation is performed by user E for a predetermined period of time while current image G2 is displayed in first area F1, the display of current image G2 is terminated (interrupted).
  • the processing unit P can display a past image G3 (see FIG. 16 ) in the first area F1 on the home screen 91.
  • the past image G3 is an acquired image showing the state of the monitored area B at a past date and time specified by the user E.
  • the display 21 can display a past image G3, which is an acquired image showing the status of the monitored area B at a date and time in the past specified by the user E.
  • the expression "status of the monitored area B" includes the status of the monitored person D (status of a part of the area within the monitored area B).
  • the past image G3 may be an acquired image showing the status of the monitored person D at a date and time in the past specified by the user E (status of a part of the area within the monitored area B).
  • the past image G3 is a video.
  • the display mode of the previous image G3 becomes the first image 11 without the skeletal information 10 superimposed.
  • the display mode of the previous image G3 becomes the first image 11 with the skeletal information 10 superimposed thereon.
  • the display mode of the previous image G3 becomes the second image 12 with the skeletal information 10 superimposed.
  • a date and time specification section 37 and a play button 38 are displayed above the second area F2 on the home screen 91.
  • User E can specify a past date and time by operating the date and time specification section 37 through operational input via the operation unit 23.
  • a time vector tab T is displayed below the first area F1 on the home screen 91.
  • the time vector tab T is a strip extending left and right.
  • the time vector tab T indicates, by color, pattern, etc., the state of the monitored person D and the transition of the presence or absence of recording information for images captured by the imaging unit 3. The further to the left the part of the time vector tab T is, the earlier the time it corresponds to.
  • the time vector tab T shown in Figures 3 and 8 indicates, for example, that for the first time period T1, there is recording information (past image G3) of images captured by the imaging unit 3. Also, for the second time period T2, there is no recording information (past image G3) of images captured by the imaging unit 3. Also, for the third time period T3, it indicates that the monitored person D was absent.
  • a vertical line Ta is displayed overlapping the time vector tab T.
  • the vertical line Ta overlaps the time vector tab T at a position corresponding to the capture time of the acquired image displayed in the first area F1.
  • the monitoring system A includes a management computer 6.
  • the management computer 6 is connected to each monitoring device 1 and the monitoring terminal 2 via a predetermined network. As shown in FIG. 2, the management computer 6 includes a pattern setting unit 61.
  • FIG. 9 shows a pattern setting screen 92 displayed on the display 21 of the monitoring terminal 2.
  • a list is displayed on the pattern setting screen 92. This list shows a number of privacy patterns that have been set by the pattern setting unit 61 in response to operations by user E.
  • a privacy pattern is a combination pattern of first information, second information, and third information.
  • the first information, second information, and third information are all information included in privacy information.
  • Privacy information is information related to the acquired image. More specifically, privacy information is information related to protecting the privacy of the monitored person D, and is information related to at least one of displaying and storing the acquired image.
  • the first information is information regarding protection of the privacy of the monitored person D, and is information regarding at least one of displaying and storing the detection image G1.
  • the first information is either "photography OK" or "privacy.”
  • the first information being "photography OK” means that the storage and display of the detection image G1 in the first image 11 is desired (permitted).
  • the first information being "privacy” means that the storage and display of the detection image G1 in the first image 11 is rejected (prohibited), and the storage and display of the detection image G1 in the second image 12 is desired (permitted).
  • the second information is information regarding protection of the privacy of the monitored person D, and is information regarding the display of the current image G2.
  • the second information is one of "photography OK”, “privacy”, and "photography NG”.
  • the second information being "photography OK” means that the display of the current image G2 in the first image 11 is desired (permitted).
  • the second information being “privacy” means that the display of the current image G2 in the first image 11 is rejected (prohibited) and the display of the current image G2 in the second image 12 is desired (permitted).
  • the second information being "photography NG” means that the display of the current image G2 in both the first image 11 and the second image 12 is rejected (prohibited). In other words, the second information being "photography NG” means that the display of the current image G2 is rejected (prohibited) regardless of the display mode.
  • the third information is information regarding protection of the privacy of the monitored person D, and is information regarding the storage and display of past images G3.
  • the third information is one of "photography OK”, “privacy”, and "photography NG”.
  • the third information being "photography OK” means that the storage and display of the past image G3 by the first image 11 is desired (permitted).
  • the third information being “privacy” means that the storage and display of the past image G3 by the first image 11 is rejected (prohibited), and the storage and display of the past image G3 by the second image 12 is desired (permitted).
  • the third information being "photography NG” means that the storage and display of the past image G3 by the first image 11 and the storage and display of the past image G3 by the second image 12 are both rejected (prohibited). In other words, the third information being "photography NG” means that the storage and display of the past image G3 is rejected (prohibited) regardless of the display mode.
  • the privacy information in this embodiment includes first information regarding the detection image G1, second information regarding the current image G2, and third information regarding the past image G3.
  • the multiple privacy patterns that have been set include a privacy pattern named "Facility-specific plan 1.”
  • the first information is "Photography OK”
  • the second information is "Photography NG”
  • the third information is "Photography OK.”
  • the pattern setting unit 61 sets one or more privacy patterns in accordance with the signal. More specifically, the pattern setting unit 61 adds, changes, or deletes one or more privacy patterns. The one or more privacy patterns that have been set are stored in the pattern setting unit 61.
  • the monitoring system A is equipped with a pattern setting unit 61 that can set one or more privacy patterns, which are patterns of combinations of the first information, the second information, and the third information.
  • the pattern setting unit 61 may be configured to preset one or more privacy patterns that cannot be changed or deleted. Such privacy patterns are also stored in the pattern setting unit 61.
  • the management computer 6 includes a privacy storage unit 62 .
  • FIG. 10 shows an individual setting screen 93 displayed on the display 21 of the monitoring terminal 2.
  • the individual setting screen 93 displays a monitored person list 39.
  • the monitored person list 39 indicates, for each monitored person D, the name of the corresponding monitored area B (room), the name of the corresponding privacy pattern, etc.
  • a privacy pattern named "Monitoring Emphasis Plan” is associated with a monitored person D corresponding to a monitored area B (room) named "Room A01.”
  • a privacy pattern named "No Photography Plan” is associated with a monitored person D corresponding to a monitored area B (room) named "Room 101.”
  • a privacy pattern named "Privacy Emphasis Plan” is associated with a monitored person D corresponding to a monitored area B (room) named "Room 201.”
  • a different privacy pattern can be selected (set) for each monitored person D.
  • a different privacy pattern can be selected (set) for each monitored area B (room).
  • a different privacy pattern can be selected (set) for each monitoring device 1 (imaging unit 3).
  • a third area F3 is provided to the right of the monitored person list 39.
  • user E selects a row from the monitored person list 39 by operational input via the operation unit 23, the setting contents for monitored person D corresponding to that row are displayed in the third area F3.
  • the third area F3 displays the plan selection section 14.
  • User E can select one of the privacy patterns displayed on the pattern setting screen 92 (see FIG. 9) by operating the plan selection unit 14 through operational input via the operation unit 23. That is, user E can select one of the pre-set privacy patterns by operating the plan selection unit 14 through operational input via the operation unit 23.
  • the selected privacy pattern is stored in the privacy storage unit 62 as privacy information related to the acquired image, linked to the monitored person D.
  • the privacy pattern corresponding to each monitored person D is stored in the privacy memory unit 62 for each monitored person D as privacy information related to the acquired image.
  • the privacy pattern corresponding to the monitored person D may be selected by user E based on the intention of the monitored person D when he/she moves in.
  • the privacy information may be information indicating the intention of the monitored person D regarding the acquired image.
  • the privacy pattern corresponding to the monitored person D may be selected regardless of the intention of the monitored person D.
  • the management computer 6 has a number of people output unit 63.
  • the number of people output unit 63 acquires all privacy patterns stored in the pattern setting unit 61.
  • the number of people output unit 63 also acquires privacy patterns corresponding to each monitored person D stored in the privacy memory unit 62. Then, based on all privacy patterns stored in the pattern setting unit 61 and the privacy patterns corresponding to each monitored person D, the number of people output unit 63 outputs information indicating the number of monitored persons D corresponding to each privacy pattern.
  • the management computer 6 sends this information to the monitoring terminal 2. Based on this information, the monitoring terminal 2 displays the number of monitored persons D corresponding to each privacy pattern in the usage count column 26 of the list on the pattern setting screen 92, as shown in FIG. 9.
  • the number of monitored persons D corresponding to (linked to) "facility-specific plan 1" is 0.
  • the number of monitored persons D corresponding to (linked to) "facility-specific plan 2" is 3.
  • the monitoring system A is equipped with a number output unit 63 that outputs information indicating the number of monitored persons D corresponding to each privacy pattern.
  • the pattern setting screen 92 may be configured such that only privacy patterns that correspond to zero monitored persons D can be deleted, and privacy patterns that correspond to one or more monitored persons D cannot be deleted.
  • the third area F3 displays a previous seconds setting section 15, a next seconds setting section 16, a first recording condition button 17, and a second recording condition button 18.
  • User E can set the start time of the above-mentioned detection image period by operating the previous seconds setting unit 15 through operational input via the operation unit 23. More specifically, by operating the previous seconds setting unit 15, the user can set the start time of the detection image period to be how many seconds before the time when the specified state is detected by the state detection unit 42.
  • User E can set the end point of the above-mentioned detection image period by operating the number of seconds after setting unit 16 through operation input via the operation unit 23. More specifically, by operating the number of seconds after setting unit 16, the end point of the detection image period can be set as a number of seconds after the time when the specified state is detected by the state detection unit 42.
  • User E can change the conditions under which the acquired images for displaying past images G3 are stored by operating the first recording condition button 17 or the second recording condition button 18.
  • the acquired image for displaying the past image G3 is stored at all times. Also, when user E operates the second recording condition button 18 by operational input via the operation unit 23, the acquired image for displaying the past image G3 is stored only when the presence of a person is detected within the monitored area B.
  • the surveillance system A may be configured to detect the presence of a person in the surveillance target area B based on the first image 11, which is an image captured by the imaging unit 3.
  • the technology for detecting the presence of a person based on an image is publicly known, and therefore a description thereof will be omitted.
  • the present disclosure is not limited to this, and it may be possible to set the acquired image for displaying the past image G3 to be stored only when a specified state is detected by the state detection unit 42, or it may be possible to set the acquired image for displaying the past image G3 to be stored only when the monitored person D is asleep.
  • the monitoring device 1 has a transmission/reception unit 7.
  • the control unit 4 has a privacy information acquisition unit 44.
  • the privacy information acquisition unit 44 of the monitoring device 1 acquires, from the privacy storage unit 62 via the transmission/reception unit 7, privacy information (privacy pattern) associated with the monitored person D corresponding to the monitoring device 1 (particularly, the imaging unit 3 of the monitoring device 1).
  • the monitoring system A is equipped with a privacy information acquisition unit 44 that acquires privacy information related to the acquired images that is stored for each monitored person D.
  • the first processing unit 43 is configured to perform processing related to at least one of displaying and storing the acquired image.
  • the first processing unit 43 processes the acquired image corresponding to the monitored person D based on the privacy information corresponding to the monitored person D. More specifically, the first processing unit 43 switches between displaying the first image 11 or the second image 12 on the display 21 based on the privacy information.
  • the monitoring system A is equipped with a processing unit P that performs processing related to at least one of displaying and storing the acquired image.
  • the processing unit P also processes the acquired image corresponding to the monitored person D based on the privacy information corresponding to the monitored person D.
  • the processing unit P also switches between displaying the first image 11 or the second image 12 on the display 21 based on the privacy information.
  • the processing of the acquired image by the processing unit P is described in detail below.
  • the first processing unit 43 has a detection processing unit 45.
  • the detection processing unit 45 includes a detection processing determination unit 81, a detection image generation unit 82, and a detection abstraction unit 83.
  • the memory unit 5 has a temporary memory unit 51 and a detection memory unit 52.
  • the detection processing unit 45 temporarily stores the first image 11 sent from the imaging unit 3 to the control unit 4 in the temporary storage unit 51.
  • the period during which the first image 11 is stored in the temporary storage unit 51 coincides with the above-mentioned detection image period. That is, the first image 11 obtained by imaging from a point in time that is the detection image period before the present (for example, a point in time 40 seconds before the present) to the present is stored in the temporary storage unit 51. In addition, the first image 11 obtained by imaging prior to that is erased from the temporary storage unit 51.
  • the detection processing unit 45 is configured to perform processing related to the storage of the detection image G1 according to the detection image storage flow shown in FIG. 11.
  • step S01 is executed first.
  • the detection process determination unit 81 determines whether or not a predetermined state has been detected by the state detection unit 42.
  • step S01 of FIG. 11 If the state detection unit 42 does not detect the specified state ("No" in step S01 of FIG. 11), this detection image storage flow ends for the time being. If the state detection unit 42 detects the specified state ("Yes" in step S01 of FIG. 11), the process proceeds to step S02.
  • step S02 the privacy information acquired by the privacy information acquisition unit 44 as described above is sent from the privacy information acquisition unit 44 to the detection process determination unit 81.
  • the detection process determination unit 81 determines whether or not the first information included in the privacy information indicates the display of the first image 11. More specifically, the detection process determination unit 81 determines whether or not the storage and display of the detection image G1 by the first image 11 is desired (permitted) based on the first information included in the privacy information.
  • step S03 If the first information indicates the display of the first image 11 (more specifically, if the storage and display of the detection image G1 by the first image 11 is desired (permitted)) ("Yes” in step S02 of FIG. 11), the process proceeds to step S03. If the first information does not indicate the display of the first image 11 (more specifically, if the storage and display of the detection image G1 by the first image 11 is refused (prohibited)) ("No" in step S02 of FIG. 11), the process proceeds to step S04.
  • step S03 the detection image generation unit 82 acquires the first image 11 stored in the temporary storage unit 51. Then, the detection image generation unit 82 generates a detection image G1 of the first image 11 based on the first image 11. The detection image generation unit 82 stores the generated detection image G1 in the detection storage unit 52. Thereafter, this detection image storage flow ends for the time being.
  • the detection abstraction unit 83 acquires the first image 11 stored in the temporary storage unit 51.
  • the detection abstraction unit 83 then generates a second image 12 by performing an abstraction process on the first image 11.
  • the detection abstraction unit 83 sends the second image 12 to the detection image generation unit 82.
  • the detection image generation unit 82 generates a detection image G1 of the second image 12 based on the second image 12.
  • the detection image generation unit 82 stores the generated detection image G1 in the detection storage unit 52. Thereafter, this detection image storage flow ends for the time being.
  • the detection memory unit 52 can store the detection image G1 of the first image 11 and the detection image G1 of the second image 12.
  • the surveillance system A is equipped with a memory unit 5 that can store acquired images.
  • the memory unit 5 also stores the first image 11.
  • the transmission/reception unit 7 has a detection transmission unit 71.
  • the detection transmission unit 71 is configured to acquire a detection image G1 stored in the detection storage unit 52 via the control unit 4 in response to a request from the monitoring terminal 2, and to transmit the detection image G1 to the monitoring terminal 2.
  • the management computer 6 has an authority storage unit 64.
  • the authority storage unit 64 stores authority information, which is information indicating the authority of user E regarding viewing of acquired images.
  • the authority information is information indicating whether viewing of the first image 11 is permitted for all of the detection image G1, current image G2, and past image G3.
  • the authority information is information indicating whether viewing of the first image 11 is permitted for all of the detection image G1, current image G2, and past image G3.
  • the authority information may be, for example, information indicating the gender of user E, information indicating whether the monitored person D corresponding to the acquired image and user E are of the same gender, or information indicating the relationship between the monitored person D corresponding to the acquired image and user E (for example, whether user E is a staff member in charge of monitored person D).
  • the monitoring terminal 2 has an authority acquisition unit 24.
  • the authority acquisition unit 24 is configured to be able to acquire authority information stored in the authority storage unit 64 from the authority storage unit 64.
  • the surveillance system A is equipped with an authority acquisition unit 24 that acquires authority information, which is information indicating the authority of the user E to view the acquired image.
  • the second processing unit 22 is configured to perform processing related to the display of the detection image G1 in accordance with the detection image display flow shown in FIG. 12.
  • step S11 the second processing unit 22 determines whether or not an operation has been performed to view the detection image G1. For example, when user E selects any detection image G1 from the list in the second area F2 of the home screen 91 shown in FIG. 8 by operating input via the operating unit 23, it is determined in step S11 that an operation has been performed to view the detection image G1.
  • step S11 of FIG. 12 If no operation has been performed to view the detection image G1 ("No” in step S11 of FIG. 12), this detection image display flow ends for the time being. If an operation has been performed to view the detection image G1 ("Yes" in step S11 of FIG. 12), the process proceeds to step S12.
  • step S12 the second processing unit 22 acquires the detection image G1 stored in the detection memory unit 52 via the control unit 4 and the transmission/reception unit 7 (detection transmission unit 71). After that, the process proceeds to step S13.
  • step S13 it is determined whether the detection image G1 acquired by the second processing unit 22 from the detection memory unit 52 is the first image 11. If the detection image G1 is not the first image 11 (in other words, if it is the second image 12) ("No" in step S13 of FIG. 12), the process proceeds to step S14. On the other hand, if the detection image G1 is the first image 11 ("Yes" in step S13 of FIG. 12), the process proceeds to step S15.
  • step S14 the detection image G1 of the second image 12 acquired by the second processing unit 22 in step S12 or step S16 (described below) is displayed on the display 21 (first area F1) under the control of the second processing unit 22. Furthermore, the second processing unit 22 makes the above-mentioned first button 31 and second button 32 inoperable. After that, this detection image display flow ends for the time being.
  • step S15 the authority acquisition unit 24 acquires authority information from the authority storage unit 64. Then, the second processing unit 22 determines, based on the authority information, whether or not the user E using the monitoring terminal 2 has the authority to view the detection image G1 of the first image 11.
  • step S16 If the user E does not have the authority to view the detection image G1 of the first image 11 ("No” in step S15 of FIG. 12), the process proceeds to step S16. If the user E has the authority to view the detection image G1 of the first image 11 ("Yes” in step S15 of FIG. 12), the process proceeds to step S17.
  • step S16 the second processing unit 22 performs an abstraction process on the detection image G1 of the first image 11 acquired by the second processing unit 22 from the detection memory unit 52, thereby generating a detection image G1 of the second image 12. As a result, the second processing unit 22 acquires the detection image G1 of the second image 12. After that, the process proceeds to step S14.
  • step S17 the second processing unit 22 executes a process to permit the display of the detection image G1 of the first image 11 on the display 21 (first area F1). This process makes the above-mentioned first button 31 and second button 32 operable. After that, the process proceeds to step S18.
  • step S18 the detection image G1 of the first image 11 acquired by the second processing unit 22 in step S12 is displayed on the display 21 (first area F1) under the control of the second processing unit 22. After that, this detection image display flow ends for the time being.
  • the detection image G1 displayed in step S18 may be the second image 12. Even if the detection image G1 of the second image 12 is displayed in step S18, the user E can view the detection image G1 of the first image 11 by operating the first button 31 or the second button 32.
  • the second processing unit 22 is configured to perform processing related to the display of the acquired image based on the authority information.
  • the present disclosure is not limited to this, and the second processing unit 22 may be configured to perform processing related to the storage of the acquired image based on the authority information, or may be configured to perform processing related to the display of the acquired image and processing related to the storage of the acquired image.
  • the processing unit P performs processing related to at least one of displaying and storing the acquired image based on the authority information.
  • the process proceeds from step S02 to step S03 and the detection image G1 of the first image 11 is stored in the detection storage unit 52, and then in the detection image display flow shown in FIG. 12, the process proceeds from step S16 to step S14 and the detection image G1 of the second image 12 is displayed.
  • the second processing unit 22 generates a second image 12 by performing an abstraction process on the first image 11 acquired from the detection memory unit 52, and displays the second image 12 on the display 21.
  • the processing unit P when displaying the second image 12, the processing unit P generates the second image 12 by performing an abstraction process on the first image 11 obtained from the storage unit 5, and causes the second image 12 to be displayed on the display 21. Furthermore, if the privacy information indicates that the first image 11 is to be displayed, and the authority information indicates that the user E is not permitted to view the first image 11, but is permitted to view the second image 12, the storage unit 5 stores the first image 11, and the processing unit P generates the second image 12 by performing an abstraction process on the first image 11 obtained from the storage unit 5, and causes the second image 12 to be displayed on the display 21.
  • the detection image generation unit 82 stores the second image 12 in the detection storage unit 52, and the second processing unit 22 displays the second image 12 on the display 21.
  • the processing unit P stores the second image 12 in the memory unit 5 and displays the second image 12 on the display 21.
  • the processing unit P is configured such that it is not possible to execute non-display processing for preventing the detection image G1 from being displayed on the display 21 based on the first information.
  • the first processing unit 43 has a current processing unit 46.
  • the current processing unit 46 includes a current processing determination unit 84 and a current abstraction unit 85.
  • the transmitting/receiving unit 7 has a current transmitting unit 72.
  • the current transmitting unit 72 is configured to obtain a current image G2 from the control unit 4 in response to a request from the monitoring terminal 2, and to transmit the current image G2 to the monitoring terminal 2.
  • the second processing unit 22 and the current processing unit 46 are configured to perform processing related to the display of the current image G2 according to the current image display flow shown in FIG. 13.
  • step S21 the privacy information acquired by the privacy information acquisition unit 44 as described above is sent from the privacy information acquisition unit 44 to the current processing determination unit 84.
  • the current processing determination unit 84 determines whether the second information included in the privacy information indicates that the current image G2 is not to be displayed. More specifically, the current processing determination unit 84 determines whether the display of the current image G2 is refused (prohibited) regardless of the display mode, based on the second information included in the privacy information.
  • step S21 of FIG. 13 If the second information indicates that the current image G2 is not displayed (more specifically, that the display of the current image G2 is refused (prohibited) regardless of the display mode) ("Yes" in step S21 of FIG. 13), the process proceeds to step S22. If the second information does not indicate that the current image G2 is not displayed (more specifically, that the display of the current image G2 is not refused (prohibited)) ("No" in step S21 of FIG. 13), the process proceeds to step S23.
  • the current processing unit 46 executes a current non-display process.
  • the current non-display process is a process for preventing the current image G2 from being displayed on the display 21.
  • the current non-display process may be, for example, a process for controlling the current transmission unit 72 so that the current transmission unit 72 does not transmit the current image G2 to the monitoring terminal 2.
  • the current non-display process may also be, for example, a process for making the above-mentioned display start button 36 inoperable. Thereafter, this current image display flow ends for the time being.
  • the processing unit P can execute a current non-display process, which is a non-display process for preventing the current image G2 from being displayed on the display 21, based on the second information.
  • step S23 the current processing determination unit 84 determines whether an operation has been performed to view the current image G2. For example, when user E operates the display start button 36 on the home screen 91 shown in FIG. 3 by operating input via the operation unit 23, a predetermined signal is sent from the monitoring terminal 2 to the control unit 4 via the transmission/reception unit 7. When the control unit 4 (current processing determination unit 84) receives the signal, it is determined in step S23 that an operation has been performed to view the current image G2.
  • step S23 of FIG. 13 If no operation has been performed to view the current image G2 ("No” in step S23 of FIG. 13), this current image display flow ends for the time being. If an operation has been performed to view the current image G2 ("Yes” in step S23 of FIG. 13), the process proceeds to step S24.
  • step S24 the current processing determination unit 84 determines whether the second information included in the privacy information sent to the current processing determination unit 84 in step S21 indicates the display of the first image 11. More specifically, the current processing determination unit 84 determines whether the display of the current image G2 by the first image 11 is desired (permitted) based on the second information included in the privacy information.
  • step S24 of FIG. 13 If the second information does not indicate the display of the first image 11 (more specifically, if the display of the current image G2 by the first image 11 is refused (prohibited)) ("No" in step S24 of FIG. 13), the process proceeds to step S25. If the second information indicates the display of the first image 11 (more specifically, if the display of the current image G2 by the first image 11 is desired (permitted)) ("Yes" in step S24 of FIG. 13), the process proceeds to step S26.
  • step S25 the second processing unit 22 renders the first button 31 and second button 32 inoperable.
  • the current abstraction unit 85 acquires the first image 11 from the imaging unit 3.
  • the current abstraction unit 85 then performs an abstraction process on the first image 11 to generate a current image G2 of the second image 12.
  • the current abstraction unit 85 then sends the generated current image G2 of the second image 12 to the monitoring terminal 2 via the current transmission unit 72, and also displays it on the display 21. Thereafter, this current image display flow ends for the time being.
  • step S26 the authority acquisition unit 24 acquires authority information from the authority storage unit 64. Then, the second processing unit 22 determines, based on the authority information, whether or not the user E using the monitoring terminal 2 has the authority to view the current image G2 of the first image 11.
  • step S25 If the user E does not have the authority to view the current image G2 of the first image 11 ("No" in step S26 of FIG. 13), the process proceeds to step S25. If the user E has the authority to view the current image G2 of the first image 11 ("Yes" in step S26 of FIG. 13), the process proceeds to step S27.
  • step S27 the second processing unit 22 executes a process to permit the display of the current image G2 of the first image 11 on the display 21 (first area F1). This process makes the above-mentioned first button 31 and second button 32 operable. After that, the process proceeds to step S28.
  • step S28 the first image 11 captured by the imaging unit 3 is sent to the monitoring terminal 2 via the control unit 4 and the current transmission unit 72 under the control of the current processing unit 46. Then, the first image 11 is displayed as a current image G2 on the display 21 (first area F1) under the control of the second processing unit 22. Thereafter, this current image display flow ends for the time being.
  • the current image G2 displayed in step S28 may be the second image 12. Even if the current image G2 of the second image 12 is displayed in step S28, user E can view the current image G2 of the first image 11 by operating the first button 31 or the second button 32.
  • the first processing unit 43 has a past processing unit 47.
  • the past processing unit 47 includes a past processing determination unit 86 and a past abstraction unit 87.
  • the storage unit 5 has a past storage unit 53.
  • the past processing unit 47 is configured to perform processing related to the storage of past image G3 according to the past image storage flow shown in FIG. 14.
  • step S31 the privacy information acquired by the privacy information acquisition unit 44 as described above is sent from the privacy information acquisition unit 44 to the past processing determination unit 86.
  • the past processing determination unit 86 determines whether or not the third information included in the privacy information indicates that the past image G3 is not to be displayed. More specifically, the past processing determination unit 86 determines whether or not the storage and display of the past image G3 is refused (prohibited) regardless of the display mode, based on the third information included in the privacy information.
  • step S31 of FIG. 14 If the third information does not indicate that the past image G3 is not displayed (more specifically, if the storage and display of the past image G3 is not denied (prohibited)) ("No" in step S31 of FIG. 14), the process proceeds to step S32. If the third information indicates that the past image G3 is not displayed (more specifically, if the storage and display of the past image G3 is denied (prohibited) regardless of the display mode) ("Yes" in step S31 of FIG. 14), the process proceeds to step S33.
  • step S32 the past processing determination unit 86 determines whether the third information included in the privacy information sent to the past processing determination unit 86 in step S31 indicates the display of the first image 11. More specifically, the past processing determination unit 86 determines whether the storage and display of the past image G3 by the first image 11 is desired (permitted) based on the third information included in the privacy information.
  • step S34 If the third information indicates the display of the first image 11 (more specifically, if the storage and display of the past image G3 by the first image 11 is desired (permitted)) ("Yes” in step S32 of FIG. 14), the process proceeds to step S34. If the third information does not indicate the display of the first image 11 (more specifically, if the storage and display of the past image G3 by the first image 11 is refused (prohibited)) ("No" in step S32 of FIG. 14), the process proceeds to step S35.
  • step S34 the past processing unit 47 sends the first image 11 sent from the imaging unit 3 to the control unit 4 to the past storage unit 53, and stores it in the past storage unit 53 as past image G3. After that, this past image storage flow ends for the time being.
  • step S35 the past abstraction unit 87 acquires the first image 11 sent from the imaging unit 3 to the control unit 4.
  • the past abstraction unit 87 then generates a second image 12 by performing an abstraction process on the first image 11.
  • the past abstraction unit 87 sends the second image 12 to the past storage unit 53 and stores it in the past storage unit 53 as past image G3. Thereafter, this past image storage flow ends for the time being.
  • the period during which the acquired images are stored in the past storage unit 53 by steps S34 and S35 is a period spanning a predetermined number of days. That is, acquired images based on imaging from a point in time a predetermined number of days prior to the present (e.g., a point in time 90 days prior to the present) to the present are stored in the past storage unit 53 as past images G3. In addition, acquired images based on imaging prior to that are erased from the past storage unit 53.
  • the past processing unit 47 executes a first past non-display process.
  • the first past non-display process is a specific example of a past non-display process.
  • the past non-display process is a non-display process for not displaying the past image G3 on the display 21.
  • the first past non-display process is a process for not storing the acquired image for displaying the past image G3 in the past memory unit 53. More specifically, the first past non-display process is a process for prohibiting the past processing unit 47 from sending the acquired image to the past memory unit 53. Thereafter, this past image storage flow ends for the time being.
  • the processing unit P can execute a past non-display process, which is a non-display process for not displaying the past image G3 on the display 21, based on the third information.
  • a past non-display process which is a non-display process for not displaying the past image G3 on the display 21, based on the third information.
  • the acquired image for displaying the past image G3 is not stored in the storage unit 5.
  • the transmission/reception unit 7 has a past transmission unit 73.
  • the past transmission unit 73 is configured to acquire a past image G3 stored in the past storage unit 53 via the control unit 4 in response to a request from the monitoring terminal 2, and to transmit the past image G3 to the monitoring terminal 2.
  • the second processing unit 22 and the past processing unit 47 are configured to perform processing related to the display of the past image G3 according to the past image display flow shown in FIG. 15.
  • step S41 the privacy information acquired by the privacy information acquisition unit 44 as described above is sent from the privacy information acquisition unit 44 to the past processing determination unit 86.
  • the past processing determination unit 86 determines whether or not the third information included in the privacy information indicates that the past image G3 is not to be displayed. More specifically, the past processing determination unit 86 determines whether or not the storage and display of the past image G3 is refused (prohibited) regardless of the display mode, based on the third information included in the privacy information.
  • step S42 If the third information indicates that the past image G3 is not displayed (more specifically, that the storage and display of the past image G3 is refused (prohibited) regardless of the display mode) ("Yes" in step S41 of FIG. 15), the process proceeds to step S42. If the third information does not indicate that the past image G3 is not displayed (more specifically, that the storage and display of the past image G3 is not refused (prohibited)) ("No" in step S41 of FIG. 15), the process proceeds to step S43.
  • the past processing unit 47 executes a second past non-display process.
  • the second past non-display process is a specific example of a past non-display process.
  • the second past non-display process may be, for example, a process of controlling the past transmission unit 73 so that the past transmission unit 73 does not transmit the past image G3 to the monitoring terminal 2.
  • the second past non-display process may also be, for example, a process of making the date and time specification unit 37 and the playback button 38 inoperable. Thereafter, this past image display flow ends for the time being.
  • step S43 the second processing unit 22 determines whether an operation for viewing the previous image G3 has been performed. For example, when the user E operates the date and time specification unit 37 and the playback button 38 on the home screen 91 shown in FIG. 3 by operating the operation input via the operation unit 23, it is determined in step S43 that an operation for viewing the previous image G3 has been performed.
  • step S43 of FIG. 15 If no operation has been performed to view the previous image G3 ("No” in step S43 of FIG. 15), this previous image display flow ends for the time being. If an operation has been performed to view the previous image G3 ("Yes” in step S43 of FIG. 15), the process proceeds to step S44.
  • step S44 the second processing unit 22 acquires the past image G3 stored in the past storage unit 53 via the control unit 4 and the transmission/reception unit 7 (past transmission unit 73). After that, the process proceeds to step S45.
  • step S45 it is determined whether the past image G3 acquired by the second processing unit 22 from the past memory unit 53 is the first image 11. If the past image G3 is not the first image 11 (in other words, if it is the second image 12) ("No" in step S45 of FIG. 15), the process proceeds to step S46. On the other hand, if the past image G3 is the first image 11 ("Yes" in step S45 of FIG. 15), the process proceeds to step S47.
  • step S46 the previous image G3 of the second image 12 acquired by the second processing unit 22 in step S44 or step S48 (described later) is displayed on the display 21 (first area F1) under the control of the second processing unit 22. Furthermore, the second processing unit 22 makes the first button 31 and the second button 32 inoperable. After that, this flow for displaying previous images ends for the time being.
  • step S47 the authority acquisition unit 24 acquires authority information from the authority storage unit 64. Then, the second processing unit 22 determines, based on the authority information, whether or not the user E using the monitoring terminal 2 has the authority to view the previous image G3 of the first image 11.
  • step S48 If the user E has the authority to view the previous image G3 of the first image 11 ("Yes" in step S47 of FIG. 15), the process proceeds to step S49.
  • step S48 the second processing unit 22 performs an abstraction process on the past image G3 of the first image 11 acquired by the second processing unit 22 from the past memory unit 53, thereby generating a past image G3 of the second image 12. As a result, the second processing unit 22 acquires the past image G3 of the second image 12. After that, the process proceeds to step S46.
  • step S49 the second processing unit 22 executes a process to permit the display of the previous image G3 of the first image 11 on the display 21 (first area F1). This process makes the above-mentioned first button 31 and second button 32 operable. After that, the process proceeds to step S50.
  • step S50 the previous image G3 of the first image 11 acquired by the second processing unit 22 in step S44 is displayed on the display 21 (first area F1) under the control of the second processing unit 22. After that, this previous image display flow ends for the time being.
  • the previous image G3 displayed in step S50 may be the second image 12. Even if the previous image G3 of the second image 12 is displayed in step S50, user E can view the previous image G3 of the first image 11 by operating the first button 31 or the second button 32.
  • the storage unit 5 includes a skeleton storage unit 54.
  • the skeleton information 10 generated by the skeleton information generating unit 41 is a moving image.
  • the generated skeleton information 10 is stored in the skeleton storage unit 54.
  • the period during which the skeletal information 10 is stored in the skeletal storage unit 54 is a period spanning a predetermined number of days. That is, the skeletal information 10 based on image capture from a point in time a predetermined number of days prior to the present (for example, a point in time 90 days prior to the present) to the present is stored in the skeletal storage unit 54. In addition, the skeletal information 10 based on image capture prior to that point is erased from the skeletal storage unit 54.
  • the second processing unit 22 acquires the skeletal information 10 stored in the skeleton memory unit 54 via the control unit 4 and the transmission/reception unit 7.
  • the second processing unit 22 displays the acquired image on the display 21 so that the imaging time of the acquired image and the imaging time of the skeletal information 10 match.
  • [Download past images] 16 shows a download screen 94 displayed on the display 21 of the monitoring terminal 2. On the download screen 94, it is possible to trim and download the past image G3.
  • a time vector tab T is displayed at the bottom of the download screen 94.
  • a fourth area F4 and a fifth area F5 are provided above the time vector tab T.
  • the previous image G3 is displayed in the fourth area F4.
  • the first button 31, the second button 32, and the third button 33 are displayed above the fourth area F4.
  • a trimming operation section Tb is displayed overlapping the time vector tab T.
  • the user E can operate the trimming operation section Tb by inputting an operation via the operation section 23.
  • the trimming range of the past image G3 is determined according to the operation of the trimming operation section Tb.
  • User E can operate within the fifth area F5 by inputting operations via the operation unit 23.
  • the save button 40 is displayed on the right edge at the bottom of the download screen 94.
  • User E can operate the save button 40 by inputting an operation via the operation unit 23.
  • a video of the trimming range determined by the trimming operation unit Tb from among the past images G3 is downloaded.
  • the video is given the notes and labels input by operations within the fifth area F5.
  • each functional unit included in the monitoring device 1, the monitoring terminal 2, and the management computer 6 may be a physical device such as a microcomputer, or may be a functional unit in software.
  • a configuration may be used in which a program corresponding to each functional unit is stored in a ROM or non-volatile memory (not shown), and the program is loaded into and executed by a CPU to execute a process corresponding to each functional unit.
  • privacy information regarding the acquired image is stored for each monitored person D. Then, based on the privacy information corresponding to the monitored person D, processing (e.g., mosaic processing) is performed on the acquired image corresponding to that monitored person D.
  • processing e.g., mosaic processing
  • the monitoring method may be configured to perform the operations performed by each element in the above embodiment in one or more steps.
  • the monitoring method may be configured to include an imaging step corresponding to the imaging unit 3, a display step corresponding to the display 21, a privacy information acquisition step corresponding to the privacy information acquisition unit 44, and a processing step corresponding to the processing unit P.
  • the processing unit P may be capable of displaying the second image 12 on the display 21 (first area F1) without the skeletal information 10 being superimposed.
  • the detection image G1, the current image G2, the past image G3, and the skeletal information 10 may be still images.
  • the authority information may be information indicating whether viewing of the first image 11 is permitted for each of the detection image G1, the current image G2, and the past image G3. In other words, whether viewing of the first image 11 is permitted may differ between the detection image G1, the current image G2, and the past image G3.
  • the authority information may include information indicating whether viewing of the second image 12 is permitted for all of the detection image G1, the current image G2, and the past image G3.
  • the authority information may also include information indicating whether viewing of the second image 12 is permitted for each of the detection image G1, the current image G2, and the past image G3. In other words, whether viewing of the second image 12 is permitted may differ between the detection image G1, the current image G2, and the past image G3. Furthermore, if viewing of the second image 12 is not permitted, the configuration may be such that user E cannot view either the first image 11 or the second image 12.
  • the processing unit P may be configured to perform only one of the processes related to displaying and storing the acquired image, or may be configured to perform both the process related to displaying the acquired image and the process related to storing the acquired image.
  • the processing unit P may be configured to switch the display mode of the detection image G1 depending on the state of the monitored person D detected by the state detection unit 42.
  • the detection processing unit 45 may be configured to store the detection image G1 of the second image 12 in the detection memory unit 52 when the state of the monitored person D detected by the state detection unit 42 is a sitting up state, and to store the detection image G1 of the first image 11 in the detection memory unit 52 when the state of the monitored person D detected by the state detection unit 42 is a fallen state.
  • switching the detection image G1 stored in the detection memory unit 52 between the first image 11 and the second image 12 corresponds to "switching the display mode.”
  • the processing unit P may be configured to switch the display mode of the acquired image on the display 21 depending on the time of image capture by the imaging unit 3.
  • the current processing unit 46 may be configured to cause the display 21 to display the current image G2 of the second image 12 during a preset time period, and to display the current image G2 of the first image 11 during other times.
  • the time period may be, for example, a time period during which consideration must be given to the privacy of the monitored person D.
  • the present disclosure can be used not only in elderly welfare facilities, but also in various other facilities such as medical facilities and public facilities, and can also be used outdoors.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention concerne un système de surveillance (A) pourvu d'une pluralité d'unités d'imagerie (3) correspondant à une pluralité de personnes à surveiller, chacune des unités d'imagerie (3) étant conçue pour capturer une image d'une région à surveiller dans laquelle une personne à surveiller correspondant à celle-ci est présente, et pourvu d'une unité d'affichage (21) qui est apte à afficher une image acquise qui est une image acquise par capture d'image par l'unité d'imagerie (3), d'une unité d'acquisition d'informations de confidentialité (44) qui acquiert des informations de confidentialité relatives à l'image acquise et à stocker pour chaque personne à surveiller, et d'une unité de traitement (P) qui effectue un traitement relatif à l'affichage et/ou au stockage de l'image acquise, l'unité de traitement (P) traitant, sur la base des informations de confidentialité correspondant à une personne à surveiller, l'image acquise correspondant à la personne à surveiller.
PCT/JP2023/030078 2022-09-29 2023-08-22 Système de surveillance et procédé de surveillance WO2024070335A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022156230A JP2024049782A (ja) 2022-09-29 2022-09-29 監視システム及び監視方法
JP2022-156230 2022-09-29

Publications (1)

Publication Number Publication Date
WO2024070335A1 true WO2024070335A1 (fr) 2024-04-04

Family

ID=90477277

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/030078 WO2024070335A1 (fr) 2022-09-29 2023-08-22 Système de surveillance et procédé de surveillance

Country Status (2)

Country Link
JP (1) JP2024049782A (fr)
WO (1) WO2024070335A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004032459A (ja) * 2002-06-27 2004-01-29 Hitachi Ltd 監視システム、およびこれに用いるコントローラと監視端末
JP2007243267A (ja) * 2006-03-06 2007-09-20 Sony Corp 映像監視システムおよび映像監視プログラム
JP2009225398A (ja) * 2008-03-19 2009-10-01 Secom Co Ltd 画像配信システム
JP2017046196A (ja) * 2015-08-27 2017-03-02 キヤノン株式会社 画像情報生成装置、画像情報生成方法、画像処理システム及びプログラム
JP2020088840A (ja) * 2019-04-11 2020-06-04 アースアイズ株式会社 監視装置、監視システム、監視方法、監視プログラム
JP2020107069A (ja) * 2018-12-27 2020-07-09 コニカミノルタ株式会社 見守り管理装置、その制御方法および見守りシステム
JP2020190889A (ja) * 2019-05-21 2020-11-26 新生電子株式会社 要介護者見守りシステム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004032459A (ja) * 2002-06-27 2004-01-29 Hitachi Ltd 監視システム、およびこれに用いるコントローラと監視端末
JP2007243267A (ja) * 2006-03-06 2007-09-20 Sony Corp 映像監視システムおよび映像監視プログラム
JP2009225398A (ja) * 2008-03-19 2009-10-01 Secom Co Ltd 画像配信システム
JP2017046196A (ja) * 2015-08-27 2017-03-02 キヤノン株式会社 画像情報生成装置、画像情報生成方法、画像処理システム及びプログラム
JP2020107069A (ja) * 2018-12-27 2020-07-09 コニカミノルタ株式会社 見守り管理装置、その制御方法および見守りシステム
JP2020088840A (ja) * 2019-04-11 2020-06-04 アースアイズ株式会社 監視装置、監視システム、監視方法、監視プログラム
JP2020190889A (ja) * 2019-05-21 2020-11-26 新生電子株式会社 要介護者見守りシステム

Also Published As

Publication number Publication date
JP2024049782A (ja) 2024-04-10

Similar Documents

Publication Publication Date Title
US20230291872A1 (en) Electronic patient sitter management system and method for implementing
US6298374B1 (en) Communication management apparatus with user modifiable symbol movable among virtual spaces shared by user terminals to direct current user position in real world and recording medium used therefor
US20040227817A1 (en) Motion detecting system, motion detecting method, motion detecting apparatus, and program for implementing the method
CN103491848B (zh) 医疗用内窥镜系统
US11548760B2 (en) Elevator display system
EP3308699B1 (fr) Dispositif et procédé de traitement central pour système de surveillance de personnes à surveiller, et système pour surveiller des personnes à surveiller
KR100859679B1 (ko) 카메라 기반의 시스템에서 모드 스위칭을 위한 방법 및 장치
JP2008085874A (ja) 人物監視システムおよび人物監視方法
US20110176025A1 (en) Video information processing apparatus, video information processing method, and computer-readable storage medium
US20130034338A1 (en) Reproduction apparatus and control method thereof
WO2024070335A1 (fr) Système de surveillance et procédé de surveillance
US7995844B2 (en) Control apparatus, control method, and program for implementing the method
JP4350389B2 (ja) 通信ネットワークシステム、携帯端末装置およびコンピュータプログラム
JP2008108151A (ja) 監視システム
CN107408330B (zh) 被监视者监视系统的显示装置及其显示方法以及被监视者监视系统
JP2008140319A (ja) 個人認証装置および個人認証システム
JP7234948B2 (ja) 見守りシステム、およびイベントリストの表示方法
JP3703229B2 (ja) カメラ制御システム及び方法並びに記憶媒体
JP6895090B2 (ja) 検知システムおよび検知システムの表示処理方法
JP6747603B2 (ja) 監視支援装置および監視支援システム
US9307142B2 (en) Imaging method and imaging apparatus
JP7398870B2 (ja) 監視システム、監視方法および監視プログラム
WO2018096806A1 (fr) Dispositif de réglage de dispositif de surveillance de sujet surveillé, son procédé de réglage et système de surveillance de sujet surveillé
JP2004046822A (ja) 監視・制御方法及びそれを用いたビデオカメラ装置
JP2023039508A (ja) 映像監視装置、表示制御方法及び映像監視システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23871576

Country of ref document: EP

Kind code of ref document: A1