WO2022209347A1 - 情報処理装置、情報処理方法、コンピュータ可読媒体、及び情報処理システム - Google Patents
情報処理装置、情報処理方法、コンピュータ可読媒体、及び情報処理システム Download PDFInfo
- Publication number
- WO2022209347A1 WO2022209347A1 PCT/JP2022/005425 JP2022005425W WO2022209347A1 WO 2022209347 A1 WO2022209347 A1 WO 2022209347A1 JP 2022005425 W JP2022005425 W JP 2022005425W WO 2022209347 A1 WO2022209347 A1 WO 2022209347A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information processing
- determination
- person
- period
- warning
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 111
- 238000003672 processing method Methods 0.000 title claims description 6
- 230000009471 action Effects 0.000 claims abstract description 129
- 238000000034 method Methods 0.000 claims abstract description 79
- 230000008569 process Effects 0.000 claims abstract description 72
- 238000003384 imaging method Methods 0.000 claims abstract description 37
- 238000012545 processing Methods 0.000 claims description 20
- 230000006399 behavior Effects 0.000 description 41
- 230000015654 memory Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 10
- 238000010295 mobile communication Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 101100521334 Mus musculus Prom1 gene Proteins 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 206010041232 sneezing Diseases 0.000 description 2
- 206010011224 Cough Diseases 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/0423—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/04—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
- G08B21/0407—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
- G08B21/0415—Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting absence of activity per se
Definitions
- the present disclosure relates to an information processing device, an information processing method, a program, and an information processing system.
- Patent Document 1 discloses that the same posture continues for a predetermined period of time based on images of the user captured by a camera, in order to prevent a decrease in work efficiency due to continuation of the same work. It describes that an alarm is output in some cases. Patent Literature 1 describes that an alarm is output when the user continues in a posture that is determined to be a keyboard operation for one hour or longer, or when the user is on the phone for one hour or longer.
- Patent Document 1 has a problem that, for example, when monitoring the behavior of pedestrians, it may not be possible to output warnings appropriately.
- An object of the present disclosure is to provide an information processing device, an information processing method, a program, and an information processing system capable of appropriately outputting an image-based warning in view of the above-described problems.
- an information processing device includes an acquisition unit that acquires information detected based on an image captured by an imaging device, and detection at the start and end of the first period first determination processing for determining to output a warning if the type of the detected object is a person, and outputting a warning if the behavior of the person detected based on the image continues for a second period or longer and an output unit configured to output a warning based on the determination result of the determination unit.
- an information processing method in which information detected based on an image captured by an imaging device is acquired, and determining to output a warning when the type of the detected object is a person, and determining to output a warning when the behavior of the person detected based on the image continues for a second period or longer; Output a warning based on the judgment result.
- the information processing device is configured to acquire information detected based on an image captured by the imaging device, and at the start and end of the first period, A first determination process for determining to output a warning if the type of the detected object is a person, and outputting a warning if the behavior of the person detected based on the image continues for a second period or longer.
- an information processing system includes an image capturing device that captures an image and an information processing device.
- the information processing device includes an acquisition unit that acquires information detected based on an image captured by an imaging device, and an object that is detected at the start and end of the first period.
- a first determination process for determining to output a warning if the type of is a person, and determining to output a warning if the behavior of the person detected based on the image continues for a second period or longer It has a determination unit that performs a second determination process, and an output unit that outputs a warning based on the determination result of the determination unit.
- an image-based warning can be appropriately output.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment
- FIG. It is a figure which shows the hardware structural example of the information processing apparatus which concerns on embodiment.
- 4 is a flowchart showing an example of processing of the information processing device according to the embodiment; It is a figure explaining an example of the 1st determination processing which concerns on embodiment. It is a figure explaining an example of the 2nd determination processing which concerns on embodiment.
- It is a figure which shows an example of the image image
- It is a figure which shows an example of a structure of the information processing apparatus which concerns on embodiment.
- FIG. 1 is a diagram showing an example of the configuration of an information processing device 10 according to an embodiment.
- the information processing device 10 has an acquisition unit 11 , a determination unit 12 and an output unit 13 .
- Each of these units may be implemented by cooperation of one or more programs installed in the information processing device 10 and hardware such as the processor 101 and the memory 102 of the information processing device 10 .
- the acquisition unit 11 acquires various types of information from a storage unit inside the information processing device 10 or from an external device.
- the acquisition unit 11 acquires information detected based on an image captured by the imaging device 20, for example.
- the acquisition unit 11 may detect (acquire) information based on the image captured by the imaging device 20 .
- the acquisition unit 11 may acquire information on a person's behavior detected by another module in the information processing apparatus 10 or by an external device.
- the determination unit 12 makes various determinations regarding the recording of the image captured by the imaging device 20.
- the "image" of the present disclosure includes at least one of a moving image and a still image.
- the determination unit 12 may determine that a warning should be output when the type of the object detected at the start and end of the first period is a person. Further, the determination unit 12 may determine that a warning should be output when, for example, the behavior of the person detected based on the image continues for a second period or longer.
- the output unit 13 outputs (notifies) a warning (alert, alarm, alarm) based on the determination result of the determination unit 12 .
- FIG. 2 is a diagram showing a configuration example of the information processing system 1 according to the embodiment.
- the information processing system 1 has an information processing device 10 and an imaging device 20 .
- the information processing device 10 and the photographing device 20 are connected via a network N so that they can communicate with each other.
- the numbers of the information processing apparatuses 10 and the photographing apparatuses 20 are not limited to the example in FIG.
- Examples of the network N include, for example, the Internet, mobile communication systems, wireless LANs (Local Area Networks), LANs, and buses.
- Examples of mobile communication systems include, for example, fifth generation mobile communication systems (5G), fourth generation mobile communication systems (4G), third generation mobile communication systems (3G), and the like.
- the information processing device 10 is, for example, a device such as a server, a cloud, a personal computer, a network video recorder, or a smart phone.
- the information processing device 10 outputs a warning based on the image captured by the imaging device 20 .
- the imaging device 20 is, for example, a device such as a network camera, a camera, or a smartphone.
- the imaging device 20 captures an image using a camera and outputs (transmits) the captured image to the information processing device 10 .
- FIG. 3 is a diagram showing a hardware configuration example of the information processing apparatus 10 according to the embodiment.
- the information processing device 10 (computer 100) includes a processor 101, a memory 102, and a communication interface 103. FIG. These units may be connected by a bus or the like.
- Memory 102 stores at least a portion of program 104 .
- Communication interface 103 includes interfaces necessary for communication with other network elements.
- Memory 102 may be of any type suitable for a local technology network. Memory 102 may be, as a non-limiting example, a non-transitory computer-readable storage medium. Also, memory 102 may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed and removable memory, and the like. Although only one memory 102 is shown in computer 100, there may be several physically different memory modules in computer 100.
- FIG. Processor 101 may be of any type.
- Processor 101 may include one or more of a general purpose computer, a special purpose computer, a microprocessor, a Digital Signal Processor (DSP), and a processor based on a multi-core processor architecture as non-limiting examples.
- Computer 100 may have multiple processors, such as application specific integrated circuit chips that are temporally dependent on a clock that synchronizes the main processor.
- Embodiments of the present disclosure may be implemented in hardware or dedicated circuitry, software, logic, or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software, which may be executed by a controller, microprocessor or other computing device.
- the present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer-readable storage medium.
- a computer program product comprises computer-executable instructions, such as those contained in program modules, to be executed on a device on a target real or virtual processor to perform the processes or methods of the present disclosure.
- Program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
- Machine-executable instructions for program modules may be executed within local or distributed devices. In a distributed device, program modules can be located in both local and remote storage media.
- Program code for executing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes are provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus. When the program code is executed by the processor or controller, the functions/acts in the flowchart illustrations and/or implementing block diagrams are performed. Program code may run entirely on a machine, partly on a machine, as a stand-alone software package, partly on a machine, partly on a remote machine, or entirely on a remote machine or server. be.
- Non-transitory computer-readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic recording media, magneto-optical recording media, optical disc media, semiconductor memories, and the like.
- Magnetic recording media include, for example, flexible disks, magnetic tapes, hard disk drives, and the like.
- Magneto-optical recording media include, for example, magneto-optical disks.
- Optical disc media include, for example, Blu-ray discs, CD (Compact Disc)-ROM (Read Only Memory), CD-R (Recordable), CD-RW (ReWritable), and the like.
- Semiconductor memories include, for example, solid state drives, mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, RAMs (random access memories), and the like.
- the program may also be delivered to the computer by various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
- FIG. 4 is a flowchart showing an example of processing of the information processing apparatus 10 according to the embodiment.
- FIG. 5 is a diagram explaining an example of the first determination process according to the embodiment.
- FIG. 6 is a diagram explaining an example of the second determination process according to the embodiment.
- FIG. 7 is a diagram showing an example of an image captured by the imaging device 20 according to the embodiment and an example of human behavior detected based on the image.
- FIG. 8 is a diagram illustrating an example of determination processing of the information processing apparatus 10 according to the embodiment.
- the information processing apparatus 10 detects each person based on the position, moving direction, moving speed, characteristics (for example, surface color, height), etc. of each person in each frame captured by the image capturing apparatus 20 at each time point. track the location and behavior of Then, the information processing device 10 executes the following processing for each of the plurality of persons appearing in the image captured by the image capturing device 20 . Therefore, hereinafter, any one of the plurality of persons captured in the image captured by the imaging device 20 is also referred to as a "determination target person" as appropriate.
- step S ⁇ b>1 the acquisition unit 11 of the information processing device 10 acquires information indicating the behavior of the person to be determined, which is detected based on the image captured by the imaging device 20 .
- the process of detecting the action of the person to be determined may be performed by any of the information processing device 10, the photographing device 20, and an external device, for example.
- the information processing device 10 may detect (estimate, infer) the behavior by, for example, AI (Artificial Intelligence) using deep learning or the like. .
- the information processing device 10 may estimate the skeleton (connection state of each joint point) of the person to be determined based on the image captured by the imaging device 20, for example.
- the information processing apparatus 10 determines that the person to be determined is the specified person. It may be judged that it is taking the posture of In the example of FIG. 7, in an image 700 captured by the imaging device 20, falling is detected as the behavior of the person 711. In the example of FIG.
- the determination unit 12 of the information processing device 10 determines whether a warning is necessary (step S2). (First determination process)
- the determination unit 12 performs a first determination process of determining to output a warning when the type of the object detected at the start and end of the first period (for example, 10 seconds) is a person. good too.
- the predetermined area may be, for example, an area in which a prohibited area (no entry) is shown in the image captured by the imaging device 20 .
- the predetermined area may be set in the information processing apparatus 10 in advance by an operator or the like.
- a person is continuously detected in a predetermined region of the image during a period 511 from time t50 to time t51 and a period 512 from time t52 to time t53 . It is shown. Therefore, in the example of FIG. 5, a person is detected in the predetermined region at the start of the first period t p1 (for example, time t 50 ), and at the end (for example, time t 50 +t P1 ), a person is detected in the predetermined area. Therefore, the determination unit 12 determines to output a warning in the first determination process.
- the determination unit 12 determines that the object is continuously detected in the predetermined area of the image for a first period (for example, 10 seconds), and that the first period starts and ends. It may be determined that a warning should be output when the type of the object that is actually detected is a person. As a result, for example, when a person enters a restricted area, even if there is a time period in which only a vehicle or the like is detected instead of the person at the location of the person, the warning can be issued appropriately. can be output to Note that in the example of FIG. 5, the period 501 during which the object is detected continues from time t50 to time t53 (> t50 + tP1 ). It is shown that an object is detected by
- the determination unit 12 performs second determination processing to determine that a warning is to be output when the specific behavior of the person detected based on the image continues for a second period (for example, 10 seconds) or longer. good too.
- the determination unit 12 may perform the second determination process, for example, on a region in the image that is set in the information processing apparatus 10 in advance by an operator or the like. As a result, for example, when a person falls over, sits down, or the like for a certain period of time or longer, a warning can be output appropriately. In the example of FIG.
- a period 601 during which a specific action by a person is detected continues from time t60 to time t62 (> t60 + tP2 ), so the specific action is detected during the second period tp2 . It is shown that the above continues. Therefore, the determination unit 12 determines to output a warning in the second determination process.
- the determination unit 12 determines one or more determination processes for the behavior of the person based on the behavior of the person detected based on the image, among a plurality of determination processes including the first determination process and the second determination process. You may In this case, the information processing apparatus 10 may be specified by an operator or the like to perform one or more determination processes for determining whether or not a warning is necessary for each type of action.
- the determination unit 12 may execute the first determination process when the behavior of the person detected based on the image is the first behavior (for example, entering a no-go zone). Further, the determination unit 12 may execute the second determination process when the behavior of the person detected based on the image is the second behavior (for example, falling, sitting down, crouching, etc.). This makes it possible to appropriately determine whether or not a warning is necessary according to the behavior of the person.
- the determination unit 12 determines that the total length of time during which the person to be determined performs a specific action in a specific period (hereinafter also referred to as an “action determination period” as appropriate. For example, 20 seconds) is a first threshold value ( Hereinafter, it may also be referred to as an “action determination threshold” as appropriate (for example, 10 seconds). Then, the determination unit 12 may determine to output a warning when the total is equal to or greater than the action determination threshold. Thereby, for example, an image-based warning can be appropriately output.
- the determination unit 12 determines that a period in which a specific action by a person to be determined is not continuously detected in the action determination period is a second threshold (hereinafter also referred to as an “action continuation threshold” as appropriate. For example, 3 seconds .) If the above is the case, it may be determined not to output the warning. Thereby, for example, an image-based warning can be appropriately output.
- a second threshold hereinafter also referred to as an “action continuation threshold” as appropriate. For example, 3 seconds .
- FIG. 8 shows that the total length of time (specific action period) during which the person to be judged is performing the specific action in the action judgment period t P3 is t 21 +t 22 +t 23 . Also, it is shown that the lengths of periods t31 and t32 in which the person to be judged does not perform a specific action in the action judgment period tP3 are each less than the action continuation threshold tC . In this case, the determination unit 12 determines to output a warning when the total (t 21 +t 22 +t 23 ) is equal to or greater than the action determination threshold, and outputs a warning when the total is not equal to or greater than the action determination threshold. You can judge.
- the determination unit 12 may determine at least one of the length of the action determination period, the action determination threshold, and the action continuation threshold based on a predetermined condition. Thereby, for example, the need for warning can be determined more appropriately. This is because at least one of the lengthening of the behavior determination period and the reduction of the behavior determination threshold may result in a warning even if, for example, the total period during which the behavior of the person to be determined cannot be detected becomes relatively long. is not output. Also, by increasing the action continuation threshold, for example, even when the total period during which the action of the person to be determined cannot be detected becomes relatively long, it is possible to reduce the number of warnings not being output. Examples of predetermined conditions are described below. Note that the determination unit 12 may determine at least one of the length of the action determination period, the action determination threshold, and the action continuation threshold by combining a plurality of conditions below.
- the determination unit 12 determines the length of the action determination period, the action determination threshold, and the action continuation threshold based on the circumstances around the person to be determined, which are determined based on the image captured by the imaging device 20. At least one may be determined. In this case, the determination unit 12 may determine at least one of the length of the action determination period, the action determination threshold, and the action continuation threshold, for example, based on the degree of congestion around the person to be determined. In this case, the determination unit 12 may, for example, determine other persons and moving objects (e.g., vehicles) existing around the person to be determined (e.g., an image area of a predetermined range including an area in which the person is photographed).
- other persons and moving objects e.g., vehicles
- the determination unit 12 sets the length of the action determination period as the first period length, the action determination threshold as the first action determination threshold, and the action continuation threshold as It may be determined as the first action continuation threshold.
- the determination unit 12 sets the length of the action determination period to a second period length longer than the first period length, and sets the action determination threshold to the first period length.
- a second action determination threshold that is smaller than the action determination threshold may be set, and the action continuation threshold may be determined as a second action continuation threshold that is larger than the first action continuation threshold. For example, the area around a certain person is crowded like the area 701 in FIG. Even if it is high, the need for warning can be determined more appropriately.
- the determination unit 12 determines, for example, the length of the action determination period, the action determination threshold value, and the At least one action continuation threshold may be determined. In this case, if the time at which the image was captured by the imaging device 20 is not within the predetermined time period, the determination unit 12 sets the length of the action determination period to the third period length, sets the action determination threshold to the third action determination threshold, The action continuation threshold may be determined as the third action continuation threshold.
- the predetermined time period may be preset in the information processing device 10 .
- the determination unit 12 sets the length of the action determination period to a fourth period length longer than the third period length, and sets the action determination threshold value to be smaller than the third action continuation threshold value.
- a fourth action continuation threshold may be set, and the action continuation threshold may be determined as a fourth action continuation threshold that is greater than the third action continuation threshold.
- the determination unit 12 may determine initial values of the action determination period and the action determination threshold according to the imaging device 20 .
- each of the initial values may be preset in the information processing device 10 for each of one or more photographing devices 20 .
- the determination unit 12 determines, for example, the length of the action determination period, the action determination period, and the type of the object in front of the person to be determined in the image captured by the image capturing apparatus 20 (the front side when viewed from the image capturing apparatus 20). At least one of the threshold and the action continuation threshold may be determined. In this case, when the type of the foreground object is a person, an automobile, or the like, the determination unit 12 sets the length of the action determination period to the fifth period length, the action determination threshold to the fifth action continuation threshold, and sets the action continuation threshold to the action continuation threshold. The threshold may be determined as the fifth action continuation threshold.
- the determination unit 12 sets the length of the action determination period to a sixth period length longer than the fifth period length, and sets the action determination threshold to the fifth action.
- a sixth action continuation threshold smaller than the continuation threshold may be set, and the action continuation threshold may be determined as a sixth action continuation threshold larger than the fifth action continuation threshold.
- the determination unit 12 may determine at least one of the length of the action determination period, the action determination threshold, and the action continuation threshold, for example, based on the type of action performed by the person to be determined. In this case, if the type of action performed by the person to be determined is sneezing, coughing, or not wearing a mask, the determination unit 12 sets the length of the action determination period to the seventh period length, and sets the length of the action determination period to the seventh period length.
- a seventh action continuation threshold may be set as the threshold, and the action continuation threshold may be determined as the seventh action continuation threshold.
- the determination unit 12 sets the length of the action determination period to the eighth period length, which is longer than the seventh period length.
- the action determination threshold may be set to an eighth action continuation threshold that is larger than the seventh threshold, and the action continuation threshold may be set to a fourth action continuation threshold that is smaller than the third action continuation threshold.
- the determination unit 12 determines the length of the action determination period, the action determination threshold, and the action continuation threshold based on the attributes of the person to be determined, which are determined based on the image captured by the imaging device 20, for example. At least one may be determined. Note that the process of detecting a person's attributes may be performed by any of the information processing device 10, the photographing device 20, and an external device, for example. When the information processing device 10 detects the attributes of a person, the information processing device 10 detects (estimates, infers, etc.) the attributes of a person (for example, age, gender, height, etc.) by AI using, for example, deep learning. ).
- the determination unit 12 sets the length of the action determination period to the ninth period length, and sets the action determination threshold to the ninth action continuation threshold.
- the action continuation threshold may be determined as the ninth action continuation threshold.
- the determination unit 12 sets the length of the action determination period to the tenth period length, which is shorter than the ninth period length, and sets the action determination threshold to the ninth action continuation length.
- a tenth action continuation threshold smaller than the threshold may be set, and the action continuation threshold may be determined as a tenth action continuation threshold larger than the ninth action continuation threshold.
- the output unit 13 of the information processing device 10 outputs a warning based on the determination result of the determination unit 12 (step S3).
- the output unit 13 may display a warning on the display screen of the information processing device 10, for example.
- the output unit 13 may cause the speaker of the information processing device 10 to output a warning sound, for example.
- the output unit 13 may also transmit a warning message or the like to a terminal such as a smart phone possessed by a monitor (security guard) or a server of a monitoring center.
- FIG. 9 is a diagram showing an example of the configuration of the information processing device 10 according to the embodiment.
- the example in FIG. 9 is different from the example in FIG. 1 mainly in that the information processing apparatus 10 has a setting unit 14 .
- the setting unit 14 may be implemented by cooperation between one or more programs installed in the information processing device 10 and hardware such as the processor 101 and the memory 102 of the information processing device 10 .
- the setting unit 14 sets various types of information used in the processing of the determination unit 12 described above.
- the setting unit 14 may set information specified by an operator (user, administrator) of the information processing apparatus 10, for example.
- the setting unit 14 may set information specified by a setting file or the like set at the time of factory shipment of the information processing apparatus 10, for example.
- the setting unit 14 selects one or more determination processes as to whether or not a warning is necessary for each area on the image captured by the imaging device 20, including a first determination process, a second determination process, and a third determination process. may be set from the determination process of . Then, based on the information set by the setting unit 14, the determination unit 12 may perform determination processing according to the area on the image in which the person to be determined is shown. As a result, for example, the first determination process can be set for an area in which a no-entry area is photographed. Further, for example, the second determination process and the third determination process can be set for an area in which the entrance/exit of a facility such as a store is photographed.
- the setting unit 14 selects one or more determination processes of necessity of warning for each type of human behavior from a plurality of determination processes including a first determination process, a second determination process, and a third determination process. May be set.
- the determination unit 12 may perform determination processing according to the behavior type of the person to be determined.
- the second determination process and the third determination process can be set for the action types such as falling, sitting down, and crouching.
- the setting unit 14 performs, for example, one or more determination processes as to whether or not a warning is required for a combination of each region on the image captured by the imaging device 20 and each type of behavior of the person, the first determination process, A plurality of determination processes including the second determination process and the third determination process may be set. Then, based on the information set by the setting unit 14, the determination unit 12 performs determination processing according to the area on the image in which the person to be determined is shown and the type of action of the person to be determined. good too. Thereby, for example, the first determination process can be set for an action such as entering a restricted area.
- the information processing device 10 may be a device included in one housing, but the information processing device 10 of the present disclosure is not limited to this.
- Each unit of the information processing apparatus 10 may be implemented by cloud computing configured by one or more computers, for example.
- the information processing device 10 and the photographing device 20 may be accommodated in the same housing to constitute an integrated information processing device. At least part of the processing of each functional unit of the information processing device 10 may be executed by the imaging device 20 .
- the information processing device 10 such as these is also included in an example of the "information processing device" of the present disclosure.
- At least the body of the person moves by the movement of another object on the front side of a person as seen from the photographing device 20 and the movement of the person on the back side of the other object as seen from the photographing device 20. Some may be hidden behind other objects. While at least part of the person's body is hidden behind other objects, the person's actions may not be detected.
- a warning is output when the total length of time during which a person performs a specific action in a specific period is equal to or greater than a threshold. As a result, an image-based warning can be appropriately output.
- (Appendix 1) Acquisition means for acquiring information detected based on the image captured by the imaging device; A first determination process for determining that a warning is to be output when the type of object detected at the start and end of the first period is a person; a determination means for performing a second determination process that determines to output a warning if it continues for a second period or longer; output means for outputting a warning based on the determination result of the determination means; Information processing device having (Appendix 2) The determination means executes the first determination process when the behavior of the person detected based on the image is the first behavior, and the behavior of the person detected based on the image is the second behavior.
- the information processing device determines to output a warning when a total length of time during which a specific action by a person is detected based on the image is equal to or greater than a first threshold in a specific period.
- the information processing device according to appendix 1 or 2.
- the determination means determines not to output the warning when a period during which the specific action by the person is not continuously detected based on the image is equal to or greater than a second threshold in the specific period.
- the information processing device according to appendix 3.
- the determination means determines at least one of the length of the specific period, the first threshold, and the second threshold based on a predetermined condition.
- the information processing device determines the length of the specific period based on at least one of a situation around the person determined based on the image and a type of an object in front of the person detected based on the image. determining at least one of the first threshold and the second threshold; The information processing device according to appendix 4 or 5. (Appendix 7) Based on at least one of the attribute of the person determined based on the image, the time when the image was taken, the place where the image was taken, and the type of the specific action, the determination means determines the determining at least one of the length of the specified time period, the first threshold, and the second threshold; 7. The information processing device according to any one of appendices 4 to 6.
- the information processing device is setting means for setting a determination method according to at least one of each area on the image captured by the imaging device and the type of human behavior; The determination means performs determination processing based on the determination method set by the setting means.
- the information processing device according to any one of appendices 1 to 7. (Appendix 9) obtaining information detected based on the image captured by the imaging device; When the type of the object detected at the start and end of the first period is a person, it is determined to output a warning, and the behavior of the person detected based on the image continues for the second period or longer. If it is, it is determined that a warning will be output, Output a warning based on the judgment result, Information processing methods.
- (Appendix 10) information processing equipment A process of acquiring information detected based on an image captured by an imaging device; a first determination process for determining that a warning should be output if the type of object detected at the start and end of the first period is a person; A second determination process that determines to output a warning if it continues for a second period or longer; a process of outputting a warning based on the judgment result;
- a non-transitory computer-readable medium storing a program for executing (Appendix 11) including a photographing device for photographing an image and an information processing device,
- the information processing device is Acquisition means for acquiring information detected based on the image captured by the imaging device;
- a first determination process for determining that a warning is to be output when the type of object detected at the start and end of the first period is a person, and the behavior of the person detected based on the image is determined.
- a determination means for performing a second determination process to determine that a warning is to be output if the warning continues for a second period or longer; output means for outputting a warning based on the determination result of the determination means;
- An information processing system having (Appendix 12) The determination means executes the first determination process when the behavior of the person detected based on the image is the first behavior, and the behavior of the person detected based on the image is the second behavior. If the second determination process is executed, The information processing system according to appendix 11.
- information processing system 10 information processing device 11 acquisition unit 12 determination unit 13 output unit 20 imaging device
Abstract
Description
(実施の形態1)
<構成>
図1を参照し、実施形態に係る情報処理装置10の構成について説明する。図1は、実施形態に係る情報処理装置10の構成の一例を示す図である。情報処理装置10は、取得部11、判定部12、及び出力部13を有する。これら各部は、情報処理装置10にインストールされた1以上のプログラムと、情報処理装置10のプロセッサ101、及びメモリ102等のハードウェアとの協働により実現されてもよい。
次に、図2を参照し、実施形態に係る情報処理システム1の構成について説明する。
<システム構成>
図2は、実施形態に係る情報処理システム1の構成例を示す図である。図2において、情報処理システム1は、情報処理装置10及び撮影装置20を有する。図2の例では、情報処理装置10と撮影装置20とは、ネットワークNにより通信できるように接続されている。なお、情報処理装置10及び撮影装置20の数は図2の例に限定されない。
図3は、実施形態に係る情報処理装置10のハードウェア構成例を示す図である。図3の例では、情報処理装置10(コンピュータ100)は、プロセッサ101、メモリ102、通信インターフェイス103を含む。これら各部は、バス等により接続されてもよい。メモリ102は、プログラム104の少なくとも一部を格納する。通信インターフェイス103は、他のネットワーク要素との通信に必要なインターフェイスを含む。
次に、図4から図8を参照し、実施形態に係る情報処理装置10の処理の一例について説明する。図4は、実施形態に係る情報処理装置10の処理の一例を示すフローチャートである。図5は、実施形態に係る第1判定処理の一例について説明する図である。図6は、実施形態に係る第2判定処理の一例について説明する図である。図7は、実施形態に係る撮影装置20で撮影された画像と、画像に基づいて検出される人物の行動の一例について示す図である。図8は、実施形態に係る情報処理装置10の判定処理の一例について説明する図である。
(第1判定処理)
判定部12は、第1期間(例えば、10秒間)の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理を行ってもよい。なお、当該所定領域は、例えば、撮影装置20で撮影された画像において、立ち入り禁止(侵入禁止)の場所が映される領域でもよい。当該所定領域は、オペレータ等により予め情報処理装置10に設定されていてもよい。これにより、例えば、立ち入り禁止区域に人物が立ち入った場合に、撮影装置20から見て当該人物よりも手前を車両等が走行したために当該人物を検出できない時間帯が存在する等の場合であっても、警告を適切に出力させることができる。
また、判定部12は、画像に基づいて検出された人物による特定の行動が第2期間(例えば、10秒)以上継続している場合は警告を出力させると判定する第2判定処理を行ってもよい。なお、判定部12は、例えば、オペレータ等により予め情報処理装置10に設定されている画像内の領域に対して、第2判定処理を行ってもよい。これにより、例えば、人物が一定時間以上、転倒及び座り込み等をしている場合に、警告を適切に出力させることができる。図6の例では、人物による特定の行動が検出されている期間601が、時点t60から時点t62(>t60+tP2)まで継続されているため、特定の行動が第2期間tp2以上継続していることが示されている。そのため、判定部12は、第2判定処理において、警告を出力させると判定する。
また、判定部12は、特定期間(以下で、適宜「行動判定期間」とも称する。例えば、20秒間。)において判定対象の人物が特定の行動をしている時間長の合計が第1閾値(以下で、適宜「行動判定閾値」とも称する。例えば、10秒。)以上であるか否かを判定してもよい。そして、判定部12は、当該合計が行動判定閾値以上である場合は警告を出力させると判定してもよい。これにより、例えば、画像に基づく警告を適切に出力させることができる。
判定部12は、所定の条件に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。これにより、例えば、警告の要否をより適切に判定できる。これは、行動判定期間の長さが長くなる、及び行動判定閾値が小さくなることの少なくとも一方により、例えば、判定対象の人物の行動を検出できない期間の合計が比較的長くなった場合でも、警告が出力されないことを低減できるためである。また、行動継続閾値が大きくなることにより、例えば、判定対象の人物の行動を検出できない期間の合計が比較的長くなった場合でも、警告が出力されないことを低減できるためである。以下に、所定の条件の例について説明する。なお、判定部12は、以下の複数の条件を組み合わせて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。
判定部12は、例えば、撮影装置20で撮影された画像に基づいて判定された、判定対象の人物の周囲の状況に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。この場合、判定部12は、例えば、判定対象の人物の周囲の混雑度に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。この場合、判定部12は、例えば、判定対象の人物の周囲(例えば、当該人物が写されている領域を含む所定範囲の画像の領域)に存在する他の人物及び移動体(例えば、車両)の数が多いほど、判定対象の人物の周囲の混雑度が高いと判定してもよい。そして、判定部12は、当該混雑度が所定の混雑度閾値以上でない場合は、行動判定期間の長さを第1の期間長とし、行動判定閾値を第1行動判定閾値とし、行動継続閾値を第1行動継続閾値と決定してもよい。
また、判定部12は、例えば、撮影装置20で画像が撮影された時刻、及び撮影装置20で画像が撮影された場所の少なくとも一方に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。この場合、判定部12は、撮影装置20で画像が撮影された時刻が所定時間帯でない場合は、行動判定期間の長さを第3期間長とし、行動判定閾値を第3行動判定閾値とし、行動継続閾値を第3行動継続閾値と決定してもよい。なお、当該所定時間帯は、情報処理装置10に予め設定されていてもよい。
また、判定部12は、例えば、撮影装置20で撮影された画像において判定対象の人物の手前(撮影装置20から見て手前)の物体の種別に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。この場合、判定部12は、手前の物体の種別が人物、及び自動車等である場合は、行動判定期間の長さを第5期間長とし、行動判定閾値を第5行動継続閾値とし、行動継続閾値を第5行動継続閾値と決定してもよい。
また、判定部12は、例えば、判定対象の人物が行っている行動の種別に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。この場合、判定部12は、判定対象の人物が行っている行動の種別がくしゃみ、咳、及びマスクの非着用等である場合は、行動判定期間の長さを第7期間長とし、行動判定閾値を第7行動継続閾値とし、行動継続閾値を第7行動継続閾値と決定してもよい。
また、判定部12は、例えば、撮影装置20で撮影された画像に基づいて判定された、判定対象の人物の属性に基づいて、行動判定期間の長さ、行動判定閾値、及び行動継続閾値の少なくとも一つを決定してもよい。なお、人物の属性を検出する処理は、例えば、情報処理装置10、撮影装置20、及び外部装置のいずれで行われてもよい。情報処理装置10にて人物の属性を検出する場合、情報処理装置10は、例えば、ディープラーニング等を用いたAIにより、人物の属性(例えば、年齢、性別、身長等)を検出(推定、推論)してもよい。
<構成>
図9を参照し、実施形態に係る情報処理装置10の構成について説明する。図9は、実施形態に係る情報処理装置10の構成の一例を示す図である。図9の例では、図1の例と比較して、情報処理装置10が設定部14を有する点が主に異なっている。設定部14は、情報処理装置10にインストールされた1以上のプログラムと、情報処理装置10のプロセッサ101、及びメモリ102等のハードウェアとの協働により実現されてもよい。
情報処理装置10は、一つの筐体に含まれる装置でもよいが、本開示の情報処理装置10はこれに限定されない。情報処理装置10の各部は、例えば1以上のコンピュータにより構成されるクラウドコンピューティングにより実現されていてもよい。また、情報処理装置10と撮影装置20とを同一の筐体内に収容し、一体の情報処理装置として構成してもよい。また、情報処理装置10の各機能部の少なくとも一部の処理を、撮影装置20が実行するようにしてもよい。これらのような情報処理装置10についても、本開示の「情報処理装置」の一例に含まれる。
撮影装置20から見てある人物よりも手前側を他の物体が移動すること、及び撮影装置20から見て他の物体よりも奥側を当該人物が移動することにより、当該人物の体の少なくとも一部が他の物体の陰に隠れる場合がある。当該人物の体の少なくとも一部が他の物体の陰に隠れている間は、当該人物の行動を検出できない場合がある。
(付記1)
撮影装置で撮影された画像に基づいて検出された情報を取得する取得手段と、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理、及び前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定する第2判定処理を行う判定手段と、
前記判定手段による判定結果に基づいて警告を出力させる出力手段と、
を有する情報処理装置。
(付記2)
前記判定手段は、前記画像に基づいて検出された人物の行動が第1行動である場合は前記第1判定処理を実行し、前記画像に基づいて検出された人物の行動が第2行動である場合は前記第2判定処理を実行する、
付記1に記載の情報処理装置。
(付記3)
前記判定手段は、特定期間において、前記画像に基づいて人物による特定の行動が検出されている時間長の合計が第1閾値以上である場合は警告を出力させると判定する、
付記1または2に記載の情報処理装置。
(付記4)
前記判定手段は、前記特定期間において、前記画像に基づいて人物による前記特定の行動が継続して検出されていない期間が第2閾値以上である場合は、前記警告を出力させないと判定する、
付記3に記載の情報処理装置。
(付記5)
前記判定手段は、所定の条件に基づいて、前記特定期間の長さ、前記第1閾値、及び前記第2閾値の少なくとも一つを決定する、
付記4に記載の情報処理装置。
(付記6)
前記判定手段は、前記画像に基づいて判定された前記人物の周囲の状況、及び前記画像に基づいて検出された前記人物の手前の物体の種別の少なくとも一つに基づいて、前記特定期間の長さ、前記第1閾値、及び前記第2閾値の少なくとも一つを決定する、
付記4または5に記載の情報処理装置。
(付記7)
前記判定手段は、前記画像に基づいて判定された前記人物の属性、前記画像が撮影された時刻、前記画像が撮影された場所、及び前記特定の行動の種別の少なくとも一つに基づいて、前記特定期間の長さ、前記第1閾値、及び前記第2閾値の少なくとも一つを決定する、
付記4から6のいずれか一項に記載の情報処理装置。
(付記8)
前記情報処理装置は、
前記撮影装置で撮影される画像上の各領域、及び人物の行動の種別の少なくとも一方に応じた判定方法を設定する設定手段を有し、
前記判定手段は、前記設定手段により設定された判定方法に基づいて判定処理を行う、
付記1から7のいずれか一項に記載の情報処理装置。
(付記9)
撮影装置で撮影された画像に基づいて検出された情報を取得し、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定し、前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定し、
判定結果に基づいて警告を出力させる、
情報処理方法。
(付記10)
情報処理装置に、
撮影装置で撮影された画像に基づいて検出された情報を取得する処理と、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理と、前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定する第2判定処理と、
判定結果に基づいて警告を出力させる処理と、
を実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
(付記11)
画像を撮影する撮影装置と、情報処理装置とを含み、
前記情報処理装置は、
撮影装置で撮影された画像に基づいて検出された情報を取得する取得手段と、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理、及び前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定する第2判定処理を行う判定手段と、
前記判定手段による判定結果に基づいて警告を出力させる出力手段と、
を有する情報処理システム。
(付記12)
前記判定手段は、前記画像に基づいて検出された人物の行動が第1行動である場合は前記第1判定処理を実行し、前記画像に基づいて検出された人物の行動が第2行動である場合は前記第2判定処理を実行する、
付記11に記載の情報処理システム。
10 情報処理装置
11 取得部
12 判定部
13 出力部
20 撮影装置
Claims (12)
- 撮影装置で撮影された画像に基づいて検出された情報を取得する取得手段と、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理、及び前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定する第2判定処理を行う判定手段と、
前記判定手段による判定結果に基づいて警告を出力させる出力手段と、
を有する情報処理装置。 - 前記判定手段は、前記画像に基づいて検出された人物の行動が第1行動である場合は前記第1判定処理を実行し、前記画像に基づいて検出された人物の行動が第2行動である場合は前記第2判定処理を実行する、
請求項1に記載の情報処理装置。 - 前記判定手段は、特定期間において、前記画像に基づいて人物による特定の行動が検出されている時間長の合計が第1閾値以上である場合は警告を出力させると判定する、
請求項1または2に記載の情報処理装置。 - 前記判定手段は、前記特定期間において、前記画像に基づいて人物による前記特定の行動が継続して検出されていない期間が第2閾値以上である場合は、前記警告を出力させないと判定する、
請求項3に記載の情報処理装置。 - 前記判定手段は、所定の条件に基づいて、前記特定期間の長さ、前記第1閾値、及び前記第2閾値の少なくとも一つを決定する、
請求項4に記載の情報処理装置。 - 前記判定手段は、前記画像に基づいて判定された前記人物の周囲の状況、及び前記画像に基づいて検出された前記人物の手前の物体の種別の少なくとも一つに基づいて、前記特定期間の長さ、前記第1閾値、及び前記第2閾値の少なくとも一つを決定する、
請求項4または5に記載の情報処理装置。 - 前記判定手段は、前記画像に基づいて判定された前記人物の属性、前記画像が撮影された時刻、前記画像が撮影された場所、及び前記特定の行動の種別の少なくとも一つに基づいて、前記特定期間の長さ、前記第1閾値、及び前記第2閾値の少なくとも一つを決定する、
請求項4から6のいずれか一項に記載の情報処理装置。 - 前記情報処理装置は、
前記撮影装置で撮影される画像上の各領域、及び人物の行動の種別の少なくとも一方に応じた判定方法を設定する設定手段を有し、
前記判定手段は、前記設定手段により設定された判定方法に基づいて判定処理を行う、
請求項1から7のいずれか一項に記載の情報処理装置。 - 撮影装置で撮影された画像に基づいて検出された情報を取得し、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定し、前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定し、
判定結果に基づいて警告を出力させる、
情報処理方法。 - 情報処理装置に、
撮影装置で撮影された画像に基づいて検出された情報を取得する処理と、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理と、前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定する第2判定処理と、
判定結果に基づいて警告を出力させる処理と、
を実行させるプログラムが格納された非一時的なコンピュータ可読媒体。 - 画像を撮影する撮影装置と、情報処理装置とを含み、
前記情報処理装置は、
撮影装置で撮影された画像に基づいて検出された情報を取得する取得手段と、
第1期間の開始の際と終了の際とに検出された物体の種別が人物である場合は警告を出力させると判定する第1判定処理、及び前記画像に基づいて検出された人物による行動が第2期間以上継続している場合は警告を出力させると判定する第2判定処理を行う判定手段と、
前記判定手段による判定結果に基づいて警告を出力させる出力手段と、
を有する情報処理システム。 - 前記判定手段は、前記画像に基づいて検出された人物の行動が第1行動である場合は前記第1判定処理を実行し、前記画像に基づいて検出された人物の行動が第2行動である場合は前記第2判定処理を実行する、
請求項11に記載の情報処理システム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/270,990 US20240054815A1 (en) | 2021-03-31 | 2022-02-10 | Information processing apparatus, information processing method, computer-readable medium, and information processing system |
JP2023510605A JP7485204B2 (ja) | 2021-03-31 | 2022-02-10 | 情報処理装置、情報処理方法、及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021059305 | 2021-03-31 | ||
JP2021-059305 | 2021-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022209347A1 true WO2022209347A1 (ja) | 2022-10-06 |
Family
ID=83455852
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/005425 WO2022209347A1 (ja) | 2021-03-31 | 2022-02-10 | 情報処理装置、情報処理方法、コンピュータ可読媒体、及び情報処理システム |
Country Status (2)
Country | Link |
---|---|
US (1) | US20240054815A1 (ja) |
WO (1) | WO2022209347A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002154375A (ja) * | 2000-11-20 | 2002-05-28 | Nissan Motor Co Ltd | 車両用視界補助装置 |
JP2009015536A (ja) * | 2007-07-03 | 2009-01-22 | Securion Co Ltd | 不審者通報装置、不審者監視装置及びこれを用いた遠隔監視システム |
JP2012053658A (ja) * | 2010-09-01 | 2012-03-15 | Mitsubishi Material C.M.I. Corp | 防犯システム |
JP2019071578A (ja) * | 2017-10-11 | 2019-05-09 | パナソニックIpマネジメント株式会社 | 物体検知装置、物体検知システムおよび物体検知方法 |
JP2019152943A (ja) * | 2018-03-01 | 2019-09-12 | オムロン株式会社 | 危険度検知装置、危険度検知方法、及び危険度検知プログラム |
JP2021504814A (ja) * | 2018-10-19 | 2021-02-15 | シャンハイ センスタイム インテリジェント テクノロジー カンパニー リミテッド | 乗客状態分析方法及び装置、車両、電子機器並びに記憶媒体 |
-
2022
- 2022-02-10 US US18/270,990 patent/US20240054815A1/en active Pending
- 2022-02-10 WO PCT/JP2022/005425 patent/WO2022209347A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002154375A (ja) * | 2000-11-20 | 2002-05-28 | Nissan Motor Co Ltd | 車両用視界補助装置 |
JP2009015536A (ja) * | 2007-07-03 | 2009-01-22 | Securion Co Ltd | 不審者通報装置、不審者監視装置及びこれを用いた遠隔監視システム |
JP2012053658A (ja) * | 2010-09-01 | 2012-03-15 | Mitsubishi Material C.M.I. Corp | 防犯システム |
JP2019071578A (ja) * | 2017-10-11 | 2019-05-09 | パナソニックIpマネジメント株式会社 | 物体検知装置、物体検知システムおよび物体検知方法 |
JP2019152943A (ja) * | 2018-03-01 | 2019-09-12 | オムロン株式会社 | 危険度検知装置、危険度検知方法、及び危険度検知プログラム |
JP2021504814A (ja) * | 2018-10-19 | 2021-02-15 | シャンハイ センスタイム インテリジェント テクノロジー カンパニー リミテッド | 乗客状態分析方法及び装置、車両、電子機器並びに記憶媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022209347A1 (ja) | 2022-10-06 |
US20240054815A1 (en) | 2024-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3754618B1 (en) | Recording control device, recording control system, recording control method, and recording control program | |
JP2001216519A (ja) | 交通監視装置 | |
JP2016085487A (ja) | 情報処理装置、情報処理方法及びコンピュータプログラム | |
US20180350179A1 (en) | Information processing apparatus, information processing method, and recording medium | |
JP2013066016A (ja) | 輪郭抽出システム、輪郭抽出装置及び輪郭抽出プログラム | |
JP2005323046A (ja) | 監視システム、および監視カメラ | |
KR101454644B1 (ko) | 보행자 추적기를 이용한 서성거림을 탐지하는 방법 | |
JP5693147B2 (ja) | 撮影妨害検知方法、妨害検知装置及び監視カメラシステム | |
US20100021008A1 (en) | System and Method for Face Tracking | |
US20220189038A1 (en) | Object tracking apparatus, control method, and program | |
US10755107B2 (en) | Information processing apparatus, information processing method, and recording medium | |
WO2022209347A1 (ja) | 情報処理装置、情報処理方法、コンピュータ可読媒体、及び情報処理システム | |
JP2008035096A (ja) | 監視装置、監視方法及びプログラム | |
US20200279378A1 (en) | Collision determination server, program, and recording medium | |
JP7485204B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP2024009906A (ja) | 監視装置、監視方法、およびプログラム | |
JP3649277B2 (ja) | 画像認識による速度測定システム及び速度測定方法 | |
JP2020194227A (ja) | 顔遮蔽判定装置、顔遮蔽判定方法、顔遮蔽判定プログラム及び乗員監視システム | |
KR20210023859A (ko) | 화상 처리 장치, 이동 장치 및 방법, 그리고 프로그램 | |
US20240089415A1 (en) | Information processing apparatus, information processing method, computer-readable medium, and information processing system | |
WO2022168402A1 (ja) | 情報処理装置、情報処理方法およびコンピュータ可読媒体 | |
JP2008178088A (ja) | 監視システム、監視方法、及びプログラム | |
CN114788255B (zh) | 记录控制装置、记录控制方法和存储介质 | |
WO2021152985A1 (ja) | 記録制御装置、記録制御方法、及びプログラム | |
US8965171B2 (en) | Recording control apparatus, recording control method, storage medium storing recording control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22779575 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18270990 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2023510605 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22779575 Country of ref document: EP Kind code of ref document: A1 |