WO2022201282A1 - 判定装置、判定方法、判定システム及びプログラムが格納された非一時的なコンピュータ可読媒体 - Google Patents
判定装置、判定方法、判定システム及びプログラムが格納された非一時的なコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2022201282A1 WO2022201282A1 PCT/JP2021/011852 JP2021011852W WO2022201282A1 WO 2022201282 A1 WO2022201282 A1 WO 2022201282A1 JP 2021011852 W JP2021011852 W JP 2021011852W WO 2022201282 A1 WO2022201282 A1 WO 2022201282A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- determination
- person
- reference area
- feature
- determination device
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 30
- 238000003384 imaging method Methods 0.000 claims description 7
- 210000003371 toe Anatomy 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 24
- 238000001514 detection method Methods 0.000 description 15
- 210000002683 foot Anatomy 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 210000004247 hand Anatomy 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 210000002414 leg Anatomy 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 210000001513 elbow Anatomy 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000001624 hip Anatomy 0.000 description 1
- 210000004394 hip joint Anatomy 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 210000003739 neck Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 210000002832 shoulder Anatomy 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the present invention relates to a determination device, a determination method, a determination system, and a non-transitory computer-readable medium storing a program.
- Patent Document 1 A posture estimating device is disclosed (Patent Document 1).
- Patent Document 2 A heavy equipment proximity monitoring system that performs a monitoring operation is disclosed (Patent Document 2).
- Patent Document 3 a video processing device having display control means for displaying a trajectory indicating changes in the position of an object in a video on a display unit is disclosed.
- a suitable technique for determining whether a specific part of a person is intruding into a predetermined area is desired. There is also a demand to grasp the action history of a person who is determined to have entered a predetermined area.
- the present disclosure has been made in view of such problems, and aims to provide a determination device, determination method, determination system, and program for suitably determining intrusion detection.
- a determination device has an image data acquisition unit, a joint point estimation unit, a characteristic element setting unit, an intrusion determination unit, and an output unit.
- the image data acquisition unit acquires image data of a predetermined space captured by the imaging device.
- the joint point estimation unit estimates the joint points of the person included in the image data.
- the characteristic element setting unit sets characteristic elements of the person based on the joint points.
- the intrusion determination unit determines whether or not the person has entered a preset reference area based on the feature element.
- the output unit outputs information about the determination result of the determination performed by the intrusion determination unit.
- a computer executes the following method in a determination method according to an embodiment of the present disclosure.
- the computer acquires image data of a predetermined space captured by the imaging device.
- the computer estimates joint points of the person included in the image data.
- the computer sets characteristic elements of the person based on the joint points.
- the computer determines whether or not the person has entered a preset reference area based on the characteristic elements.
- the computer outputs information about the result of the determination.
- a program causes a computer to execute the following steps.
- the computer acquires image data of a predetermined space captured by the imaging device.
- the computer estimates joint points of the person included in the image data.
- the computer sets characteristic elements of the person based on the joint points.
- the computer determines whether or not the person has entered a preset reference area based on the characteristic elements.
- the computer outputs information about the result of the determination.
- FIG. 1 is a block diagram showing the configuration of a determination device according to Embodiment 1;
- FIG. 4 is a flow chart showing a determination method according to the first embodiment;
- FIG. 10 is a block diagram showing the configuration of a determination system according to a second embodiment;
- FIG. 2 is a block diagram showing the configuration of a determination device according to a second embodiment;
- FIG. 9 is a flow chart showing a determination method according to the second embodiment; It is a figure which shows the 1st example of the image which a determination apparatus processes.
- FIG. 10 is a diagram showing a second example of an image processed by the determination device; It is a figure which shows the 1st example of intrusion determination.
- FIG. 10 is a diagram showing a second example of intrusion determination;
- FIG. 10 is a diagram showing a second example of intrusion determination;
- FIG. 12 is a diagram showing a third example of intrusion determination;
- FIG. 12 is a diagram showing a fourth example of intrusion determination;
- FIG. 12 is a diagram showing a fifth example of intrusion determination;
- FIG. 11 is a diagram showing an example of an image processed by the determination device according to the third embodiment;
- FIG. 11 is a diagram showing an example of an image processed by a determination device according to a fourth embodiment;
- FIG. It is a block diagram which illustrates the hardware constitutions of a computer.
- FIG. 1 is a block diagram of the determination device 10 according to the first embodiment.
- a determination device 10 shown in FIG. 1 is used by being communicably connected to a camera (photographing device) installed in a predetermined facility or outdoors.
- the determination device 10 has an intrusion detection function that determines whether or not a person has entered a predetermined reference area set in image data, and outputs the determination result.
- the determination device has an image data acquisition unit 111, a joint point estimation unit 112, a characteristic element setting unit 113, an intrusion determination unit 114, and an output unit 115 as main components.
- the image data acquisition unit 111 acquires image data of a predetermined space captured by the imaging device.
- the number of cameras connected to the image data acquisition unit 111 may be one or plural.
- the camera to which the image data acquisition unit 111 is connected may be fixed in order to photograph a predetermined angle of view, or may be of a movable type capable of panning, tilting, or zooming.
- the image data acquisition unit 111 appropriately supplies the image data acquired from the camera to each component.
- the joint point estimation unit 112 receives image data from the image data acquisition unit 111 and estimates the joint points of the person included in the received image data. Specifically, the joint point estimation unit 112 identifies, for example, an image of a person (person image) included in the received image data. For example, the joint point estimating unit 112 searches for an area having a feature amount that matches the feature amount of the person image by, for example, performing convolution processing on the image data. The joint point estimating unit 112 identifies an area that matches the feature amount of the human image as the human image.
- the joint point estimation unit 112 estimates the joint points of the person from the person image. Joint points are, for example, wrists, elbows, shoulders, necks, hip joints, knees, and the like.
- the joint point estimating unit 112 may estimate the joint points from one human image, or may estimate the joint points from image data relating to a plurality of images taken at different times.
- the joint point estimation unit 112 supplies information about the estimated joint points to the characteristic element setting unit 113 .
- the feature element setting unit 113 uses the estimated joint points to set the feature elements of the person.
- a feature element of a person is a specific joint point or an element related to a joint point set for intrusion detection.
- the specified human image includes a plurality of joint points. Which joint point among these multiple joint points is set as a feature element may be set in advance or may be set individually. In the case of individual setting, the characteristic elements may be set by the user using the determination device 10, or may be automatically set according to predetermined conditions set in advance. After setting the characteristic elements, the characteristic element setting unit 113 supplies the intrusion determination unit 114 with information about the set characteristic elements.
- the intrusion determination unit 114 receives information about the feature elements from the feature element setting unit 113, and determines whether or not the person corresponding to the feature elements has entered the reference area.
- a reference area is a specific area for performing intrusion detection, and is an area set in advance in image data.
- the intrusion determination unit 114 determines whether or not the person corresponding to the characteristic element has entered the reference area, and then supplies information regarding the determination result to the output unit 115 . It should be noted that in the following description, the information regarding the determination result may be simply referred to as the determination result.
- the output unit 115 receives from the intrusion determination unit 114 information about the determination result of the determination performed by the intrusion determination unit 114, and outputs the received information.
- the determination result output by the output unit 115 is output in a manner that allows the user using the determination device to recognize the determination result. More specifically, for example, the determination result may be output by sound, light, or image.
- the determination result may also be transmitted to any other device communicably connected to the determination device 10 .
- FIG. 2 is a flow chart showing a determination method according to the first embodiment.
- the flowchart shown in FIG. 2 is started, for example, by activating the determination device 10 .
- the flow chart shown in FIG. 2 may be initiated by receiving image data from an imager.
- the image data acquisition unit 111 acquires image data of a predetermined space captured by the imaging device (step S11).
- the image data acquisition unit 111 supplies the acquired image data to at least the joint point estimation unit 112 .
- the joint point estimation unit 112 estimates the joint points of the person included in the received image data (step S12).
- the joint point estimation unit 112 supplies information on the estimated joint points to the feature element setting unit 113 .
- the joint point estimation unit 112 estimates the joint points for each human image. That is, the joint point estimation unit 112 generates information corresponding to a plurality of human images, and supplies the information to the feature element setting unit 113 .
- the feature element setting unit 113 uses the information received from the joint point estimation unit 112 to set feature elements from the joint points of the human image (step S13). After setting the characteristic elements, the characteristic element setting unit 113 supplies information about the characteristic elements to the intrusion determination unit 114 . Note that when a plurality of person images are specified in the image data, the feature element setting unit 113 sets feature elements corresponding to each person image.
- the intrusion determination unit 114 uses the set feature elements to determine whether or not the person corresponding to the feature elements has entered a preset reference area (step S14).
- the intrusion determination unit 114 may determine whether or not any of the human images intrudes into the reference area when a plurality of human images are specified in the image data. Further, when a plurality of person images are specified in the image data, the intrusion determination unit 114 may perform intrusion detection determination corresponding to each person image.
- the output unit 115 outputs information about the determination result (step S15).
- the determination device 10 ends the series of processes.
- the determination device 10 has a processor and a storage device (not shown).
- the storage device of the determination device 10 includes, for example, a storage device including non-volatile memory such as flash memory and SSD (Solid State Drive).
- the storage device of the determination device 10 stores a computer program (hereinafter simply referred to as a program) for executing the image processing method described above.
- the processor also loads a computer program from a storage device into a buffer memory such as a DRAM (Dynamic Random Access Memory) and executes the program.
- a DRAM Dynamic Random Access Memory
- Each configuration of the determination device 10 may be realized by dedicated hardware. Also, part or all of each component may be implemented by a general-purpose or dedicated circuit, processor, etc., or a combination thereof. These may be composed of a single chip, or may be composed of multiple chips connected via a bus. A part or all of each component of each device may be implemented by a combination of the above-described circuits and the like and programs. Moreover, CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (field-programmable gate array), etc. can be used as a processor. It should be noted that the configuration descriptions described herein may also be applied to other devices or systems described below in the present disclosure.
- the plurality of information processing devices, circuits, etc. may be centrally arranged or distributed. may be placed.
- the information processing device, circuits, and the like may be implemented as a form in which each is connected via a communication network, such as a client-server system, a cloud computing system, or the like.
- the functions of the determination device 10 may be provided in a SaaS (Software as a Service) format.
- the determination device 10 identifies a person image from image data, and performs mask processing when the identified person image is included in a mask area. Therefore, according to Embodiment 1, it is possible to provide a determination device, a determination method, and a program for suitably determining intrusion detection.
- FIG. 3 is a block diagram showing the configuration of a determination system according to a second embodiment;
- a determination system 1 shown in FIG. 3 includes a determination device 20 and a camera 300 .
- the determination device 20 and the camera 300 are communicably connected via a network N1.
- the camera 300 is installed in the space 900, captures the scenery of the space 900, generates image data, and supplies the generated image data to the determination device 20 via the network N1.
- a person P1 and a person P2 may exist in a space 900 photographed by the camera 300 .
- a predetermined reference area is set in advance in this space 900 .
- the determination device 20 connected to the camera 300 determines whether or not the person P1 or the person P2 has entered the reference area.
- FIG. 4 is a block diagram showing the configuration of the determination device 20 according to the second embodiment.
- a determination device 20 shown in FIG. 4 differs from the determination device 10 according to the first embodiment in that it has a display 116 and a storage unit 120 .
- the display 116 is a display device including, for example, organic electroluminescence and liquid crystal panels.
- the display 116 receives the determination result as image data from the output unit 115 and displays the received determination result.
- the display regarding the determination result may include an image captured by the camera 300, for example. Further, the display regarding the determination result may be an image captured by the camera 300 on which the joint points, characteristic elements, and reference regions are superimposed.
- the storage unit 120 is a storage device including non-volatile memory such as flash memory, SSD (Solid State Drive) or EPROM (Erasable Programmable Read Only Memory).
- the storage unit 120 stores information about reference regions, for example.
- the storage unit 120 supplies the stored information on the reference area to the intrusion determination unit 114 .
- the output unit 115 in this embodiment outputs information including an alert as a determination result when it is determined that the characteristic element related to the specified person has entered the reference area. Output unit 115 provides such alerts to display 116 .
- FIG. 5 is a flow chart showing a determination method according to the second embodiment.
- the flowchart shown in FIG. 5 differs from the flowchart shown in FIG. 2 in the process after step S13.
- the intrusion determination unit 114 determines whether or not the set feature element intrudes into the reference area (step S21). If it is not determined that such a feature element intrudes into the reference area (step S21: NO), the determination device 20 proceeds to step S23. On the other hand, if it is determined that the characteristic element intrudes into the reference area (step S21: YES), the determination device 20 proceeds to step S22.
- step S22 the output unit 115 outputs an alert according to the determination result (step S22).
- the output unit 115 supplies a signal for displaying on the display 116 that an intrusion has been detected.
- the output unit 115 may output the alert continuously for a predetermined period.
- the determination device 20 proceeds to step S23.
- step S23 the determination device 20 determines whether or not to end the series of processes (step S23).
- the case of ending the series of processes is, for example, the case of stopping the determination device 20 by the user's operation, or the case of stopping the supply of image data from the camera 300 .
- step S23: YES the determination device 20 ends the process.
- step S23: NO the determination device 20 returns to step S11 and continues the process.
- FIG. 6 is a diagram showing a first example of an image processed by the determination device.
- An image 201 is shown in FIG.
- Image 201 is an example of an image displayed on display 116 .
- the image 201 is a landscape of the space 900 photographed by the camera 300, and includes a first person image 210 of the person P1, a second person image 220 of the person P2, and a predetermined object image 230.
- a reference area 240 is set in the object image 230 .
- the reference area 240 is indicated by a thick two-dot chain line trapezoid.
- the reason why the reference area 240 is indicated by a trapezoid is that the reference area 240 is set by a rectangle parallel to the horizontal plane of the space 900 . In this way, the reference area 240 is set to pseudo-correspond to a predetermined height and predetermined shape in the space 900 .
- the determination device 20 determines whether or not a person has entered the reference area 240 .
- FIG. 7 is a diagram showing a second example of an image processed by the determination device.
- the joint point estimating unit 112 analyzes, for example, the image 201 by convolution processing to determine whether a human image of a predetermined size exists.
- the origin of the image 201 is, for example, the upper left corner of the image, the horizontal direction from left to right is the X axis, and the vertical direction from top to bottom is the Y axis.
- the joint point estimating unit 112 performs image analysis processing in the X plus direction from the origin, and when the processing is completed up to the right end, moves the Y axis in the plus direction, and again performs analysis processing in the X plus direction from the left end. conduct.
- the joint point estimation unit 112 can change the size of the rectangle used for analysis according to the depth of the image 201 . That is, the joint point estimating unit 112 specifies the human image with a relatively smaller size toward the upper side of the image 201, that is, toward the back of the space, and uses a relatively larger size toward the lower side of the image 201, that is, toward the front side of the space to identify the human image. identification. Through such processing, the joint point estimation unit 112 can efficiently identify the human image.
- FIG. 7 shows a rectangle F10 and a rectangle F20.
- Rectangle F10 is used when identifying first person image 210 .
- the joint point estimation unit 112 identifies the first person image 210 by calculating the feature amount of the first person image 210 included in the rectangle F10. Similarly, the joint point estimation unit 112 identifies the second person image 220 by calculating the feature amount of the rectangle F20.
- the joint point estimation unit 112 estimates the joint points of the specified human image.
- the joint point estimating unit 112 estimates joint points related to the human image from the specified feature amount of the human image.
- the joint point estimation unit 112 may have a learning model learned by machine learning, for example, in order to estimate the posture of a person in a person image. That is, in this case, for example, the joint point estimation unit 112 estimates the posture of the person included in the rectangle F10 shown in FIG. 7 using the learning model, and estimates the joint points from the estimated posture of the person.
- a plurality of joint points 211 are superimposed on the first person image 210 .
- a plurality of connection lines 212 connecting joint points are also superimposed on the first person image 210 .
- a plurality of joint points 221 are superimposed on the second person image 220 .
- a plurality of connection lines 222 connecting joint points are also superimposed on the second person image 220 .
- FIG. 8 is a diagram showing a first example of intrusion determination.
- FIG. 8 shows the reference area 240, the first person image 210, the joint points 211 of the first person image 210, and the connecting lines 212 that connect the joint points 211 extracted from FIG.
- characteristic elements 213 are indicated by black rectangles at the positions of the joint points 211 corresponding to the left and right hand portions.
- the example of FIG. 8 shows a state in which the characteristic element setting unit 113 sets the point located at the end (that is, the end point) of the joint points 211 of the first person image 210 as the characteristic element 213 .
- the feature element setting unit 113 sets the joint points of the hands, which are the end points of the joint points, as the feature elements.
- the feature element setting unit 113 may set the person's head or the tip of the foot as the end point of the joint point instead of the joint point of the hand.
- one of the two set feature elements 213 intrudes inside the reference area 240 .
- the intrusion determination unit 114 determines that the feature element 213 is intruding inside the reference area 240 .
- one of the joint points is set as a characteristic element, and when the set characteristic element intrudes into the reference region 240, the intrusion determination unit 114 detects the person P1 in the first person image 210. is intruding into the reference area 240 .
- the determination device 20 according to the present embodiment can suitably detect that the tip of the body enters the reference region.
- connection line 212 that connects two joint points 211 from the image data corresponding to the person, and then one connection line 212 connects the multiple joint points 211.
- a joint point 211 is recognized as an end point and set as a characteristic element 213 here.
- FIG. 9 is a diagram showing a second example of intrusion determination.
- the characteristic element setting unit 113 according to the example of FIG. 9 sets a plurality of adjacent joint points including the end points described above as characteristic elements.
- the characteristic element setting unit 113 may set a plurality of adjacent joint points corresponding to arms or legs of a person as characteristic elements.
- the intrusion determination unit 114 determines whether or not all of the adjacent feature elements intrude into the reference region.
- the feature element setting unit 113 sets two features, the hand and the elbow, as feature elements 214 .
- the feature element 214 corresponding to one of the two arms exists inside the reference area 240 .
- the intrusion determination unit 114 determines that the first person image 210 has entered the reference area 240 .
- FIG. 10 is a diagram showing a third example of intrusion determination.
- the feature element setting unit 113 according to the example of FIG. 10 identifies a circumscribing rectangle that contacts the outside of the person, and sets joint points that contact this circumscribing rectangle as feature elements.
- a circumscribing rectangle F11 is set around the joint point 211 on the first person image 210 .
- the characteristic element setting unit 113 sets, as characteristic elements 215, joint points that contact the circumference of the circumscribed rectangle F11.
- the intrusion determination unit 114 determines that the first person image 210 has entered the reference area 240 .
- the determination device 20 can set the joint points located on the outer edge of the human image as the feature elements.
- FIG. 11 is a diagram showing a fourth example of intrusion determination.
- the feature element setting unit 113 according to the example of FIG. 11 also sets, as feature elements, the joint points that contact the circumscribing rectangle that contacts the outside of the person, as in the example of FIG. 10 .
- a circumscribing rectangle F12 is set around the joint point 211 on the first person image 210 .
- the first human image 210 shown in FIG. 11 has the arm positioned on the left side of the drawing bent at the elbow and protruding outward. Therefore, the joint point of the elbow is in contact with the circumscribed rectangle F12. Therefore, the characteristic element setting unit 113 sets the joint point that is in contact with the left side of the circumscribing rectangle F12 as the characteristic element 215 . Further, in FIG. 11, the feature element 215 in contact with the left side of the circumscribing rectangle F12 exists inside the reference area 240 . Therefore, the intrusion determination unit 114 determines that the first person image 210 has entered the reference area 240 .
- FIG. 12 is a diagram showing a fifth example of intrusion determination.
- the feature element setting unit 113 sets a connection line connecting two joint points as the feature element based on image data corresponding to a person.
- the intrusion determination unit 114 determines whether or not at least part of the connection line set as the feature element intrudes into the reference area.
- the characteristic element setting unit 113 sets the connection line connecting the hand and the elbow as the characteristic element 216.
- the feature element 216 is indicated by a bold white line.
- a portion of feature element 216 overlaps reference region 240 . Therefore, the intrusion determination unit 114 determines that the first person image 210 has entered the reference area 240 .
- the mode of intrusion detection in the second embodiment is not limited to the above example.
- the characteristic element setting unit 113 may set three or more adjacent joint points as characteristic elements. Therefore, the feature element setting unit 113 may set, for example, all the joint points 211 on the first person image 210 as feature elements.
- the characteristic element setting unit 113 can set various characteristic elements using the joint points of the person image. According to Embodiment 2, it is possible to provide a determination device, a determination method, a determination system, and a program for suitably determining intrusion detection.
- the determination device according to the third embodiment differs from the determination device described above in the method of intrusion detection. More specifically, the determination device according to the third embodiment differs from the determination device 20 according to the second embodiment in the processing performed by the characteristic element setting unit 113 and the intrusion determination unit 114 .
- the feature element setting unit 113 in the present embodiment sets the first feature element and the second feature element from a plurality of joint points of the person. Further, in this case, the intrusion determination unit 114 determines whether the first characteristic element has intruded into the preset first reference area, and also determines whether the second characteristic element has intruded into the preset second reference area. determine whether there is Furthermore, the output unit 115 outputs the determination result when the first feature element enters the first reference area and the second feature element enters the second reference area.
- FIG. 13 is a diagram showing an example of an image processed by the determination device 20 according to the third embodiment.
- the feature element setting unit 113 sets the joint points of the hands of the first person image 210 as the first feature elements 217 .
- the characteristic element setting unit 113 also sets the central portion (white circle) of the line segment connecting the joint points of both feet of the first person image 210 as the second characteristic element 218 .
- the feature element setting unit 113 sets the joint points of the hands of the second person image 220 as the first feature elements 227 .
- the characteristic element setting unit 113 sets the central portion (white circle) of the line segment connecting the joint points of both legs of the second person image 220 as the second characteristic element 228 .
- a first reference area 241 and a second reference area 242 are set.
- the first reference area 241 is a predetermined area including the object image 230, like the reference area 240 shown in FIG.
- the second reference area 242 is set on the floor on the right side of the reference area 240 .
- a first characteristic element 217 of the first person image 210 and a first characteristic element 227 of the second person image 220 are present in the first reference area 241 . Also, the second feature element 218 of the first person image 210 exists in the second reference area 242 . A second feature element 228 of the second person image 220 exists outside the second reference area 242 .
- the intrusion determination unit 114 converts the first person image 210 in which the first characteristic element 217 exists in the first reference area 241 and the second characteristic element 218 exists in the second reference area 242 to the 1 reference area 241 is determined.
- the second person image 220 although the first feature element 227 exists in the first reference region 241, the second feature element 228 does not exist in the second reference region 242. It is not determined that the two-person image 220 has entered the first reference area 241 .
- the determination device 20 is not limited to the functions or configurations described above.
- the determination device 20 determines that when the period during which the second characteristic element 218 exists in the second reference area 242 is longer than a preset period (eg, 3 seconds, 10 seconds, 15 seconds, etc.) , it may be determined that the first person image 210 is intruding.
- the second characteristic element 218 may be the joint points of both feet themselves, or may be set to either one of the two feet instead of the point set between the two feet described above.
- the height of the second reference area 242 may correspond to the waist height of the person image instead of the floor surface, or may correspond to the head position. In this case, the second characteristic element can be set at a portion corresponding to the height of the set second reference area.
- the output unit 115 may output the determination result when, for example, the first feature element 217 enters the first reference region 241 after the second feature element 218 enters the second reference region 242. .
- the determination device 20 can suitably determine intrusion detection while tracking the order of actions of a person.
- Embodiment 3 it is possible to provide a determination device, a determination method, a determination system, and a program for suitably determining intrusion detection.
- FIG. 14 is a diagram showing an example of an image processed by the determination device 20 according to the fourth embodiment.
- the determination device 20 sets a third reference area 243 in addition to the first reference area 241 and the second reference area 242 .
- the third reference area 243 is set at a distance equivalent to several meters from the second reference area 242 at the same height as the floor surface as the second reference area 242 is.
- the determination device 20 tracks the trajectory of the person in the image 204 .
- the intrusion determination unit 114 determines whether or not the second characteristic element has passed through a preset third reference area. In this case, the output unit 115 enters the second reference region 242 after the second feature element 218 passes through the third reference region 243, and the first feature element 217 enters the first reference region 241. output the judgment result.
- FIG. 14 shows the trajectory 219 of the second feature element 218 set in the first person image 210 .
- a trajectory 219 is a superimposed representation of the positions at which the second characteristic element 218 existed during a period from the time when the image 204 was captured to a time before a predetermined period of time.
- the second feature 218 of the first person image 210 intrudes into the second reference area 242 after passing through the third reference area 243 .
- the first feature element 217 set at hand enters the first reference area 241 while the second feature element 218 at the feet is present in the second reference area 242.
- the intrusion determination unit 114 of the determination device 20 determines that the first person image 210 has entered the first reference area 241 .
- the determination device 20 described above may have a reference area setting unit that sets a reference area in a predetermined space. Thereby, a desired reference area can be set.
- the reference area may be set in association with the position, size, shape, etc. of the object included in the image data. As a result, the determination device 20 can hold the predetermined reference area even when the camera zooms or pans, for example.
- Embodiment 4 it is possible to provide a determination device, a determination method, a determination system, and a program for suitably determining intrusion detection while grasping the action trajectory of a person.
- Non-transitory computer-readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic recording media (e.g., flexible discs, magnetic tapes, hard disk drives), magneto-optical recording media (e.g., magneto-optical discs), CD-ROM (Read Only Memory) CD-R, CD - R/W, including semiconductor memory (eg Mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), Flash ROM, RAM (Random Access Memory)).
- the program may also be delivered to the computer by various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
- FIG. 15 is a block diagram illustrating the hardware configuration of a computer.
- the determination device can implement the functions described above by a computer 500 including the hardware configuration shown in the figure.
- the computer 500 may be a portable computer such as a smart phone or a tablet terminal, or may be a stationary computer such as a PC.
- Computer 500 may be a dedicated computer designed to implement each device, or may be a general-purpose computer.
- the computer 500 can implement desired functions by installing predetermined applications.
- Computer 500 has bus 502 , processor 504 , memory 506 , storage device 508 , input/output interface (I/F) 510 and network interface (I/F) 512 .
- the bus 502 is a data transmission path through which the processor 504, memory 506, storage device 508, input/output interface 510, and network interface 512 exchange data with each other.
- the method of connecting the processors 504 and the like to each other is not limited to bus connection.
- the processor 504 is various processors such as CPU, GPU or FPGA.
- the memory 506 is a main memory implemented using a RAM (Random Access Memory) or the like.
- the storage device 508 is an auxiliary storage device realized using a hard disk, SSD, memory card, ROM (Read Only Memory), or the like.
- the storage device 508 stores programs for realizing desired functions.
- the processor 504 reads this program into the memory 506 and executes it, thereby realizing each functional component of each device.
- the input/output interface 510 is an interface for connecting the computer 500 and input/output devices.
- the input/output interface 510 is connected to an input device such as a keyboard and an output device such as a display device.
- a network interface 512 is an interface for connecting the computer 500 to a network.
- (Appendix 2) The characteristic element setting means sets the endpoint of the joint point to the characteristic element.
- the determination device according to appendix 1.
- the characteristic element setting means sets the person's head, fingertips, or toes as the end points of the joint points.
- the determination device according to appendix 2. (Appendix 4)
- the feature element setting means sets a plurality of adjacent joint points including the end points to the feature elements,
- the intrusion determination means determines whether or not all of the plurality of adjacent feature elements intrude into the reference area.
- the determination device according to appendix 2 or 3. (Appendix 5)
- the feature element setting means sets a plurality of adjacent joint points corresponding to the arms or legs of the person as the feature elements.
- the feature element setting means specifies a circumscribing rectangle that contacts the outside of the person, and sets the joint points that contact the circumscribing rectangle as the feature element.
- the determination device according to appendix 1.
- the characteristic element setting means sets a connection line connecting two of the joint points based on the image data corresponding to the person, and sets the joint point connected by one of the connection lines as the characteristic element. do, The determination device according to appendix 1.
- the feature element setting means sets a connection line connecting the two joint points to the feature element based on the image data corresponding to the person, The intrusion determination means determines whether at least a portion of the connection line set in the characteristic element intrudes into the reference area.
- the determination device according to appendix 1.
- the output means outputs the determination result when it is determined that the characteristic element has entered the reference area.
- the determination device according to any one of Appendices 1 to 8.
- the characteristic element setting means sets a first characteristic element and a second characteristic element based on the plurality of joint points of the person,
- the intrusion determination means determines whether the first characteristic element intrudes into a preset first reference area and determines whether the second characteristic element intrudes into a preset second reference area. judge,
- the determination device according to any one of Appendices 1 to 9.
- the output means outputs the determination result when the first characteristic element intrudes into the first reference area and the second characteristic element intrudes into the second reference area.
- the determination device according to appendix 10.
- the intrusion determination means further determines whether the second characteristic element has passed through a preset third reference area, The output means, when the second characteristic element enters the second reference area after passing the third reference area and the first characteristic element enters the first reference area, the output the judgment result,
- the determination device according to appendix 11.
- the output means outputs the determination result when the first feature element intrudes into the first reference area after the second feature element invades the second reference area.
- the determination device according to any one of Appendices 10 to 12.
- Appendix 14 The determination device according to any one of Appendices 1 to 13; and a camera that supplies the image data generated by photographing a predetermined space to the determination device.
- (Appendix 16) Acquiring image data of a predetermined space photographed by a photographing device, estimating joint points of a person included in the image data; setting characteristic elements of the person based on the joint points; determining whether or not the person has entered a preset reference area based on the feature element; outputting information about the determination result of the determination;
- a non-transitory computer-readable medium storing an information control program that causes a computer to execute the determination method.
Abstract
Description
以下、図面を参照して本発明の実施の形態について説明する。図1は、実施の形態1にかかる判定装置10のブロック図である。図1に示す判定装置10は、所定の施設内または屋外に設置されたカメラ(撮影装置)に通信可能に接続して使用される。判定装置10は、画像データにおいて設定された所定の参照領域に人物が進入したか否かを判定し、その判定結果を出力する侵入検知機能を有している。判定装置は主な構成として、画像データ取得部111、関節点推定部112、特徴要素設定部113、侵入判定部114および出力部115を有している。
次に、実施の形態2について説明する。図3は、実施形態2にかかる判定システムの構成を示すブロック図である。図3に示す判定システム1は、判定装置20およびカメラ300を含む。判定装置20とカメラ300とは、ネットワークN1を介して通信可能に接続している。
次に、判定装置20が設定する特徴要素と参照領域の関係により行われる侵入判定の例について説明する。図8は、侵入判定の第1例を示す図である。図8は、図7から参照領域240、第1人物画像210、および第1人物画像210の関節点211および関節点211を繋ぐ接続線212を抽出して示したものである。
次に、図9に示す例について説明する。図9は、侵入判定の第2例を示す図である。図9の例にかかる特徴要素設定部113は、上述した端点を含む複数の隣接する関節点を特徴要素としてそれぞれ設定する。特徴要素設定部113は、人物の腕または脚に対応する複数の隣接する関節点を、特徴要素として設定するものであってもよい。この場合、侵入判定部114は、隣接する複数の特徴要素が全て参照領域に侵入しているか否かを判定する。
次に、図10に示す例について説明する。図10は、侵入判定の第3例を示す図である。図10の例にかかる特徴要素設定部113は、人物の外側に接する外接矩形を特定し、この外接矩形に接する関節点を、特徴要素に設定する。
次に、図11に示す例について説明する。図11は、侵入判定の第4例を示す図である。図11の例にかかる特徴要素設定部113も、図10の例と同様に、人物の外側に接する外接矩形に接する関節点を、特徴要素に設定する。
次に、図12に示す例について説明する。図12は、侵入判定の第5例を示す図である。図12に示す例において、特徴要素設定部113は、人物に対応する画像データに基づいて2つの関節点を接続する接続線を前記特徴要素に設定する。この場合に、侵入判定部114は、特徴要素に設定された接続線の少なくとも一部が参照領域に侵入しているか否かを判定する。
次に、実施の形態3について説明する。実施の形態3にかかる判定装置は、侵入検知の手法が上述の判定装置と異なる。より具体的には、実施の形態3にかかる判定装置は、特徴要素設定部113および侵入判定部114が行う処理が実施の形態2にかかる判定装置20と異なる。
次に、実施の形態4について説明する。実施の形態4は、参照領域に関する処理が、実施の形態3にかかる判定装置20と異なる。図14は、実施形態4にかかる判定装置20が処理する画像の例を示す図である。図14に示す画像204において、判定装置20は、第1参照領域241、第2参照領域242に加えて、第3参照領域243を設定する。第3参照領域243は、第2参照領域242と同じく床面と同じ高さにおいて、第2参照領域242から数メートルに相当する距離の場所に設定されている。
以下、本開示における判定装置の各機能構成がハードウェアとソフトウェアとの組み合わせで実現される場合について説明する。
(付記1)
撮影装置が撮影した所定の空間の画像データを取得する画像データ取得手段と、
前記画像データに含まれる人物の関節点を推定する関節点推定手段と、
前記関節点に基づいて前記人物の特徴要素を設定する特徴要素設定手段と、
前記特徴要素に基づいて前記人物が予め設定された参照領域に侵入しているか否かを判定する侵入判定手段と、
前記侵入判定手段が行った判定の判定結果に関する情報を出力する出力手段と、を備える
判定装置。
(付記2)
前記特徴要素設定手段は、前記関節点の端点を前記特徴要素に設定する、
付記1に記載の判定装置。
(付記3)
前記特徴要素設定手段は、前記人物の頭、手先または足先を前記関節点の前記端点として設定する、
付記2に記載の判定装置。
(付記4)
前記特徴要素設定手段は、前記端点を含む複数の隣接する前記関節点を前記特徴要素にそれぞれ設定し、
前記侵入判定手段は、複数の隣接する前記特徴要素が全て前記参照領域に侵入しているか否かを判定する、
付記2または3に記載の判定装置。
(付記5)
前記特徴要素設定手段は、前記人物の腕または脚に対応する複数の隣接する前記関節点を、前記特徴要素として設定する、
付記1に記載の判定装置。
(付記6)
前記特徴要素設定手段は、前記人物の外側に接する外接矩形を特定し、前記外接矩形に接する前記関節点を、前記特徴要素に設定する、
付記1に記載の判定装置。
(付記7)
前記特徴要素設定手段は、前記人物に対応する前記画像データに基づいて2つの前記関節点を接続する接続線を設定し、一の前記接続線が接続する前記関節点を、前記特徴要素に設定する、
付記1に記載の判定装置。
(付記8)
前記特徴要素設定手段は、前記人物に対応する前記画像データに基づいて2つの前記関節点を接続する接続線を前記特徴要素に設定し、
前記侵入判定手段は、前記特徴要素に設定された前記接続線の少なくとも一部が前記参照領域に侵入しているか否かを判定する、
付記1に記載の判定装置。
(付記9)
前記出力手段は、前記特徴要素が前記参照領域に侵入したと判定された場合に、前記判定結果を出力する、
付記1~8のいずれか一項に記載の判定装置。
(付記10)
前記特徴要素設定手段は、前記人物における複数の前記関節点に基づいて第1特徴要素および第2特徴要素を設定し、
前記侵入判定手段は、前記第1特徴要素が予め設定された第1参照領域に侵入しているかを判定すると共に、前記第2特徴要素が予め設定された第2参照領域に侵入しているかを判定する、
付記1~9のいずれか一項に記載の判定装置。
(付記11)
前記出力手段は、前記第1特徴要素が前記第1参照領域に侵入し、且つ、前記第2特徴要素が前記第2参照領域に侵入した場合に、前記判定結果を出力する、
付記10に記載の判定装置。
(付記12)
前記侵入判定手段は、前記第2特徴要素が予め設定された第3参照領域を通過したか否かをさらに判定し、
前記出力手段は、前記第2特徴要素が前記第3参照領域を通過した後に、前記第2参照領域に侵入し、且つ、前記第1特徴要素が前記第1参照領域に侵入した場合に、前記判定結果を出力する、
付記11に記載の判定装置。
(付記13)
前記出力手段は、前記第2特徴要素が前記第2参照領域に侵入した後に、前記第1特徴要素が前記第1参照領域に侵入した場合に、前記判定結果を出力する、
付記10~12のいずれか一項に記載の判定装置。
(付記14)
付記1~13のいずれか一項に記載の判定装置と、
所定の空間を撮影して生成した前記画像データを前記判定装置に供給するカメラと、を備える
判定システム。
(付記15)
コンピュータが、
撮影装置が撮影した所定の空間の画像データを取得し、
前記画像データに含まれる人物の関節点を推定し、
前記関節点に基づいて前記人物の特徴要素を設定し、
前記特徴要素に基づいて前記人物が予め設定された参照領域に侵入しているか否かを判定し、
前記判定にかかる判定結果に関する情報を出力する、
判定方法。
(付記16)
撮影装置が撮影した所定の空間の画像データを取得し、
前記画像データに含まれる人物の関節点を推定し、
前記関節点に基づいて前記人物の特徴要素を設定し、
前記特徴要素に基づいて前記人物が予め設定された参照領域に侵入しているか否かを判定し、
前記判定にかかる判定結果に関する情報を出力する、
判定方法
を、コンピュータに実行させる情報制御プログラムが格納された非一時的なコンピュータ可読媒体。
10 判定装置
20 判定装置
111 画像データ取得部
112 関節点推定部
113 特徴要素設定部
114 侵入判定部
115 出力部
116 ディスプレイ
117 スピーカ
120 記憶部
201 画像
210 第1人物画像
220 第2人物画像
230 物体画像
240 参照領域
300 カメラ
900 空間
N1 ネットワーク
Claims (16)
- 撮影装置が撮影した所定の空間の画像データを取得する画像データ取得手段と、
前記画像データに含まれる人物の関節点を推定する関節点推定手段と、
前記関節点に基づいて前記人物の特徴要素を設定する特徴要素設定手段と、
前記特徴要素に基づいて前記人物が予め設定された参照領域に侵入しているか否かを判定する侵入判定手段と、
前記侵入判定手段が行った判定の判定結果に関する情報を出力する出力手段と、を備える
判定装置。 - 前記特徴要素設定手段は、前記関節点の端点を前記特徴要素に設定する、
請求項1に記載の判定装置。 - 前記特徴要素設定手段は、前記人物の頭、手先または足先を前記関節点の前記端点として設定する、
請求項2に記載の判定装置。 - 前記特徴要素設定手段は、前記端点を含む複数の隣接する前記関節点を前記特徴要素にそれぞれ設定し、
前記侵入判定手段は、複数の隣接する前記特徴要素が全て前記参照領域に侵入しているか否かを判定する、
請求項2または3に記載の判定装置。 - 前記特徴要素設定手段は、前記人物の腕または脚に対応する複数の隣接する前記関節点を、前記特徴要素として設定する、
請求項1に記載の判定装置。 - 前記特徴要素設定手段は、前記人物の外側に接する外接矩形を特定し、前記外接矩形に接する前記関節点を、前記特徴要素に設定する、
請求項1に記載の判定装置。 - 前記特徴要素設定手段は、前記人物に対応する前記画像データに基づいて2つの前記関節点を接続する接続線を設定し、一の前記接続線が接続する前記関節点を、前記特徴要素に設定する、
請求項1に記載の判定装置。 - 前記特徴要素設定手段は、前記人物に対応する前記画像データに基づいて2つの前記関節点を接続する接続線を前記特徴要素に設定し、
前記侵入判定手段は、前記特徴要素に設定された前記接続線の少なくとも一部が前記参照領域に侵入しているか否かを判定する、
請求項1に記載の判定装置。 - 前記出力手段は、前記特徴要素が前記参照領域に侵入したと判定された場合に、前記判定結果を出力する、
請求項1~8のいずれか一項に記載の判定装置。 - 前記特徴要素設定手段は、前記人物における複数の前記関節点に基づいて第1特徴要素および第2特徴要素を設定し、
前記侵入判定手段は、前記第1特徴要素が予め設定された第1参照領域に侵入しているかを判定すると共に、前記第2特徴要素が予め設定された第2参照領域に侵入しているかを判定する、
請求項1~9のいずれか一項に記載の判定装置。 - 前記出力手段は、前記第1特徴要素が前記第1参照領域に侵入し、且つ、前記第2特徴要素が前記第2参照領域に侵入した場合に、前記判定結果を出力する、
請求項10に記載の判定装置。 - 前記侵入判定手段は、前記第2特徴要素が予め設定された第3参照領域を通過したか否かをさらに判定し、
前記出力手段は、前記第2特徴要素が前記第3参照領域を通過した後に、前記第2参照領域に侵入し、且つ、前記第1特徴要素が前記第1参照領域に侵入した場合に、前記判定結果を出力する、
請求項11に記載の判定装置。 - 前記出力手段は、前記第2特徴要素が前記第2参照領域に侵入した後に、前記第1特徴要素が前記第1参照領域に侵入した場合に、前記判定結果を出力する、
請求項10~12のいずれか一項に記載の判定装置。 - 請求項1~13のいずれか一項に記載の判定装置と、
所定の空間を撮影して生成した前記画像データを前記判定装置に供給するカメラと、を備える
判定システム。 - コンピュータが、
撮影装置が撮影した所定の空間の画像データを取得し、
前記画像データに含まれる人物の関節点を推定し、
前記関節点に基づいて前記人物の特徴要素を設定し、
前記特徴要素に基づいて前記人物が予め設定された参照領域に侵入しているか否かを判定し、
前記判定にかかる判定結果に関する情報を出力する、
判定方法。 - 撮影装置が撮影した所定の空間の画像データを取得し、
前記画像データに含まれる人物の関節点を推定し、
前記関節点に基づいて前記人物の特徴要素を設定し、
前記特徴要素に基づいて前記人物が予め設定された参照領域に侵入しているか否かを判定し、
前記判定にかかる判定結果に関する情報を出力する、
判定方法
を、コンピュータに実行させる情報制御プログラムが格納された非一時的なコンピュータ可読媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023508199A JPWO2022201282A5 (ja) | 2021-03-23 | 判定装置、判定方法、及びプログラム | |
PCT/JP2021/011852 WO2022201282A1 (ja) | 2021-03-23 | 2021-03-23 | 判定装置、判定方法、判定システム及びプログラムが格納された非一時的なコンピュータ可読媒体 |
US18/267,612 US20240046508A1 (en) | 2021-03-23 | 2021-03-23 | Determination apparatus, determination method, determination system, and non-transitory computer-readable medium storing program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/011852 WO2022201282A1 (ja) | 2021-03-23 | 2021-03-23 | 判定装置、判定方法、判定システム及びプログラムが格納された非一時的なコンピュータ可読媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022201282A1 true WO2022201282A1 (ja) | 2022-09-29 |
Family
ID=83396416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/011852 WO2022201282A1 (ja) | 2021-03-23 | 2021-03-23 | 判定装置、判定方法、判定システム及びプログラムが格納された非一時的なコンピュータ可読媒体 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20240046508A1 (ja) |
WO (1) | WO2022201282A1 (ja) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006094100A (ja) * | 2004-09-24 | 2006-04-06 | Matsushita Electric Ind Co Ltd | 映像切替装置、画像処理装置、映像記録装置、画像処理システムおよび映像記録システム |
JP2017069748A (ja) * | 2015-09-30 | 2017-04-06 | グローリー株式会社 | 監視カメラシステム及び監視方法 |
-
2021
- 2021-03-23 US US18/267,612 patent/US20240046508A1/en active Pending
- 2021-03-23 WO PCT/JP2021/011852 patent/WO2022201282A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006094100A (ja) * | 2004-09-24 | 2006-04-06 | Matsushita Electric Ind Co Ltd | 映像切替装置、画像処理装置、映像記録装置、画像処理システムおよび映像記録システム |
JP2017069748A (ja) * | 2015-09-30 | 2017-04-06 | グローリー株式会社 | 監視カメラシステム及び監視方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022201282A1 (ja) | 2022-09-29 |
US20240046508A1 (en) | 2024-02-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110210302B (zh) | 多目标跟踪方法、装置、计算机设备及存储介质 | |
US10445887B2 (en) | Tracking processing device and tracking processing system provided with same, and tracking processing method | |
US20170104915A1 (en) | Display control apparatus, display control method, and storage medium | |
CN110853076A (zh) | 一种目标跟踪方法、装置、设备及存储介质 | |
JP6939111B2 (ja) | 画像認識装置および画像認識方法 | |
US10750127B2 (en) | Monitoring system, monitoring method, and monitoring program | |
JP2015142181A (ja) | 制御装置、制御方法 | |
JP2008009849A (ja) | 人物追跡装置 | |
US20230351757A1 (en) | Information processing apparatus, control method, and program | |
US11120838B2 (en) | Information processing apparatus, control method, and program | |
CN110199316B (zh) | 相机和相机的图像处理方法 | |
JP6803525B2 (ja) | 顔検出装置およびこれを備えた顔検出システムならびに顔検出方法 | |
JP6593922B2 (ja) | 画像監視システム | |
JP2011053005A (ja) | 監視システム | |
WO2022201282A1 (ja) | 判定装置、判定方法、判定システム及びプログラムが格納された非一時的なコンピュータ可読媒体 | |
JP7214437B2 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP6939065B2 (ja) | 画像認識用コンピュータプログラム、画像認識装置及び画像認識方法 | |
JP2021026599A (ja) | 画像処理システム | |
US10489921B2 (en) | Behavior analysis apparatus and behavior analysis method | |
WO2019013105A1 (ja) | 見守り支援システム及びその制御方法 | |
WO2022130849A1 (ja) | 画像処理装置、画像処理方法および非一時的なコンピュータ可読媒体 | |
JP2021056899A (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP2014142696A (ja) | 解析処理制御システム | |
WO2022239291A1 (ja) | 物体検知装置及び方法 | |
JP6954416B2 (ja) | 情報処理装置、情報処理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21932892 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18267612 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2023508199 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21932892 Country of ref document: EP Kind code of ref document: A1 |