WO2017163283A1 - 監視装置及び監視システム - Google Patents
監視装置及び監視システム Download PDFInfo
- Publication number
- WO2017163283A1 WO2017163283A1 PCT/JP2016/004149 JP2016004149W WO2017163283A1 WO 2017163283 A1 WO2017163283 A1 WO 2017163283A1 JP 2016004149 W JP2016004149 W JP 2016004149W WO 2017163283 A1 WO2017163283 A1 WO 2017163283A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- camera
- information
- monitoring
- purchase history
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/38—Payment protocols; Details thereof
- G06Q20/40—Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
- G06Q20/401—Transaction verification
- G06Q20/4014—Identity check for transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07G—REGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
- G07G1/00—Cash registers
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07G—REGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
- G07G1/00—Cash registers
- G07G1/12—Cash registers electronically operated
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19645—Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
- H04N5/92—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N5/9201—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
Definitions
- the present disclosure relates to a monitoring device and a monitoring system for identifying a person reflected in a camera and tracking the identified person.
- Patent Document 1 discloses a monitoring system including a plurality of monitoring cameras.
- the surveillance camera extracts feature information of the object shown in the video and transmits the feature information to other surveillance cameras.
- a plurality of monitoring cameras can cooperate to track and monitor an object having the same feature information.
- This disclosure provides a monitoring device and a monitoring system effective for accurately tracking an object.
- a monitoring device is a monitoring device that identifies a person from videos captured by a plurality of cameras installed in a predetermined area in which a plurality of products are arranged, and includes a video from a plurality of cameras, a predetermined
- a receiving unit that receives purchase history information representing products purchased by a person from a POS register installed in the area, characteristic information representing the characteristics of the person, and imaging area information representing the imaging area of the camera are stored.
- a storage unit, and a control unit that identifies a person from the video based on the feature information, and the control unit specifies a camera that has shot the shooting region through which the person has passed based on the purchase history information and the shooting region information To do.
- the monitoring device and the monitoring system according to the present disclosure are effective for accurately tracking an object.
- FIG. 1 is a block diagram showing a configuration of a monitoring system according to a first embodiment.
- A is a figure which shows an example of the purchase history information table in Embodiment 1
- (b) is a figure which shows an example of the display information table in Embodiment 1
- (c) is an example of the imaging area information table in Embodiment 1. Illustration The figure which shows the example of arrangement
- Flowchart for explaining a person identification operation in the first embodiment (A) is a figure for demonstrating the feature extraction of a person
- (b) is a figure which shows an example of the feature information table in Embodiment 1.
- Embodiment 1 will be described with reference to the drawings.
- a monitoring system that is effective for tracking an object even if a situation occurs in which the characteristics of the object (in this embodiment, a person) cannot be extracted from some of the plurality of monitoring cameras. provide.
- FIG. 1 shows the configuration of the monitoring system of the first embodiment.
- the monitoring system 100 includes a plurality of monitoring cameras 1 (monitoring cameras a, b, c), a plurality of POS (Point of Sales) registers (registers) 2, and images captured by the plurality of monitoring cameras 1. And a monitoring device 3 that tracks the identified person using the purchase history information acquired by the plurality of POS registers 2.
- Each surveillance camera 1 has a photographing unit 11 for photographing a video and a transmission unit 12 for transmitting a video photographed by the photographing unit 11.
- the photographing unit 11 can be realized by a CCD image sensor, a CMOS image sensor, an NMOS image sensor, or the like.
- the transmission unit 12 includes an interface circuit for performing communication with an external device in compliance with a predetermined communication standard (for example, LAN, WiFi).
- Each POS register 2 includes a purchase history information acquisition unit 21 that acquires purchase history information representing a product purchased by a person in the store, and a transmission unit 22 that transmits the acquired purchase history information.
- the purchase history information acquisition unit 21 includes a barcode reader using a scanner, a CCD, a laser, or the like.
- the transmission unit 22 includes an interface circuit for performing communication with an external device in compliance with a predetermined communication standard (for example, LAN, WiFi).
- the monitoring device 3 includes a receiving unit 31 that receives video from each monitoring camera 1 and purchase history information from each POS register 2, a video storage unit 32a that stores the received video, and received purchase history information.
- a purchase history information storage unit 32b that stores the purchase history information table T1 and a control unit 33 that identifies a person shown in the video stored in the video storage unit 32a and tracks the identified person.
- the receiving unit 31 includes an interface circuit for performing communication with an external device in compliance with a predetermined communication standard (for example, LAN, WiFi).
- the control unit 33 can be realized by a semiconductor element or the like.
- the function of the control unit 33 may be configured only by hardware, or may be realized by combining hardware and software.
- the control unit 33 can be configured by, for example, a microcomputer, a CPU, an MPU, a DSP, an FPGA, and an ASIC.
- the control unit 33 includes a recognition unit 33a for identifying a person shown in the video stored in the video storage unit 32a.
- the recognizing unit 33a extracts the feature of the person shown in the video stored in the video storage unit 32a, generates feature information representing the feature, and the person having the extracted feature appears in the monitoring camera 1.
- Shooting time information representing a certain time zone is generated.
- the feature information and the shooting time information are recognition information obtained by recognizing a person.
- the monitoring device 3 further includes a product shelf information storage unit 32c that stores a display information table T2 and a shooting area information table T3, and a recognition information storage unit 32d that stores a feature information table T4 and a shooting time information table T5.
- the display information table T2 and the shooting area information table T3 are stored in advance in the product shelf information storage unit 32c.
- the display information table T2 includes display information indicating which products are displayed on which product shelves in the store.
- the shooting area information table T3 includes shooting area information indicating areas in which the monitoring camera 1 can capture images.
- the display information includes information related to the product shelves on which the products are displayed, and the shooting area information includes information related to the product shelves photographed by the monitoring camera 1.
- the feature information table T4 includes person feature information generated by the recognition unit 33a.
- the shooting time information table T5 includes shooting time information generated by the recognition unit 33a.
- the control unit 33 further includes a recognition information correction unit 33b that corrects the shooting time information table T5 based on the purchase history information table T1, the display information table T2, the shooting region information table T3, and the feature information table T4. Based on the purchase history information table T1, the display information table T2, and the shooting area information table T3, the recognition information correction unit 33b specifies the monitoring camera 1 where the identified person should appear, and the shooting time of the specified monitoring camera 1 Whether or not there is information (that is, whether or not the identified person is shown in the characteristic monitoring camera 1) is determined based on the photographing time information table T5.
- the recognition information correction unit 33b determines that there is no shooting time information of the identified monitoring camera 1 (that is, the identified person is not shown in the characteristic monitoring camera 1)
- the recognition information correction unit 33b displays the shooting time information based on the shooting time information table T5. Calculate (estimate) the time zone that should be reflected on the surveillance camera 1 identified by the person who has not been identified, and determine that other persons appearing on the surveillance camera 1 are not reflected in the calculated (estimated) time. Then, the photographing time information table T5 is corrected.
- the video storage unit 32a, the purchase history information storage unit 32b, the product shelf information storage unit 32c, and the recognition information storage unit 32d can be realized by a DRAM, a ferroelectric memory, a flash memory, a magnetic disk, or the like. It is a storage unit.
- the monitoring device 3 further includes a display unit 34.
- the display unit 34 can display the video stored in the video storage unit 32a, the feature information table T4, and the shooting time information table T5.
- the display unit 34 can be realized by a liquid crystal display or the like.
- FIG. 2A shows an example of the purchase history information table T1 in the first embodiment.
- the purchase history information table T1 includes purchase history information 41 acquired by the purchase history information acquisition unit 21 of the POS register 2.
- the purchase history information 41 includes person identification information (ID) and product identification information (ID) purchased by the person.
- FIG. 2B shows an example of the display information table T2 in the first embodiment.
- the display information table T2 includes display information 42 including product identification information (ID) and product shelf identification information (ID) on which the product is displayed.
- the display information table T2 is stored in advance in the product shelf information accumulation unit 32c when, for example, the display location of the product is determined.
- FIG. 2C shows an example of the shooting area information table T3 in the first embodiment.
- the shooting area information table T3 is configured by shooting area information 43 including identification information (ID) of the monitoring camera 1 and the shooting area of the monitoring camera 1.
- the imaging region information table T3 is stored in advance in the product shelf information accumulation unit 32c when, for example, the monitoring camera 1 is installed.
- FIG. 3 shows an arrangement example of the monitoring camera 1.
- the surveillance camera 1 (surveillance cameras a, b, c) is installed in a store.
- the monitoring camera a images the entrance
- the monitoring camera b images the product shelves A01 and A02
- the monitoring camera c is disposed at a position where the POS register 2 is imaged.
- the number and installation locations of the monitoring cameras 1 are merely examples and can be arbitrarily changed, but at least one of the monitoring cameras 1 installs the POS register 2 in a range where photographing can be performed.
- Each of the monitoring cameras 1 transmits the video imaged by the imaging unit 11 from the transmission unit 12 to the monitoring device 3.
- the transmitted video is stored in the video storage unit 32a of the monitoring device 3.
- FIG. 4 shows a process of identifying a person by the recognition unit 33a.
- the recognition unit 33a performs the person identification process shown in FIG. 4 at a predetermined timing.
- the predetermined timing may be when the user instructs the monitoring device 3 or every predetermined time (for example, 24 hours).
- the recognition unit 33a reads the video stored in the video storage unit 32a and extracts the characteristics of the person shown in the video (S401). For example, the recognizing unit 33a analyzes the video sequentially from the video of the monitoring camera a installed on the entrance side or from the video of the monitoring camera c installed on the POS register 2 side. The recognition unit 33a extracts, for example, the shape, color, size, or position of a part of the face as the feature of the person.
- FIG. 5A shows an example of person feature extraction
- FIG. 5B shows an example of the feature information table T4.
- the recognizing unit 33a may, as the characteristics of a person, for example, the distance between both eyes (distance between “I-II”) and the distance between one eye and the nose (“II-III”) as shown in FIG. ”Is extracted, and feature information 44 including the extracted feature (distance) is added to the feature information table T4 as shown in FIG.
- the recognizing unit 33a determines whether or not the feature information 44 indicating the feature that matches the extracted feature already exists in the feature information table T4 (S402).
- the identification information (ID) for identifying the person is generated, and the generated identification information and the characteristics of the person (the distance between “I-II” and “II-III” are generated.
- the feature information 44 including the distance between them is added to the feature information table T4 (S403).
- the video shot by the monitoring camera 1 includes information on the shooting time of the video, and the purchase history information 41 acquired by the POS register 2 includes information on the recording time of the purchase history information 41. Therefore, by referring to the shooting time of the monitoring camera c installed in the POS register 2 and the recording time of the purchase history information 41 acquired by the POS register 2, the person shown in the video of the monitoring camera c, The person who purchased the product indicated in the purchase history information 41 can be linked.
- the recognizing unit 33a identifies the person identification information (ID) in the purchase history information table T1 and the person identification information in the feature information table T4. (ID) is matched.
- the recognition unit 33a Based on the feature information 44, the recognition unit 33a generates shooting time information indicating when a person is reflected in which monitoring camera 1, and adds the shooting time information to the shooting time information table T5 (S404).
- FIG. 6 shows an example of the shooting time information table T5.
- the shooting time information 45 includes the identification information (ID) of the person, the identification information of the monitoring camera 1 that shot the person, the time when the person started to appear on the monitoring camera 1 (IN time), and the time when the reflection ends (OUT time). ).
- the identification information (ID) of the person in the photographing time information table T5 is the same as the identification information (ID) of the person in the feature information table T4.
- the recognizing unit 33a determines whether or not the reading of the images from all the monitoring cameras 1 has been completed (S405), and if not completed, the processing of steps S401 to S404 is performed for the remaining monitoring camera 1 images. repeat.
- the person can be identified, and when the person identified by referring to the photographing time information table T5 is assigned to which surveillance camera 1. You can recognize what was reflected. Accordingly, it is possible to track a person using the images of the plurality of monitoring cameras 1.
- monitoring cameras 1 for example, monitoring cameras a, b, c
- the person from some monitoring cameras 1 cannot be extracted, and the tracking of the person may be interrupted.
- the same person is extracted using the purchase history information table T1, the display information table T2, and the shooting area information table T3, and the shooting time information table T5 is corrected.
- FIG. 7 shows a process of correcting the photographing time information table T5 by the recognition information correcting unit 33b.
- the recognition information correction unit 33b reads out the photographing time information table T5 and rearranges them in the descending order of the number of person entries (S701).
- FIG. 8 shows the photographing time information table T5 after rearrangement.
- the recognition information correction unit 33b extracts one person in descending order of the number of entries (S702) and refers to the purchase history information table T1 to check whether or not the shooting time information 45 is missing for the extracted person. (S703). Whether or not the shooting time information 45 is missing is confirmed by referring to the purchase history information table T1 and specifying the monitoring camera 1 on which a person should appear. Specifically, in the example of FIG.
- the recognition information correction unit 33b first detects that the person A has purchased the product A with reference to the purchase history information table T1. Next, with reference to the display information table T2, it is detected that the product A is displayed on the product shelf A01. Then, referring to the shooting area information table T3, it is detected that the product shelf A01 is shot by the monitoring camera b. That is, the monitoring camera b is specified as the monitoring camera 1 where the person A should be reflected. Then, the recognition information correction unit 33b refers to the shooting time information table T5 and confirms whether or not the person A has the shooting time information 45 of the monitoring camera b.
- the recognition information correction unit 33b refers to the imaging time information table T5 to estimate the time zone that should have appeared in the monitoring camera 1 in which the imaging time information 45 is missing ( (S705). For example, as shown in FIG. 8, for the person A, the shooting time information 45 of the monitoring camera b is missing.
- the recognition information correction unit 33b outputs the OUT time (10:11) of the monitoring camera a installed at the entrance and the IN time (10:36) of the monitoring camera c installed in the POS register 2. It is estimated that the image is captured by the other monitoring camera (here, monitoring camera b) during the time period between. That is, a time period between 10:11 and 10:36 is estimated (calculated) as a time zone that can be reflected on the monitoring camera b.
- the recognition information correction unit 33b extracts the person shown in the estimated time zone from the shooting time information table T5 (S706).
- the recognition information correction unit 33b performs the estimated time zone (10:11 to 10:36), the person B shown in the monitoring camera b is extracted.
- the recognition information correction unit 33b may extract from among persons with a small number of entries, for example, from persons with missing shooting time information 45. If the extracted person is one, it is determined that the person (person B) is the same person (person A), and the photographing time information table T5 is corrected.
- the identification information of the person A is recorded in “corrected person identification information (ID)” for the person B.
- the recognition information correction unit 33b determines that the person whose feature information 44 is closest is the same person based on the feature information table T4, and the shooting time information table T5 is corrected (S707).
- the shooting time information table T5 For all the persons recorded in the shooting time information table T5, it is determined whether or not the confirmation of the absence of the shooting time information 45 has been completed (S708). If not completed (No in S708), step Returning to S702, the next person is newly extracted from the shooting time information table T5 to check whether the shooting time information 45 is missing. If the confirmation of the absence or absence of the shooting time information 45 is completed for all of the persons recorded in the shooting time information table T5, the shooting time information table T5 is displayed on the display unit 34 (S709). The user can confirm the tracking of the person photographed by the plurality of monitoring cameras 1 by referring to the corrected photographing time information table T5 displayed on the display unit 34.
- the recognition information correcting unit 33b compensates for the lack of the shooting time information 45 by using the purchase history information table T1, the display information table T2, the shooting area information table T3, and the feature information table T4.
- the recognition information correction unit 33b corrects the shooting time information 45 of the monitoring camera b for the person A by correcting the person B as the person A as shown in FIG. Thereby, the control unit 33 can track the person A with high accuracy.
- the shooting area information 43 of the shooting area information table T3 includes the product shelf identification information (ID), but may include the product identification information (ID).
- ID product identification information
- the shooting area information table T3 by referring to the product identification information in the purchase history information table T1 and the product identification information in the shooting area information table T3, the product can be directly selected from the purchased product without using the display information table T2. It is possible to identify the surveillance camera 1 that captures the area to be included.
- the monitoring device 3 is a monitoring device that identifies a person from videos captured by a plurality of monitoring cameras 1 installed in a predetermined area (store) where a plurality of products are arranged.
- a receiving unit 31 for receiving video from a plurality of surveillance cameras 1 and purchase history information 41 representing products purchased by a person from a POS register 2 installed in a predetermined area (store);
- a recognition information accumulating unit 32d for storing feature information 44 representing a product
- a product shelf information accumulating unit 32c for storing imaging region information 43 representing an imaging region of the monitoring camera 1, and a person from a video based on the feature information 41
- the control unit 33 specifies the monitoring camera 1 that has captured the imaging region through which the person has passed based on the purchase history information 41 and the imaging region information 43.
- the surveillance camera 1 that photographed the person is identified using the purchase history information 41 and the shooting area information 43, so that tracking of the person can be realized with high accuracy
- the control unit 33 specifies, based on the shooting area information 43, the monitoring camera 1 that has shot the position where the product included in the purchase history information 41 is placed as the monitoring camera 1 that shot the shooting area through which the person has passed. Thereby, since the monitoring camera 1 which image
- the plurality of monitoring cameras 1 are a first camera (monitoring camera c) arranged at a predetermined position, and a second camera specified by the control unit 33 as the monitoring camera 1 that has captured a shooting region through which a person has passed. (Monitoring camera b), and the person who could not be identified from the video captured by the second camera (monitoring camera b) is based on the time when the person was captured by the first camera (monitoring camera c). In the video imaged by the second camera (monitoring camera b), it is specified. Specifically, the control unit 33 generates shooting time information 45 representing the time when the person identified based on the feature information 44 is reflected on the monitoring camera 1.
- control unit 33 specifies the monitoring camera 1 (monitoring camera b) where the person should appear based on the purchase history information 41 and the shooting area information 43, and the person is not reflected in the specified monitoring camera 1. Further, based on the shooting time information 45 of the surveillance camera 1 (surveillance camera c) arranged at a predetermined position, the time that should be reflected on the surveillance camera 1 (surveillance camera b) specified by a person who is not reflected is estimated. Then, the control unit 33 refers to the shooting time information 45, specifies that the other person shown in the specified monitoring camera 1 is not shown in the estimated time, and sets the shooting time information 45. Rewrite.
- the shooting time information 45 is missing because the feature information 44 does not match, the shooting time information where the missing has occurred by referring to the purchase history information 41 and the shooting area information 43. 45 can be supplemented. Therefore, the person feature information 44 acquired from the video of some monitoring cameras 1 is not acquired from the video of other monitoring cameras 1 but is recognized as a different person in the video of other monitoring cameras 1. However, by referring to the purchase history information 41 and the shooting area information 43, it can be re-recognized as the same person. Therefore, it is possible to accurately track the person.
- control unit 33 calculates a time zone in which the person has passed through the shooting area of the second camera (monitoring camera b) based on the time when the person was shot by the first camera (monitoring camera c). If two or more person candidates are shown in the video taken by the second camera during the time period selected, one of the two or more person candidates is selected as the person based on the feature information 44. Identify. Thereby, even when two or more person candidates are shown in the video, it is possible to accurately identify the person who is determined not to be shown from the person candidates.
- control unit 33 extracts the features of the person from the video received by the receiving unit 31, generates the feature information 44, and stores it in the recognition information storage unit 32d. As a result, even when a new person is extracted, it is possible to identify and track the person.
- the first camera (surveillance camera c) is installed at a position where the person being purchased can be photographed by the POS register 2.
- the POS register 2 sets the time zone in which the person has passed through the shooting area of the second camera (monitoring camera b). It can be restricted to a time zone before the time of the accounting process.
- the monitoring system 100 acquires a plurality of surveillance cameras 1 installed in a predetermined area where a plurality of products are arranged, and purchase history information 41 that is installed in the predetermined area and represents products purchased by a person.
- POS register 2 characteristic information 44 representing the characteristics of the person, and photographing area information 43 representing the photographing area of the surveillance camera 1, identifying a person from the video based on the characteristic information 44, and purchase history information 41
- a monitoring device 3 that identifies the monitoring camera 1 that has captured the imaging region through which the person has passed based on the imaging region information 43.
- an object person
- the monitoring system 100 is also useful for simulating a change in the flow line and analyzing the value of the store area.
- the first embodiment has been described as an example of the technique disclosed in the present application.
- the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which changes, replacements, additions, omissions, and the like have been made as appropriate.
- time zone estimation S705
- person extraction S706
- FIG. 8 the case where the person A can be identified in the images of the monitoring cameras a and c installed at the entrance and the POS register 2 (the shooting time information 45 exists) is taken as an example.
- the time zone between the OUT time of the monitoring camera c on the POS register 2 side and the IN time of the POS register 2 is calculated (estimated), but if the person A can be identified at least in the video of the monitoring camera c installed in the POS register 2 good.
- the monitored camera a can be specified as the monitoring camera 1 where the person is to be reflected, and the presence or absence of the shooting time information 45 of the monitoring camera a can be confirmed.
- a predetermined time zone before the IN time of the monitoring camera c on the POS register 2 side for example, one hour before the IN time
- a person may be extracted from the video of the surveillance camera 1 (surveillance cameras a and b) in which the shooting time information 45 is missing.
- the estimated time zone may be obtained by excluding the zone.
- the monitoring system 100 of the present disclosure can be realized by cooperating with hardware resources such as a processor, a memory, and a program.
- the present disclosure is applicable to a monitoring apparatus that tracks a target object using a plurality of monitoring cameras and a monitoring system having the monitoring apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Development Economics (AREA)
- Signal Processing (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Marketing (AREA)
- Economics (AREA)
- Game Theory and Decision Science (AREA)
- Data Mining & Analysis (AREA)
- Computer Security & Cryptography (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
- Burglar Alarm Systems (AREA)
- Alarm Systems (AREA)
- Cash Registers Or Receiving Machines (AREA)
Abstract
Description
実施形態1について、図面を用いて説明する。本実施形態においては、複数の監視カメラのうちの一部において対象物(本実施形態において、人物)の特徴を抽出できない状況が生じても、その対象物を追跡するのに有効な監視システムを提供する。
図1は、実施形態1の監視システムの構成を示している。本実施形態の監視システム100は、複数の監視カメラ1(監視カメラa,b,c)と、複数のPOS(Point of Sales)レジスタ(レジ)2と、複数の監視カメラ1で撮影された映像に映っている人物を識別し、複数のPOSレジスタ2で取得された購入履歴情報を使用して、識別した人物を追跡する監視装置3とを含む。
図4は、認識部33aによる人物の識別の処理を示す。認識部33aは、図4に示す人物の識別の処理を所定のタイミングで行う。例えば、所定のタイミングは、ユーザが監視装置3に指示したときであっても良いし、所定時間(例えば、24時間)毎であっても良い。
撮影したときの角度や照明条件によって同一人物であっても映像への映り方が異なることがある。そのため、複数の監視カメラ1で撮影した映像から抽出される同一人物の特徴が一致しないことがある。たとえば、明るい場所で高い位置に設置された監視カメラ1と、暗い場所で低い位置に設置された監視カメラ1では、撮影される映像が大きく異なるため、両者で撮影される映像から抽出される人物の特徴は異なる場合がある。この場合、同一人物であっても、抽出した特徴が異なるために、他の人物として認識してしまう。そのため、同一人物が複数の監視カメラ1(例えば、監視カメラa,b,c)の前を順に通った場合であっても、一部の監視カメラ1(例えば、監視カメラb)からはその人物の特徴を抽出できず、その人物の追跡が途切れてしまうことがある。
以上のように、本実施形態の監視装置3は、複数の商品が配置された所定領域(店舗)内に設置された複数の監視カメラ1によって撮影された映像から人物を識別する監視装置であって、複数の監視カメラ1からの映像と、所定領域(店舗)内に設置されたPOSレジスタ2からの人物が購入した商品を表す購入履歴情報41とを受信する受信部31と、人物の特徴を表す特徴情報44を格納する認識情報蓄積部32dと、監視カメラ1の撮影領域を表す撮影領域情報43を格納する商品棚情報蓄積部32cと、特徴情報41に基づいて映像から人物を識別する制御部33と、を備え、制御部33は、購入履歴情報41と撮影領域情報43とに基づいて、人物が通過した撮影領域を撮影した監視カメラ1を特定する。特徴情報44を使用して人物を識別するだけでなく、購入履歴情報41と撮影領域情報43とを使用して、人物を撮影した監視カメラ1を特定するため、人物の追跡を精度良く実現できる。
以上のように、本出願において開示する技術の例示として、実施形態1を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置き換え、付加、省略などを行った実施形態にも適用可能である。また、上記実施形態1で説明した各構成要素を組み合わせて、新たな実施形態とすることも可能である。そこで、以下、他の実施形態を例示する。
2 POSレジスタ
3 監視装置
11 撮影部
12 送信部
21 購入履歴情報取得部
22 送信部
31 受信部
32a 映像蓄積部
32b 購入履歴情報蓄積部
32c 商品棚情報蓄積部
32d 認識情報蓄積部
33 制御部
33a 認識部
33b 認識情報修正部
34 表示部
100 監視システム
T1 購入履歴情報テーブル
T2 陳列情報テーブル
T3 撮影領域情報テーブル
T4 特徴情報テーブル
T5 撮影時刻情報テーブル
Claims (7)
- 複数の商品が配置された所定領域内に設置された複数のカメラによって撮影された映像から人物を識別する監視装置であって、
前記複数のカメラからの映像と、前記所定領域内に設置されたPOSレジスタからの人物が購入した商品を表す購入履歴情報とを受信する受信部と、
人物の特徴を表す特徴情報と、前記カメラの撮影領域を表す撮影領域情報とを格納する記憶部と、
前記特徴情報に基づいて前記映像から人物を識別する制御部と、
を備え、
前記制御部は、前記購入履歴情報と前記撮影領域情報とに基づいて、前記人物が通過した撮影領域を撮影したカメラを特定する、
監視装置。 - 前記制御部は、前記人物が通過した撮影領域を撮影したカメラとして、前記購入履歴情報に含まれる商品が配置されていた位置を撮影したカメラを前記撮影領域情報に基づいて特定する、請求項1に記載の監視装置。
- 前記複数のカメラは、所定の位置に配置された第一カメラと、前記制御部によって前記人物が通過した撮影領域を撮影したカメラとして特定された第二のカメラと、を含み、
前記第二のカメラで撮影された映像から識別できなかった前記人物を、前記第一のカメラにおいて前記人物が撮影された時間に基づき、前記第二のカメラにより撮影された映像において特定する、
請求項1に記載の監視装置。 - 前記制御部は、前記第一のカメラにおいて前記人物が撮影された時間に基づき、前記第二のカメラの撮影領域を前記人物が通過した時間帯を算出し、前記算出した時間帯に、前記第二のカメラにより撮影された映像において、2人以上の人物候補が映っている場合、前記特徴情報に基づいて、前記2人以上の人物候補の中から1人を前記人物として特定する、請求項3に記載の監視装置。
- 前記制御部は、前記受信部が受信した映像から前記人物の特徴を抽出して前記特徴情報を生成して、前記記憶部に格納する、請求項1に記載の監視装置。
- 前記第一のカメラは、前記POSレジスタで購入中の人物を撮影可能な位置に設置される、請求項3に記載の監視装置。
- 複数の商品が配置された所定領域内に設置された複数のカメラと、
前記所定領域内に設置され、人物が購入した商品を表す購入履歴情報を取得するPOSレジスタと、
人物の特徴を表す特徴情報と前記カメラの撮影領域を表す撮影領域情報とを有し、前記特徴情報に基づいて前記映像から人物を識別し、前記購入履歴情報と前記撮影領域情報とに基づいて、前記人物が通過した撮影領域を撮影したカメラを特定する、請求項1から請求項6のいずれかに記載の監視装置と、
を含む、監視システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16895323.0A EP3435666B1 (en) | 2016-03-25 | 2016-09-12 | Monitoring device and monitoring system |
JP2018506504A JP6631962B2 (ja) | 2016-03-25 | 2016-09-12 | 監視装置及び監視システム |
US16/138,041 US20190026909A1 (en) | 2016-03-25 | 2018-09-21 | Monitoring device and monitoring system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016062033 | 2016-03-25 | ||
JP2016-062033 | 2016-03-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/138,041 Continuation US20190026909A1 (en) | 2016-03-25 | 2018-09-21 | Monitoring device and monitoring system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017163283A1 true WO2017163283A1 (ja) | 2017-09-28 |
Family
ID=59901284
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/004149 WO2017163283A1 (ja) | 2016-03-25 | 2016-09-12 | 監視装置及び監視システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20190026909A1 (ja) |
EP (1) | EP3435666B1 (ja) |
JP (1) | JP6631962B2 (ja) |
WO (1) | WO2017163283A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022051683A (ja) * | 2020-09-22 | 2022-04-01 | グラスパー テクノロジーズ エーピーエス | 訓練データの生成と再識別に使用するための機械学習モデルの訓練とについての概念 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10509949B1 (en) * | 2019-01-24 | 2019-12-17 | Capital One Services, Llc | Method and system for customizing user experience |
CN110414424A (zh) * | 2019-07-26 | 2019-11-05 | 广州云从信息科技有限公司 | 数据处理系统、方法、平台、机器可读介质及设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002041770A (ja) * | 2000-07-25 | 2002-02-08 | Oki Electric Ind Co Ltd | 顧客情報収集システム |
JP2005286619A (ja) * | 2004-03-29 | 2005-10-13 | Matsushita Electric Ind Co Ltd | 監視カメラシステム |
JP2006115435A (ja) * | 2004-10-12 | 2006-04-27 | Neo Planning:Kk | 遠隔監視システム |
JP2013196043A (ja) * | 2012-03-15 | 2013-09-30 | Glory Ltd | 特定人物監視システム |
JP2015069279A (ja) * | 2013-09-27 | 2015-04-13 | パナソニックIpマネジメント株式会社 | 滞留時間測定装置、滞留時間測定システムおよび滞留時間測定方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2230629A3 (en) * | 2008-07-16 | 2012-11-21 | Verint Systems Inc. | A system and method for capturing, storing, analyzing and displaying data relating to the movements of objects |
JP6561830B2 (ja) * | 2013-04-16 | 2019-08-21 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
US9361775B2 (en) * | 2013-09-25 | 2016-06-07 | Oncam Global, Inc. | Mobile terminal security systems |
-
2016
- 2016-09-12 WO PCT/JP2016/004149 patent/WO2017163283A1/ja active Application Filing
- 2016-09-12 EP EP16895323.0A patent/EP3435666B1/en active Active
- 2016-09-12 JP JP2018506504A patent/JP6631962B2/ja active Active
-
2018
- 2018-09-21 US US16/138,041 patent/US20190026909A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002041770A (ja) * | 2000-07-25 | 2002-02-08 | Oki Electric Ind Co Ltd | 顧客情報収集システム |
JP2005286619A (ja) * | 2004-03-29 | 2005-10-13 | Matsushita Electric Ind Co Ltd | 監視カメラシステム |
JP2006115435A (ja) * | 2004-10-12 | 2006-04-27 | Neo Planning:Kk | 遠隔監視システム |
JP2013196043A (ja) * | 2012-03-15 | 2013-09-30 | Glory Ltd | 特定人物監視システム |
JP2015069279A (ja) * | 2013-09-27 | 2015-04-13 | パナソニックIpマネジメント株式会社 | 滞留時間測定装置、滞留時間測定システムおよび滞留時間測定方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3435666A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022051683A (ja) * | 2020-09-22 | 2022-04-01 | グラスパー テクノロジーズ エーピーエス | 訓練データの生成と再識別に使用するための機械学習モデルの訓練とについての概念 |
JP7186269B2 (ja) | 2020-09-22 | 2022-12-08 | グラスパー テクノロジーズ エーピーエス | 訓練データの生成と同一物判定に使用するための機械学習モデルの訓練とについての概念 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017163283A1 (ja) | 2019-02-28 |
EP3435666A1 (en) | 2019-01-30 |
EP3435666B1 (en) | 2020-07-15 |
EP3435666A4 (en) | 2019-03-27 |
US20190026909A1 (en) | 2019-01-24 |
JP6631962B2 (ja) | 2020-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10572736B2 (en) | Image processing apparatus, image processing system, method for image processing, and computer program | |
JP6688975B2 (ja) | 監視装置及び監視システム | |
US9124778B1 (en) | Apparatuses and methods for disparity-based tracking and analysis of objects in a region of interest | |
JP6036824B2 (ja) | 画角変動検知装置、画角変動検知方法および画角変動検知プログラム | |
CN108921098B (zh) | 人体运动分析方法、装置、设备及存储介质 | |
US20190384989A1 (en) | Image processing apparatus | |
US20160165129A1 (en) | Image Processing Method | |
WO2017163283A1 (ja) | 監視装置及び監視システム | |
US20180046842A1 (en) | Line-of-sight detection device and line-of-sight detection method | |
CN110675426B (zh) | 人体跟踪方法、装置、设备及存储介质 | |
US20160203454A1 (en) | Information processing apparatus and method for recognizing specific person by the same | |
JP2013207393A (ja) | 画像監視装置 | |
JP6113631B2 (ja) | 作業確認システム | |
US9734483B2 (en) | Product management device and product management method | |
JP6289308B2 (ja) | 情報処理装置およびプログラム | |
US10817727B2 (en) | Information processing apparatus and method of controlling an information processing apparatus that estimate a waiting time in a waiting line | |
JP2018201146A (ja) | 画像補正装置、画像補正方法、注目点認識装置、注目点認識方法及び異常検知システム | |
JP7164047B2 (ja) | 注視点検出装置及び注視点検出方法 | |
JP2016045743A (ja) | 情報処理装置およびプログラム | |
US20240087428A1 (en) | Display device and display method | |
CN107967268B (zh) | 路跑活动的拍照系统及其操作方法 | |
JP6815859B2 (ja) | 通行量計測装置 | |
CN117616468A (zh) | 物体检知方法及物体检知装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018506504 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016895323 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016895323 Country of ref document: EP Effective date: 20181025 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16895323 Country of ref document: EP Kind code of ref document: A1 |