WO2023238614A1 - Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2023238614A1
WO2023238614A1 PCT/JP2023/018219 JP2023018219W WO2023238614A1 WO 2023238614 A1 WO2023238614 A1 WO 2023238614A1 JP 2023018219 W JP2023018219 W JP 2023018219W WO 2023238614 A1 WO2023238614 A1 WO 2023238614A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
group
unit
tracking
information processing
Prior art date
Application number
PCT/JP2023/018219
Other languages
English (en)
Japanese (ja)
Inventor
洸太 坂巻
将士 園山
和輝 中道
Original Assignee
コニカミノルタ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタ株式会社 filed Critical コニカミノルタ株式会社
Publication of WO2023238614A1 publication Critical patent/WO2023238614A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an information processing device, an information processing system, an information processing method, and a program.
  • the measures include distributing flyers before Christmas to commercial areas of commercial facilities and areas where many families live.
  • Management verifies the effectiveness of leaflet distribution by counting the number of visitors during the event period.
  • technological developments have been actively conducted to make it easier and more accurate to verify the effectiveness of measures taken by managers.
  • Patent Document 1 discloses an invention that creates and visualizes the flow information of beacon owners using information acquired from a plurality of beacons placed at various locations within a collection target area.
  • Patent Document 2 discloses an invention that mainly visualizes the movement of tourists visiting Japan on a map using user attributes such as gender, nationality, age, and purpose of visit in addition to the user's location information. ing.
  • Patent Documents 1 and 2 merely focus on each target person and analyze the behavior of each person. For this reason, there is a problem in that it is not possible to sufficiently accurately verify the effectiveness of the measures implemented by the manager of the commercial facility for customers visiting the commercial facility.
  • An information processing device comprising:
  • the drawing unit may draw the movement of the group in the Sankey diagram in different colors depending on the number of people in the group, or draw the movement of the group in the Sankey diagram when the number of people in the group is a predetermined number.
  • the information processing device according to (3) which performs highlighted drawing in a diagram.
  • the identification result of the person identification unit, the tracking result of the person tracking unit, and the determination result of the group determination unit are multiple types of results in multiple types of periods
  • the drawing unit The information processing device according to (3) or (4), wherein the movement of the group during the plurality of types of periods is drawn in the Sankey diagram based on the plurality of types of results.
  • the movement of the group includes a first movement in a first period as the plurality of types of periods, and a second movement in a second period as the plurality of types of periods, and the drawing unit: The information processing device according to (5), wherein the difference between the first movement and the second movement is drawn in the Sankey diagram.
  • the person identifying unit assigns attribute information to the identified person based on the image information
  • the person tracking unit assigns attribute information to each of the plurality of photographing devices based on the assigned attribute information.
  • the information processing device according to (8) which determines the identity of the identified person based on image information acquired from the image information.
  • the attribute information described in (9) or (10) includes at least one of the identified person's gender, age, clothing, color, presence or absence of a hat, presence or absence of glasses, and presence or absence of a bag. Information processing device.
  • the person identifying unit identifies the person multiple times by changing one or both of the location and time, and the person tracking unit associates the results of identifying the person multiple times.
  • the information processing device wherein the identified person is tracked, and the group determination unit determines the group attribute of the group by tracking the person for a plurality of people.
  • An information processing device and a plurality of photographing devices including a person identification unit that identifies a person, a person tracking unit that tracks the identified person, and a person tracking unit that tracks the person.
  • An information processing system comprising: a group determination unit that determines the number of people in a group to which the identified person belongs using the results.
  • An information processing method comprising: identifying a person; tracking the identified person; and determining the number of people in a group to which the identified person belongs, using the results of the tracking. .
  • a person identification unit that identifies a person, a person tracking unit that tracks the identified person, and a group that uses the tracking results of the person tracking unit to determine the number of people in the group to which the identified person belongs.
  • FIG. 1 is an example of an overall configuration diagram of an information processing system according to this embodiment.
  • the information processing system 100 includes an information processing device 1 and a plurality of cameras 2.
  • the information processing device 1 is a computer that performs predetermined arithmetic processing.
  • the camera 2 is a photographing device installed at various locations in a commercial facility (for example, an entrance, an escalator entrance or exit, an elevator entrance, a store entrance).
  • the information processing device 1 and the camera 2 are each connected to be able to communicate by wire or wirelessly.
  • the information processing device 1 can acquire images taken by each of the cameras 2 as image information from each of the cameras 2.
  • the information processing device 1 includes a person identifying section 11, a person tracking section 12, a group determining section 13, and a drawing section 14.
  • the information processing device 1 also stores a tracking DB 15 and a group DB 16.
  • "DB” is an abbreviation for database (DataBase).
  • the person identifying unit 11 identifies a person within a commercial facility. Specifically, the person identification unit 11 identifies a person from an image taken by the camera 2 based on image information acquired from the camera 2 . For example, the person identifying unit 11 can be implemented as a learning device that is machine-trained to detect a person from an image.
  • the person tracking unit 12 tracks the person identified by the person identifying unit 11.
  • the group determination unit 13 uses the tracking results of the person tracking unit 12 to determine the group to which the person identified by the person identification unit 11 belongs.
  • a group is a collection of people (customers) who act together in a commercial facility. For example, groups include, but are not limited to, families (parents and children) and partners (pairs of men and women).
  • the drawing unit 14 draws the movement of the group based on the identification result of the person identification unit 11, the tracking result of the person tracking unit 12, and the determination result of the group determination unit 13. Drawing can be performed, for example, on the display unit of the information processing device 1.
  • Attribute information is information indicating one or more types of attributes assigned to a person.
  • the attribute information may include, but is not limited to, at least one of the identified person's gender, age, clothing, color, presence or absence of a hat, presence or absence of glasses, and presence or absence of a bag. Attributes play the role of distinguishing people by being assigned to them (in other words, they have the function of distinguishing each person), and the value of each attribute can be determined from image information using machine learning. .
  • Image information from the cameras 2 is generated at predetermined time intervals, and the person identification unit 11 can assign ID and attribute information to all cameras 2 at predetermined time intervals. As a result, the person identification unit 11 can partially create the tracking DB 15.
  • the values registered in the "Age” column, "Gender” column, and “Color” column of the tracking DB 15 are examples of the value of the attribute indicated by the attribute information.
  • the values of attributes registered in the tracking DB 15 are not limited to these. For example, values indicating the person's clothing, presence or absence of a hat, presence or absence of glasses, presence or absence of a bag, etc. can be prepared.
  • the "unified ID between cameras” column when the person shown in the images taken by a plurality of cameras 2 is the same person, the value of the ID given to the person is registered. The unified ID between cameras will be described later.
  • the person identification unit 11 creates each record in the tracking DB 15 for image information that is generated every moment.
  • FIG. 4 it can be seen that at 11:25:30, persons A and B are shown in the frame of camera No. 1, camera 2. Also, at 11:25:40, person A is shown in the frame of camera 2 of camera No. 2, and at 11:25:41, person B is shown in the frame of camera 2 of camera No. 2. .
  • the group determination unit 13 can determine that persons A and B belong to a group. Further, as shown in FIG. 3, the group determination unit 13 assigns a group ID: "1" to the group of persons A and B.
  • at 11:25:31 person C is shown in the frame of camera No. 1, camera 2, and at 11:25:32, person D is shown in the frame of camera No. 1, camera 2.
  • FIG. 5 is an explanatory diagram example (part 2) of group determination.
  • the table shown in FIG. 5 has the same format as the cable shown in FIG. 4, but the contents are different.
  • FIG. 5 shows another example of how the group determining unit 13 determines groups.
  • a case will be considered in which two or more of the same person are shown in close-up frames shot by the same camera 2 over a predetermined period of time.
  • the group determination unit 13 determines that the same person is in a group.
  • a group ID is assigned to the determined group.
  • the group determination unit 13 can determine the number of people forming the determined group. According to the example of FIG. 4, for the group of persons A and B, the group determination unit 13 determines that the number of people in the group is two. At this time, as shown in FIG. 3, the group determination unit 13 assigns a group number of "2" to the group of persons A and B. Similarly, for the group of persons C, D, and E, the group determination unit 13 determines that the number of people in the group is three. At this time, as shown in FIG. 3, the group determination unit 13 assigns a group number of "3" to the group of persons C, D, and E.
  • the group determination unit 13 can determine the group attribute of the determined group based on the attribute information of the persons belonging to the group. For example, according to FIG. 2, by referring to the attribute information, it can be inferred that persons A and B are a male and female pair in their 20s. Therefore, the group determination unit 13 determines that persons A and B are partners, and registers the value of "partner" in the "group attribute” column of the records of persons A and B (group information in FIG. 3). Furthermore, according to FIG. 2, it can be inferred that persons C, D, and E are a parent and child by referring to the attribute information. Therefore, the group determination unit 13 determines that persons C, D, and E are a family, and enters the value of "family” in the "group attribute” column of the records of persons C, D, and E (group information in FIG. 3). register.
  • the map registered in the information processing device 1 may be a simple map that includes aisle network information that is data about the connections of aisles, and store location information that is data that is the location of stores.
  • the passage network information includes, for example, a node set as a reference of the passage (for example, the center in the width direction) and a link connecting the nodes.
  • a node is set, for example, at a branch point of a passage, a point where a store is located, a point where the camera 2 is installed, or the like.
  • the location information of the store and the location information of the camera 2 may be associated with nodes of the aisle network information, for example.
  • the information processing device 1 can determine which store the person photographed by the camera 2 is heading to. However, it is not essential that the information processing device 1 stores location information of stores within the commercial facility. For example, the operator of the information processing device 1 can select a predetermined period of time (for example, one day) during which images were taken by the camera 2.
  • the drawing unit 14 can extract tracking information included in the selected predetermined period from the tracking information in the tracking DB 15.
  • the drawing unit 14 can extract group information corresponding to the extracted tracking information from the group DB 16. Then, the drawing unit 14 can draw a Sankey diagram based on the extracted tracking information and group information.
  • FIG. 6 is an explanatory diagram example (part 1) of the Sankey diagram.
  • Figure 6 shows the flow of people when customers who stopped by stores (store 0, store 1, store 2, store 3, store 4) in a commercial facility were tracked.
  • the bands shown in FIG. 6 represent the movement of customers.
  • the movement of the customer is shown in the right direction.
  • the movement of each customer can be identified by referring to the tracking DB 15.
  • the drawing unit 14 can express bands in different colors depending on group attributes.
  • a band whose group attribute is "family” and a band whose group attribute is "partner” are shown in different colors (for convenience of illustration, they are shown by different hatching rather than different colors).
  • the color of the band can be specified by referring to the "group attribute" column of the group DB 16.
  • the vertical width of the band represents the number of customers moving.
  • the vertical width of each color band can be specified by referring to the tracking DB 15 and the group DB 16.
  • the store is the starting point or end point of the customer's movement, and as shown in FIG. 6, the drawing unit 14 represents the store with a vertical bar.
  • the drawing unit 14 can perform similar visualization for a larger number of group attributes. Further, the drawing unit 14 may draw the movement of the group in a Sankey diagram in different colors depending on the number of people in the group. Further, the drawing unit 14 can perform emphasized drawing such that the array of interest is colored in a dark color and the other arrays are colored in a light color. Such highlighted drawing allows the group attributes that the manager of the commercial facility wants to see to be displayed in an easy-to-understand manner. Further, the drawing unit 14 may use a Sankey diagram to highlight the movement of the group when the number of people in the group is a predetermined number.
  • FIG. 7 is an explanatory diagram example (part 2) of the Sankey diagram.
  • FIG. 8 is an explanatory diagram example (part 3) of the Sankey diagram.
  • Figures 7 and 8 show two sections of data passing through stores (store 0, store 1, store 2, store 3, store 4) in a commercial facility extracted from the tracking DB 15 and group DB 16 and drawn in Sankey diagrams. This is a diagram of the time.
  • FIG. 7 is a Sankey diagram drawn by the drawing unit 14 based on tracking information when the operator selects one week one month ago and group information corresponding to the tracking information.
  • FIG. 8 is a Sankey diagram drawn by the drawing unit 14 based on tracking information when the operator selects the past week and group information corresponding to the tracking information.
  • the bands drawn in FIGS. 7 and 8 are narrowed down to bands of the family layer (groups whose group attribute is "family").
  • FIG. 9 is an explanatory diagram example (part 4) of the Sankey diagram.
  • a Sankey diagram has the property that it can be drawn by adding or subtracting two or more Sankey diagrams.
  • the Sankey diagram in FIG. 9 corresponds to the Sankey diagram for one week up to the present in FIG. 8 minus the Sankey diagram for one week from one month ago in FIG. 7.
  • the bands shown in FIG. 9 represent the positive value (increase) and negative value (decrease) of the customer's movement amount after taking the difference.
  • the manager of the commercial facility displays two Sankey diagrams on the drawing unit 14 to compare with past data and compares the differences to confirm the increase or decrease in the amount of customer movement. , I was able to think of measures.
  • the manager of a commercial facility can check the increase or decrease in the amount of customer movement at a glance by referring to one Sankey diagram, and can verify measures more intuitively.
  • FIG. 10 is a flowchart showing the information processing method of this embodiment.
  • the information processing device 1 starts processing using image information acquired from each of the plurality of cameras 2.
  • the person identification unit 11 identifies a person from an image taken by the camera 2 (step S1). An ID and attribute information are assigned to the identified person and registered in the tracking DB 15. The person identifying unit 11 can identify a person multiple times by changing one or both of the location and time.
  • the person tracking unit 12 tracks the specified person by determining the identity of the specified person based on the attribute information (step S2). The person tracking unit 12 can track the identified person by associating the results of identifying the person multiple times.
  • the group determination unit 13 determines the number of people in the group to which the identified person belongs (step S3). At this time, the group determination unit 13 determines the group to which the identified person belongs by referring to the tracking DB 15.
  • the media I/F 207 reads the program or data stored in the recording medium 208 and provides it to the CPU 201 via the RAM 202.
  • the CPU 201 loads this program from the recording medium 208 onto the RAM 202 via the media I/F 207, and executes the loaded program.
  • the recording medium 208 is, for example, an optical recording medium such as a DVD (Digital Versatile Disc) or a PD (Phase change rewritable disk), a magneto-optical recording medium such as an MO (Magneto Optical disk), a tape medium, a magnetic recording medium, or a semiconductor memory. It is.
  • the CPU 201 of the computer 200 realizes the functions of each part by executing a program loaded onto the RAM 202.
  • data stored in the HDD 204 is used.
  • the CPU 201 of the computer 200 reads these programs from the recording medium 208 and executes them, but as another example, these programs may be acquired from another device via the communication network 209.
  • the person identifying unit 11 identifies a person, determines the number of people in a group, etc. using image information acquired from a plurality of cameras 2.
  • the person identification unit 11 can identify a person or determine the number of people in a group by using identification information and location information that can be obtained from a terminal such as a smartphone owned by each person, rather than image information from the camera 2. You may do so.
  • identification information and position information that can be obtained from beacons may be used to identify people, determine the number of people in a group, and the like.
  • the identification of a person by the person identification unit 11 may be performed based on image information obtained from a plurality of cameras 2, or may be performed based on image information obtained from a single camera 2. good.
  • the information processing device may be implemented using one computer, or may be implemented using two or more computers.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

Le but de la présente invention est de vérifier avec précision l'efficacité de mesures développées pour de multiples personnes. Le dispositif de traitement d'informations (1) selon la présente invention comprend : une unité d'identification de personne (11) qui identifie une personne ; une unité de suivi de personne (12) qui suit la personne identifiée ; et une unité de détermination de groupe (13) qui utilise les résultats de suivi de l'unité de suivi de personne (12) pour déterminer un groupe auquel appartient la personne identifiée. Ce dispositif comprend également une unité de traçage (14) qui trace le mouvement du groupe sur la base de résultats d'identification provenant de l'unité d'identification de personne (11), en suivant les résultats de l'unité de suivi de personne (12), et des résultats de détermination provenant de l'unité de détermination de groupe (13). L'unité de traçage (14) trace le mouvement du groupe dans un diagramme de Sankey.
PCT/JP2023/018219 2022-06-09 2023-05-16 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme WO2023238614A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022093424 2022-06-09
JP2022-093424 2022-06-09

Publications (1)

Publication Number Publication Date
WO2023238614A1 true WO2023238614A1 (fr) 2023-12-14

Family

ID=89118158

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/018219 WO2023238614A1 (fr) 2022-06-09 2023-05-16 Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2023238614A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003134235A (ja) * 2001-10-19 2003-05-09 Ntt Docomo Kansai Inc 情報送信システム、情報処理装置、コンピュータプログラム、及び記録媒体
WO2018180588A1 (fr) * 2017-03-27 2018-10-04 株式会社日立国際電気 Système d'appariement d'images faciales et système de recherche d'images faciales
JP2019023851A (ja) * 2017-07-21 2019-02-14 株式会社エヌ・ティ・ティ・アド データ分析システム及び分析方法。
WO2020195376A1 (fr) * 2019-03-27 2020-10-01 日本電気株式会社 Dispositif de surveillance, procédé de détection d'objet suspect, support d'enregistrement
JP2021140636A (ja) * 2020-03-09 2021-09-16 日本電気株式会社 クーポン発行装置、方法、及び、プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003134235A (ja) * 2001-10-19 2003-05-09 Ntt Docomo Kansai Inc 情報送信システム、情報処理装置、コンピュータプログラム、及び記録媒体
WO2018180588A1 (fr) * 2017-03-27 2018-10-04 株式会社日立国際電気 Système d'appariement d'images faciales et système de recherche d'images faciales
JP2019023851A (ja) * 2017-07-21 2019-02-14 株式会社エヌ・ティ・ティ・アド データ分析システム及び分析方法。
WO2020195376A1 (fr) * 2019-03-27 2020-10-01 日本電気株式会社 Dispositif de surveillance, procédé de détection d'objet suspect, support d'enregistrement
JP2021140636A (ja) * 2020-03-09 2021-09-16 日本電気株式会社 クーポン発行装置、方法、及び、プログラム

Similar Documents

Publication Publication Date Title
Corona et al. Meva: A large-scale multiview, multimodal video dataset for activity detection
AU2021202992B2 (en) System of Automated Script Generation With Integrated Video Production
JP6444655B2 (ja) 表示方法、滞在情報表示システム、表示制御装置、及び表示制御方法
CN108027827A (zh) 基于图像分析的协调通信和/或存储
US20230098803A1 (en) Event management system
CN111563396A (zh) 在线识别异常行为的方法、装置、电子设备及可读存储介质
CN109145707A (zh) 图像处理方法及装置、电子设备和存储介质
WO2014168283A1 (fr) Système de service de programmation de voyage, procédé de commande de service et procédé de programmation de voyage
JP2015513331A (ja) ルールベースのコンテンツ最適化のためのシステム及び方法
WO2023238614A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et programme
WO2019245183A1 (fr) Procédé, dispositif et programme permettant de sélectionner un lieu et de fournir des informations
JP2007102341A (ja) 自動計数装置
EP4066197A1 (fr) Système et procédé de perception interactive et de présentation de contenu
JP2004133805A (ja) 動画像処理方法、動画像処理装置およびその方法をコンピュータに実行させるプログラム
Queralt et al. Estimating the unmet need for services: A middling approach
WO2021199179A1 (fr) Dispositif, système et procédé de guidage vers un siège et support non transitoire lisible par ordinateur stockant un programme
US11816195B2 (en) Information processing apparatus, information processing method, and storage medium
KR20220068339A (ko) 안면인식 기술을 이용한 여행객 행동특성 데이타 생성 방법
JP2007102342A (ja) 自動計数装置
US20240046652A1 (en) System and program
Muller Tutorial Architectural Reasoning Using Conceptual Modeling
WO2021192313A1 (fr) Dispositif, système et procédé de guidage de siège et support non transitoire lisible par ordinateur stockant un programme
US20240054576A1 (en) Camera booth implemented building information management and payment system
WO2024053971A1 (fr) Système de gestion de création de nft comportant une fonction communautaire ajoutée
US11252379B2 (en) Information processing system, information processing method, and non-transitory storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23819598

Country of ref document: EP

Kind code of ref document: A1