WO2015129210A1 - 情報処理装置、データ分析方法、及び、記録媒体 - Google Patents
情報処理装置、データ分析方法、及び、記録媒体 Download PDFInfo
- Publication number
- WO2015129210A1 WO2015129210A1 PCT/JP2015/000779 JP2015000779W WO2015129210A1 WO 2015129210 A1 WO2015129210 A1 WO 2015129210A1 JP 2015000779 W JP2015000779 W JP 2015000779W WO 2015129210 A1 WO2015129210 A1 WO 2015129210A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- person
- staying
- tracking
- flow line
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 51
- 238000000034 method Methods 0.000 title claims description 13
- 238000007405 data analysis Methods 0.000 title claims description 5
- 238000001514 detection method Methods 0.000 claims abstract description 36
- 238000004458 analytical method Methods 0.000 claims abstract description 29
- 238000004364 calculation method Methods 0.000 claims description 21
- 230000001186 cumulative effect Effects 0.000 claims description 14
- 230000008569 process Effects 0.000 claims description 8
- 238000010586 diagram Methods 0.000 description 10
- 230000000694 effects Effects 0.000 description 9
- 230000014759 maintenance of location Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000009825 accumulation Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0201—Market modelling; Market analysis; Collecting market data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the present invention relates to data analysis, and more particularly, to an information processing apparatus, data analysis method, and recording medium for analyzing human behavior information.
- An information processing apparatus that analyzes behavior information uses, for example, image information of a monitoring camera, information from an RFID (Radio Frequency Identification) tag, and information (for example, a number) of a mobile phone SIM (Subscriber Identity Module) card. Then, the information processing apparatus calculates person's action trajectory information based on such information (see, for example, Patent Document 1).
- the calculated human action trajectory information is used, for example, for analyzing the action of a person in a store or warehouse area.
- An analyst who analyzes behavior can grasp information effective for purchasing behavior or business efficiency based on person behavior trajectory information.
- customers in the store are not only looking at the goods while moving. For example, a customer stops to confirm a product. That is, the person may not only move but also stop (stay).
- the behavior analysis system described in Patent Document 1 displays the flow data of the analyzed person when displaying the result of the person action in the analyzed target area. For this reason, the behavior analysis system described in Patent Literature 1 has a problem in that it is not possible to appropriately display the result of the behavior analysis of the person in the entire target area.
- An object of the present invention is to provide an information processing apparatus, a data analysis method, and a recording medium that solve the above problems.
- An information processing apparatus receives analysis information including information related to a position of a person included in a target area, and is first time-series information of the position of the person based on the analysis information.
- the direction calculating means for calculating the traveling direction information of each person based on the second person tracking information
- the second person tracking information Based on the staying time calculating means for calculating the staying place information and staying time information of each person, the traveling direction information, the staying place information, and the staying time information, the flow line information and staying information of the person And calculate the image of the target area Again and a processing means for displaying said accumulated information and said flow line information.
- the data processing method is the first time-series information of the position of the person that receives the analysis information including information related to the position of the person included in the target area and based on the analysis information.
- the person tracking information is calculated, the person in the first person tracking information is identified, the first person tracking information is classified for each person, and the second person tracking which is time-series information of the position for each person Information is calculated, the direction information of each person is calculated based on the second person tracking information, and the staying place information and the staying time information of each person are calculated based on the second person tracking information. Then, based on the traveling direction information, the staying place information, and the staying time information, the flow line information and staying information of the person are calculated, and the flow line information and the staying information are overlapped with the image of the target area.
- the stay information is displayed.
- a computer-readable recording medium receives analysis information including information related to the position of a person included in a target area, and time-series information of the position of the person based on the analysis information
- the first person tracking information is calculated, the person in the first person tracking information is identified, the first person tracking information is classified for each person, and the time series information of the position for each person is used.
- the traveling direction information, the stay location information, and the stay time information Based on the process of calculating the stay location information and the stay time information, the traveling direction information, the stay location information, and the stay time information, the flow line information and the stay information of the person are calculated, and the target The flow line information superimposed on the area image Including a program for executing the processing of displaying said accumulated information on the computer.
- FIG. 1 is a block diagram showing an example of the configuration of the information processing apparatus according to the first embodiment of the present invention.
- FIG. 2 is a diagram illustrating an example of a display of the information processing apparatus according to the first embodiment.
- FIG. 3 is a diagram illustrating an example of another configuration of the information processing apparatus according to the first embodiment.
- FIG. 4 is a block diagram illustrating an example of the configuration of the information processing apparatus according to the second embodiment.
- FIG. 5 is a diagram illustrating an example of a display of the information processing apparatus according to the second embodiment.
- FIG. 1 is a block diagram showing an example of the configuration of the information processing apparatus 10 according to the first embodiment of the present invention.
- the information processing apparatus 10 includes a person detection tracking unit 100, a same person detection unit 101, a direction calculation unit 102, a dwell time calculation unit 103, and a processing unit 104.
- the person detection tracking unit 100 receives analysis information used for behavior analysis in the target area.
- the analysis information may include information related to the position.
- the person detection tracking unit 100 may receive a floor image (hereinafter referred to as “floor image”) from a camera that captures a predetermined floor as a target area as analysis information.
- the person detection tracking unit 100 may receive the RFID tag position information as the analysis information.
- floor image a floor image
- the person detection tracking unit 100 may receive the RFID tag position information as the analysis information.
- a description will be given using a floor video.
- the person detection / tracking unit 100 After receiving the floor video, the person detection / tracking unit 100 detects the position of one or more persons from the floor video. For example, the person detection tracking unit 100 may detect the position of the person using image recognition (for example, person image recognition) technology.
- image recognition for example, person image recognition
- the person detection tracking unit 100 tracks the position of the person between the frames of the floor video, and calculates time series information of the position of the person.
- the time series information of the position of the person calculated by the person detection tracking unit 100 is referred to as “first person tracking information”.
- the person detection tracking unit 100 outputs the first person tracking information to the same person detection unit 101.
- each person is not specified in the first person tracking information.
- the same person detection unit 101 receives the first person tracking information from the person detection tracking unit 100. Then, the same person detection unit 101 specifies (detects) all persons included in the first person tracking information. For example, the same person detection unit 101 may use a face recognition technique. Further, the same person detection unit 101 classifies the first person tracking information for each person determined to be the same among the determined persons. Hereinafter, the person tracking information for each person classified by the same person detecting unit 101 is referred to as “second person tracking information”. That is, the same person detecting unit 101 detects time-series information of positions related to the same person as the second person tracking information. The same person detection unit 101 outputs the second person tracking information to the direction calculation unit 102 and the residence time calculation unit 103. The same person detection unit 101 may specify all persons in a predetermined range. For example, when analyzing customer behavior, the same person detection unit 101 may identify each customer excluding employees.
- the direction calculation unit 102 receives the second person tracking information from the same person detection unit 101. Then, the direction calculation unit 102 is information (travel direction) indicating the travel direction of the person (same person) based on the position information (person tracking information) of each person in time series in the second person tracking information. Information) is calculated as time-series data. Then, the direction calculation unit 102 outputs the traveling direction information calculated for each person to the processing unit 104.
- the residence time calculation unit 103 receives the second person tracking information from the same person detection unit 101. After reception, the residence time calculation unit 103 calculates residence location information related to the location where the same person stayed and residence time information related to the residence time based on the second person tracking information. Then, the residence time calculation unit 103 outputs the calculated residence location information and residence time information to the processing unit 104.
- the processing unit 104 receives the traveling direction information from the direction calculating unit 102 and receives the staying location information and the staying time information from the staying time calculating unit 103. Then, the processing unit 104 calculates flow line information and staying information for each person based on the traveling direction information, the staying place information, and the staying time information.
- the flow line information is information relating to a change in position for each person.
- the staying information is information regarding staying for each person.
- the flow line information and the staying information are information used by the processing unit 104 for display. For this reason, the flow line information and the staying information may include information necessary for display in addition to the above information.
- the processing unit 104 displays an image of the calculated flow line information and staying information superimposed on the image of the target area.
- the processing unit 104 has no limitation on the number of persons of information to be displayed.
- the processing unit 104 may display flow line information and staying information regarding all persons.
- the information processing apparatus 10 displays information regarding all persons in the target area. Therefore, the user of the information processing apparatus 10 can comprehensively grasp the movement information and staying information of the person in the entire target area.
- the processing unit 104 may receive information on a person to be displayed from an input device of a user of the information processing apparatus 10.
- the user of the information processing apparatus 10 can grasp movement information and staying information of a predetermined person in the entire target area.
- the information processing apparatus 10 may display information on a person of an age included in a predetermined range.
- the display means on which the processing unit 104 displays an image is not particularly limited.
- the processing unit 104 may display an image on a display unit of the information processing apparatus 10 (not shown).
- the processing unit 104 may transmit image information to an external device (not shown).
- the display format of the processing unit 104 is not particularly limited.
- FIG. 2 is a diagram illustrating an example of the display of the processing unit 104.
- FIG. 2 assumes a store floor as an example of the target area. Therefore, the gondola (display stand) 500 shown in FIG. 2 displays merchandise. Then, the customer moves between the gondola 500 shown in FIG.
- the information processing apparatus 10 receives a floor image from a camera (not shown) installed in the store. Each component of the information processing apparatus 10 operates as described above.
- the processing unit 104 calculates flow line information based on the received direction information of the person. Then, the processing unit 104 replaces the calculated flow line information with continuous points of coordinates on the image (floor map) of the target area. Then, as illustrated in FIG. 2, the processing unit 104 displays human flow line information as flow line information 301, flow line information 302, flow line information 303, and flow line information 304.
- the processing unit 104 calculates a stay location on the coordinates of the floor map based on the received stay location information of the person. Further, the processing unit 104 calculates a residence time from the received residence time information. Then, as illustrated in FIG. 2, the processing unit 104 displays stay information 305, stay information 306, stay information 307, and stay information 308 indicating the stay location and stay time. In FIG. 2, the values shown in the stay information 305 to 308 are stay times.
- the processing unit 104 of the present embodiment displays the stay information 305 to 308 as a figure having a size proportional to the stay time.
- the proportion does not have to be limited to a mathematically exact proportion.
- the processing unit 104 may display a graphic corresponding to the residence time. For example, the figure of the residence information 306 corresponding to the residence time of “1 s (1 second)” is smaller than the figure of the residence information 305 corresponding to the residence time of “10 s (10 seconds)”.
- the ratio between the size of the figure and the residence time may deviate from the exact ratio in order to facilitate understanding of the display.
- the display of the processing unit 104 in the present embodiment is not limited to the display of FIG.
- the processing unit 104 may change the display color, character size, or line thickness based on the residence time.
- the processing unit 104 of the present embodiment displays the stay information 305 to 308 indicating the stay place and stay time in addition to the display of the flow line information 301 to 304 based on the progress information.
- This embodiment can produce an effect of appropriately displaying the result of behavior analysis in the entire target area.
- the person detection tracking unit 100 detects the position of the person based on the analysis information, and calculates first person tracking information. Then, the same person detection unit 101 classifies the first person tracking information for each person. Then, the direction calculation unit 102 outputs the traveling direction information for each classified person. On the other hand, the residence time calculation unit 103 outputs residence location information and residence time information. This is because the processing unit 104 can display the stay information based on the stay location information and the stay time information in addition to the flow line information based on the traveling direction information.
- the flow line information indicating the movement of the person and the staying information indicating the staying of the person are displayed. Therefore, an analyst using the information processing apparatus 10 according to the present embodiment can simultaneously grasp the stay position and the stay time in addition to the movement of the person. Therefore, the analyst can perform more appropriate analysis.
- the present embodiment can achieve an effect of appropriately displaying a detailed behavior analysis of a person in the target area.
- the reason is that the processing unit 104 displays the flow line information and the staying information of all persons or some persons.
- the information processing apparatus 10 described above is configured as follows.
- each component of the information processing apparatus 10 may be configured with a hardware circuit.
- the information processing apparatus 10 may be configured as a plurality of information processing apparatuses in which the respective constituent units are connected via a network or a bus.
- the information processing apparatus 10 may configure a plurality of components by a single piece of hardware.
- the information processing apparatus 10 may be realized as a computer device including a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory).
- the information processing apparatus 10 may be realized as a computer apparatus that further includes an input / output connection circuit (IOC: Input : Output Circuit) and a network interface circuit (NIC: Network Interface Circuit).
- IOC Input : Output Circuit
- NIC Network Interface Circuit
- FIG. 3 is a block diagram illustrating an example of the configuration of the information processing apparatus 60 according to the modification.
- the information processing device 60 includes a CPU 610, a ROM 620, a RAM 630, an internal storage device 640, an IOC 650, and a NIC 680, and constitutes a computer.
- the CPU 610 reads a program from ROM 620.
- the CPU 610 controls the RAM 630, the internal storage device 640, the IOC 650, and the NIC 680 based on the read program.
- the computer including the CPU 610 controls these configurations, and implements the functions as the information processing apparatus 10 shown in FIG.
- each function is a function of the person detection tracking unit 100, the same person detection unit 101, the direction calculation unit 102, the dwell time calculation unit 103, and the processing unit 104.
- the CPU 610 may use the RAM 630 or the internal storage device 640 as a temporary storage of a program when realizing each function.
- the CPU 610 may read the program included in the computer-readable storage medium 700 storing the program using a storage medium reading device (not shown). Alternatively, the CPU 610 may receive a program from an external device (not shown) via the NIC 680. Further, the CPU 610 may store the read program or the received program in the RAM 630 and operate based on the stored program.
- ROM 620 stores programs executed by CPU 610 and fixed data.
- the ROM 620 is, for example, a P-ROM (Programmable-ROM) or a flash ROM.
- the RAM 630 temporarily stores programs executed by the CPU 610 and data.
- the RAM 630 is, for example, a D-RAM (Dynamic-RAM).
- the internal storage device 640 stores data and programs that the information processing device 60 stores for a long time. Further, the internal storage device 640 may operate as a temporary storage device for the CPU 610.
- the internal storage device 640 is, for example, a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), or a disk array device.
- the ROM 620 and the internal storage device 640 are non-transitory storage media.
- the RAM 630 is a volatile storage medium.
- the CPU 610 can operate based on a program stored in the ROM 620, the internal storage device 640, or the RAM 630. That is, the CPU 610 can operate using a nonvolatile storage medium or a volatile storage medium.
- the IOC 650 mediates data between the CPU 610, the input device 660, and the display device 670.
- the IOC 650 is, for example, an IO interface card or a USB (Universal Serial Bus) card.
- the input device 660 is a device that receives an input instruction from an operator of the information processing apparatus 60.
- the input device 660 is, for example, a keyboard, a mouse, or a touch panel.
- the input device 660 includes a camera that outputs a floor image.
- the display device 670 is a device that displays information to the operator of the information processing apparatus 60.
- the display device 670 is a liquid crystal display, for example.
- the CPU 610 may display an image displayed by the processing unit 104 on the display device 670. In this case, the display device 670 may be included in the processing unit 104.
- the NIC 680 relays data exchange with an external device (not shown) via the network.
- the NIC 680 is, for example, a LAN (Local Area Network) card.
- the information processing apparatus 60 configured as described above can obtain the same effects as the information processing apparatus 10.
- the reason is that the CPU 610 of the information processing apparatus 60 can realize the same function as the information processing apparatus 10 based on the program.
- FIG. 4 is a block diagram illustrating an example of the configuration of the information processing apparatus 20 according to the second embodiment.
- the information processing apparatus 20 includes a person detection tracking unit 100, a same person detection unit 101, a direction calculation unit 102, a dwell time calculation unit 103, a data accumulation unit 201, and a processing unit 202. including.
- the person detection tracking unit 100 the same person detection unit 101, the direction calculation unit 102, and the residence time calculation unit 103 are the same as those in the first embodiment, detailed description thereof is omitted.
- the configuration and operation unique to the present embodiment will be mainly described.
- the data accumulating unit 201 receives the traveling direction information from the direction calculating unit 102, and receives the staying location information and the staying time information from the staying time calculating unit 103. Then, the data accumulating unit 201 accumulates the number of people who generate traveling direction information (flow line data) in the same traveling direction in the same section. Furthermore, the data accumulating unit 201 accumulates the number of people who have generated the same staying place.
- the data accumulating unit 201 may hold in advance information on a section in which the number of people in the target area is accumulated. Or the data accumulation part 201 may set the area to accumulate based on stay location information.
- the data accumulating unit 201 outputs the accumulated number of flow line data in the same traveling direction in the same section and the accumulated number of people in the same staying place to the processing unit 202.
- the processing unit 202 receives the cumulative number of flow line data in the same traveling direction in the same section and the cumulative number of people in the same staying place. Then, the processing unit 202 displays the flow line information and the staying information on the image of the target area based on the accumulated number of the received flow line data and the accumulated number of people in the same staying place. That is, the processing unit 202 displays flow line information and staying information corresponding to the cumulative number of people.
- the data accumulating unit 201 may accumulate either the cumulative number of flow line data in the same traveling direction in the same section or the cumulative number of people in the same staying place.
- the processing unit 202 may perform display based on the information accumulated by the data accumulation unit 201.
- the display format of the processing unit 202 is not particularly limited.
- FIG. 5 is a diagram illustrating an example of the display of the processing unit 202.
- Fig. 5 assumes a store floor as in Fig. 2.
- the processing unit 202 displays an arrow having a thickness (width) proportional to the cumulative number of people as the flow line information.
- flow line information 401, flow line information 402, flow line information 403, and flow line information 404 are displayed.
- the processing unit 202 displays a figure having a size proportional to the cumulative number of people in the same staying place as the staying information display.
- stay information 405, stay information 406, stay information 407, and stay information 408 are displayed.
- the proportion need not be limited to a mathematically exact proportion.
- 10 persons move along the flow line indicated by the arrow of the flow line information 401 and stay at the position indicated by the stay information 405.
- one person moves along the flow line indicated by the arrow of the flow line information 402 and stays at the position indicated by the stay information 406.
- the flow line information 401 and the staying information 405 are information corresponding to 10 people.
- the flow line information 402 and the staying information 406 are information corresponding to one person.
- the flow line information 401 is represented by a thicker arrow than the flow line information 402.
- the stay information 405 is represented by a graphic larger than the stay information 406.
- the processing unit 202 may receive the traveling direction information, the staying location information, and the staying time information, and display the same information as the processing unit 104. . That is, the processing unit 202 may include the function of the processing unit 104. For example, the processing unit 202 may display information corresponding to the flow line information 301 to 304 and the stay information 305 to 306 in addition to the flow line information 401 to 404 and the stay information 405 to 406 corresponding to the cumulative number of people.
- This embodiment can produce an effect of clarifying the number of persons related to the flow line information and the staying information in addition to the effect of the first embodiment.
- the data accumulation unit 201 of the present embodiment calculates the cumulative number of flow line data in the same direction and the cumulative number of people in the same staying place in the same section. This is because the processing unit 202 displays flow line information and staying information corresponding to the cumulative number of people.
- Information processing apparatus 20 Information processing apparatus 60 Information processing apparatus 100 Person detection tracking part 101 Same person detection part 102 Direction calculation part 103 Residence time calculation part 104 Processing part 201 Data accumulation part 202 Processing part 301 Flow line information 302 Flow line information 303 Flow line information 304 Flow line information 305 Retention information 306 Retention information 307 Retention information 308 Retention information 401 Flow line information 402 Flow line information 403 Flow line information 404 Flow line information 405 Retention information 406 Retention information 407 Retention information 408 Retention information 500 Gondola 610 CPU 620 ROM 630 RAM 640 Internal storage device 650 IOC 660 Input device 670 Display device 680 NIC 700 storage media
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Entrepreneurship & Innovation (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Game Theory and Decision Science (AREA)
- Economics (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Data Mining & Analysis (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
図1は、本発明における第1の実施形態に係る情報処理装置10の構成の一例を示すブロック図である。図1に示すように、情報処理装置10は、人物検出追跡部100と、同一人物検出部101と、方向算出部102と、滞留時間算出部103と、処理部104とを含む。
以上説明した情報処理装置10は、次のように構成される。
次に、本発明の第2の実施形態について、図面を参照して説明する。
20 情報処理装置
60 情報処理装置
100 人物検出追跡部
101 同一人物検出部
102 方向算出部
103 滞留時間算出部
104 処理部
201 データ累積部
202 処理部
301 動線情報
302 動線情報
303 動線情報
304 動線情報
305 滞留情報
306 滞留情報
307 滞留情報
308 滞留情報
401 動線情報
402 動線情報
403 動線情報
404 動線情報
405 滞留情報
406 滞留情報
407 滞留情報
408 滞留情報
500 ゴンドラ
610 CPU
620 ROM
630 RAM
640 内部記憶装置
650 IOC
660 入力機器
670 表示機器
680 NIC
700 記憶媒体
Claims (7)
- 対象エリアの含まれる人物の位置に関連する情報を含む分析用情報を受信し、前記分析用情報を基に人物の位置の時系列情報である第1の人物追跡情報を算出する人物検出追跡手段と、
前記第1の人物追跡情報における人物を特定し、人物ごとに前記第1の人物追跡情報を分類し、人物ごとの位置の時系列情報である第2の人物追跡情報を算出する同一人物検出手段と、
前記第2の人物追跡情報を基に、各人物の進行方向情報を算出する方向算出手段と、
前記第2の人物追跡情報を基に、各人物の滞留場所情報と滞留時間情報とを算出する滞留時間算出手段と、
前記進行方向情報と、前記滞留場所情報と、前記滞留時間情報とを基に、人物の動線情報と滞留情報とを算出し、前記対象エリアの画像と重ねて前記動線情報と前記滞留情報とを表示する処理手段と
を含む情報処理装置。 - 前記処理手段が、
表示する前記滞留時間の図形の大きさを、前記滞留時間の長さを基に決定する
請求項1に記載の情報処理装置。 - 前記処理手段が、
すべての人物又は一部の人物の前記動線情報及び前記滞留情報を表示する
請求項1又は請求項2に記載の情報処理装置。 - 前記進行方向情報における同一区間で同一方向の動線データの発生人数を累積するデータ累積手段を含み、
前記処理手段が、
表示する前記動線情報の図形の幅を、累積した前記動線データの累積人数を基に決定する
請求項1ないし3のいずれか1項に記載の情報処理装置。 - 前記データ累積手段が、
同一の滞留場所の発生人数を累積し、
前記処理手段が、
表示する前記滞留情報の図形の大きさを、追跡した前記同一の滞留場所の累積人数を基に決定する
請求項4に記載の情報処理装置。 - 対象エリアの含まれる人物の位置に関連する情報を含む分析用情報を受信し、前記分析用情報を基に人物の位置の時系列情報である第1の人物追跡情報を算出し、
前記第1の人物追跡情報における人物を特定し、人物ごとに前記第1の人物追跡情報を分類し、人物ごとの位置の時系列情報である第2の人物追跡情報を算出し、
前記第2の人物追跡情報を基に、各人物の進行方向情報を算出し、
前記第2の人物追跡情報を基に、各人物の滞留場所情報と滞留時間情報とを算出し、
前記進行方向情報と、前記滞留場所情報と、前記滞留時間情報とを基に、人物の動線情報と滞留情報とを算出し、前記対象エリアの画像と重ねて前記動線情報と前記滞留情報とを表示する
データ分析方法。 - 対象エリアの含まれる人物の位置に関連する情報を含む分析用情報を受信し、前記分析用情報を基に人物の位置の時系列情報である第1の人物追跡情報を算出する処理と、
前記第1の人物追跡情報における人物を特定し、人物ごとに前記第1の人物追跡情報を分類し、人物ごとの位置の時系列情報である第2の人物追跡情報を算出する処理と、
前記第2の人物追跡情報を基に、各人物の進行方向情報を算出する処理と、
前記第2の人物追跡情報を基に、各人物の滞留場所情報と滞留時間情報とを算出する処理と、
前記進行方向情報と、前記滞留場所情報と、前記滞留時間情報とを基に、人物の動線情報と滞留情報とを算出し、前記対象エリアの画像と重ねて前記動線情報と前記滞留情報とを表示する処理と
をコンピュータに実行させるプログラムを含むコンピュータに読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/119,460 US20170011410A1 (en) | 2014-02-25 | 2015-02-19 | Information-processing device, data analysis method, and recording medium |
JP2016505047A JP6319421B2 (ja) | 2014-02-25 | 2015-02-19 | 情報処理装置、データ分析方法、及び、プログラム |
US16/297,942 US20190205903A1 (en) | 2014-02-25 | 2019-03-11 | Information-processing device, data analysis method, and recording medium |
US16/297,969 US20190205904A1 (en) | 2014-02-25 | 2019-03-11 | Information-processing device, data analysis method, and recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-034036 | 2014-02-25 | ||
JP2014034036 | 2014-02-25 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/119,460 A-371-Of-International US20170011410A1 (en) | 2014-02-25 | 2015-02-19 | Information-processing device, data analysis method, and recording medium |
US16/297,942 Continuation US20190205903A1 (en) | 2014-02-25 | 2019-03-11 | Information-processing device, data analysis method, and recording medium |
US16/297,969 Continuation US20190205904A1 (en) | 2014-02-25 | 2019-03-11 | Information-processing device, data analysis method, and recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015129210A1 true WO2015129210A1 (ja) | 2015-09-03 |
Family
ID=54008547
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/000779 WO2015129210A1 (ja) | 2014-02-25 | 2015-02-19 | 情報処理装置、データ分析方法、及び、記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (3) | US20170011410A1 (ja) |
JP (1) | JP6319421B2 (ja) |
WO (1) | WO2015129210A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017123024A (ja) * | 2016-01-06 | 2017-07-13 | パナソニックIpマネジメント株式会社 | 動線分析システム及び動線分析方法 |
WO2017170084A1 (ja) * | 2016-03-31 | 2017-10-05 | 日本電気株式会社 | 動線表示システム、動線表示方法およびプログラム記録媒体 |
WO2019162988A1 (ja) * | 2018-02-20 | 2019-08-29 | 株式会社ソシオネクスト | 表示制御装置、表示制御システム、表示制御方法、及びプログラム |
JP2020144746A (ja) * | 2019-03-08 | 2020-09-10 | 本田技研工業株式会社 | 情報分析装置及び情報分析方法 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6399096B2 (ja) | 2014-09-11 | 2018-10-03 | 日本電気株式会社 | 情報処理装置、表示方法およびコンピュータプログラム |
US10510125B2 (en) | 2016-11-17 | 2019-12-17 | International Business Machines Corporation | Expense compliance checking based on trajectory detection |
KR102138553B1 (ko) | 2017-07-19 | 2020-07-28 | 미쓰비시덴키 가부시키가이샤 | 행동 가시화 장치 및 행동 가시화 방법 |
CN110766101B (zh) * | 2018-07-26 | 2023-10-20 | 杭州海康威视数字技术股份有限公司 | 确定移动轨迹的方法和装置 |
CN110969050A (zh) * | 2018-09-29 | 2020-04-07 | 上海小蚁科技有限公司 | 员工的工作状态检测方法、装置、存储介质、终端 |
CN111308463B (zh) * | 2020-01-20 | 2022-06-07 | 京东方科技集团股份有限公司 | 人体检测方法、装置、终端设备、存储介质及电子设备 |
CN116157833A (zh) * | 2020-09-23 | 2023-05-23 | Jvc建伍株式会社 | 图像处理装置及图像处理程序 |
CN112733814B (zh) * | 2021-03-30 | 2021-06-22 | 上海闪马智能科技有限公司 | 一种基于深度学习的行人徘徊滞留检测方法、系统及介质 |
CN114743345A (zh) * | 2022-03-22 | 2022-07-12 | 广东电力通信科技有限公司 | 一种基于电子地图的核心场所智能管控平台 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009146166A (ja) * | 2007-12-14 | 2009-07-02 | Hitachi Ltd | 作業標準化支援システムおよび作業標準化支援方法 |
JP2010055594A (ja) * | 2008-07-31 | 2010-03-11 | Nec Software Kyushu Ltd | 動線管理システムおよびプログラム |
JP2011103062A (ja) * | 2009-11-11 | 2011-05-26 | Kozo Keikaku Engineering Inc | 主要動線出力装置、主要動線出力方法およびプログラム |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007163391A (ja) * | 2005-12-16 | 2007-06-28 | Liti R & D:Kk | 動線表示装置 |
-
2015
- 2015-02-19 WO PCT/JP2015/000779 patent/WO2015129210A1/ja active Application Filing
- 2015-02-19 JP JP2016505047A patent/JP6319421B2/ja active Active
- 2015-02-19 US US15/119,460 patent/US20170011410A1/en not_active Abandoned
-
2019
- 2019-03-11 US US16/297,942 patent/US20190205903A1/en not_active Abandoned
- 2019-03-11 US US16/297,969 patent/US20190205904A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009146166A (ja) * | 2007-12-14 | 2009-07-02 | Hitachi Ltd | 作業標準化支援システムおよび作業標準化支援方法 |
JP2010055594A (ja) * | 2008-07-31 | 2010-03-11 | Nec Software Kyushu Ltd | 動線管理システムおよびプログラム |
JP2011103062A (ja) * | 2009-11-11 | 2011-05-26 | Kozo Keikaku Engineering Inc | 主要動線出力装置、主要動線出力方法およびプログラム |
Non-Patent Citations (2)
Title |
---|
KOSUKE NAKANO: "Hito no Kodo' o Bunseki suru Ichi Joho Solution", GEKKAN JIDO NINSHIKI, vol. 23, no. 7, 10 June 2010 (2010-06-10), pages 19 - 23 * |
YASUHIRO KOTAKI: "RFID o Riyo shita, Ido Dosen no Mieruka Tool", GEKKAN JIDO NINSHIKI, vol. 26, no. 8, 10 July 2013 (2013-07-10), pages 36 - 41 * |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017123024A (ja) * | 2016-01-06 | 2017-07-13 | パナソニックIpマネジメント株式会社 | 動線分析システム及び動線分析方法 |
US11276210B2 (en) | 2016-03-31 | 2022-03-15 | Nec Corporation | Flow line display system, flow line display method, and program recording medium |
WO2017170084A1 (ja) * | 2016-03-31 | 2017-10-05 | 日本電気株式会社 | 動線表示システム、動線表示方法およびプログラム記録媒体 |
JPWO2017170084A1 (ja) * | 2016-03-31 | 2019-01-10 | 日本電気株式会社 | 動線表示システム、動線表示方法および動線表示プログラム |
US10740934B2 (en) | 2016-03-31 | 2020-08-11 | Nec Corporation | Flow line display system, flow line display method, and program recording medium |
JP7131587B2 (ja) | 2016-03-31 | 2022-09-06 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法およびプログラム |
JP2021002840A (ja) * | 2016-03-31 | 2021-01-07 | 日本電気株式会社 | 情報処理システム、情報処理装置、情報処理方法およびプログラム |
WO2019162988A1 (ja) * | 2018-02-20 | 2019-08-29 | 株式会社ソシオネクスト | 表示制御装置、表示制御システム、表示制御方法、及びプログラム |
JPWO2019162988A1 (ja) * | 2018-02-20 | 2021-02-25 | 株式会社ソシオネクスト | 表示制御装置、表示制御システム、表示制御方法、及びプログラム |
US11321949B2 (en) | 2018-02-20 | 2022-05-03 | Socionext Inc. | Display control device, display control system, and display control method |
JP7147835B2 (ja) | 2018-02-20 | 2022-10-05 | 株式会社ソシオネクスト | 表示制御装置、表示制御システム、表示制御方法、及びプログラム |
JP2020144746A (ja) * | 2019-03-08 | 2020-09-10 | 本田技研工業株式会社 | 情報分析装置及び情報分析方法 |
JP7149206B2 (ja) | 2019-03-08 | 2022-10-06 | 本田技研工業株式会社 | 情報分析装置及び情報分析方法 |
Also Published As
Publication number | Publication date |
---|---|
US20170011410A1 (en) | 2017-01-12 |
JP6319421B2 (ja) | 2018-05-09 |
US20190205903A1 (en) | 2019-07-04 |
JPWO2015129210A1 (ja) | 2017-03-30 |
US20190205904A1 (en) | 2019-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6319421B2 (ja) | 情報処理装置、データ分析方法、及び、プログラム | |
US10402634B2 (en) | Information processing device, information processing method, and computer program product | |
JP6172380B2 (ja) | Pos端末装置、posシステム、商品認識方法及びプログラム | |
US10846537B2 (en) | Information processing device, determination device, notification system, information transmission method, and program | |
US10334965B2 (en) | Monitoring device, monitoring system, and monitoring method | |
US9767346B2 (en) | Detecting an attentive user for providing personalized content on a display | |
CA3160731A1 (en) | Interactive behavior recognizing method, device, computer equipment and storage medium | |
CA3014403A1 (en) | Tracking and/or analyzing facility-related activities | |
US9727791B2 (en) | Person detection system, method, and non-transitory computer readable medium | |
CA2958888A1 (en) | Reducing the search space for recognition of objects in an image based on wireless signals | |
JP6233624B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
US11657548B2 (en) | Information processing device, display method, and program storage medium for monitoring object movement | |
US20140140572A1 (en) | Parallel face detection and tracking system | |
US20190113349A1 (en) | Systems and methods for autonomous generation of maps | |
US11030540B2 (en) | User activity recognition through work surfaces using radio-frequency sensors | |
JP2021185533A (ja) | 動線判定装置、動線判定システム、動線判定方法及びプログラム | |
US10726378B2 (en) | Interaction analysis | |
CN108446693B (zh) | 待识别目标的标记方法、系统、设备及存储介质 | |
US20230230379A1 (en) | Safety compliance system and method | |
JP6741877B2 (ja) | タグ情報を用いた物品管理システム | |
JP2016224884A (ja) | 情報処理装置、及びプログラム | |
CN115457441A (zh) | 进行远程视频面签的风险识别方法 | |
CN103700163A (zh) | 确定对象的方法及三维传感器 | |
JP2009076001A (ja) | 行動把握方法、行動分析方法、情報処理装置、システムおよびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15755381 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15119460 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016505047 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15755381 Country of ref document: EP Kind code of ref document: A1 |