JP2011008571A - Passer-by fluidity data generating device, content distribution controlling device, passer-by fluidity data generating method, and content distribution controlling method - Google Patents

Passer-by fluidity data generating device, content distribution controlling device, passer-by fluidity data generating method, and content distribution controlling method Download PDF

Info

Publication number
JP2011008571A
JP2011008571A JP2009151901A JP2009151901A JP2011008571A JP 2011008571 A JP2011008571 A JP 2011008571A JP 2009151901 A JP2009151901 A JP 2009151901A JP 2009151901 A JP2009151901 A JP 2009151901A JP 2011008571 A JP2011008571 A JP 2011008571A
Authority
JP
Japan
Prior art keywords
flow
passer
passers
attribute
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2009151901A
Other languages
Japanese (ja)
Inventor
Toshihiro Mochizuki
Takehiro Sekine
敏弘 望月
剛宏 関根
Original Assignee
Shunkosha:Kk
株式会社春光社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shunkosha:Kk, 株式会社春光社 filed Critical Shunkosha:Kk
Priority to JP2009151901A priority Critical patent/JP2011008571A/en
Publication of JP2011008571A publication Critical patent/JP2011008571A/en
Pending legal-status Critical Current

Links

Images

Abstract

PROBLEM TO BE SOLVED: To sufficiently bring out the effect of advertisement while taking cooperation among a plurality of digital signages.SOLUTION: A passer-by fluidity data generating device 100 provides an advertisement distribution controlling device 200 with 'fluidity information' which is necessary to display an advertisement appropriate to passers-by on a display device 300. The advertisement distribution controlling device 200 computes a 'fluidity index' on the basis of the 'fluidity information' and controls the distribution of the advertisement to the display device 300 by using the 'fluidity index'. That is, the advertisement distribution controlling device 200 distributes a more effective advertisement to the display device 300 on the basis of 'fluidity information' which is the result of analysis made by the passer-by fluidity data generating device 100, and consequently the effect of advertisement is enhanced.

Description

  The disclosed technology relates to a passer-by flow data generation device, a content distribution control device, a passer-by flow data generation method, and a content distribution control method.

  2. Description of the Related Art Conventionally, advertising media that display content using information communication technology called digital signage are known. Digital signage has the following functions. Face recognition technology is used based on the camera image to determine the sex and age group of passers-by who pass in front of digital signage. The discrimination results are totaled for each time zone, and the display effect of the advertisement displayed on the digital signage is measured according to the sex, age group, and advertisement distribution schedule of a passerby that is a predetermined advertisement target. Further, based on the measurement result, there is known a conventional technology that enables advertisement display that can be expected to be more effective in accordance with the attribute and time zone of a passerby that is a target at a digital signage installation location.

  The following conventional techniques are known. That is, the digital signage or the advertisement distribution apparatus connected to be able to communicate with the digital signage detects the current position information or the current time of the digital signage. Then, an advertisement is selected by searching a database in which the display priority set for each advertisement is changed so as to change according to the current position information or the current time. Then, for the selected advertisement, content that produces a display effect according to the set display priority is generated and displayed on the display screen of the digital signage.

JP 2007-199382 A

NEC Corporation press release, "Japan's first release of" eye flavor, "an all-in-one digital signage board that uses face recognition technology to deliver ads from analysis to effect analysis." [Online], December 16, 2008 Japan, NEC Corporation, [Search April 27, 2009], Internet <http://www.nec.co.jp/press/en/0812/1601.html>

  However, the digital signage of the above prior art is based only on the information of passers-by passing through the adjacent area of the digital signage, and the passersby's gender, age group, advertisement distribution schedule, current location information or current time of digital signage The advertisements displayed on the corresponding digital signage are switched according to the display. For this reason, it is possible to optimize the advertisement effect with respect to the advertisement display in the corresponding digital signage alone. However, in locations such as urban areas where commercial facilities and transportation platforms are close to each other, the flow of passers-by between them is dense, and digital signage is installed at all locations. It was not possible to bring out sufficient advertising effects while coordinating between digital signage.

  The disclosed technology has been made in view of the above, and a flow data generation device for passers-by, a content distribution control device, a traffic that can sufficiently bring out an advertising effect while cooperating between a plurality of digital signs An object is to provide a human flow data generation method and a content distribution control method.

  In one aspect, a passerby flow data generation device, a content distribution control device, a passer flow data generation method, and a content distribution control method of the disclosed technology can communicate with a content distribution control device that distributes content to be displayed on a display device. A connected passer-by-flow data generation device that detects a passer-by passing through a predetermined pass-through area and a face attribute of the passer-by based on an image captured by the imaging device, and the detected passer-by flow pattern And the number of passers by whom the flow pattern is acquired is counted for each face attribute and for each flow pattern, and flow information including the counted number of passers and the direction of movement of the passers is generated. And transmitting to the content distribution control device.

  According to one aspect of the disclosed passer flow data generation device, content distribution control device, passer flow data generation method, and content distribution control method, the flow of passers is concentrated in a wide space, and the passer flow Even in situations where data generation devices and content display devices are installed at all locations, content can be displayed with sufficient content display effects while linking the passer-by flow data generation device and content display device. The effect that it becomes possible to display is produced.

FIG. 1 is a block diagram illustrating the configuration of the passer-by flow data generating apparatus according to the first embodiment. FIG. 2 is a diagram illustrating an example of a flow model storage table. FIG. 3 is a diagram illustrating an example of flow data. FIG. 4 is a diagram illustrating an example of a flow model. FIG. 5 is a diagram showing an outline of classification of a flow locus into a flow model. FIG. 6 is a flowchart illustrating a passer-by detection process procedure according to the first embodiment. FIG. 7 is a flowchart showing the flow trajectory classification processing procedure. FIG. 8 is a block diagram illustrating the configuration of the advertisement distribution control system according to the second embodiment. FIG. 9 is a block diagram illustrating the configuration of the advertisement distribution control apparatus according to the second embodiment. FIG. 10 is a diagram illustrating an example of the flow information table. FIG. 11 is a diagram illustrating an example of the attribute time period ratio table. FIG. 12 is a diagram illustrating an example of the flow distance table. FIG. 13 is a diagram illustrating an example of a flow index table. FIG. 14 is a diagram illustrating an example of a physical distance table. FIG. 15 is a diagram illustrating an example of the advertisement distribution contract information table. FIG. 16 is a diagram illustrating an example of the advertisement data table. FIG. 17 is a flowchart showing a flow information reception processing procedure. FIG. 18 is a flowchart showing the procedure of the advertisement distribution schedule determination process (1). FIG. 19 is a diagram showing the calculation result of the advertisement distribution score by the advertisement distribution schedule determination process (1). FIG. 20 is a diagram showing the advertisement distribution schedule (1). FIG. 21 is a flowchart showing the procedure of the advertisement distribution schedule determination process (2). FIG. 22 is a diagram showing the calculation result of the advertisement distribution score by the advertisement distribution schedule determination process (2). FIG. 23 is a diagram showing the advertisement distribution schedule (2). FIG. 24 is a diagram (part 1) illustrating an installation example of a face recognition camera. FIG. 25 is a diagram (part 2) illustrating an installation example of the face recognition camera.

  Hereinafter, embodiments of a passer-by flow data generation device, a content delivery control device, a passer-by flow data generation method, and a content delivery control method according to the disclosed technology will be described in detail with reference to the drawings. In the following embodiments, content displayed on a display device such as a display or digital signage will be described using an advertisement as an example. In addition, the content is based on image display only. However, the disclosed technology is not limited to advertisements and can be widely applied to content that conveys information. Further, the content is not limited to image display, but may be audio output. The disclosed technology is not limited by the following embodiments.

[Summary of Example 1]
Hereinafter, Example 1 according to the disclosed technology will be described with reference to FIGS. Example 1 is the following example. In other words, attributes such as the gender, age group, etc. of passers who pass through areas that are open to the public and freely accessible to the public, such as station concourses, are obtained for each attribute. It is the Example which concerns on the passerby flow data generation apparatus which produces | generates "flow data" which is the result of totaling the number of passers by time.

  The process of acquiring attributes such as the sex and age group of passers-by passing through the “person count setting area” is referred to as “sensing”. That is, the passer-by flow data generating apparatus according to the first embodiment is an apparatus that performs sensing to generate “flow information”.

  As will be described later, the passer-by flow data generating apparatus according to the first embodiment performs “sensing” only at a corresponding location based on images of a face recognition camera and a person count camera installed at one location, and “flow data Is a terminal device that generates " However, the present invention is not limited to this, and even a server device that performs “sensing” at a plurality of locations based on images of face recognition cameras and person count cameras installed at a plurality of locations and generates “flow data”. Good. The “base” refers to a facility in which a person count setting area and / or a display device for displaying an advertisement is installed.

[Configuration of Passer Flow Data Generation Device According to Embodiment 1]
FIG. 1 is a block diagram illustrating the configuration of the passer-by flow data generating apparatus according to the first embodiment. The passerby flow data generation apparatus 100 according to the first embodiment is connected to the advertisement distribution control apparatus 200 through a secure information communication network. The passer-by flow data generating device 100 generates “flow data” of a passer-by described later and transmits it to the advertisement distribution control device 200. The advertisement distribution control device 200 is connected to a display device 300 that displays advertisements via a secure information communication network. The display device 300 is, for example, digital signage, and is a liquid crystal display capable of displaying an image based on image data. The advertisement distribution control device 200 controls the display of advertisements displayed on the display device 300 based on the “flow data” received in real time from the passerby flow data generation device 100, for example, the display order of advertisements, the display time of advertisements, etc. Do.

  The “passerby” here refers to a general person who passes through a predetermined pass area, such as a station concourse, a public area such as a plaza, and the like that can be freely passed. In the disclosed technology including the embodiment, an example in which a predetermined traffic area is a concourse of a station is shown. Further, “passenger flow” refers to the number of persons traveling by gender, age group, time, and flow direction. Here, the “flow direction” refers to the direction in which the “passerby” moves, and specifically refers to the base where the “passerby” heads from the base where the current position is located. The “base” is a concept including a place for getting on and off public transportation such as a station, a commercial facility, a business facility, and the like. The passer-by flow data generating device 100 is a device that generates “flow data” based on “passer-by flow” and transmits it to the advertisement distribution control device 200.

  As shown in FIG. 1, a passer-by flow data generating apparatus 100 according to the first embodiment includes a passer-by detection unit 101, a face attribute detection unit 102, a flow trajectory classification processing unit 103, a flow model storage unit 103a, and a passer-by counting unit. 104, a flow data generation unit 105, and a flow data transmission unit 106.

  A face recognition camera 150a is connected to the passer-by detection unit 101 via a predetermined interface (for example, IEEE (The Institute of Electrical and Electronics Engineers) 1394). The flow trajectory classification processing unit 103 is connected to a person count camera 150b via a predetermined interface (for example, USB (Universal Serial Bus) or the like).

  Although FIG. 1 shows that one set of face recognition camera 150a and one set of person count camera 150b are connected to one passer flow data generation device 100, the present invention is not limited to this, and a plurality of sets of face recognition are provided. A camera 150a and / or a plurality of people counting cameras 150b may be connected.

  The passer-by flow data generation device 100 is disposed at each of a plurality of locations. Each passer-by flow data generation device 100 is connected to the advertisement distribution control device 200 via a secure information communication network. The advertisement distribution control device 200 performs distribution control of advertisements distributed to one or a plurality of display devices 300 using the flow index received from the passerby flow data generation device 100.

  The passer-by detection unit 101 is arranged at a position where a passer-by who is passing through the person count setting area of one or a plurality of bases can be imaged, and the face image of the passer-by is included in the moving image data captured by the face recognition camera 150a. Detect whether or not. For example, when a moving object detected from a frame difference of moving image data is recognized as a person's face, the moving object is determined to be a passer's face and detected as a passerby.

  The face attribute detection unit 102 acquires a face attribute from the passer's face detected by the passerby detection unit 101. A well-known technique is used for acquiring the face attribute. The face attributes are the passersby's gender, age group, and the like that can be determined from facial features. For example, a facial feature is compared with a feature pattern for each sex and age group prepared in advance, and a facial attribute is acquired based on the comparison result. The age group is an index that distinguishes passersby's age, for example, in their 10s, 20s,. The face attribute may be detected based on appearance, and does not necessarily match the actual sex and age group of the passerby.

  The flow trajectory classification processing unit 103 tracks a moving trajectory (hereinafter referred to as “flow trajectory”) of a moving object that is determined to be a passerby in the moving image data captured by the person count camera 150b. Then, the flow trajectory classification processing unit 103 counts the number of people who walk along a predetermined arbitrary straight line (hereinafter referred to as a flow model) or the number of passers-by who have stayed at a certain point for a certain period of time. In addition, the flow trajectory classification processing unit 103 is the number of people who have passed through a rectangular area fixed in advance in the moving image data captured by the person counting camera 150b (the number of people who entered the rectangular area and the number of the rectangular areas). Count).

  The “flow trajectory” for each passerby is a walking trajectory that the corresponding passerby has moved from the start to the end of the tracking of the flow trajectory. Since the walking trajectory does not necessarily indicate a straight line and varies depending on the passerby, it is difficult to store the walking trajectory for each passerby. Therefore, a plurality of vector patterns determined by the start point and the end point are prepared in advance, and a plurality of vector patterns determined by the start point and the end point are prepared in advance, and the flow trajectory classification processing unit 103 performs regression such as least square method. The walking trajectory is regressed to any vector pattern using an analysis technique.

  Then, the flow trajectory classification processing unit 103 adds and averages the luminances of the moving image data periodically acquired from the person count camera 150b, and uses this as a background image. The background image is normally updated for several hundred frames. Subsequent updates are performed by deleting the oldest frame and adding the latest frame by the first-in first-out method. When an image of a certain frame is input to the updated background image, the brightness difference is calculated, and object region candidates are extracted.

  Then, the flow trajectory classification processing unit 103 performs a binary determination process and a labeling process from the difference image to extract an object region. In the labeling process, an index n (n is a natural number) is assigned to each difference image. In the processing so far, when the area of the rectangular area is smaller than a certain value, the corresponding rectangular area is regarded as noise and excluded. Then, the flow trajectory classification processing unit 103 obtains the center of gravity of each extracted object region. When a plurality of object areas are obtained for each frame, and the change in the center of gravity of each object area is equal to or less than a predetermined threshold, and the change in the feature amount such as color information of the object area is within the predetermined range, these Is determined to be the same between frames, and tracking continues.

  The tracking information of the movement locus is totaled by adding a time stamp to each tracking coordinate. If a large number of passersby are tracked at short frame intervals, the total amount of data increases, resulting in an increase in processing load and storage resource pressure. In order to avoid this, instead of using all the coordinates tracked in each frame, the movement vector between the tracking coordinates is obtained for each frame, and the change in the angle formed by the movement vector between the frames is a predetermined angle or more. Only tracking coordinates. In this way, it is possible to reduce the load of tracking processing of the flow trajectory classification processing unit 103.

  The flow trajectory classification processing unit 103 compares the tracked flow trajectory with the flow model stored in the flow model storage unit 103a and indicated by a vector, and the flow trajectory can be regressed by statistical processing such as the least square method. Classify into models.

  The flow model stored in the flow model storage unit 103a is stored in the flow model storage table shown in FIG. The flow model simplifies by replacing the flow trajectory of passers-by showing complicated movements with several patterns. As shown in FIG. 2, the flow model is a vector that can be identified by the base where the passer-by flow data generation apparatus 100 is installed and the base where the display device 300 is installed.

  Since the flow model is a vector, it indicates the flow direction of the passerby. The flow direction of the passerby indicated by the flow model is called “flow direction”. The “flow direction” indicates a base where the display device 300 is installed, and the corresponding base is associated and stored in the flow model storage table. The flow model is a flow direction index indicating a base to which a passerby flows.

  For example, the flow model shown in FIG. 2 corresponds to a vector set in the area count setting area of the concourse at station A shown in FIG. The person count setting area corresponds to the camera view of the face recognition camera 150a. In the number of people setting area, A station as a base that is the starting point of the flow model and “B facility direction”, “C facility direction”, and “D facility direction” indicating the flow direction that is the end point are set. .

  As shown in FIG. 2, for example, in the flow model “1”, “A station” is the base that is the starting point of the flow model, and “B facility direction” is the “flow direction” of the flow model. That is, it is assumed that a passerby who passes through the person count setting region while drawing a flow trajectory classified as the flow model “1” passes through the concourse of station A and heads for “B facility”. Therefore, when there are passers-by who pass through the flow count “1” in the flow count setting area, the advertisement distribution control apparatus 200 installs an advertisement corresponding to the face attribute of the passer-by in the “A facility”. Delivered to the display device 300.

  The process of classifying the flow trajectory by returning to the flow model is performed as follows. That is, in FIG. 5, a flow model that is a straight line connecting any two points (flow model start point and flow model end point) in the image is prepared in advance. Further, a threshold value Th1 for determining that the tracked flow locus is close to the flow model is set in advance. The distance between the coordinate (X mark in FIG. 5) in the flow trajectory and the flow model is obtained for each coordinate, and when the average distance is equal to or less than the threshold Th1, it is determined that the user has walked along the flow model. A plurality of types of flow models are prepared. By connecting a flow model end point and another flow model start point, a complicated flow model path can be set.

  In this way, predict the other locations to which the passers who are “sensed” at one location between the distant locations will go, and present advertisements that exhibit more effective advertising to the corresponding passers-by at other locations. It becomes possible. That is, it becomes possible to cooperate with the passer-by-passenger flow data generation device 100 and the display device 300 installed in the bases dispersed in a wide area, and an advertisement that exerts an advertising effect on passers-by from a macro viewpoint. It becomes possible to present.

The flow trajectory classification processing unit 103 detects the walking speed of a passerby whose flow trajectory is classified into the classification model. In the detection process, each tracking coordinate and time are set to (x n , y n , t n ) (n is an index of the difference image), and a moving speed V n between the tracking coordinates is obtained from this data based on the following equation.

Then, when the tracking of the passerby is completed, the moving speed V ave is obtained as follows. Note that Σ appearing in the following expression represents the sum of V n over all indexes n.

The flow trajectory classification processing unit 103 determines whether or not the travel speed V ave of the passerby is equal to or less than a predetermined threshold value Th1. Then, the passer-by counting unit 104 counts only passers-by whose movement speed V ave is determined to be equal to or less than the predetermined threshold Th1 by the flow trajectory classification processing unit 103.

  In general, the number counting camera 150b that captures a passerby from a position to look down can count passersby over a wider range than the face recognition camera 150a can recognize the face attribute. Within the passer count setting area shown in FIG. 4, almost all passers-by who pass through the passer count setting area (total number of passers) can be counted regardless of the direction of walking or the direction of the face. Therefore, by calculating (the number of passers whose faces are recognized) / (total number of passers), it is possible to calculate a gaze rate indicating the ratio of passers-by who have watched the advertisement in the corresponding passer count setting area. The gaze rate may be calculated by the flow index calculation unit 205 of the advertisement distribution control device 200.

  The passer-by counter unit 104 includes a flow model (flow direction) classified by the flow trajectory classification processing unit 103 for each attribute (gender and age group) based on the passer's face attribute detected by the face attribute detection unit 102. The number of passersby is counted every time and every time. In addition, the passer-by counter 104 counts the total number of passers-by who have passed through the number-of-people count setting area (hereinafter referred to as “total number of passers-by”) regardless of the detection of the face attribute. The flow data generation unit 105 generates “flow data” based on these counts. The flow data transmission unit 106 transmits “flow data” to the advertisement distribution control apparatus 200 each time it is acquired.

  As shown in the flow data table of FIG. 3, “flow information” is the number of people who pass by each base where the passerby was sensed, every flow direction estimated from the flow model, every gender, every age group, every time. Is the result of aggregation. In addition, the total number of passers by time and the passer average moving speed are included. Each numerical value in the flow data table is a numerical value counted by the passer-by counting unit 104.

  As an example, the passer-by flow data generation device 100 includes a flow data storage unit that accumulates and stores flow data of passers-by, and a passer-by flow information generation unit that aggregates the flow data for each flow model, and distributes advertisements. The control device 200 may include a content selection output control unit that selects and outputs content to be output to the display device 300 according to the flow information generated by the passer-by flow information generation unit. In this case, the passer-by flow information generating device 100 distributes the advertisement to the display device 300 based on the flow information accumulated and stored in the flow information storage unit.

[Passer detection processing]
FIG. 6 is a flowchart illustrating a passer-by detection process procedure according to the first embodiment. In step S101, the passer-by detection unit 101 determines whether or not a passer-by is detected based on the captured image of the face recognition camera 150a. When it is determined that a passerby has been detected (Yes at Step S101), the process proceeds to Step S102, and when it is not determined that a passerby has been detected (No at Step S101), Step S101 is repeated.

  Subsequently, in step S102, the face attribute detection unit 102 detects a passerby's face attribute detected based on an image captured by the face recognition camera 150a. Here, it is assumed that the face attributes are sex and age group. Gender is gender. In addition, the age group indicates the age in 10's, 20's,.

  Subsequently, in step S103, the flow trajectory classification processing unit 103 and the passer-by counting unit 104 perform a flow trajectory classification process. The flow trajectory classification process is a process of tracking the flow trajectory of a passerby based on the captured image of the person count camera 150b and applying it to any flow model. Details of the flow trajectory classification process will be described later with reference to FIG. That is, the flow trajectory classification processing unit 103 executes the flow trajectory classification process shown in FIG.

  Subsequently, in step S104, the passer-by counter 104 calculates an average moving speed of all passers-by. That is, the average of the moving speeds of all passers-by calculated by the flow trajectory classification processing unit 103 is calculated.

  Subsequently, in step S105, the flow data generation unit 105 performs flow data generation processing. The flow data generation process is a process of generating “flow data” and transmitting it to the advertisement distribution control apparatus 200.

  Subsequently, in step S106, the passer-by detection unit 101 determines whether another passer-by is detected based on the captured image of the face recognition camera 150a. When it is determined that another passerby has been detected (Yes at Step S106), the process proceeds to Step S102, and when it is not determined that another passerby has been detected (No at Step S106), the passerby detection process ends.

[Flow trajectory classification process]
FIG. 7 is a flowchart showing a flow trajectory classification processing procedure related to the flow trajectory classification processing. In the flow trajectory classification process, the moving speed is equal to or less than a predetermined threshold among the passers-by whose face attributes are detected based on the captured image of the face recognition camera 150a and the total number of passers-by counted based on the captured image of the person count camera 150b. Only passers-by are counted. This is because a passerby who quickly passes the number of people setting area so that the moving speed exceeds a predetermined threshold is considered not to pay attention to the advertisement displayed on the display device 300, and thus a passerby whose face attribute is detected. This is because it may be possible to improve the accuracy of “flow data” by removing from the number of people.

  First, in step S111, the flow trajectory classification processing unit 103 performs tracking of a flow trajectory of a passerby detected by the face recognition camera 150a and / or the person count camera 150b. Subsequently, in step S112, the flow trajectory classification processing unit 103 acquires coordinate data of the flow trajectory of the tracked passerby.

  Subsequently, in step S113, the flow trajectory classification processing unit 103 acquires each time data corresponding to the coordinate data of the tracked passerby's flow trajectory. Subsequently, in step S114, the flow trajectory classification processing unit 103 calculates a distance between each flow model of the passer-by detected by the face recognition camera 150a and / or the person count camera 150b and each coordinate of the flow trajectory.

  Subsequently, in step S115, the flow trajectory classification processing unit 103 determines whether or not the distance calculated in step S114 is equal to or less than the threshold Th1, that is, the flow trajectory of the passerby is returned to any flow model. Determine whether it is possible. When it is determined that the distance calculated in step S114 is equal to or smaller than the threshold value Th1 (Yes in step S115), the process proceeds to step S116, and when the distance calculated in step S114 is not determined to be equal to or smaller than the threshold value Th1 (No in step S115). The process proceeds to step S120. Although Th1 is a positive number, for example, it is a threshold value that satisfies Th1 <δ (δ is a certain positive number).

  In step S116, the flow trajectory classification processing unit 103 calculates the movement speed of the passerby tracked based on the above equations (1) and (2). Subsequently, in step S117, the flow trajectory classification processing unit 103 determines whether or not the moving speed calculated in step S116 is equal to or less than a threshold value Th2. When it is determined that the moving speed calculated in step S116 is equal to or less than the threshold Th2 (Yes in step S117), the process proceeds to step S118, and when the moving speed calculated in step S116 is not determined to be equal to or less than the threshold Th2 (step S117). (No), it moves to step S120. Although Th2 is a positive number, 4 to 5 [km / h] is preferable, for example.

  In step S118, the passer-by counter 104 counts up the number of passers-by classified into the corresponding flow model. Subsequently, in step S119, the flow trajectory classification processing unit 103 temporarily stores the movement speed of the corresponding passer-by counted up in step S118 in a predetermined storage area. Subsequently, in step S120, the flow trajectory classification processing unit 103 determines whether there is another passerby that can be tracked. When it is determined that there are other trackable passers-by (Yes at Step S120), the process proceeds to Step S111. When it is not determined that there are other trackable passers-by (No at Step S120), the flow trajectory classification process ends. Then, the process returns to step S104 of the passer-by detection process shown in FIG.

  According to the first embodiment described above, the passer-by flow data generation device 100 generates flow information that can be a basis for control that enables the advertisement distribution control device 200 to distribute an effective advertisement by the display device 300 installed at each base. Can be provided. In particular, since the passer-by flow data generation device 100 and / or the display device 300 can be installed at a plurality of bases to collect flow information at each base and provide the information to the advertisement distribution control device 200, traffic placed in a wide area It becomes possible to perform effective advertisement distribution by linking the human flow data generation device 100 and the display device 300.

  In the first embodiment, the passer-by flow data generating apparatus 100 has been described. In the second embodiment, a case where a plurality of passer-by flow data generation devices 100 and display devices 300 are respectively arranged at a plurality of bases and a network system including the advertisement distribution control device 200 is configured is shown as an example. In the second embodiment, this network system is called a content output control system.

[Outline of advertising output control system]
FIG. 8 is a block diagram illustrating an outline of an example of the advertisement output control system. As shown in the figure, there are “A station”, “B facility”, “C facility”, and “D facility” as bases. Bases refer to places where traffic flows freely and frequently, such as transportation facilities, commercial facilities, and business facilities.

  In the example of FIG. 8, it is assumed that the passer-by flow data generation device 100a and the display device 300a are arranged at “A station”. Similarly, it is assumed that a passer-by flow data generating device 100b and a display device 300b are arranged in “B facility”. Similarly, it is assumed that the passer-by flow data generation device 100c is arranged in the “C facility”. Similarly, it is assumed that a passer-by flow data generation device 100d is arranged in “D facility”. Although not illustrated in FIG. 8, only the display device 300 may be installed at the base.

  The advertisement distribution control device 200, the passer flow data generation device 100a to the passer flow data generation device 100d, and the display devices 300a to 300b are each connected via a secure information communication network N. Specifically, the flow information generated by each passer flow data generation device 100x (x = a to d, the same applies hereinafter) is transmitted to the advertisement distribution control device 200.

  The advertisement distribution control device 200 processes the flow information received from each passer flow data generation device 100x to calculate various indexes, and each display device 300y (y = ab), and so on based on the calculated various indexes. ) To deliver advertisements that increase advertising effectiveness. In addition, the various indexes based on the distribution of the advertisement to the display device 300y are based on the flow information transmitted from the passer-by flow data generating device 100x that is attached to the display device 300y or arranged at an adjacent base. It is calculated based on.

  In FIG. 8, the adjacent bases are “B facility”, “C facility”, and “D facility” for “A station”.

[Configuration of advertisement output control device]
FIG. 9 is a block diagram illustrating the configuration of the advertisement output control apparatus according to the second embodiment. As shown in the figure, the advertisement distribution control apparatus 200 according to the second embodiment includes a flow data receiving unit 201, a flow data temporary storage unit 201a, a flow information generation unit 202, a flow information storage unit 202a, and an attribute time zone ratio calculation unit. 203, a flow amount calculation unit 204, a flow index calculation unit 205, an advertisement distribution schedule determination unit 206, an advertisement distribution schedule storage unit 206b, an advertisement distribution control unit 207, and a distribution advertisement data storage unit 208.

  The flow data receiving unit 201 receives the flow data from the passer-by flow data generation device 100x and stores it in the flow data temporary storage unit 201a. The flow data here is the counted flow data or statistical flow data used as an alternative shown in FIG. 3 of the first embodiment.

  If the “flow data” of the passer-by cannot be obtained by the passer-by counting unit 104 of the passer-by flow information generating apparatus 100 such as when the base is between stations, “flow information” cannot be generated. Instead of “,” statistical flow data (hereinafter referred to as “statistical flow information”) may be used. In this case, the flow information generation unit 202 refers to “statistical flow information”. Note that “statistical flow information” is not information for each attribute but is often a total value for each time period. In this case, the ratio for each attribute in the other “flow information” may be used to apportion the total value described above.

  The flow information generation unit 202 stores the flow data stored in the flow data temporary storage unit 201a in real time for each flow model (base and flow direction), for each attribute, and for each time zone (for example, a time zone in units of one hour). Aggregate to generate “flow information” (see FIG. 10). Further, the “flow information” may be generated by batch processing by a cycle such as once a day or once a week. The flow information generation unit 202 stores the generated flow information in the flow information storage unit 202a.

  The attribute time zone ratio calculation unit 203 uses the flow information stored in the flow information storage unit 202a to recognize the number of passersby in each time zone for each gender and age group attribute, and face recognition for all time zones for each attribute. The attribute time zone ratio is calculated by dividing by the total number of passers-by (see FIG. 11). That is, the attribute time zone ratio refers to the ratio of the number of people in each time zone to the number of people in each time zone within the same attribute.

  The attribute time zone ratio calculated by the attribute time zone ratio calculation unit 203 is as shown in FIG. Each numerical value stored in the attribute time zone ratio table shown in FIG. 11 is obtained by collecting the flow data shown in FIG. For example, in the time zone from 9:00 to 10:00, there are “38” women in their 20s who head from “A station” to “B facility direction”. When divided by the total number of women in their 20s heading from “station” to “facility B”, an attribute time zone ratio of “0.12 (12%)” is obtained.

  By looking at the change of the attribute time zone ratio for each time zone, it is possible to know in which time zone whether there are many passersby of the attribute heading from the base to the other base. That is, it can be seen that it is effective to distribute an advertisement targeted to which attribute layer to the display device 300y arranged in the corresponding base or the other base. Hereinafter, the attribute time zone ratio is set to “A”. The attribute time zone ratio A is a different numerical value for each time zone and for each attribute.

  The flow amount calculation unit 204 calculates a “flow amount” for each time zone and for each attribute. “Flow amount B” is an index indicating the thickness of the flow. “Flow amount B” is defined by the following equation.

  Here, “R” is “flow information” for each attribute and each time zone. “D” is a physical distance between bases. By dividing “R” by “D”, as the physical distance between the passerby flow data generation device 100x and the display device 300y is shorter, the “flow amount B”, that is, the “flow thickness” increases. It shows that the “thickness of the flow” becomes smaller as the physical distance becomes larger. That is, as the physical distance is shorter, for example, the flow amount of the passer-by from the base where the passer-by flow data generating device 100x is arranged to the base where the display device 30 is arranged is emphasized. It shows that the significance of displaying an appropriate advertisement on the display device 300y is high.

  The physical distance “D” between the bases is as shown in the physical distance table of FIG. The “base” in FIG. 14 corresponds to the “base” in FIG. The “flow direction” in FIG. 14 corresponds to the “flow direction” in FIG. That is, based on the information on “base” and “flow direction” included in the flow information, the physical distance table “D” is acquired by referring to the physical distance table in FIG.

  The flow index calculation unit 205 calculates the flow index S based on the following equation.

  That is, the “flow index S” is obtained as a product of “attribute time zone ratio A”, “flow index B”, and “option coefficient C” for each time zone and each attribute. The flow information acquired at a certain base is counted for each time zone and for each attribute, and the ratio of the same time zone and attribute to the total number of people is obtained. For example, as shown in FIG. 11, when the ratio of women in their 20s is 14:00:00 to 15:00 and the ratio of the number of people in all time zones is 0.16, this ratio is used as a flow index. Substitute into A in S calculation formula (4). When there is no flow information at the base, A substitutes a predetermined constant. The predetermined constant is 1/24 (“24” is the number of time zones in a day) ≈0.04 or 1/13 (“13” is when a person is active outdoors (that is, the person counts through the count setting area)). (The number of assumed time zones between 9:00 and 20:00)) ≈0.077 may be used.

  The “average flow amount R” is the number of pedestrians matching the flow straight line model described above, and for example, the number of pedestrians passing through the point per hour is adopted. At the time when the passer-by flow data generating device 100x is installed, a logical network is formed between the bases where the other passer-by flow data generating device 100x or the display device 300y is installed. “Physical distance D” is set when the logical network is configured. Also, the passer-by flow data generating device 100x or the display device 300y is installed at a different base. For example, when the passer-by flow data generating device 100x or the display device 300y is installed at “A station” and “B facility”, a predetermined value (such as station census data) such as the average number of passengers at those stations is “ Adopted as “average flow rate R”.

  Whether or not to set an optional condition such as climate to “flow index S” depends on the intention of the client who provides the advertisement. When setting, depending on the climatic conditions (for example, a weather forecast provided by a third party), “Option coefficient C = 0.6” when clear, “Option coefficient C = 0.3” when cloudy, and rain Is a coefficient “option coefficient C = 0.1” or the like. You may allocate to each client so that the sum of these option coefficients C may be set to 1.

  The advertisement distribution schedule determination unit 206 gives priority to the advertisement distribution in descending order of the “flow index S” based on the “flow index S” calculated for each advertisement distribution time zone, attribute, and each advertisement. Next, the schedule for the corresponding time zone is determined. When the advertisement distribution schedule determination unit 206 determines the schedule for the corresponding time zone, the advertisement distribution contract information table (see FIG. 16) is referred to. The advertisement distribution contract information table includes information on target attributes, distribution areas (radius [km] from the center), distribution period, and presence / absence of optional conditions.

  The advertisement distribution schedule determination unit 206 selects an advertisement whose target attribute and distribution period match the time zone and attribute for which the “flow index S” is calculated, and determines the distribution schedule. In addition, the advertisement distribution schedule determination unit 206 sets the display device 300y targeted for advertisement distribution, the distribution period, and the optional conditions in consideration of the distribution area (radius [km] from the center), the distribution period, and the presence of optional conditions. To do.

  Depending on the calculated “flow index S”, there are two methods for determining the advertisement distribution schedule for each time period. The first method is a method in which the delivery time for one advertisement is fixed and the frequency of delivery is changed according to the “flow index S”. The reciprocal of the “flow index S” calculated for each advertisement ID is obtained, and this is used as the initial value “T” of the time index until the next display. That is, Tn is calculated while incrementing n from 1 for each advertisement ID by the expression “Tn = T × n”.

  For example, if the “flow index S” of the advertisement ID 001 is 0.98, the “flow index S” of the advertisement ID 002 is 0.87, the “flow index S” of the advertisement ID 003 is 0.74, and “T” is 0.74. The initial values “T” are 1.02, 1.15, and 1.35, respectively, and the value of Tn is as shown in FIG. Since the delivery time for each advertisement is fixed at 30 seconds, Tn is set until the cumulative delivery time exceeds the delivery time upper limit Th3 (for example, if the delivery time zone is 1 hour, the cumulative time is 3600 seconds). In this order, the display order of the contents is determined in ascending order of Tn (see FIGS. 19 and 20.) Since this scheduling determines the contents to be displayed next, both the still picture advertisement and the moving picture advertisement are determined. It can also be applied to.

  The second method of determining the distribution schedule is a method of changing the distribution time for one advertisement according to the “flow index S”. The standard time for one display is determined in advance as 30 seconds, for example, and the display repetition order is determined in descending order of the “flow index S”. Each actual display time is obtained by multiplying the standard time by “flow index S” (see FIGS. 23 and 24). However, since it does not make sense as an advertisement display if it becomes too short due to the low “flow index S” for one display time, the lower limit value of the display is set to 15 seconds, for example. Since this delivery time varies depending on the “flow index S”, it is suitable for scheduling still image advertisements.

  The advertisement distribution schedule determination unit 206 that has determined the advertisement distribution schedule as described above stores the determined advertisement distribution schedule (see FIG. 20 or FIG. 23) in a predetermined storage area. Then, according to the advertisement distribution schedule, the advertisement distribution control unit 207 reads the corresponding advertisement data (see FIG. 16) stored in the distribution advertisement data storage unit 208 and distributes the advertisement to the display device 300y to be distributed. .

[Flow information reception processing]
Next, flow information reception processing will be described with reference to FIG. The flow information reception process is a process executed when the advertisement distribution control device 200 receives flow information from each passer-by flow information generating device 100x. First, in step S201, the flow data receiving unit 201 determines whether or not flow information has been received from the passer-by flow information generating device 100x. When it is determined that the flow information has been received (Yes at Step S201), the process proceeds to Step S202, and when it is not determined that the flow information has been received (No at Step S201), Step S201 is repeated.

  In step S202, the attribute time zone ratio calculation unit 203 calculates “attribute time zone ratio A” based on the flow information received in step S201. Subsequently, in step S203, the flow amount calculation unit 204 calculates “flow distance B” with reference to the flow information received in step S201 and the physical distance table in FIG. The calculation result of “flow distance B” is as shown in FIG.

  Subsequently, in step S204, the flow index calculation unit 205 and the advertisement distribution schedule determination unit 206 execute an advertisement distribution schedule determination process. Note that the advertisement distribution schedule determination process includes an “advertisement distribution schedule determination process (1)” in FIG. 18 and an “advertisement distribution schedule determination process (2)” in FIG. As described above, these are advertisement distribution schedule determination processes respectively corresponding to the two types of methods. The advertisement delivery schedule determination process executed in step S204 may be any.

  Subsequently, in step S205, the advertisement distribution control unit 207 distributes the advertisement data to the target display device 300y so as to display the selected advertisement on the suitable display device 300y according to the advertisement distribution schedule determined in step S204. .

[Advertisement delivery schedule decision processing (1)]
Next, the advertisement delivery schedule determination process (1) will be described with reference to FIG. First, in step S211, the flow index calculation unit 205 calculates “flow index (score) S” based on the above equation (4). The calculation result of “flow index (score) S” is as shown in FIG.

  Subsequently, in step S212, the advertisement distribution schedule determination unit 206 calculates “T = 1 / S”. Subsequently, in step S213, the advertisement distribution schedule determination unit 206 initializes the index n by assigning 1 to it. Subsequently, in step S214, the advertisement distribution schedule determination unit 206 calculates “T × n”.

  Subsequently, in step S215, the advertisement distribution schedule determining unit 206 determines whether the cumulative display time of the advertisement scheduled to be distributed to the advertisement distribution target display device 300y is equal to or less than the above “Th3” or “n <Th4”. It is determined whether or not the above condition is satisfied. Here, “Th4” is a threshold value of the cumulative advertisement display cycle number. The cumulative advertisement display cycle number is, for example, the number of periods when advertisement A, advertisement B, and advertisement C are periodically distributed in this order. When it is determined that any of the above-described conditions is satisfied (Yes at Step S215), the process proceeds to Step S216. When any of the above-described conditions is not satisfied (No at Step S215), the process proceeds to Step S217.

  In step S216, the advertisement delivery schedule determination unit 206 increments n. When this process ends, the process moves to step S214. On the other hand, in step S217, the advertisement distribution schedule determination unit 206 generates a schedule for distributing advertisements to the target display device 300y in ascending order of “T × n”.

  Note that the calculation results of steps S211 to S214 are as shown in FIG. According to FIG. 20, when three advertisements with advertisement IDs “001”, “002”, and “003” are distributed to the target display device 300y, “T × n” determines the priority order for distributing the advertisement. It is a weighting factor. The advertisements are distributed in order from the smallest “T × n” value. Therefore, “T × n” means that advertisements with advertisement IDs “001”, “002”, and “003” always maintain this order as n = 1, 2,. It is not always delivered. In fact, referring to FIG. 20, the advertisement with the 10th display order is “001” and the advertisement with the 11th display order is “002”, but the advertisement with the 13th display order is “001” instead of “003”. It is.

[Advertisement delivery schedule determination process (2)]
Next, the advertisement delivery schedule determination process (2) will be described with reference to FIG. First, in step S211, the flow index calculation unit 205 calculates “flow index (score) S” based on the above equation (4). The calculation result of “flow index (score) S” is as shown in FIG.

  Subsequently, in step S214a, the advertisement distribution schedule determination unit 206 calculates “flow index (score) S” × “standard display time”. Here, the “standard display time” is the advertisement display time on the contract corresponding to the “display time” in the advertisement distribution contract information table shown in FIG. Subsequently, in step S217a, the advertisement distribution schedule determination unit 206 generates a schedule for periodically switching and displaying each advertisement to be distributed over a time of “flow index (score) S” × “standard display time”.

  The calculation results of step S211 and step S214a are as shown in FIG. According to FIG. 22, when three advertisements with advertisement IDs “001”, “002”, and “003” are distributed to the target display device 300y, advertisement IDs “001”, “002”, and “003” are displayed. Ads are delivered in this order. The “flow index (score) S” is a weighting coefficient that determines the time for distributing the advertisement, and the display time of each advertisement is determined based on the “flow index (score) S” and the contract display time. . The determination result is as shown in FIG.

  As described above, in the second embodiment, the advertisement display order and display time are controlled based on various indices calculated based on the “flow index (score) S”. It becomes possible. Further, by feeding back the effect of the advertisement to the advertisement display contract information, the customer can put out the advertisement more efficiently.

[About Face Recognition Camera]
FIG. 24 is a diagram illustrating an installation example of the face recognition camera 150a, and is a diagram of the installation state viewed from above. A plurality of face recognition cameras 150a are housed in the housing f at an upper portion of the display device 300, about 2 to 2.5 m high (face recognition cameras 150a1 to 150a4 in the figure). The angle of view of the face recognition camera 150a is inclined downward from the horizontal so that a wide range of faces can be captured.

  The reason for using a plurality of cameras is to cover a wide imaging area and recognize faces in different areas. Each face recognition camera 150a is directed at a different angle in the horizontal direction, and is directed toward capturing a human face in an adjacent area (see FIG. 25). The camera is attached to a pan head or a turntable that can change the horizontal direction and the vertical direction so that the orientation of the face recognition camera 150a can be finely adjusted at the installation site of the system.

  The face recognition camera 150a and its lens have a gray transmissive film affixed to the housing front glass g so that passers-by cannot be seen from the outside. Further, a black acrylic plate a having round holes larger than the lens hole diameter h1 to hole diameter h4 is installed inside the glass, and a small dark room space dr is formed between the glass and the lens. This reduces reflection and incident light from the outside and prevents the camera and lens from being seen from the outside, so that it gives psychological pressure to passersby like conventional cameras. Can be suppressed.

  Face recognition is performed by recognition software installed in a computer connected to the face recognition camera 150a, detects a face from an input image, and estimates the gender and age of the person from the detected face rectangular area. Is.

[About the people count camera]
The number counting camera 150b for counting the number of passersby is installed at a height of about 5 m above the housing of the display device 300. The imaging direction is the direction directly below the floor surface. The person count camera can also change the angle of view and zoom, and the wide-angle mode and the fish-eye mode can be changed. In the wide angle mode, the angle of view is approximately 120 °, and in the fisheye mode, the angle of view is approximately 180 °.

[Effect of Example]
The system terminal of the embodiment is roughly divided into the following three types. (1) Those that only perform sensing. (2) Sensing and advertising display. (3) An advertisement display (display) only. Sensing refers to face recognition and number counting. All terminals are connected to the network, and secure communication is performed with the aggregation server. Of these, only (2) is the system that measures the advertising gaze rate effect. The sensing only terminal of (1) is relatively small and is assumed to be installed not only at a station but also in a store or outdoors. In the case of a store, there are many entrances through which customers visit. The conventional advertisement attention rate evaluation system is based on the fact that the terminal (2) is installed at a certain place and sensing is performed to determine what kind of advertisement is displayed at that place.

  In the system of the disclosed technology, a plurality of terminals (1) to (3) are installed over a wide range and region. For example, a terminal that performs only sensing is installed in a plurality of stores that develop a chain, and attribute information such as a visitor who passes there is always acquired and accumulated in a server (advertisement distribution control device 200). Thereby, it becomes possible to grasp statistical information that a certain store has many visits with an attribute in a certain time zone.

  On the other hand, if the terminal (2) or (3) is installed in another location, for example, a station premises, the advertisement content of this store is stored here based on the attribute statistical information previously acquired at the store. Deliver with. For example, in consideration of the time when women in their 20s are most likely to visit a store, an advertisement for cosmetics directed to this target is displayed at a station. The delivery timing may be determined from the acquired attribute statistical information, and may be a time zone one day later or a time zone one week later.

  Each table referred to by each function unit in the above embodiment is stored in a predetermined storage area. For example, it may be a predetermined storage area provided in the function unit of the reference source.

  As described above, the passer-by flow data generation device, the content delivery control device, the passer-by flow data generation method, and the content delivery control method disclosed in the present application have a high flow of passers-by in a wide space, and digital signage. This is useful when you want to get the full effect of advertising while coordinating between digital signage, even in situations where the system is installed at any location.

100, 100a, 100b, 100c, 100x Passerby flow data generation apparatus 101 Passerby detection unit 102 Face attribute detection unit 103 Flow trajectory classification processing unit 103a Flow model storage unit 104 Passer count unit 105 Flow data generation unit 106 Flow data transmission 150a, 150a1, 150a2, 150a3, 150a4 Face recognition camera 150b Number counting camera 200 Advertisement distribution control device 201 Flow data transmission unit 201a Flow data temporary storage unit 202 Flow information generation unit 203 Attribute time zone ratio calculation unit 204 Flow rate calculation unit 205 Flow index calculation unit 206 Advertisement distribution schedule determination unit 207 Advertisement distribution control unit 206b Advertisement distribution schedule storage unit 208 Distribution advertisement data storage unit 300, 300a, 300b, 300y Display device a Acrylic board dr Dark room space f Housing g Housing front glass h1, h2, h3, h4 Hole diameter

Claims (12)

  1. A passer-by-flow data generation device connected to be able to communicate with a content delivery control device that delivers content to be displayed on a display device,
    A flow model storage unit for storing a flow model in which the flow of the passers-by passing the pass-through area where content output to a display device installed at a position where the passers-by can be viewed is visible;
    A passer-by detection unit that detects the passer-by captured by the imaging device;
    A flow trajectory classification processing unit that acquires the flow trajectory of the passerby detected by the passerby detection unit and classifies the flow trajectory into any flow model stored in the flow model storage unit;
    A passer-by counter for counting the number of passers-by for each of the flow trajectories classified into any one of the flow models by the flow trajectory classification processing unit;
    A flow information generation unit that generates flow data including the number of passers counted by the passer counting unit, the direction of movement of the passers, and time information, and transmits the flow data to the content distribution control device. Passenger flow data generation device characterized by that.
  2. The display device and the traffic area are installed at different locations,
    The flow information generating unit uses flow data calculated by a predetermined statistical process when the number of passers-by for each flow model cannot be counted by the passer-by counter when generating flow data. The passer-by flow data generating apparatus according to claim 1.
  3.   3. The passer-by flow data generating apparatus according to claim 2, wherein the flow model is a flow direction index indicating the base to which the flow of the passer-by is directed.
  4. The flow trajectory classification processing unit detects a travel speed of a passerby detected by the passerby detection unit, determines whether the detected travel speed is a predetermined threshold value or less,
    4. The passerby according to claim 1, wherein the passer-by counting unit counts the number of passers-by that the moving speed classification processing unit determines that the moving speed is equal to or less than a predetermined threshold. Flow data generator.
  5. A face attribute detection unit that detects a face attribute of the passerby detected by the passerby detection unit;
    The passer-by-passage counting unit, for each of the flow trajectories classified into any one of the flow models by the flow trajectory classification processing unit, and for each face attribute of the passers-by detected by the face attribute detection unit; and The passer-by flow data generating apparatus according to any one of claims 1 to 4, wherein the number of passers-by is counted at each time when the face attribute is detected by the face attribute detection unit.
  6. A flow data storage unit for storing and storing the flow data;
    Generated by the flow information generation unit and the flow information generation unit that generates flow information by collecting the flow data stored in the flow data storage unit for each flow model, for each face attribute, and for each time zone based on the time A passerby according to any one of claims 1 to 5, further comprising: a content selection output control unit that selects and outputs the content to be output to the display device based on the flow information. Flow data generator.
  7. The imaging device includes a face recognition camera that captures an image for acquiring face attributes;
    The case where the face recognition camera is stored is
    A gray transparent film is attached to the front glass,
    A black acrylic plate having a larger diameter hole than the direct shape of the lens of the face recognition camera is installed inside the glass, and a dark room space is formed between the front glass and the lens of the face recognition camera. The passer-by-passage data generation apparatus according to any one of claims 1 to 6, wherein
  8. A content delivery control device communicably connected to a plurality of passer-by-passage data generation devices that collect information on the flow of passers-by passing through a predetermined area,
    The number of the passers-by for each flow model in which the flow of the passers-by passing through the pass-through area where the content output to the display device installed at a position where the passers-by can be seen is visible and the passers-by of the passers-by A fluid data storage unit that accumulates and stores fluid data including a moving direction, attributes of the passer-by, and time information when the passer-by is detected;
    A flow information generation unit that generates flow information by collecting flow data stored in the flow data storage unit for each flow model, for each attribute of the passerby, for each time information when the passerby is detected,
    A flow information storage unit that stores the flow information generated by the flow information generation unit;
    An attribute time zone ratio calculating unit that calculates an attribute time zone ratio that is a ratio for each time zone of the number of passers-by for each attribute of the passerby included in the flow information stored in the flow information storage unit;
    A flow amount calculation unit that calculates a flow amount weighted to the number of each passer's time zone for each attribute of the passer included in the flow information stored in the flow information storage unit;
    A flow index calculation unit that calculates a flow index indicating a characteristic of the corresponding flow based on the attribute time zone ratio calculated by the attribute time zone ratio calculation unit and the flow amount calculated by the flow amount calculation unit;
    A content distribution schedule determination unit that selects content to be output to the display device based on the flow index calculated by the flow index calculation unit, and determines a distribution schedule of the selected content;
    A content distribution control device comprising: a content distribution control unit that controls distribution of the corresponding content to a display device in accordance with the distribution schedule determined by the content distribution schedule determination unit.
  9. A physical distance storage unit that stores the physical distance between adjacent bases,
    The flow amount calculation unit divides the number of passers by time zone for each passer's attribute included in the flow information stored in the flow information storage unit by the physical distance between the bases, The content distribution control apparatus according to claim 8, wherein an amount is calculated.
  10.   10. The content distribution control apparatus according to claim 8, wherein the content distribution schedule determination unit selects content that matches contract information related to content distribution and determines a distribution schedule for the selected content.
  11. A passer-by flow data generation method performed by a passer-by-flow data generation device communicably connected to a content distribution control device that distributes content to be displayed on a display device,
    A passerby detection step of detecting the passerby imaged by the imaging device;
    The flow path of the passerby detected by the passerby detection step is acquired, and the content of the passerby who passes through the passable area where the content output to the display device installed at a position where the passerby can visually recognize is visible. A flow trajectory classification processing step for classifying the flow trajectory into any flow model stored in a flow model storage unit that stores a flow model in which flow is patterned;
    A passer-by-passage counting step for counting the number of passers-by for each of the flow trajectories classified into any of the flow models by the flow trajectory classification processing step;
    Including a flow information generation step of generating flow data including the number of passers counted by the passer counting step, the movement direction of the passers, and time information and transmitting the flow data to the content distribution control device. A method for generating passer-by flow data.
  12. A content distribution control method performed by a content distribution control device that is communicably connected to a passer flow data generation device that collects information related to the flow of passers-by passing through a predetermined area,
    The number of the passers-by for each flow model in which the flow of the passers-by passing through the pass-through area where the content output to the display device installed at a position where the passers-by can be seen is visible and the passers-by of the passers-by A fluid data storage step for storing and storing fluid data in a fluid data storage unit including a moving direction, attributes of the passer-by, and time information when the passer-by is detected;
    A flow information generation step for generating flow information by counting flow data stored in the flow data storage unit for each flow model, for each attribute of the passer-by, for each time information when the passer-by is detected,
    A flow information storage step of storing the flow information generated by the flow information generation step in a flow information storage unit;
    An attribute time zone ratio calculating step for calculating an attribute time zone ratio that is a ratio for each time zone of the number of passers-by for each attribute of the passers-by included in the flow information stored in the flow information storage unit;
    A flow amount calculation step of calculating a flow amount weighted to the number of each passer's time zone for each attribute of the passer included in the flow information stored in the flow information storage unit;
    A flow index calculation step for calculating a flow index indicating a characteristic of the flow based on the attribute time zone ratio calculated by the attribute time zone ratio calculation step and the flow amount calculated by the flow amount calculation step;
    A content distribution schedule determination step for selecting content to be output to the display device based on the flow index calculated by the flow index calculation step, and determining a distribution schedule for the selected content;
    A content distribution control method comprising: a content distribution control step of controlling distribution of the content to a display device in accordance with the distribution schedule determined by the content distribution schedule determination step.
JP2009151901A 2009-06-26 2009-06-26 Passer-by fluidity data generating device, content distribution controlling device, passer-by fluidity data generating method, and content distribution controlling method Pending JP2011008571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009151901A JP2011008571A (en) 2009-06-26 2009-06-26 Passer-by fluidity data generating device, content distribution controlling device, passer-by fluidity data generating method, and content distribution controlling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009151901A JP2011008571A (en) 2009-06-26 2009-06-26 Passer-by fluidity data generating device, content distribution controlling device, passer-by fluidity data generating method, and content distribution controlling method

Publications (1)

Publication Number Publication Date
JP2011008571A true JP2011008571A (en) 2011-01-13

Family

ID=43565140

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009151901A Pending JP2011008571A (en) 2009-06-26 2009-06-26 Passer-by fluidity data generating device, content distribution controlling device, passer-by fluidity data generating method, and content distribution controlling method

Country Status (1)

Country Link
JP (1) JP2011008571A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011232876A (en) * 2010-04-26 2011-11-17 Nippon Telegr & Teleph Corp <Ntt> Content attention degree calculation device, content attention degree calculation method, and content attention degree calculation program
JP2012078722A (en) * 2010-10-05 2012-04-19 Mitsubishi Electric Corp Information providing system
JP2012203650A (en) * 2011-03-25 2012-10-22 Toshiba Corp Information distribution device, system, and information display device
JP2014527668A (en) * 2011-08-09 2014-10-16 アルカテル−ルーセント System and method for identifying billboard audience group routes and providing advertising content based on routes
US8976109B2 (en) 2010-09-08 2015-03-10 Sharp Kabushiki Kaisha Content output system, output control device and output control method
JP2015230567A (en) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 Drive assist system
KR20160029330A (en) * 2014-09-05 2016-03-15 동국대학교 산학협력단 System and method for recommending travel destination based on user characteristics and tourist destination characteristics
JP2016103292A (en) * 2012-06-26 2016-06-02 インテル・コーポレーション Method and apparatus for measuring audience size for digital sign
WO2019193816A1 (en) * 2018-04-05 2019-10-10 矢崎エナジーシステム株式会社 Guidance system
WO2020145411A1 (en) * 2019-01-11 2020-07-16 Nec Display Solutions, Ltd. Graphical user interface for insights on viewing of media content

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07160883A (en) * 1993-12-09 1995-06-23 Nippon Telegr & Teleph Corp <Ntt> Personal attribute detecting device
JPH08202847A (en) * 1995-01-31 1996-08-09 Matsushita Electric Ind Co Ltd Moving object counting device
JP2003125379A (en) * 2001-10-17 2003-04-25 Nippon Telegraph & Telephone West Corp Information providing system, method thereof, information distributing server apparatus, contents distribution server apparatus, street television set, mobile communication terminal and program
JP2003271084A (en) * 2002-03-15 2003-09-25 Omron Corp Apparatus and method for providing information
JP2005003966A (en) * 2003-06-12 2005-01-06 Kenji Morishita Advertisement distribution system and advertisement distribution method
JP2005230231A (en) * 2004-02-19 2005-09-02 Omron Corp Game method, game machine and program for game
JP2005346617A (en) * 2004-06-07 2005-12-15 East Japan Railway Co Passer-by behavior analysis system
JP2007164804A (en) * 2007-01-22 2007-06-28 Asia Air Survey Co Ltd Mobile object detecting system, mobile object detecting device, mobile object detection method and mobile object detecting program
JP2007193292A (en) * 2005-12-19 2007-08-02 Ricoh Co Ltd Content display system and information display device
JP2009139857A (en) * 2007-12-10 2009-06-25 Unicast Corp Contents display control device, contents display control method, and contents display control program

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07160883A (en) * 1993-12-09 1995-06-23 Nippon Telegr & Teleph Corp <Ntt> Personal attribute detecting device
JPH08202847A (en) * 1995-01-31 1996-08-09 Matsushita Electric Ind Co Ltd Moving object counting device
JP2003125379A (en) * 2001-10-17 2003-04-25 Nippon Telegraph & Telephone West Corp Information providing system, method thereof, information distributing server apparatus, contents distribution server apparatus, street television set, mobile communication terminal and program
JP2003271084A (en) * 2002-03-15 2003-09-25 Omron Corp Apparatus and method for providing information
JP2005003966A (en) * 2003-06-12 2005-01-06 Kenji Morishita Advertisement distribution system and advertisement distribution method
JP2005230231A (en) * 2004-02-19 2005-09-02 Omron Corp Game method, game machine and program for game
JP2005346617A (en) * 2004-06-07 2005-12-15 East Japan Railway Co Passer-by behavior analysis system
JP2007193292A (en) * 2005-12-19 2007-08-02 Ricoh Co Ltd Content display system and information display device
JP2007164804A (en) * 2007-01-22 2007-06-28 Asia Air Survey Co Ltd Mobile object detecting system, mobile object detecting device, mobile object detection method and mobile object detecting program
JP2009139857A (en) * 2007-12-10 2009-06-25 Unicast Corp Contents display control device, contents display control method, and contents display control program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CSNJ200810067161; 藪下 浩子 他: 'FRUITS Movie:経路情報の要約可視化の一手法' 第70回(平成20年)全国大会講演論文集(4) インタフェース コンピュータと人間社会 , 20080313, p.4-327〜4-328 *
JPN6013033525; 藪下 浩子 他: 'FRUITS Movie:経路情報の要約可視化の一手法' 第70回(平成20年)全国大会講演論文集(4) インタフェース コンピュータと人間社会 , 20080313, p.4-327〜4-328 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011232876A (en) * 2010-04-26 2011-11-17 Nippon Telegr & Teleph Corp <Ntt> Content attention degree calculation device, content attention degree calculation method, and content attention degree calculation program
US8976109B2 (en) 2010-09-08 2015-03-10 Sharp Kabushiki Kaisha Content output system, output control device and output control method
JP2012078722A (en) * 2010-10-05 2012-04-19 Mitsubishi Electric Corp Information providing system
JP2012203650A (en) * 2011-03-25 2012-10-22 Toshiba Corp Information distribution device, system, and information display device
JP2014527668A (en) * 2011-08-09 2014-10-16 アルカテル−ルーセント System and method for identifying billboard audience group routes and providing advertising content based on routes
JP2016103292A (en) * 2012-06-26 2016-06-02 インテル・コーポレーション Method and apparatus for measuring audience size for digital sign
JP2015230567A (en) * 2014-06-04 2015-12-21 トヨタ自動車株式会社 Drive assist system
US9463796B2 (en) 2014-06-04 2016-10-11 Toyota Jidosha Kabushiki Kaisha Driving assistance apparatus
KR20160029330A (en) * 2014-09-05 2016-03-15 동국대학교 산학협력단 System and method for recommending travel destination based on user characteristics and tourist destination characteristics
KR101628108B1 (en) * 2014-09-05 2016-06-21 동국대학교 산학협력단 System and method for recommending travel destination based on user characteristics and tourist destination characteristics
WO2019193816A1 (en) * 2018-04-05 2019-10-10 矢崎エナジーシステム株式会社 Guidance system
WO2020145411A1 (en) * 2019-01-11 2020-07-16 Nec Display Solutions, Ltd. Graphical user interface for insights on viewing of media content

Similar Documents

Publication Publication Date Title
Zheng et al. Urban computing: concepts, methodologies, and applications
US10339544B2 (en) Techniques for automatic real-time calculation of user wait times
US9952427B2 (en) Measurement method and system
US10410048B2 (en) System and method for detecting, tracking and counting human objects of interest using a counting system and a data capture device
US9710924B2 (en) Field of view determiner
Seer et al. Kinects and human kinetics: A new approach for studying pedestrian behavior
US10614316B2 (en) Anomalous event retriever
JP2016027490A (en) Method, system, product and computer program for multi-cue object detection and analysis (multi-cue object detection and analysis)
US10037607B2 (en) Topology determination for non-overlapping camera network
TWI653591B (en) The object selected in the process of approaching the advertising display device of the digital signage
US9324096B2 (en) System and method for communicating information
JP5994397B2 (en) Information processing apparatus, information processing method, and program
Hoogendoorn et al. Extracting microscopic pedestrian characteristics from video data
JP2019053762A (en) Display method
US20180301031A1 (en) A method and system for automatically detecting and mapping points-of-interest and real-time navigation using the same
US8254625B2 (en) Automated service measurement, monitoring and management
JP3521637B2 (en) Passenger number measurement device and entrance / exit number management system using the same
KR101375583B1 (en) Object Density Estimation in Video
US9256955B2 (en) System and method for processing visual information for event detection
CN101303727B (en) Intelligent management method based on video human number Stat. and system thereof
US20130266190A1 (en) System and method for street-parking-vehicle identification through license plate capturing
US20130054377A1 (en) Person tracking and interactive advertising
TWI307051B (en) Methods and systems for gathering market research data inside and outside commercial establishments
WO2017080341A1 (en) Smart parking control method and system integrating toll collection and advertisement pushing
US9641763B2 (en) System and method for object tracking and timing across multiple camera views

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120626

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130627

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130709

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20131126