WO2023209951A1 - Content effect calculation method and effect calculation system - Google Patents

Content effect calculation method and effect calculation system Download PDF

Info

Publication number
WO2023209951A1
WO2023209951A1 PCT/JP2022/019283 JP2022019283W WO2023209951A1 WO 2023209951 A1 WO2023209951 A1 WO 2023209951A1 JP 2022019283 W JP2022019283 W JP 2022019283W WO 2023209951 A1 WO2023209951 A1 WO 2023209951A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
content
person
effect
movement
Prior art date
Application number
PCT/JP2022/019283
Other languages
French (fr)
Japanese (ja)
Inventor
奏美 知名
Original Assignee
シャープNecディスプレイソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープNecディスプレイソリューションズ株式会社 filed Critical シャープNecディスプレイソリューションズ株式会社
Priority to PCT/JP2022/019283 priority Critical patent/WO2023209951A1/en
Publication of WO2023209951A1 publication Critical patent/WO2023209951A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising

Definitions

  • the present invention relates to a content effect calculation method and an effect calculation system.
  • Signage that displays content is used in stores, public facilities, etc.
  • the contents include advertisements, notices, various guides, and the like.
  • Patent Document 1 there is a system that predicts the effect of displaying content and allocates advertisements to advertising spaces based on the predicted effect.
  • An object of the present invention is to provide a content effect calculation method and an effect calculation system that can measure the effect of displaying content.
  • the present invention displays content related to the first area on a display screen of a display device installed outside the first area, and after displaying the content on the display screen, displays the content related to the first area. Detecting the movement of a person to the first area, and calculating the effect of the content based on the movement of the person from a second area where at least the display surface is visible and which is different from the first area to the first area. This is a method for calculating the effectiveness of content.
  • the present invention includes a display control unit that displays content related to the first area on a display screen of a display device installed outside the first area, and after the content is displayed on the display screen, a detection unit that detects movement of a person to the first area; and a detection unit that detects movement of a person to the first area from a second area where at least the display surface is visible and is different from the first area.
  • This is an effect calculation system having a calculation unit that calculates the effect of the content based on the content.
  • FIG. 1 is a schematic block diagram showing the configuration of an effect measurement system S according to an embodiment of the present invention.
  • 2 is a schematic functional block diagram illustrating the functions of an information processing device 50.
  • FIG. 5 is a diagram illustrating an example of area-related data stored in a storage unit 502.
  • FIG. It is a figure explaining the relationship between elapsed time and points.
  • 5 is a flowchart illustrating the operation of the information processing device 50.
  • FIG. FIG. 3 is a diagram showing an example of first data. It is a figure which shows an example of 2nd data. It is a figure which shows an example of the 2nd data newly obtained. It is a schematic block diagram showing the composition of effect measurement system Sa in a 2nd embodiment.
  • FIG. 5 is a schematic functional block diagram illustrating the functions of an information processing device 51.
  • FIG. 5 is a flowchart illustrating the operation of the information processing device 51.
  • FIG. It is a schematic block diagram showing the composition of effect measurement system Sb when three cameras are provided in a 2nd embodiment.
  • It is a schematic block diagram showing the composition of effect measurement system Sc in a 3rd embodiment.
  • FIG. 1 is a schematic block diagram showing the configuration of an effect measurement system S according to an embodiment of the present invention.
  • the effect measurement system S measures the effect of displaying the content by measuring the movement of a person after the content is displayed on a display device provided within the facility.
  • Facilities include public facilities, commercial facilities, station premises, event venues, shopping malls (commercial complexes), stores, etc., and are places where people can visit.
  • the facility is a store.
  • the effect measurement system S is communicably connected via a camera 20A, a display device 30B, a content providing device 40B, an information processing device 50, and a network N.
  • Area A is a part of the facility, and area B is a different area from area A within the facility. Area A and area B may be adjacent to each other or may be separated from each other. Area A is set within a facility (for example, a store) for a target section where it is desired to attract people by displaying content on the display device 30B.
  • Product shelf 10A is installed in area A.
  • Product shelf 10B is installed in area B. Different products are displayed on the product shelf 10A and the product shelf 10B.
  • Camera 20A is installed above the ceiling or wall inside the store, and images an area including area A. If a person exists in area A, camera 20A can image the area including the person.
  • Display device 30B is installed in area B.
  • the display screen of the display device 30B is visible to a person who has come into area B to view the products displayed on the product shelf 10B.
  • Content providing device 40B stores multiple types of content and transmits the content to display device 30B.
  • the content includes content related to the store, and is, for example, an advertisement that can appeal to consumers about the products displayed on the product shelf 10A.
  • the information processing device 50 causes the display device 30B to display the content, and calculates the effect of displaying the content.
  • the information processing device 50 is, for example, a computer.
  • the network N is, for example, a LAN (local area network).
  • FIG. 2 is a schematic functional block diagram illustrating the functions of the information processing device 50.
  • the communication unit 501 communicates with other devices (camera 20A, display device 30B, content providing device 40B) via network N.
  • the storage unit 502 stores area-related data and other various data.
  • FIG. 3 is a diagram illustrating an example of area-related data stored in the storage unit 502.
  • the storage unit 502 stores camera identification information for identifying the camera 20A, area identification information for identifying the area, and information regarding products displayed in the area in association with each other.
  • the input unit 503 receives information input from an input device connected to the outside of the information processing device 50.
  • the input device for example, at least one of a keyboard, a mouse, a touch panel, etc. can be used.
  • the area selection setting unit 504 sets rules indicating timing for selecting areas. Timings for selecting an area include, for example, the following (a) to (b). (a) Instruction to move a person
  • the area selection setting unit 504 sets a selection timing for selecting an area based on a request input from the input unit 503 to move a person to the first area. The selection timing in this case may be such that the area is selected immediately when a request to move a person is input from the input unit 503.
  • the area selection setting unit 504 may receive an area selection time, which is a predetermined time, from the input unit 503, and the timing may be based on this area selection time. In this case, when the area selection time arrives, an area can be selected. When the area selection timing has arrived, content for attracting people to the destination area is displayed.
  • the area selection time may be a periodic time or may be a predetermined time.
  • Area selection may be performed based on the magnitude relationship between the number of people in the area and the reference value. For example, area selection may be performed when the number of people in the second area exceeds a reference value.
  • area selection may be performed when the number of people in the first area is less than a reference value.
  • the content can be displayed on a display device set other than the first area in response to the number of people in the first area becoming smaller than the reference value, and the content can be displayed on a display device set other than the first area. It is possible to attract people from outside to the first area.
  • the measurement timing setting unit 505 determines the measurement timing (comparison) for acquiring person data from the selected area after content related to the selected area is displayed on a display device installed in an area different from the data acquisition time). By comparing the person data measured according to this measurement timing with the person data obtained before displaying the content, it is possible to understand changes in the number of people in the selected area after displaying the content. can. This timing can be determined depending on the point in time when it is desired to understand people's movements after displaying the content. In this embodiment, a plurality of different timings after displaying the content are set. For example, timings are selected such that measurement is performed at least once in each of period T1, period T2, period T3, and period T4, which will be described later.
  • a person data detection unit 506 calculates characteristic data of a person included in the imaging data.
  • the feature data may be a feature amount obtained by quantifying the features of a person, or may be attribute data estimated based on the features of the person.
  • the attribute data includes, for example, the person's gender, age group, clothing characteristics (color of clothing, etc.), and the like.
  • the feature data will also be referred to as person data.
  • the person data detection unit 506 may perform this estimation process every time the image data is obtained while continuing to receive the image data from the camera. Further, the person data detection unit 506 may calculate the feature data from the imaging data obtained when the measurement timing set by the measurement timing setting unit 515 arrives.
  • the person data detection unit 506 determines whether the comparison data acquisition time has been reached, and calculates feature data (person data) when the comparison data determination time has been reached.
  • the display control unit 507 displays content related to the first area on the display surface of a display device installed outside the first area.
  • the first area may be any area within the facility, and is area A in this embodiment.
  • the display device installed outside the first area is the display device 30B.
  • the movement detection unit 508 detects movement of the person to the first area after the content is displayed on the display screen.
  • the movement detection unit 508 detects movement of the person when the image captured by the camera 20A includes a person. Furthermore, the movement detection unit 508 may detect the movement of a person using a method different from the method using the image captured by the camera 20A. For example, the position of a terminal device such as a smartphone or a mobile phone carried by a person is measured.
  • the information processing device 50 may acquire position information that is the measurement result, and the movement detection unit 508 may detect the movement of the person based on the position information.
  • GNSS Global Navigation Satellite System
  • the information processing device 50 may acquire the location information via a wireless communication network included in the terminal device using application software installed in the terminal device.
  • the wireless communication network may be 4G, 5G, LTE, Wi-Fi (registered trademark), or the like.
  • the movement detection unit 508 detects that the position information based on the measurement result is in the area B where the display device 30B is provided. Detects a terminal device belonging to a person and tracks the person based on the location information of that terminal device.
  • the movement detection unit 508 After displaying the content on the display screen, the movement detection unit 508 detects whether the person moves from the second area to the first area until a time longer than the average time required for the person to move from the second area to the first area has elapsed. Movement is detected as a first detection result.
  • the average time is the time obtained by calculating the average time required for a person to move from the second area to the first area. When a person moves the distance on the route between the first area and the second area, the time required for the transition differs because the walking speed differs, but here we will explain the case where the average time is used as an example. .
  • the movement detection unit 508 detects movement of a person during the measurement period.
  • the measurement target period is a period from a measurement start timing, which is a point in time after the content is displayed on the display screen, to a measurement end timing, which is a point in time that is longer than the average time.
  • the calculation unit 509 calculates the effect of the content based on the movement of a person from a second area, which is different from the first area, to the first area, where at least the display surface is visible.
  • the second area may be an area outside the first area and within or around the facility. In this embodiment, a case will be described in which the second area is area B in the example shown in FIG.
  • the calculation unit 509 calculates the effect of the content based on the first detection result that is the detection result of the movement detection unit 508.
  • a coefficient that is lower at the end of the period to be detected as the first detection result than at the beginning is set according to the elapsed time in the period to be measured, and the movement of the person is detected.
  • the effectiveness of the content is calculated by summing up the points for each detected person.
  • FIG. 4 is a diagram illustrating the relationship between elapsed time and points.
  • the horizontal axis is time and the vertical axis is points.
  • the measurement target period is from time t0, which is the measurement start timing, to time t4, which is the measurement end timing.
  • An example of the timing after the content is displayed on the display device 30B is time t0, and an example of the timing when a time longer than the average time required for movement has passed is time t4.
  • point p1 is for period T1 from time t0 to time t1
  • point p2 is for period T2 from after time t1 to time t2
  • point p2 is for period T3 from after time t2 to time t3.
  • a point p3 is set
  • a point p4 is set for a period T4 from after time t3 to time t4.
  • p1 is set to be the largest
  • p4 is set to become the smallest as time passes.
  • the calculation unit 509 calculates the effect of the content by (p1 ⁇ 2+p3 ⁇ 3).
  • this figure shows a case where the measurement period is divided into four sections, it may be divided into two or three sections, or may be divided into five or more sections. Further, as long as points are awarded according to the passage of time, points may be awarded according to the elapsed time without dividing the sections.
  • FIG. 5 is a flowchart illustrating the operation of the information processing device 50.
  • the camera 20A images the area A and transmits the imaged data to the information processing device 50 via the network N.
  • the display control unit 507 of the information processing device 50 selects arbitrary content from among the contents stored in the content providing device 40B and displays it on the display device 30B (step S101).
  • content set as default may be selected and displayed.
  • Content providing device 40B transmits the content selected by information processing device 50 to display device 30B via network N. Thereby, the display device 30B displays the content selected by the display control unit 507.
  • the person data detection unit 506 determines whether or not to display the target content based on whether the first cycle set by the area selection setting unit 504 has been reached (step S102), and determines whether the first cycle has been reached. If not (step S102-NO), the process moves to step S101, and if the first cycle has been reached, it is determined that the target content is to be displayed (step S102-YES), and the process moves to step S103.
  • the display control unit 507 causes the display device 30B to display the target content.
  • the target content here is content related to the area to which people are to be attracted.
  • content related to products stored on product shelves installed in the area to be attracted is used as the target content.
  • the display control unit 507 receives the designation of the area to be attracted from the input device, and reads out the content ID corresponding to the designated area from the storage unit 502.
  • the designation of the area may be accepted when the attraction timing arrives, or may be specified in advance before the attraction timing arrives.
  • the display control unit 507 extracts content related to the products displayed on the product shelf 10A as the target content.
  • the display control unit 507 displays the extracted target content on the display device 30B (step S103).
  • the person data detection unit 506 acquires the characteristic data of the person included in the image data obtained from the camera 20A (step S104), and temporarily stores it in the storage unit 502 as first data.
  • FIG. 6 is a diagram showing an example of the first data. Two people have been detected at this point. More specifically, there is one person whose attribute is "male, from 40s to 50s" and one person whose attribute is "female, from 40s to 50s.”
  • the person data detection unit 506 determines whether the comparison data acquisition time has been reached (step S105), and if the comparison data determination time has not been reached (step S105-NO), a certain wait time has elapsed. After that, the determination in step S105 is performed again, and if the comparison data determination time has been reached (step S105-YES), the characteristic data of the person included in the imaging data obtained from the camera 20A is acquired (step S106), The data is temporarily stored in the storage unit 502 as second data.
  • FIG. 7 is a diagram showing an example of the second data. The example in this figure shows an example of second data obtained based on image data captured immediately before time t1. For example, five people are detected from the second data obtained based on the imaging data taken immediately before time t1.
  • step S107 determines whether or not the measurement period has ended (step S107), and if the measurement period has not ended (step S107-NO), the process moves to step S105.
  • step S106 new second data is acquired (step S106), and if the measurement period has ended (step S107-YES), the process moves to S107.
  • FIG. 8 is a diagram showing an example of newly obtained second data. The example in this figure shows an example of second data obtained based on image data captured immediately before time t2. For example, six people are detected from the second data obtained based on the imaging data taken immediately before time t2.
  • the calculation unit 509 calculates the effect based on the first data and the second data (step S108). For example, in the first data shown in FIG. 6 and the second data shown in FIG. Find the value. Furthermore, the calculation unit 509 calculates the total value of points for each piece of second data obtained each time the comparison data acquisition time is reached. For example, in the first data shown in FIG. 6 and the second data shown in FIG. Find the value. In this way, the effect of displaying the content is calculated by calculating the sum of each point obtained each time the comparison data acquisition time is reached.
  • the information processing device 50 determines whether or not a power OFF instruction has been input (step S109), and if a power OFF instruction has been input (step S109-YES), the information processing device 50 ends the process and restarts the power OFF instruction. If no instruction has been input (step S109-NO), the process moves to step S101.
  • content related to the first area is displayed on the display surface of the display device (for example, display device 30B) installed outside the first area (for example, area A).
  • the display device for example, display device 30B
  • the content content related to the product
  • the effect of the content is calculated based on the movement of people from (for example, area B) to the first area.
  • the content after displaying the content on the display surface, the content is not moved to the first area until a time longer than the average time required for a person to move from the second area to the first area has elapsed.
  • the movement of people is detected as the first detection result.
  • a person who becomes interested in a product in the first area due to viewing the content will move to the first area shortly after viewing the content. Therefore, by setting a measurement period that takes into account the travel time for such a person to move to the first area, content can be targeted at people who arrived within the measurement period.
  • the effect of displaying the content can be measured, and a person who arrives in the first area after the measurement period is considered to have arrived in the first area regardless of the content being displayed. can be excluded from measurement.
  • the display device 30B displays content related to the product shelf 10A in area A to a person who has passed through the first area and reached the second area, thereby prompting the person to return to area A. can be encouraged.
  • FIG. 9 is a schematic block diagram showing the configuration of the effect measurement system Sa in the second embodiment.
  • Camera 20B is installed above the ceiling or wall inside the store, and images an area including area B. If a person exists in area B, camera 20B can image the area including the person.
  • Display device 30A is installed in area A. Here, the display screen of the display device 30A is visible to a person who has come into the area A to view the products displayed on the product shelf 10A.
  • the content providing device 40A stores multiple types of content and transmits the content to the display device 30A.
  • the content includes content related to the store, and is, for example, an advertisement that can appeal to consumers about the products displayed on the product shelf 10B.
  • the information processing device 51 displays the content on at least one of the display device 30A and the display device 30B, and calculates the effect of displaying the content.
  • the information processing device 51 is, for example, a computer.
  • FIG. 10 is a schematic functional block diagram illustrating the functions of the information processing device 51.
  • the communication unit 511 communicates with other devices (camera 20A, camera 20B, display device 30A, display device 30B, content providing device 40A, content providing device 40B) via network N.
  • the storage unit 512 stores area-related data and other various data.
  • the storage unit 512 stores camera identification information, area identification information that identifies an area, and information regarding products displayed in the area in association with each other.
  • area-related data is stored for each of the cameras 20A and 20B.
  • the area selection setting section 514 sets the selection timing for selecting an area based on the instruction input from the input section 503.
  • the selection timing may be an area selection time that is a predetermined time.
  • the area selection time may be a timing that occurs at regular intervals. Further, the selection timing may be the timing at which it is detected that the number of people set for each area has fallen below a reference value.
  • the measurement timing setting unit 515 sets the measurement timing for acquiring person data from the selected area after content related to the selected area is displayed on a display device installed in an area different from the selected area. Set.
  • the measurement timing may be once after the content is displayed, or may be measured multiple times at different timings.
  • the display control unit 517 selects an area to attract people based on the estimation results for each area. Examples of selection methods include (1) to (4) below. (1) Selecting an area with the least number of people Among the multiple areas, the area where the number of people detected by the person data detection unit 506 is the least is selected. (2) When referring to the size relationship between the number of people in the area and the reference value The person data detection unit 506 detects the number of visitors when no content is displayed and the number of sales of products displayed in the area. Measurements are made based on the results, and the aggregated results obtained by calculating the average value for each day of the week or each time period are stored as a reference.
  • the number of people in the area at the time of monitoring is compared with a standard value depending on the target day of the week and time of day, and if the number of people in the area is less than the standard value, the area is attracted. Select as the target area. In this case, it is possible to attract people to an area with fewer people. In this case, the number of people in the area at the time of monitoring is compared with a standard value depending on the target day of the week and time of day, and if the number of people in the area exceeds the standard value, other Select one of the areas as the area to be attracted.
  • the display control unit 507 displays content related to the first area on the display surface of a display device installed outside the first area.
  • the first area may be any area within the facility, and is area A in this embodiment.
  • the display device installed outside the first area is the display device 30B.
  • the movement detection unit 508 detects movement of the person to the first area after the content is displayed on the display screen. For example, if a person is included in an image captured by a camera that captures an area selected as an area to attract people, the movement detection unit 508 detects the movement of the person. The movement detection unit 508 detects the movement of the person to the first area as the first detection result until a time longer than the average time required for the movement of the person from the second area to the first area has elapsed. do. Furthermore, after displaying the content on the display screen, the movement detection unit 508 detects movement of the person out of the second area as a second detection result.
  • the movement detection unit 508 detects movement of the person from the second area to outside the second area as a case where it can be estimated that the person has moved from the second area to the first area, and If a person is included in the image captured in the first area, it may be detected as the second detection result.
  • the movement detection unit 508 moves the person from the area where the content is displayed to the attraction target area by tracking the movement of the person based on the characteristic data of the person detected in the area outside the first area. Detect whether the has moved. For example, the movement detection unit 508 detects a case in which feature data obtained based on certain imaging data and feature data obtained by subsequent imaging have a correspondence relationship, and the position is located along the movement route. If the person is within a distance that can be considered to have moved, the person is presumed to be the same person. This allows people to be tracked from the area where the display device displaying the content is installed to the area to be lured, so that those who actually view the content will come to the area to be attracted. It can be estimated as follows.
  • the calculation unit 519 calculates the effect of the content based on the movement of a person from a second area where the display surface is at least visible and which is different from the first area to the first area.
  • the second area may be an area outside the first area and within or around the facility. In this embodiment, a case where the second area is area B will be described. If it is estimated that the person has moved from the second area to the first area based on the first detection result and the second detection result, the calculation unit 519 determines that the person has viewed the content. Therefore, it can be estimated that the user became interested in the product included in the content, and as a result moved to the first area. If it is estimated in this way, it can be estimated that there was an effect of displaying the content.
  • FIG. 11 is a flowchart illustrating the operation of the information processing device 51.
  • the camera 20A images the area A and transmits the imaged data to the information processing device 51 via the network N.
  • Camera 20B images area B and transmits the imaged data to information processing device 51 via network N.
  • the display control unit 517 of the information processing device 51 selects any content from among the contents stored in the content providing device 40A, displays it on the display device 30A, and selects any content from among the contents stored in the content providing device 40B.
  • content is selected and displayed on the display device 30B (step S201).
  • content set as default may be selected and displayed.
  • the content providing device 40A transmits the content selected by the information processing device 51 to the display device 30A via the network N
  • the content providing device 40B transmits the content selected by the information processing device 51 to the display device 30B via the network N. Send via.
  • the display device 30A and the display device 30B each display the content selected by the display control unit 517.
  • the person data detection unit 506 determines whether or not to select an area based on the setting conditions set by the area selection setting unit 504. For example, the person data detection unit 506 determines whether the area selection time has arrived based on whether the first period has been reached (step S202), and if the first period has not been reached (step S202). -NO), it is determined that the area selection time has not been reached, and the process moves to step S201. On the other hand, if the first cycle has been reached, the person data detection unit 506 determines that the area selection time has arrived (step S202-YES), and moves the process to step S203.
  • the person data detection unit 506 calculates the characteristic data of the person included in the imaging data for each area, and temporarily stores it in the storage unit 512 as reference data for counting the number of people in each area. (step S203).
  • the results of estimating the attributes of the people included in that area are acquired and stored as reference data.
  • the person data detection unit 506 estimates the number of people in area A and their attributes, estimates the number of people in area B and their attributes, and stores the estimated results in the storage unit 512. .
  • the display control unit 517 selects an area to attract people based on the estimation results for each area (step S204). For example, assume that area A is selected to have the least number of people. In other words, there are fewer people in area A and more people in area B. Therefore, it is decided to attract people from area B to area A.
  • the display control unit 517 selects a product displayed in the selected area based on the area-related data stored in the storage unit 512 (step S205), and displays content corresponding to the selected product in the content providing device.
  • the content is selected from among the contents stored in 40B (S205).
  • content related to the products displayed on the product shelf 10A in area A is selected.
  • the display control unit 517 displays the selected content on the display device 30B (step S206).
  • the selected content may be displayed on each display device 30 installed outside area A.
  • the display control unit 517 may repeatedly reproduce the content multiple times, or may reproduce arbitrary content different from the selected content (step S207).
  • the person data detection unit 506 determines whether the comparison data acquisition time has been reached (step S208), and if the comparison data determination time has not been reached (step S208-NO), the process moves to step S207. However, when the comparison data determination time has been reached (step S208-YES), the characteristics of the person included in the imaged data obtained from the camera that images the area selected in step S204 (here, camera 20A of area A) Data is acquired (step S209) and temporarily stored in the storage unit 502 as comparison data. Here, attributes are estimated for each person in the selected area, and the estimation results are stored.
  • the calculation unit 509 calculates the effect based on the reference data and comparison data (step S210). For example, the effect of displaying the content is calculated by calculating the increase in the number of people for each attribute between the reference data and the comparison data, and obtaining a count result of counting the increase in the number of people.
  • the information processing device 51 determines whether a power OFF instruction has been input (step S211), and if a power OFF instruction has been input (step S212-YES), the information processing device 51 terminates the process and performs a power OFF instruction. If no instruction has been input (step S211-NO), the process moves to step S201.
  • content related to products displayed in a selected specific area is displayed on a display device in another area, and the attributes of the person in the specific area at the time before the content is displayed. and the attributes of people in a specific area after a certain amount of time has passed.
  • FIG. 12 is a schematic block diagram showing the configuration of the effect measurement system Sb in the case where three cameras are provided in the second embodiment.
  • a first area (area A) and a second area (area B) are separated from each other, and an area (area C) between the first area and the second area is used as the first area.
  • the image is captured by a third camera (camera 20C) different from the camera and the second camera.
  • the movement detection unit 508 detects movement of the person from the second area (area B) to the third area (area C) based on the captured image obtained from the camera 20C, and the movement is detected.
  • the movement of the person from the third area (area C) to the first area (area A) is detected.
  • the calculation unit 519 calculates the following: Based on this detection result, the effectiveness of the content is calculated.
  • a plurality of third cameras may be provided.
  • the imaging areas of the plurality of third cameras may be set so that images can be taken along the movement route between the first area and the second area.
  • the imaging areas of the first camera, the second camera, and the third camera may be different from each other, and may be set so that at least a portion thereof overlaps along the movement route.
  • an area AC is an area where a first area (area A) and a third area (area C) overlap, and an area where a second area (area B) and a third area (area C) are overlapped.
  • An area BC which is an overlapping area, is set along the movement route between the first area (area A) and the second area (area B). In this case, the movement of the person can be tracked without interruption in the imaging area.
  • FIG. 13 is a schematic block diagram showing the configuration of the effect measurement system Sc in the third embodiment.
  • the effect measurement system Sc is communicably connected via a camera 20D, a display device 30E, a content providing device 40E, an information processing device 52, and a network N.
  • Area D is a part of the area within the facility, and is set for a target section within the facility (for example, a store) where it is desired to attract people by displaying content on the display device 30B.
  • Camera 20D is installed above the ceiling or wall inside the store, and images an area including area D. If a person exists in area D, camera 20D can image the area including the person.
  • Display device 30E is installed in an area different from area D. Here, the display screen of the display device 30B is visible to a person in an area different from area D.
  • Area E is a second area where the display surface of display device 30E is visible and different from the first area.
  • the content providing device 40E stores multiple types of content and transmits the content to the display device 30E.
  • the information processing device 52 displays the content on the display device 30E and calculates the effect of displaying the content.
  • the information processing device 52 is, for example, a computer.
  • the network N is, for example, a LAN (local area network).
  • the information processing device 52 displays content related to the first area (area D) on the display surface of the display device (30E) installed outside the first area (area D).
  • the information processing device 52 outputs an instruction to the content providing device 40E to display the content on the display device 30E.
  • the content providing device 40E causes the display device 30B to display the content by transmitting the content to the display device 30E.
  • the information processing device 52 displays the content on the display surface of the display device 30E, and then detects movement of the person to the first area.
  • the information processing device 52 calculates the effect of the content based on at least the movement of the person from the second area to the first area.
  • the storage unit 502 and the storage unit 512 are storage media such as a hard disk drive (HDD), a flash memory, an electrically erasable programmable read only memory (EEPROM), or a random access memory (RAM). cess read/write Memory) , ROM (Read Only Memory), or any combination of these storage media.
  • HDD hard disk drive
  • flash memory an electrically erasable programmable read only memory
  • RAM random access memory
  • cess read/write Memory , ROM (Read Only Memory), or any combination of these storage media.
  • nonvolatile memory can be used for the storage unit 502 and the storage unit 512.
  • the communication unit 501, input unit 503, area selection setting unit 504, measurement timing setting unit 505, person data detection unit 506, display control unit 507, movement detection unit 508, and calculation unit 509 of the information processing device 50 may be constituted by a processing device such as a CPU (central processing unit) or a dedicated electronic circuit, for example.
  • the communication unit 511, input unit 503, area selection setting unit 514, measurement timing setting unit 515, person data detection unit 506, display control unit 517, movement detection unit 518, and calculation unit 519 of the information processing device 51 are, for example, CPU (central processing unit) or a dedicated electronic circuit.
  • construction management is performed by recording a program for realizing the functions of the processing section shown in Fig. 1 on a computer-readable recording medium, and having the computer system read and execute the program recorded on this recording medium.
  • the "computer system” herein includes hardware such as an OS and peripheral devices.
  • the term "computer system” includes the homepage providing environment (or display environment) if a WWW system is used.
  • the term “computer-readable recording medium” refers to portable media such as flexible disks, magneto-optical disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into computer systems.
  • the term “computer-readable recording medium” includes a medium that retains a program for a certain period of time, such as a volatile memory inside a computer system that is a server or a client.
  • the above-mentioned program may be one for realizing a part of the above-mentioned functions, or may be one that can realize the above-mentioned functions in combination with a program already recorded in the computer system.
  • the above program may be stored in a predetermined server, and the program may be distributed (downloaded, etc.) via a communication line in response to a request from another device.
  • 10A, 10B ... Product shelf, 20A, 20B, 20C, 20D... Camera, 30, 30A, 30B, 30E... Display device, 40A, 40B, 40E... Content providing device, 50, 51, 52... Information processing device, 501, 511... Communication unit, 502, 512... Storage unit, 503... Input unit, 504, 514... Area selection setting unit, 505, 515... Measurement timing setting section, 506... Person data detection section, 507, 517... Display control section, 508, 518... Movement detection section, 509, 519... Calculation section, S, Sa, Sb, Sc ...Effect measurement system

Landscapes

  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

In the present invention, content relating to a first region (D) is displayed in a display screen of a display device (30E) installed outside the first region. After the content has been displayed in the display screen, movement of people into the first region is detected, and the effect of the content is calculated on the basis of the movement of people into the first region from a second region (E) which is different from the first region and in which at least the display screen is visible.

Description

コンテンツの効果算出方法、効果算出システムContent effect calculation method, effect calculation system
 本発明は、コンテンツの効果算出方法、効果算出システムに関する。 The present invention relates to a content effect calculation method and an effect calculation system.
 店舗や公共施設等において、コンテンツを表示するサイネージが利用されている。コンテンツとしては、広告、お知らせ、各種案内等がある。 Signage that displays content is used in stores, public facilities, etc. The contents include advertisements, notices, various guides, and the like.
 ここで、コンテンツを表示したことに対する効果を予測し、その予測された効果を基に、広告枠に対する広告を割り当てるシステムがある(例えば、特許文献1)。 Here, there is a system that predicts the effect of displaying content and allocates advertisements to advertising spaces based on the predicted effect (for example, Patent Document 1).
特開2009-245364号公報Japanese Patent Application Publication No. 2009-245364
 コンテンツを表示させることを依頼する依頼主は、コンテンツを表示したことに対する効果を把握したいという要望がある。
 なお、上述の特許文献1の技術は、コンテンツを表示した効果を予測するものであるが、効果を測定するものではない。
 本発明の目的は、コンテンツを表示したことに対する効果を測定することができるコンテンツの効果算出方法、効果算出システムを提供することである。
A client who requests content to be displayed has a desire to understand the effects of displaying the content.
Note that the technique of Patent Document 1 described above predicts the effect of displaying content, but does not measure the effect.
An object of the present invention is to provide a content effect calculation method and an effect calculation system that can measure the effect of displaying content.
 本発明は、第1の領域の外に設置された表示装置の表示面に、前記第1の領域に関連するコンテンツを表示し、前記表示面に前記コンテンツを表示した後、前記第1の領域への人の移動を検出し、少なくとも前記表示面を視認可能であり前記第1の領域とは異なる第2の領域から前記第1の領域への人の移動に基づく、前記コンテンツの効果を算出するコンテンツの効果算出方法である。 The present invention displays content related to the first area on a display screen of a display device installed outside the first area, and after displaying the content on the display screen, displays the content related to the first area. Detecting the movement of a person to the first area, and calculating the effect of the content based on the movement of the person from a second area where at least the display surface is visible and which is different from the first area to the first area. This is a method for calculating the effectiveness of content.
 本発明は、第1の領域の外に設置された表示装置の表示面に、前記第1の領域に関連するコンテンツを表示させる表示制御部と、前記表示面に前記コンテンツが表示された後、前記第1の領域への人の移動を検出する検出部と、少なくとも前記表示面を視認可能であり前記第1の領域とは異なる第2の領域から前記第1の領域への人の移動に基づく、前記コンテンツの効果を算出する算出部とを有する効果算出システムである。 The present invention includes a display control unit that displays content related to the first area on a display screen of a display device installed outside the first area, and after the content is displayed on the display screen, a detection unit that detects movement of a person to the first area; and a detection unit that detects movement of a person to the first area from a second area where at least the display surface is visible and is different from the first area. This is an effect calculation system having a calculation unit that calculates the effect of the content based on the content.
 本発明によれば、コンテンツを表示したことに対する効果を測定することができる。 According to the present invention, it is possible to measure the effect of displaying content.
この発明の一実施形態による効果測定システムSの構成を示す概略ブロック図である。1 is a schematic block diagram showing the configuration of an effect measurement system S according to an embodiment of the present invention. 情報処理装置50の機能を説明する概略機能ブロック図である。2 is a schematic functional block diagram illustrating the functions of an information processing device 50. FIG. 記憶部502に記憶されるエリア関連データの一例を示す図である。5 is a diagram illustrating an example of area-related data stored in a storage unit 502. FIG. 経過時間とポイントとの関係を説明する図である。It is a figure explaining the relationship between elapsed time and points. 情報処理装置50の動作を説明するフローチャートである。5 is a flowchart illustrating the operation of the information processing device 50. FIG. 第1データの一例を示す図である。FIG. 3 is a diagram showing an example of first data. 第2データの一例を示す図である。It is a figure which shows an example of 2nd data. 新たに得られた第2データの一例を示す図である。It is a figure which shows an example of the 2nd data newly obtained. 第2の実施形態における効果測定システムSaの構成を示す概略ブロック図である。It is a schematic block diagram showing the composition of effect measurement system Sa in a 2nd embodiment. 情報処理装置51の機能を説明する概略機能ブロック図である。5 is a schematic functional block diagram illustrating the functions of an information processing device 51. FIG. 情報処理装置51の動作を説明するフローチャートである。5 is a flowchart illustrating the operation of the information processing device 51. FIG. 第2の実施形態においてカメラを3台設けた場合における効果測定システムSbの構成を示す概略ブロック図である。It is a schematic block diagram showing the composition of effect measurement system Sb when three cameras are provided in a 2nd embodiment. 第3の実施形態における効果測定システムScの構成を示す概略ブロック図である。It is a schematic block diagram showing the composition of effect measurement system Sc in a 3rd embodiment.
 以下、本発明の一実施形態による効果測定システムについて図面を参照して説明する。図1は、この発明の一実施形態による効果測定システムSの構成を示す概略ブロック図である。
 効果測定システムSは、施設内に設けられた表示装置にコンテンツを表示した後において、人物の移動を計測することでコンテンツを表示したことに対する効果を測定する。
 施設は、公共施設、商業施設、駅の構内、イベント会場、ショッピングモール(複合商業施設)、店舗等であり、人が来場しうる場所である。ここでは、施設は、店舗である場合について説明する。
Hereinafter, an effect measurement system according to an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is a schematic block diagram showing the configuration of an effect measurement system S according to an embodiment of the present invention.
The effect measurement system S measures the effect of displaying the content by measuring the movement of a person after the content is displayed on a display device provided within the facility.
Facilities include public facilities, commercial facilities, station premises, event venues, shopping malls (commercial complexes), stores, etc., and are places where people can visit. Here, a case will be described in which the facility is a store.
 効果測定システムSは、カメラ20A、表示装置30B、コンテンツ提供装置40B、情報処理装置50、ネットワークNを介して通信可能に接続される。 The effect measurement system S is communicably connected via a camera 20A, a display device 30B, a content providing device 40B, an information processing device 50, and a network N.
 エリアAは、施設内の一部のエリアであり、エリアBは、施設内であってエリアAとは異なるエリアである。エリアAとエリアBは隣接していてもよいし、離間していてもよい。エリアAは、施設(例えば店舗)内において、表示装置30Bにコンテンツを表示させることで人を誘引したい対象の区画に対して設定される。
 商品棚10Aは、エリアAに設置される。商品棚10Bは、エリアBに設置される。商品棚10Aと商品棚10Bには異なる商品が陳列される。
 カメラ20Aは、店舗内の天井または壁面上方に設置され、エリアAを含む領域を撮像する。カメラ20Aは、エリアAに人が存在する場合には、人を含む領域を撮像することができる。
 表示装置30Bは、エリアBに設置される。ここでは、表示装置30Bの表示画面は、商品棚10Bに陳列された商品を見にエリアB内に来た人物が視認可能である。
 コンテンツ提供装置40Bは、複数種類のコンテンツを記憶しており、表示装置30Bにコンテンツを送信する。コンテンツは、店舗に関連する内容を含むものであり、例えば、商品棚10Aに陳列された商品を消費者に訴求可能な広告である。
Area A is a part of the facility, and area B is a different area from area A within the facility. Area A and area B may be adjacent to each other or may be separated from each other. Area A is set within a facility (for example, a store) for a target section where it is desired to attract people by displaying content on the display device 30B.
Product shelf 10A is installed in area A. Product shelf 10B is installed in area B. Different products are displayed on the product shelf 10A and the product shelf 10B.
Camera 20A is installed above the ceiling or wall inside the store, and images an area including area A. If a person exists in area A, camera 20A can image the area including the person.
Display device 30B is installed in area B. Here, the display screen of the display device 30B is visible to a person who has come into area B to view the products displayed on the product shelf 10B.
Content providing device 40B stores multiple types of content and transmits the content to display device 30B. The content includes content related to the store, and is, for example, an advertisement that can appeal to consumers about the products displayed on the product shelf 10A.
 情報処理装置50は、コンテンツを表示装置30Bに表示させ、コンテンツを表示したことに対する効果を算出する。情報処理装置50は例えばコンピュータである。
 ネットワークNは、例えばLAN(ローカルエリアネットワーク)である。
The information processing device 50 causes the display device 30B to display the content, and calculates the effect of displaying the content. The information processing device 50 is, for example, a computer.
The network N is, for example, a LAN (local area network).
 図2は、情報処理装置50の機能を説明する概略機能ブロック図である。
 通信部501は、ネットワークNを介して他の機器(カメラ20A、表示装置30B、コンテンツ提供装置40B)と通信をする。
 記憶部502は、エリア関連データと他の各種データとを記憶する。
 図3は、記憶部502に記憶されるエリア関連データの一例を示す図である。
 記憶部502は、カメラ20Aを識別するカメラ識別情報と、エリアを識別するエリア識別情報と、エリアに陳列された商品に関する情報とを対応付けて記憶している。
FIG. 2 is a schematic functional block diagram illustrating the functions of the information processing device 50.
The communication unit 501 communicates with other devices (camera 20A, display device 30B, content providing device 40B) via network N.
The storage unit 502 stores area-related data and other various data.
FIG. 3 is a diagram illustrating an example of area-related data stored in the storage unit 502.
The storage unit 502 stores camera identification information for identifying the camera 20A, area identification information for identifying the area, and information regarding products displayed in the area in association with each other.
 図2に戻り、入力部503は、情報処理装置50の外部に接続される入力装置から入力される情報を受け付ける。入力装置は、例えば、キーボード、マウス、タッチパネル等のうち少なくともいずれか1つを用いることができる。
 エリア選定設定部504は、エリアを選定するタイミングを示すルールを設定する。エリアを選定するタイミングは、例えば以下の(a)から(b)がある。
(a)人を移動させる指示
 エリア選定設定部504は、入力部503から入力される、第1の領域に対して人を移動させる要求に基づいて、エリアを選定する選定タイミングを設定する。この場合における選定タイミングは、人を移動させる要求が入力部503から入力された際に直ちにエリア選定をするようなタイミングであってもよい。
(b)エリア選定時間を用いる場合
また、エリア選定設定部504は、予め定められた時間であるエリア選定時間を入力部503から受け付け、このエリア選定時間に基づくタイミングであってもよい。この場合、エリア選定時間が到来すると、エリアを選定させることができる。エリア選定タイミングが到来したことに応じて、人を誘引先のエリアへ誘引するためのコンテンツが表示される。エリア選定時間は、周期的に到来する時間であってもよいし、決められた時刻であってもよい。
(c)エリアにいる人数と基準値との大小関係を参照する場合
 エリアにおける人数と基準値との大小関係に基づいてエリア選定を行うようにしてもよい。例えば、第2の領域にいる人数が基準値以上となった場合に、エリア選定を行うようにしてもよい。この場合、あるエリアに人数が多く集まったことに応じて、そのエリアにいる人にコンテンツを視聴してもらうことで、第1の領域に誘引することができる。
 また、例えば、第1の領域にいる人数が基準値未満となった場合に、エリア選定を行うようにしてもよい。この場合、第1の領域にいる人の人数が基準値よりも少なくなったことに応じて、コンテンツを第1の領域以外に設定された表示装置に表示させることができ、第1の領域の外から第1の領域へ人を誘引することができる。
Returning to FIG. 2, the input unit 503 receives information input from an input device connected to the outside of the information processing device 50. As the input device, for example, at least one of a keyboard, a mouse, a touch panel, etc. can be used.
The area selection setting unit 504 sets rules indicating timing for selecting areas. Timings for selecting an area include, for example, the following (a) to (b).
(a) Instruction to move a person The area selection setting unit 504 sets a selection timing for selecting an area based on a request input from the input unit 503 to move a person to the first area. The selection timing in this case may be such that the area is selected immediately when a request to move a person is input from the input unit 503.
(b) When using area selection time Alternatively, the area selection setting unit 504 may receive an area selection time, which is a predetermined time, from the input unit 503, and the timing may be based on this area selection time. In this case, when the area selection time arrives, an area can be selected. When the area selection timing has arrived, content for attracting people to the destination area is displayed. The area selection time may be a periodic time or may be a predetermined time.
(c) When referring to the magnitude relationship between the number of people in the area and the reference value Area selection may be performed based on the magnitude relationship between the number of people in the area and the reference value. For example, area selection may be performed when the number of people in the second area exceeds a reference value. In this case, in response to a large number of people gathered in a certain area, by having the people in that area watch the content, it is possible to attract them to the first area.
Further, for example, area selection may be performed when the number of people in the first area is less than a reference value. In this case, the content can be displayed on a display device set other than the first area in response to the number of people in the first area becoming smaller than the reference value, and the content can be displayed on a display device set other than the first area. It is possible to attract people from outside to the first area.
 計測タイミング設定部505は、選定されたエリアに関連するコンテンツが、当該エリアとは異なるエリアに設置された表示装置に表示された後において、選定されたエリアから人物データを取得する計測タイミング(比較データ取得時間)を設定する。この計測タイミングに従って計測された人物データと、コンテンツを表示する前の得られた人物データとを比較することで、コンテンツを表示したことに対し、選定されたエリアに対する人の変化を把握することができる。このタイミングは、コンテンツを表示したことに対してどの時点における人の動向を把握したいかに応じて決めることができる。
 この実施形態においては、コンテンツを表示した後における、異なる複数のタイミングが設定される。例えば、後述する、期間T1、期間T2、期間T3、期間T4のそれぞれの期間において、少なくとも1回測定するようなタイミングが選定される。
The measurement timing setting unit 505 determines the measurement timing (comparison) for acquiring person data from the selected area after content related to the selected area is displayed on a display device installed in an area different from the data acquisition time). By comparing the person data measured according to this measurement timing with the person data obtained before displaying the content, it is possible to understand changes in the number of people in the selected area after displaying the content. can. This timing can be determined depending on the point in time when it is desired to understand people's movements after displaying the content.
In this embodiment, a plurality of different timings after displaying the content are set. For example, timings are selected such that measurement is performed at least once in each of period T1, period T2, period T3, and period T4, which will be described later.
 人物データ検出部506は、撮像データに含まれる人物の特徴データを算出する。特徴データは、人物の特徴を数値化した特徴量であってもよいし、人物の特徴に基づいて推定される属性データであってもよい。属性データは、例えば、人物の性別、年齢層、服装の特徴(服の色等)等である。以下、特徴データを人物データとも称する。
 人物データ検出部506は、この推定処理を、撮像データをカメラから受信し続ける間、その撮像データが得られる毎に算出してもよい。また、人物データ検出部506は、計測タイミング設定部515によって設定された計測タイミングが到来したときに得られる撮像データから特徴データを算出するようにしてもよい。
A person data detection unit 506 calculates characteristic data of a person included in the imaging data. The feature data may be a feature amount obtained by quantifying the features of a person, or may be attribute data estimated based on the features of the person. The attribute data includes, for example, the person's gender, age group, clothing characteristics (color of clothing, etc.), and the like. Hereinafter, the feature data will also be referred to as person data.
The person data detection unit 506 may perform this estimation process every time the image data is obtained while continuing to receive the image data from the camera. Further, the person data detection unit 506 may calculate the feature data from the imaging data obtained when the measurement timing set by the measurement timing setting unit 515 arrives.
 人物データ検出部506は、比較データ取得時間に到達したか否かを判定し、比較データ決定時間に到達した場合に、特徴データ(人物データ)を算出する。 The person data detection unit 506 determines whether the comparison data acquisition time has been reached, and calculates feature data (person data) when the comparison data determination time has been reached.
 表示制御部507は、第1の領域の外に設置された表示装置の表示面に、第1の領域に関連するコンテンツを表示させる。第1の領域は、施設内のいずれかのエリアであればよく、この実施形態においては、エリアAである。第1の領域の外に設置された表示装置は、表示装置30Bである。 The display control unit 507 displays content related to the first area on the display surface of a display device installed outside the first area. The first area may be any area within the facility, and is area A in this embodiment. The display device installed outside the first area is the display device 30B.
 移動検出部508は、表示面にコンテンツが表示された後、第1の領域への人の移動を検出する。移動検出部508は、カメラ20Aによって撮像された画像に人が含まれている場合に、当該人の移動を検出する。
 また、移動検出部508は、人の移動の検出は、カメラ20Aによって撮像された画像を用いる方法とは別の方法であってもよい。例えば、人物が携帯しているスマートフォンまたは携帯電話等の端末装置によって、この端末装置の位置を計測する。その計測結果である位置情報を情報処理装置50が取得し、移動検出部508が、その位置情報に基づいて、人の移動を検出してもよい。位置情報の測定は、例えば、GNSS(Global Navigation Satellite System)を利用してもよい。
 情報処理装置50は、人物が携帯する端末装置から位置情報を取得する場合、端末装置に搭載されたアプリケーションソフトウェア等によって、端末装置が有する、無線通信網を介して取得するようにしてもよい。無線通信網としては、4G、5G、LTE、Wi-Fi(登録商標)等であってもよい。
 この場合、表示装置30Bにおいてコンテンツが表示された後、端末装置の位置を計測した計測結果を受信すると、移動検出部508は、計測結果に基づく位置情報が、表示装置30Bが設けられたエリアBに属する端末装置を検出し、その端末装置の位置情報を元に、人物を追跡する。
The movement detection unit 508 detects movement of the person to the first area after the content is displayed on the display screen. The movement detection unit 508 detects movement of the person when the image captured by the camera 20A includes a person.
Furthermore, the movement detection unit 508 may detect the movement of a person using a method different from the method using the image captured by the camera 20A. For example, the position of a terminal device such as a smartphone or a mobile phone carried by a person is measured. The information processing device 50 may acquire position information that is the measurement result, and the movement detection unit 508 may detect the movement of the person based on the position information. For example, GNSS (Global Navigation Satellite System) may be used to measure the position information.
When the information processing device 50 acquires location information from a terminal device carried by a person, the information processing device 50 may acquire the location information via a wireless communication network included in the terminal device using application software installed in the terminal device. The wireless communication network may be 4G, 5G, LTE, Wi-Fi (registered trademark), or the like.
In this case, when receiving the measurement result of measuring the position of the terminal device after the content is displayed on the display device 30B, the movement detection unit 508 detects that the position information based on the measurement result is in the area B where the display device 30B is provided. Detects a terminal device belonging to a person and tracks the person based on the location information of that terminal device.
 移動検出部508は、表示面にコンテンツを表示した後、第2の領域から第1の領域への人の移動に要する平均時間よりも長い時間が経過するまで、第1の領域への人の移動を第1の検出結果として検出する。平均時間は、第2の領域から第1の領域への人の移動に要する時間の平均を求めることで得られる時間である。第1の領域と第2の領域との間の経路における距離を人が移動する場合、歩行速度が異なるため、移行にかかる時間が異なるが、ここでは平均の時間を用いる場合を例として説明する。移動検出部508は、計測対象期間において、人の移動を検出する。計測対象期間は、表示面にコンテンツを表示した後の時点である計測開始タイミングから、平均時間よりも長い時間が経過した時点である計測終了タイミングまでの期間である。 After displaying the content on the display screen, the movement detection unit 508 detects whether the person moves from the second area to the first area until a time longer than the average time required for the person to move from the second area to the first area has elapsed. Movement is detected as a first detection result. The average time is the time obtained by calculating the average time required for a person to move from the second area to the first area. When a person moves the distance on the route between the first area and the second area, the time required for the transition differs because the walking speed differs, but here we will explain the case where the average time is used as an example. . The movement detection unit 508 detects movement of a person during the measurement period. The measurement target period is a period from a measurement start timing, which is a point in time after the content is displayed on the display screen, to a measurement end timing, which is a point in time that is longer than the average time.
 算出部509は、少なくとも表示面を視認可能であり第1の領域とは異なる第2の領域から第1の領域への人の移動に基づく、コンテンツの効果を算出する。第2の領域は、第1の領域の外であって、施設内または施設の周囲のエリアであればよい。この実施形態において、第2の領域は、図1に示す例では、エリアBである場合について説明する。 The calculation unit 509 calculates the effect of the content based on the movement of a person from a second area, which is different from the first area, to the first area, where at least the display surface is visible. The second area may be an area outside the first area and within or around the facility. In this embodiment, a case will be described in which the second area is area B in the example shown in FIG.
 算出部509は、移動検出部508の検出結果である第1の検出結果に基づいて、コンテンツの効果を算出する。
 算出部509は、第1の検出結果として検出する対象の期間の開始時よりも終了時の方が低くなる係数が計測対象期間における経過時間に応じて設定されており、人の移動が検出された時点におけるポイントを、検出された人のそれぞれについて合計した値を求めることで、コンテンツの効果を算出する。
 ここで、図4は、経過時間とポイントとの関係を説明する図である。横軸は時間であり縦軸はポイントである。計測開始タイミングである時刻t0から計測終了タイミングである時刻t4までが計測対象期間である。表示装置30Bにコンテンツを表示した後のタイミングの一例が時刻t0であり、移動に要する平均時間よりも長い時間が経過したタイミングの一例が時刻t4である。
The calculation unit 509 calculates the effect of the content based on the first detection result that is the detection result of the movement detection unit 508.
In the calculation unit 509, a coefficient that is lower at the end of the period to be detected as the first detection result than at the beginning is set according to the elapsed time in the period to be measured, and the movement of the person is detected. The effectiveness of the content is calculated by summing up the points for each detected person.
Here, FIG. 4 is a diagram illustrating the relationship between elapsed time and points. The horizontal axis is time and the vertical axis is points. The measurement target period is from time t0, which is the measurement start timing, to time t4, which is the measurement end timing. An example of the timing after the content is displayed on the display device 30B is time t0, and an example of the timing when a time longer than the average time required for movement has passed is time t4.
 ここでは、時刻t0から時刻t1までである期間T1に対してポイントp1、時刻t1の後から時刻t2までである期間T2に対してポイントp2、時刻t2の後から時刻t3までである期間T3に対してポイントp3、時刻t3の後から時刻t4までである期間T4に対してポイントp4が設定されている。ここでは、ポイントp1からポイントp4のうち、p1が最も大きく、時間の経過に応じてp4が最も小さくなるように設定されている。
 例えば、算出部509は、第1の検出結果において、期間T1において2人、期間T3において3人が検出された場合には、コンテンツの効果を、(p1×2+p3×3)によって求める。
 この図においては、計測対象期間を4つの区間に区切る場合について図示されているが、2つまたは3つに区切ってもよいし、5つ以上に区切ってもよい。また、時間の経過に応じたポイントが付与されるのであれば、区間を区切ることなく、経過した時刻に応じたポイントを付与してもよい。
Here, point p1 is for period T1 from time t0 to time t1, point p2 is for period T2 from after time t1 to time t2, and point p2 is for period T3 from after time t2 to time t3. On the other hand, a point p3 is set, and a point p4 is set for a period T4 from after time t3 to time t4. Here, among the points p1 to p4, p1 is set to be the largest, and p4 is set to become the smallest as time passes.
For example, if two people are detected in the period T1 and three people are detected in the period T3 in the first detection result, the calculation unit 509 calculates the effect of the content by (p1×2+p3×3).
Although this figure shows a case where the measurement period is divided into four sections, it may be divided into two or three sections, or may be divided into five or more sections. Further, as long as points are awarded according to the passage of time, points may be awarded according to the elapsed time without dividing the sections.
 図5は、情報処理装置50の動作を説明するフローチャートである。
 カメラ20Aは、エリアAを撮像し、ネットワークNを介して、撮像データを情報処理装置50に送信する。
 情報処理装置50の表示制御部507は、コンテンツ提供装置40Bに記憶されたコンテンツのうち、任意のコンテンツを選定し、表示装置30Bに表示させる(ステップS101)。ここでは、デフォルトとして設定されたコンテンツが選定され表示されてもよい。コンテンツ提供装置40Bは、情報処理装置50によって選定されたコンテンツを表示装置30BにネットワークNを介して送信する。これにより、表示装置30Bは、表示制御部507によって選定されたコンテンツを表示する。
FIG. 5 is a flowchart illustrating the operation of the information processing device 50.
The camera 20A images the area A and transmits the imaged data to the information processing device 50 via the network N.
The display control unit 507 of the information processing device 50 selects arbitrary content from among the contents stored in the content providing device 40B and displays it on the display device 30B (step S101). Here, content set as default may be selected and displayed. Content providing device 40B transmits the content selected by information processing device 50 to display device 30B via network N. Thereby, the display device 30B displays the content selected by the display control unit 507.
 人物データ検出部506は、エリア選定設定部504によって設定されている第1周期に到達したか否かに基づいてターゲットコンテンツを表示するか否かを判定し(ステップS102)、第1周期に到達していなければ(ステップS102-NO)、ステップS101に処理を移行し、第1周期に到達していればターゲットコンテンツを表示すると判定し(ステップS102-YES)、処理をステップS103に移行する。 The person data detection unit 506 determines whether or not to display the target content based on whether the first cycle set by the area selection setting unit 504 has been reached (step S102), and determines whether the first cycle has been reached. If not (step S102-NO), the process moves to step S101, and if the first cycle has been reached, it is determined that the target content is to be displayed (step S102-YES), and the process moves to step S103.
 ターゲットコンテンツを表示すると判定されると、表示制御部507は、ターゲットコンテンツを表示装置30Bに表示させる。ここでのターゲットコンテンツは、どのエリアに対して人を誘引するか、その誘引する対象となるエリアに関連するコンテンツである。ここでは、誘引する対象のエリアに設置された商品棚に収容された商品に関連するコンテンツがターゲットコンテンツとして用いられる。表示制御部507は、誘引する対象となるエリアの指定を入力装置から受け付け、その指定されたエリアに応じたコンテンツIDを記憶部502から読み出す。エリアの指定は、誘引タイミングに到来した時点で受け付けるようにしてもよいし、誘引タイミングが到来する前に、予め指定されていてもよい。ここでは、エリアAが誘引する対象のエリアとして指定された場合、表示制御部507は、商品棚10Aに陳列された商品に関するコンテンツをターゲットコンテンツとして抽出する。
 表示制御部507は、抽出したターゲットコンテンツを表示装置30Bに表示させる(ステップS103)。
When it is determined that the target content should be displayed, the display control unit 507 causes the display device 30B to display the target content. The target content here is content related to the area to which people are to be attracted. Here, content related to products stored on product shelves installed in the area to be attracted is used as the target content. The display control unit 507 receives the designation of the area to be attracted from the input device, and reads out the content ID corresponding to the designated area from the storage unit 502. The designation of the area may be accepted when the attraction timing arrives, or may be specified in advance before the attraction timing arrives. Here, when area A is designated as the area to be attracted, the display control unit 507 extracts content related to the products displayed on the product shelf 10A as the target content.
The display control unit 507 displays the extracted target content on the display device 30B (step S103).
 コンテンツが表示されると、人物データ検出部506は、カメラ20Aから得られる撮像データに含まれる人物の特徴データを取得し(ステップS104)、第1データとして記憶部502に一時的に記憶する。
 図6は、第1データの一例を示す図である。この時点において検出された人物は2人である。より具体的には、属性が「男性,40歳代から50歳代」である人物が1名、属性が「女性,40歳代から50歳代」である人物が1名である。
When the content is displayed, the person data detection unit 506 acquires the characteristic data of the person included in the image data obtained from the camera 20A (step S104), and temporarily stores it in the storage unit 502 as first data.
FIG. 6 is a diagram showing an example of the first data. Two people have been detected at this point. More specifically, there is one person whose attribute is "male, from 40s to 50s" and one person whose attribute is "female, from 40s to 50s."
 人物データ検出部506は、比較データ取得時間に到達したか否かを判定し(ステップS105)、比較データ決定時間に到達していない場合には(ステップS105-NO)、一定のウエイト時間が経過してから再度ステップS105の判定を行い、比較データ決定時間に到達した場合には(ステップS105-YES)、カメラ20Aから得られる撮像データに含まれる人物の特徴データを取得し(ステップS106)、第2データとして記憶部502に一時的に記憶する。
 図7は、第2データの一例を示す図である。この図の例では、時刻t1の直前に撮像された撮像データに基づいて得られた第2データの一例を示す。
 例えば、時刻t1の直前に撮像された撮像データに基づいて得られた第2データから検出された人物は5人である。より具体的には、属性が「男性,20歳代から30歳代」である人物が1名、属性が「男性,40歳代から50歳代」である人物が1名、属性が「女性,20歳代から30歳代」である人物が2名、属性が「女性,40歳代から50歳代」である人物が1名である。
The person data detection unit 506 determines whether the comparison data acquisition time has been reached (step S105), and if the comparison data determination time has not been reached (step S105-NO), a certain wait time has elapsed. After that, the determination in step S105 is performed again, and if the comparison data determination time has been reached (step S105-YES), the characteristic data of the person included in the imaging data obtained from the camera 20A is acquired (step S106), The data is temporarily stored in the storage unit 502 as second data.
FIG. 7 is a diagram showing an example of the second data. The example in this figure shows an example of second data obtained based on image data captured immediately before time t1.
For example, five people are detected from the second data obtained based on the imaging data taken immediately before time t1. More specifically, there is one person whose attribute is "male, 20s to 30s," one person whose attribute is "male, 40s to 50s," and one person whose attribute is "female." , 20's to 30's," and 1 person's attributes are "Female, 40's to 50's."
 次に、人物データ検出部506は、計測対象期間が終了したか否かを判定し(ステップS107)、計測対象期間が終了していなければ(ステップS107-NO)、処理をステップS105に移行させ、そして次の比較データ取得時間に到達していれば新たな第2データを取得し(ステップS106)、計測対象期間が終了していれば(ステップS107-YES)、処理をS107に移行させる。
 図8は、新たに得られた第2データの一例を示す図である。この図の例では、時刻t2の直前に撮像された撮像データに基づいて得られた第2データの一例を示す。
 例えば、時刻t2の直前に撮像された撮像データに基づいて得られた第2データから検出された人物は6人である。より具体的には、属性が「男性,20歳代から30歳代」である人物が2名、属性が「男性,40歳代から50歳代」である人物が1名、属性が「女性,20歳代から30歳代」である人物が2名、属性が「女性,40歳代から50歳代」である人物が1名である。
 このような第2データは、比較データ取得時間に到達する毎に新たに得られる。
Next, the person data detection unit 506 determines whether or not the measurement period has ended (step S107), and if the measurement period has not ended (step S107-NO), the process moves to step S105. , if the next comparison data acquisition time has been reached, new second data is acquired (step S106), and if the measurement period has ended (step S107-YES), the process moves to S107.
FIG. 8 is a diagram showing an example of newly obtained second data. The example in this figure shows an example of second data obtained based on image data captured immediately before time t2.
For example, six people are detected from the second data obtained based on the imaging data taken immediately before time t2. More specifically, there are two people whose attributes are "male, 20s to 30s," one person whose attributes are "male, 40s to 50s," and one person whose attributes are "female." , 20's to 30's," and 1 person's attributes are "Female, 40's to 50's."
Such second data is newly obtained every time the comparison data acquisition time is reached.
 計測対象期間が終了すると、算出部509は、第1データと第2データとに基づいて効果を算出する(ステップS108)。
 例えば、図6に示す第1データと、図7に示す第2データとにおいて、属性毎に増えた人数を求め、人の移動が検出された時点におけるポイントを、検出された人のそれぞれについて合計した値を求める。さらに、算出部509は、比較データ取得時間に到達する毎に得られる第2データのそれぞれについてもポイントの合計値を求める。例えば、図6に示す第1データと、図8に示す第2データとにおいて、属性毎に増えた人数を求め、人の移動が検出された時点におけるポイントを、検出された人のそれぞれについて合計した値を求める。このようにして、比較データ取得時間に到達する毎に得られた各ポイントの総和を求めることで、コンテンツを表示したことに対する効果を算出する。
When the measurement period ends, the calculation unit 509 calculates the effect based on the first data and the second data (step S108).
For example, in the first data shown in FIG. 6 and the second data shown in FIG. Find the value. Furthermore, the calculation unit 509 calculates the total value of points for each piece of second data obtained each time the comparison data acquisition time is reached. For example, in the first data shown in FIG. 6 and the second data shown in FIG. Find the value. In this way, the effect of displaying the content is calculated by calculating the sum of each point obtained each time the comparison data acquisition time is reached.
 情報処理装置50は、電源OFFの指示が入力されているか否かを判定し(ステップS109)、電源OFFの指示が入力されていれば(ステップS109-YES)、処理を終了し、電源OFFの指示が入力されていなければ(ステップS109-NO)、処理をステップS101に移行する。 The information processing device 50 determines whether or not a power OFF instruction has been input (step S109), and if a power OFF instruction has been input (step S109-YES), the information processing device 50 ends the process and restarts the power OFF instruction. If no instruction has been input (step S109-NO), the process moves to step S101.
 以上説明した実施形態によれば、第1の領域(例えばエリアA)の外に設置された表示装置(例えば表示装置30B)の表示面に、第1の領域に関連するコンテンツ(商品棚10Aの商品に関するコンテンツ)を表示し、表示面にコンテンツを表示した後、記第1の領域への人の移動を検出し、少なくとも表示面を視認可能であり第1の領域とは異なる第2の領域(例えばエリアB)から第1の領域への人の移動に基づく、コンテンツの効果を算出する。
 これにより、コンテンツを表示し、その後、第1の領域への人の移動が検出された場合には、そのコンテンツを視聴した人が、コンテンツに関連する行動として第1の領域への移動をしたと推定することができる。これにより、人の移動を検出することで、コンテンツを表示したことに対する効果を測定することができる。
According to the embodiment described above, content related to the first area (product shelf 10A) is displayed on the display surface of the display device (for example, display device 30B) installed outside the first area (for example, area A). After displaying the content (content related to the product) on the display surface, detecting the movement of the person to the first area, and displaying the content on the display surface, and detecting the movement of the person to the first area, and displaying the content on the display surface, and detecting the movement of the person to the first area, and displaying the content on the display surface. The effect of the content is calculated based on the movement of people from (for example, area B) to the first area.
As a result, if content is displayed and a person's movement to the first area is detected after that, the person who viewed the content moves to the first area as an action related to the content. It can be estimated that Thereby, by detecting the movement of people, it is possible to measure the effectiveness of displaying the content.
 また上述した実施形態によれば、表示面にコンテンツを表示した後、第2の領域から第1の領域への人の移動に要する平均時間よりも長い時間が経過するまで、第1の領域への人の移動を第1の検出結果として検出するようにした。これにより、コンテンツを視聴したことに起因して第1の領域にある商品に興味を持った人物は、コンテンツを視聴した後、まもなく第1の領域へ移動すると考えられる。そのため、このような人物が第1の領域に移動してくるまでの移動時間が考慮された計測対象期間が設定されていることによって、その計測対象期間内に到来した人を対象として、コンテンツを表示したことによる効果を測定することができ、また、計測対象期間よりも後に第1の領域に到来する人物については、コンテンツを表示したこととは関係なく、第1の領域に到来したものと見なし、計測対象から除外することができる。 Further, according to the embodiment described above, after displaying the content on the display surface, the content is not moved to the first area until a time longer than the average time required for a person to move from the second area to the first area has elapsed. The movement of people is detected as the first detection result. As a result, it is considered that a person who becomes interested in a product in the first area due to viewing the content will move to the first area shortly after viewing the content. Therefore, by setting a measurement period that takes into account the travel time for such a person to move to the first area, content can be targeted at people who arrived within the measurement period. The effect of displaying the content can be measured, and a person who arrives in the first area after the measurement period is considered to have arrived in the first area regardless of the content being displayed. can be excluded from measurement.
 また、上述した実施形態において、施設内において、回遊する順路が想定されている場合もある。例えば、スーパーマーケットのような店舗においては、店舗の管理者によって、商品棚を見る順序が想定されているケースもある。このような場合、上流から下流に人が移動することが想定された移動順路において、第1の領域が上流に設定され、前記第2の領域は、下流に設定されるようにしてもよい。これにより、第1の領域を通過して第2の領域まで到達した人に対して、表示装置30Bによって、エリアAの商品棚10Aに関連するコンテンツを表示することで、エリアAに戻るように促すことができる。この場合、人が新たに店舗へ来店していなく、エリアAにおいて人が増えた場合には、エリアBからエリアAに人が戻ってきたと推定される。これにより、コンテンツを表示装置30Bに表示したことによってエリアAに人を誘引できたと推定することができる。 Furthermore, in the embodiments described above, there are cases in which a route for walking around the facility is assumed. For example, in a store such as a supermarket, there are cases in which the store manager has a predetermined order in which products are viewed on the shelves. In such a case, in a travel route in which a person is expected to move from upstream to downstream, the first region may be set upstream, and the second region may be set downstream. As a result, the display device 30B displays content related to the product shelf 10A in area A to a person who has passed through the first area and reached the second area, thereby prompting the person to return to area A. can be encouraged. In this case, if no new people are visiting the store and the number of people increases in area A, it is presumed that people have returned to area A from area B. Thereby, it can be estimated that people were attracted to area A by displaying the content on display device 30B.
 次に、第2の実施形態について説明する。
 図9は、第2の実施形態における効果測定システムSaの構成を示す概略ブロック図である。同図において図1の各部に対応する部分には同一の符号を付し、その説明を省略する。
 カメラ20Bは、店舗内の天井または壁面上方に設置され、エリアBを含む領域を撮像する。カメラ20Bは、エリアBに人が存在する場合には、人を含む領域を撮像することができる。
 表示装置30Aは、エリアAに設置される。ここでは、表示装置30Aの表示画面は、商品棚10Aに陳列された商品を見にエリアA内に来た人物が視認可能である。
 コンテンツ提供装置40Aは、複数種類のコンテンツを記憶しており、表示装置30Aにコンテンツを送信する。コンテンツは、店舗に関連する内容を含むものであり、例えば、商品棚10Bに陳列された商品を消費者に訴求可能な広告である。
Next, a second embodiment will be described.
FIG. 9 is a schematic block diagram showing the configuration of the effect measurement system Sa in the second embodiment. In the figure, parts corresponding to those in FIG. 1 are designated by the same reference numerals, and their explanations will be omitted.
Camera 20B is installed above the ceiling or wall inside the store, and images an area including area B. If a person exists in area B, camera 20B can image the area including the person.
Display device 30A is installed in area A. Here, the display screen of the display device 30A is visible to a person who has come into the area A to view the products displayed on the product shelf 10A.
The content providing device 40A stores multiple types of content and transmits the content to the display device 30A. The content includes content related to the store, and is, for example, an advertisement that can appeal to consumers about the products displayed on the product shelf 10B.
 情報処理装置51は、コンテンツを表示装置30Aと表示装置30Bとのうち少なくともいずれか一方に表示させ、コンテンツを表示したことに対する効果を算出する。情報処理装置51は、例えばコンピュータである。 The information processing device 51 displays the content on at least one of the display device 30A and the display device 30B, and calculates the effect of displaying the content. The information processing device 51 is, for example, a computer.
 図10は、情報処理装置51の機能を説明する概略機能ブロック図である。
 この図において、図2の情報処理装置50における対応する部分には同一の符号を付し、その説明を省略する。
 通信部511は、ネットワークNを介して他の機器(カメラ20A、カメラ20B、表示装置30A、表示装置30B、コンテンツ提供装置40A、コンテンツ提供装置40B)と通信をする。
 記憶部512は、エリア関連データと他の各種データとを記憶する。例えば、記憶部512は、カメラ識別情報と、エリアを識別するエリア識別情報と、エリアに陳列された商品に関する情報とを対応付けて記憶している。ここでは、カメラ20A、カメラ20Bについて、それぞれエリア関連データが記憶される。
FIG. 10 is a schematic functional block diagram illustrating the functions of the information processing device 51.
In this figure, corresponding parts in the information processing device 50 of FIG. 2 are denoted by the same reference numerals, and the explanation thereof will be omitted.
The communication unit 511 communicates with other devices (camera 20A, camera 20B, display device 30A, display device 30B, content providing device 40A, content providing device 40B) via network N.
The storage unit 512 stores area-related data and other various data. For example, the storage unit 512 stores camera identification information, area identification information that identifies an area, and information regarding products displayed in the area in association with each other. Here, area-related data is stored for each of the cameras 20A and 20B.
 エリア選定設定部514は、入力部503から入力される指示に基づいて、エリアを選定する選定タイミングを設定する。選定タイミングは、予め定められた時間であるエリア選定時間であってもよい。エリア選定時間は、一定時間毎に到来するようなタイミングであってもよい。また、選定タイミングは、エリア毎に設定された人数が基準値を下回ったことが検出されたタイミングであってもよい。 The area selection setting section 514 sets the selection timing for selecting an area based on the instruction input from the input section 503. The selection timing may be an area selection time that is a predetermined time. The area selection time may be a timing that occurs at regular intervals. Further, the selection timing may be the timing at which it is detected that the number of people set for each area has fallen below a reference value.
 計測タイミング設定部515は、選定されたエリアに関連するコンテンツが、当該エリアとはことなるエリアに設置された表示装置に表示された後において、選定されたエリアから人物データを取得する計測タイミングを設定する。この実施形態において、計測タイミングは、コンテンツを表示した後において1回であってもよいし、異なるタイミングにおける複数回であってもよい。 The measurement timing setting unit 515 sets the measurement timing for acquiring person data from the selected area after content related to the selected area is displayed on a display device installed in an area different from the selected area. Set. In this embodiment, the measurement timing may be once after the content is displayed, or may be measured multiple times at different timings.
 表示制御部517は、各エリアの推定結果に基づいて、人を誘引する対象のエリアを選定する。選定方法は例えば以下の(1)から(4)がある。
(1)人数が少ないエリアを選定
 複数のエリアのうち、人物データ検出部506によって検出された人の人数が最も少ないエリアを選定する。
(2)エリアにいる人数と基準値との大小関係を参照する場合
 コンテンツを表示していない時における来店者の人数や、エリアに陳列された商品の販売数を、人物データ検出部506の検出結果に基づいて計測し、曜日ごと、または時間帯ごとに平均値を求めた集計結果を基準として記憶しておく。そして、監視する時点におけるエリアにいる人の人数と、対象の曜日や時間帯に応じた基準値とを比較し、エリアにいる人の人数が基準値を下回った場合に、そのエリアを誘引する対象のエリアとして選定する。この場合、人数が少ないエリアに誘引することができる。
 また、この場合、監視する時点におけるエリアにいる人の人数と、対象の曜日や時間帯に応じた基準値とを比較し、エリアにいる人の人数が基準値を上回った場合に、他のエリアのうちいずれかのエリアを誘引する対象のエリアとして選定する。この場合、人数が多いエリアから別のエリアへ誘引することができ、来店した人物が店舗内の特定の場所に偏って存在してしまわないように、分散させることができる。
(3)基準値を下回ったエリアのなかからランダムに選択
 上述の(2)において基準値を下回ったエリアが複数ある場合に、その複数のエリアの中から、一定時間毎にランダムにエリアを選択する。
(4)ランダムに選択
 上述の(2)、(3)とは異なり、エリアにいる人の人数には関係なく、ランダムにエリアを選定する。
The display control unit 517 selects an area to attract people based on the estimation results for each area. Examples of selection methods include (1) to (4) below.
(1) Selecting an area with the least number of people Among the multiple areas, the area where the number of people detected by the person data detection unit 506 is the least is selected.
(2) When referring to the size relationship between the number of people in the area and the reference value The person data detection unit 506 detects the number of visitors when no content is displayed and the number of sales of products displayed in the area. Measurements are made based on the results, and the aggregated results obtained by calculating the average value for each day of the week or each time period are stored as a reference. Then, the number of people in the area at the time of monitoring is compared with a standard value depending on the target day of the week and time of day, and if the number of people in the area is less than the standard value, the area is attracted. Select as the target area. In this case, it is possible to attract people to an area with fewer people.
In this case, the number of people in the area at the time of monitoring is compared with a standard value depending on the target day of the week and time of day, and if the number of people in the area exceeds the standard value, other Select one of the areas as the area to be attracted. In this case, it is possible to attract people from an area with a large number of people to another area, and it is possible to disperse the people who have visited the store so that they are not concentrated in a particular location within the store.
(3) Random selection from among the areas that are below the standard value If there are multiple areas that are below the standard value in (2) above, randomly select an area from among the multiple areas at regular intervals. do.
(4) Random selection Unlike (2) and (3) above, an area is selected at random, regardless of the number of people in the area.
 表示制御部507は、第1の領域の外に設置された表示装置の表示面に、第1の領域に関連するコンテンツを表示させる。第1の領域は、施設内のいずれかのエリアであればよく、この実施形態においては、エリアAである。第1の領域の外に設置された表示装置は、表示装置30Bである。 The display control unit 507 displays content related to the first area on the display surface of a display device installed outside the first area. The first area may be any area within the facility, and is area A in this embodiment. The display device installed outside the first area is the display device 30B.
 移動検出部508は、表示面にコンテンツが表示された後、第1の領域への人の移動を検出する。例えば、移動検出部508は、人を誘引する対象のエリアとして選定されたエリアを撮像するカメラによって撮像された画像に人が含まれている場合に、当該人の移動を検出する。
 移動検出部508は、第2の領域から第1の領域への人の移動に要する平均時間よりも長い時間が経過するまで、第1の領域への人の移動を第1の検出結果として検出する。また、移動検出部508は、表示面にコンテンツを表示した後、第2の領域の外への人の移動を第2の検出結果として検出する。
 また、移動検出部508は、人物が第2の領域から第1の領域へ移動したと推定できる場合として、第2の領域から第2の領域の外への人の移動を検出し、かつ、第1の領域が撮像された画像に人が含まれている場合に、第2の検出結果として検出するようにしてもよい。
The movement detection unit 508 detects movement of the person to the first area after the content is displayed on the display screen. For example, if a person is included in an image captured by a camera that captures an area selected as an area to attract people, the movement detection unit 508 detects the movement of the person.
The movement detection unit 508 detects the movement of the person to the first area as the first detection result until a time longer than the average time required for the movement of the person from the second area to the first area has elapsed. do. Furthermore, after displaying the content on the display screen, the movement detection unit 508 detects movement of the person out of the second area as a second detection result.
Further, the movement detection unit 508 detects movement of the person from the second area to outside the second area as a case where it can be estimated that the person has moved from the second area to the first area, and If a person is included in the image captured in the first area, it may be detected as the second detection result.
 移動検出部508は、第1の領域の外のエリアにおいて検出された人物の特徴データに基づいて、その人物の移動を追跡することで、コンテンツが表示されたエリアから、誘引対象のエリアに人物が移動したか否かを検出する。例えば、移動検出部508は、ある撮像データに基づいて得られた特徴データと、その後に撮像されて得られた特徴データとが対応関係にある場合であって、その位置が移動経路に沿って移動した見なせる程度の距離内である場合には、その人物について同一人物であると推定する。これにより、コンテンツが表示された表示装置が設置されたエリアから、誘引する対象のエリアまでの間において、人物を追跡することで、実際にコンテンツを見た人が、誘引する対象のエリアに来たとして推定することができる。 The movement detection unit 508 moves the person from the area where the content is displayed to the attraction target area by tracking the movement of the person based on the characteristic data of the person detected in the area outside the first area. Detect whether the has moved. For example, the movement detection unit 508 detects a case in which feature data obtained based on certain imaging data and feature data obtained by subsequent imaging have a correspondence relationship, and the position is located along the movement route. If the person is within a distance that can be considered to have moved, the person is presumed to be the same person. This allows people to be tracked from the area where the display device displaying the content is installed to the area to be lured, so that those who actually view the content will come to the area to be attracted. It can be estimated as follows.
 算出部519は、少なくとも表示面を視認可能であり第1の領域とは異なる第2の領域から第1の領域への人の移動に基づく、コンテンツの効果を算出する。第2の領域は、第1の領域の外であって、施設内または施設の周囲のエリアであればよい。この実施形態において、第2の領域は、エリアBである場合について説明する。
 算出部519は、第1の検出結果と第2の検出結果とに基づいて、人物が第2の領域から第1の領域へ移動したと推定される場合には、人物がコンテンツを視聴したことによって、そのコンテンツに含まれる商品に興味を持ち、その結果、第1の領域へ移動したと推定することができる。このように推定される場合には、コンテンツを表示したことに対する効果があったと推定することができる。
The calculation unit 519 calculates the effect of the content based on the movement of a person from a second area where the display surface is at least visible and which is different from the first area to the first area. The second area may be an area outside the first area and within or around the facility. In this embodiment, a case where the second area is area B will be described.
If it is estimated that the person has moved from the second area to the first area based on the first detection result and the second detection result, the calculation unit 519 determines that the person has viewed the content. Therefore, it can be estimated that the user became interested in the product included in the content, and as a result moved to the first area. If it is estimated in this way, it can be estimated that there was an effect of displaying the content.
 図11は、情報処理装置51の動作を説明するフローチャートである。
 カメラ20Aは、エリアAを撮像し、ネットワークNを介して、撮像データを情報処理装置51に送信する。カメラ20Bは、エリアBを撮像し、ネットワークNを介して、撮像データを情報処理装置51に送信する。
 情報処理装置51の表示制御部517は、コンテンツ提供装置40Aに記憶されたコンテンツのうち、任意のコンテンツを選定し、表示装置30Aに表示させ、コンテンツ提供装置40Bに記憶されたコンテンツのうち、任意のコンテンツを選定し、表示装置30Bに表示させる(ステップS201)。ここでは、デフォルトとして設定されたコンテンツが選定され、表示されてもよい。コンテンツ提供装置40Aは、情報処理装置51によって選定されたコンテンツを表示装置30AにネットワークNを介して送信し、コンテンツ提供装置40Bは、情報処理装置51によって選定されたコンテンツを表示装置30BにネットワークNを介して送信する。これにより、表示装置30Aと表示装置30Bはそれぞれ、表示制御部517によって選定されたコンテンツを表示する。
FIG. 11 is a flowchart illustrating the operation of the information processing device 51.
The camera 20A images the area A and transmits the imaged data to the information processing device 51 via the network N. Camera 20B images area B and transmits the imaged data to information processing device 51 via network N.
The display control unit 517 of the information processing device 51 selects any content from among the contents stored in the content providing device 40A, displays it on the display device 30A, and selects any content from among the contents stored in the content providing device 40B. content is selected and displayed on the display device 30B (step S201). Here, content set as default may be selected and displayed. The content providing device 40A transmits the content selected by the information processing device 51 to the display device 30A via the network N, and the content providing device 40B transmits the content selected by the information processing device 51 to the display device 30B via the network N. Send via. Thereby, the display device 30A and the display device 30B each display the content selected by the display control unit 517.
 人物データ検出部506は、エリア選定設定部504によって設定された設定条件に基づいて、エリアを選定するか否かを判定する。例えば、人物データ検出部506は、第1周期に到達したか否かに基づいてエリア選定時間に到達したか否かを判定し(ステップS202)、第1周期に到達していなければ(ステップS202-NO)、エリア選定時間に到達していないと判定し、ステップS201に処理を移行する。一方、人物データ検出部506は、第1周期に到達していればエリア選定時間に到達したと判定し(ステップS202-YES)、処理をステップS203に移行する。 The person data detection unit 506 determines whether or not to select an area based on the setting conditions set by the area selection setting unit 504. For example, the person data detection unit 506 determines whether the area selection time has arrived based on whether the first period has been reached (step S202), and if the first period has not been reached (step S202). -NO), it is determined that the area selection time has not been reached, and the process moves to step S201. On the other hand, if the first cycle has been reached, the person data detection unit 506 determines that the area selection time has arrived (step S202-YES), and moves the process to step S203.
 エリア選定時間に到達したと判定されると、人物データ検出部506は、撮像データに含まれる人物の特徴データをエリア毎に算出し、エリア毎に人数をカウントした基準データとして記憶部512に一時的に記憶する(ステップS203)。ここでは、エリア毎に、そのエリアに含まれる人物についてそれぞれ属性を推定した結果を基準データとして取得され記憶される。例えば、人物データ検出部506は、エリアAにいる人の人数とそれぞれの属性を推定し、エリアBにいる人の人数とそれぞれの属性推定し、その推定された結果を記憶部512に記憶する。 When it is determined that the area selection time has been reached, the person data detection unit 506 calculates the characteristic data of the person included in the imaging data for each area, and temporarily stores it in the storage unit 512 as reference data for counting the number of people in each area. (step S203). Here, for each area, the results of estimating the attributes of the people included in that area are acquired and stored as reference data. For example, the person data detection unit 506 estimates the number of people in area A and their attributes, estimates the number of people in area B and their attributes, and stores the estimated results in the storage unit 512. .
 表示制御部517は、各エリアの推定結果に基づいて、人を誘引する対象のエリアを選定する(ステップS204)。ここでは例えば、エリアAの人数が最も少ないと選定されたとする。言い換えると、エリアAにいる人が少なく、エリアBにいる人が多い。そのため、エリアBからエリアAに人を誘引することが決定される。
 表示制御部517は、選定されたエリア内に陳列されている商品を記憶部512に記憶されたエリア関連データに基づいて選定し(ステップS205)、選定された商品に対応するコンテンツをコンテンツ提供装置40Bに記憶されたコンテンツからコンテンツとして選定する(S205)。ここでは、エリアAの商品棚10Aに陳列された商品に関連するコンテンツが選定される。
The display control unit 517 selects an area to attract people based on the estimation results for each area (step S204). For example, assume that area A is selected to have the least number of people. In other words, there are fewer people in area A and more people in area B. Therefore, it is decided to attract people from area B to area A.
The display control unit 517 selects a product displayed in the selected area based on the area-related data stored in the storage unit 512 (step S205), and displays content corresponding to the selected product in the content providing device. The content is selected from among the contents stored in 40B (S205). Here, content related to the products displayed on the product shelf 10A in area A is selected.
 そして表示制御部517は、選定されたコンテンツを表示装置30Bに表示させる(ステップS206)。ここでは、表示装置30が3つ以上ある場合には、エリアA以外に設置されたそれぞれの表示装置30に、選定されたコンテンツを表示させるようにしてもよい。 Then, the display control unit 517 displays the selected content on the display device 30B (step S206). Here, if there are three or more display devices 30, the selected content may be displayed on each display device 30 installed outside area A.
 ここでは、表示制御部517は、選定されたコンテンツの再生が終了した場合には、複数回繰り返し再生してもよいし、選定されたコンテンツとは異なる任意のコンテンツを再生してもよい(ステップS207)。 Here, when the reproduction of the selected content is completed, the display control unit 517 may repeatedly reproduce the content multiple times, or may reproduce arbitrary content different from the selected content (step S207).
 人物データ検出部506は、比較データ取得時間に到達したか否かを判定し(ステップS208)、比較データ決定時間に到達していない場合には(ステップS208-NO)、ステップS207の処理に移行し、比較データ決定時間に到達した場合には(ステップS208-YES)、ステップS204において選定されたエリアを撮像するカメラ(ここではエリアAのカメラ20A)から得られる撮像データに含まれる人物の特徴データを取得し(ステップS209)、比較データとして記憶部502に一時的に記憶する。ここでは、選定されたエリアにいる人物のそれぞれについて属性が推定され、その推定結果が記憶される。 The person data detection unit 506 determines whether the comparison data acquisition time has been reached (step S208), and if the comparison data determination time has not been reached (step S208-NO), the process moves to step S207. However, when the comparison data determination time has been reached (step S208-YES), the characteristics of the person included in the imaged data obtained from the camera that images the area selected in step S204 (here, camera 20A of area A) Data is acquired (step S209) and temporarily stored in the storage unit 502 as comparison data. Here, attributes are estimated for each person in the selected area, and the estimation results are stored.
 推定結果が記憶されると、算出部509は、基準データと比較データとに基づいて効果を算出する(ステップS210)。例えば、基準データと比較データとにおいて、属性毎に増えた人数を求め、増えた人数をカウントしたカウント結果を得ることで、コンテンツを表示したことに対する効果を算出する。 Once the estimation results are stored, the calculation unit 509 calculates the effect based on the reference data and comparison data (step S210). For example, the effect of displaying the content is calculated by calculating the increase in the number of people for each attribute between the reference data and the comparison data, and obtaining a count result of counting the increase in the number of people.
 情報処理装置51は、電源OFFの指示が入力されているか否かを判定し(ステップS211)、電源OFFの指示が入力されていれば(ステップS212-YES)、処理を終了し、電源OFFの指示が入力されていなければ(ステップS211-NO)、処理をステップS201に移行する。 The information processing device 51 determines whether a power OFF instruction has been input (step S211), and if a power OFF instruction has been input (step S212-YES), the information processing device 51 terminates the process and performs a power OFF instruction. If no instruction has been input (step S211-NO), the process moves to step S201.
 以上説明した実施形態によれば、選定された特定のエリアにおいて陳列された商品に関連するコンテンツを、他のエリアの表示装置に表示し、当該表示する前の時点における特定エリアにいる人の属性と、一定時間が経過した後における特定エリアにいる人の属性とを比較するようにした。これにより、コンテンツの表示前後における属性の相違に基づいて、コンテンツを視聴したことによってコンテンツに関連する商品に興味を持った人の属性の分布を推定することができる。 According to the embodiment described above, content related to products displayed in a selected specific area is displayed on a display device in another area, and the attributes of the person in the specific area at the time before the content is displayed. and the attributes of people in a specific area after a certain amount of time has passed. Thereby, it is possible to estimate the distribution of attributes of people who are interested in products related to the content after viewing the content, based on the difference in attributes before and after the content is displayed.
 上述した第2の実施形態において、カメラは2台である場合について説明したが、カメラは、3台以上であってもよい。
 図12は、第2の実施形態においてカメラを3台設けた場合における効果測定システムSbの構成を示す概略ブロック図である。
 効果測定システムSbでは、第1の領域(エリアA)と第2の領域(エリアB)とが離間しており、第1の領域と第2の領域の間の領域(エリアC)を第1のカメラ及び第2のカメラとは異なる第3のカメラ(カメラ20C)によって撮像する。そして、移動検出部508が、カメラ20Cから得られる撮像画像に基づいて、第2の領域(エリアB)から第3の領域(エリアC)への人の移動を検出し、当該移動が検出された人について第3の領域(エリアC)から第1の領域(エリアA)へ人の移動を検出する。
 算出部519は、第2の領域から第3の領域への人の移動が検出され、当該移動が検出された人について第3の領域から第1の領域へ人の移動が検出された場合、この検出結果に基づいて、コンテンツの効果を算出する。
In the second embodiment described above, the case where there are two cameras has been described, but the number of cameras may be three or more.
FIG. 12 is a schematic block diagram showing the configuration of the effect measurement system Sb in the case where three cameras are provided in the second embodiment.
In the effect measurement system Sb, a first area (area A) and a second area (area B) are separated from each other, and an area (area C) between the first area and the second area is used as the first area. The image is captured by a third camera (camera 20C) different from the camera and the second camera. Then, the movement detection unit 508 detects movement of the person from the second area (area B) to the third area (area C) based on the captured image obtained from the camera 20C, and the movement is detected. The movement of the person from the third area (area C) to the first area (area A) is detected.
When the movement of a person from the second area to the third area is detected, and the movement of the person from the third area to the first area is detected for the person whose movement was detected, the calculation unit 519 calculates the following: Based on this detection result, the effectiveness of the content is calculated.
 このように、第1の領域と第2の領域の間を移動可能な移動経路を撮影するカメラを設けることによって、コンテンツを表示させた表示装置が設けられたエリアと誘引したい対象のエリアとのが離間していたとしても、その間のエリアを第3のカメラで撮像することによって、エリアAとエリアBとの間において人物を連続的に追跡することができる。これにより、コンテンツを表示したことに対する効果の測定精度を向上させることができる。 In this way, by providing a camera that photographs a movable route between the first area and the second area, it is possible to differentiate between the area where the display device displaying the content is provided and the area where the target is to be attracted. Even if they are separated, the person can be continuously tracked between area A and area B by capturing an image of the area between them with a third camera. Thereby, the accuracy of measuring the effect of displaying the content can be improved.
 ここで、第3のカメラは、1台である場合について説明したが、第1の領域と第2の領域との間の移動経路を1台のカメラの撮像可能な領域に収まらない場合には、第3のカメラを複数設けるようにしてもよい。この場合、複数の第3のカメラの撮像領域が、第1の領域と第2の領域との間の移動経路に沿って撮像できるように設定されればよい。 Here, the case where there is only one third camera has been described, but if the moving route between the first area and the second area does not fit within the imageable area of one camera, , a plurality of third cameras may be provided. In this case, the imaging areas of the plurality of third cameras may be set so that images can be taken along the movement route between the first area and the second area.
 また、第1のカメラ、第2のカメラ、第3のカメラの撮像領域は、互いに異なる領域であり、移動経路に沿って少なくとも一部が重なるように設定するようにしてもよい。図12では、第1の領域(エリアA)と第3の領域(エリアC)とが重なる領域である領域ACと、第2の領域(エリアB)と第3の領域(エリアC)とが重なる領域である領域BCとが、第1の領域(エリアA)と第2の領域(エリアB)の間の移動経路に沿うように設定されている。
 この場合、人の移動を撮像領域において途切れることなく追跡することができる。
Further, the imaging areas of the first camera, the second camera, and the third camera may be different from each other, and may be set so that at least a portion thereof overlaps along the movement route. In FIG. 12, an area AC is an area where a first area (area A) and a third area (area C) overlap, and an area where a second area (area B) and a third area (area C) are overlapped. An area BC, which is an overlapping area, is set along the movement route between the first area (area A) and the second area (area B).
In this case, the movement of the person can be tracked without interruption in the imaging area.
 図13は、第3の実施形態における効果測定システムScの構成を示す概略ブロック図である。
 効果測定システムScは、カメラ20D、表示装置30E、コンテンツ提供装置40E、情報処理装置52、ネットワークNを介して通信可能に接続される。
FIG. 13 is a schematic block diagram showing the configuration of the effect measurement system Sc in the third embodiment.
The effect measurement system Sc is communicably connected via a camera 20D, a display device 30E, a content providing device 40E, an information processing device 52, and a network N.
 エリアDは、施設内の一部のエリアであり、施設(例えば店舗)内において、表示装置30Bにコンテンツを表示させることで人を誘引したい対象の区画に対して設定される。
 カメラ20Dは、店舗内の天井または壁面上方に設置され、エリアDを含む領域を撮像する。カメラ20Dは、エリアDに人が存在する場合には、人を含む領域を撮像することができる。
 表示装置30Eは、エリアDとは異なる領域に設置される。ここでは、表示装置30Bの表示画面は、エリアDとは異なる領域にいる人物によって視認可能である。
 エリアEは、表示装置30Eの表示面を視認可能であり第1の領域とは異なる第2の領域である。
 コンテンツ提供装置40Eは、複数種類のコンテンツを記憶しており、表示装置30Eにコンテンツを送信する。
Area D is a part of the area within the facility, and is set for a target section within the facility (for example, a store) where it is desired to attract people by displaying content on the display device 30B.
Camera 20D is installed above the ceiling or wall inside the store, and images an area including area D. If a person exists in area D, camera 20D can image the area including the person.
Display device 30E is installed in an area different from area D. Here, the display screen of the display device 30B is visible to a person in an area different from area D.
Area E is a second area where the display surface of display device 30E is visible and different from the first area.
The content providing device 40E stores multiple types of content and transmits the content to the display device 30E.
 情報処理装置52は、コンテンツを表示装置30Eに表示させ、コンテンツを表示したことに対する効果を算出する。情報処理装置52は、例えばコンピュータである。
 ネットワークNは、例えばLAN(ローカルエリアネットワーク)である。
The information processing device 52 displays the content on the display device 30E and calculates the effect of displaying the content. The information processing device 52 is, for example, a computer.
The network N is, for example, a LAN (local area network).
 情報処理装置52は、第1の領域(エリアD)の外に設置された表示装置(30E)の表示面に、第1の領域(エリアD)に関連するコンテンツを表示させる。ここでは、情報処理装置52は、コンテンツ提供装置40Eに対して、表示装置30Eにコンテンツを表示させるよう指示を出力する。この指示を受けて、コンテンツ提供装置40Eは、表示装置30Eにコンテンツを送信することで、表示装置30Bにコンテンツを表示させる。 The information processing device 52 displays content related to the first area (area D) on the display surface of the display device (30E) installed outside the first area (area D). Here, the information processing device 52 outputs an instruction to the content providing device 40E to display the content on the display device 30E. Upon receiving this instruction, the content providing device 40E causes the display device 30B to display the content by transmitting the content to the display device 30E.
 次に、情報処理装置52は、表示装置30Eの表示面にコンテンツを表示した後、第1の領域への人の移動を検出する。情報処理装置52は、少なくとも第2の領域から第1の領域への人の移動に基づく、コンテンツの効果を算出する。 Next, the information processing device 52 displays the content on the display surface of the display device 30E, and then detects movement of the person to the first area. The information processing device 52 calculates the effect of the content based on at least the movement of the person from the second area to the first area.
 以上説明した実施形態において、記憶部502、記憶部512は、記憶媒体、例えば、HDD(Hard Disk Drive)、フラッシュメモリ、EEPROM(Electrically Erasable Programmable Read Only Memory)、RAM(Random Access read/write Memory)、ROM(Read Only Memory)、またはこれらの記憶媒体の任意の組み合わせによって構成される。
 この記憶部502、記憶部512は、例えば、不揮発性メモリを用いることができる。
In the embodiment described above, the storage unit 502 and the storage unit 512 are storage media such as a hard disk drive (HDD), a flash memory, an electrically erasable programmable read only memory (EEPROM), or a random access memory (RAM). cess read/write Memory) , ROM (Read Only Memory), or any combination of these storage media.
For example, nonvolatile memory can be used for the storage unit 502 and the storage unit 512.
 上述した実施形態において、情報処理装置50の通信部501、入力部503、エリア選定設定部504、計測タイミング設定部505、人物データ検出部506、表示制御部507、移動検出部508、算出部509は、例えばCPU(中央処理装置)等の処理装置若しくは専用の電子回路で構成されてよい。
 また、情報処理装置51の通信部511、入力部503、エリア選定設定部514、計測タイミング設定部515、人物データ検出部506、表示制御部517、移動検出部518、算出部519は、例えばCPU(中央処理装置)等の処理装置若しくは専用の電子回路で構成されてよい。
In the embodiment described above, the communication unit 501, input unit 503, area selection setting unit 504, measurement timing setting unit 505, person data detection unit 506, display control unit 507, movement detection unit 508, and calculation unit 509 of the information processing device 50 may be constituted by a processing device such as a CPU (central processing unit) or a dedicated electronic circuit, for example.
In addition, the communication unit 511, input unit 503, area selection setting unit 514, measurement timing setting unit 515, person data detection unit 506, display control unit 517, movement detection unit 518, and calculation unit 519 of the information processing device 51 are, for example, CPU (central processing unit) or a dedicated electronic circuit.
 また、図1における処理部の機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することにより施工管理を行ってもよい。なお、ここでいう「コンピュータシステム」とは、OSや周辺機器等のハードウェアを含むものとする。 In addition, construction management is performed by recording a program for realizing the functions of the processing section shown in Fig. 1 on a computer-readable recording medium, and having the computer system read and execute the program recorded on this recording medium. You can. Note that the "computer system" herein includes hardware such as an OS and peripheral devices.
 また、「コンピュータシステム」は、WWWシステムを利用している場合であれば、ホームページ提供環境(あるいは表示環境)も含むものとする。
 また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、サーバやクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持しているものを含むものとする。また上記プログラムは、前述した機能の一部を実現するためのものであっても良く、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであってもよい。また、上記のプログラムを所定のサーバに記憶させておき、他の装置からの要求に応じて、当該プログラムを通信回線を介して配信(ダウンロード等)させるようにしてもよい。
Furthermore, the term "computer system" includes the homepage providing environment (or display environment) if a WWW system is used.
Furthermore, the term "computer-readable recording medium" refers to portable media such as flexible disks, magneto-optical disks, ROMs, and CD-ROMs, and storage devices such as hard disks built into computer systems. Furthermore, the term "computer-readable recording medium" includes a medium that retains a program for a certain period of time, such as a volatile memory inside a computer system that is a server or a client. Further, the above-mentioned program may be one for realizing a part of the above-mentioned functions, or may be one that can realize the above-mentioned functions in combination with a program already recorded in the computer system. Alternatively, the above program may be stored in a predetermined server, and the program may be distributed (downloaded, etc.) via a communication line in response to a request from another device.
 以上、この発明の実施形態について図面を参照して詳述してきたが、具体的な構成はこの実施形態に限られるものではなく、この発明の要旨を逸脱しない範囲の設計等も含まれる。 Although the embodiments of the present invention have been described above in detail with reference to the drawings, the specific configuration is not limited to these embodiments, and includes designs within the scope of the gist of the present invention.
10A,10B・・・商品棚、20A,20B,20C,20D・・・カメラ、30,30A,30B,30E・・・表示装置、40A,40B,40E・・・コンテンツ提供装置、50,51,52・・・情報処理装置、501,511・・・通信部、502,512・・・記憶部、503・・・入力部、504,514・・・エリア選定設定部、505,515・・・計測タイミング設定部、506・・・人物データ検出部、507,517・・・表示制御部、508,518・・・移動検出部、509,519・・・算出部、S,Sa,Sb,Sc・・・効果測定システム 10A, 10B... Product shelf, 20A, 20B, 20C, 20D... Camera, 30, 30A, 30B, 30E... Display device, 40A, 40B, 40E... Content providing device, 50, 51, 52... Information processing device, 501, 511... Communication unit, 502, 512... Storage unit, 503... Input unit, 504, 514... Area selection setting unit, 505, 515... Measurement timing setting section, 506... Person data detection section, 507, 517... Display control section, 508, 518... Movement detection section, 509, 519... Calculation section, S, Sa, Sb, Sc ...Effect measurement system

Claims (11)

  1.  第1の領域の外に設置された表示装置の表示面に、前記第1の領域に関連するコンテンツを表示し、
     前記表示面に前記コンテンツを表示した後、前記第1の領域への人の移動を検出し、
     少なくとも前記表示面を視認可能であり前記第1の領域とは異なる第2の領域から前記第1の領域への人の移動に基づく、前記コンテンツの効果を算出する
     コンテンツの効果算出方法。
    Displaying content related to the first area on a display surface of a display device installed outside the first area,
    After displaying the content on the display surface, detecting movement of the person to the first area;
    A content effect calculation method for calculating the effect of the content based on movement of a person from a second area different from the first area to the first area where at least the display surface is visible.
  2.  前記表示面に前記コンテンツを表示した後、前記第2の領域から前記第1の領域への人の移動に要する平均時間よりも長い時間が経過するまで、前記第1の領域への人の移動を第1の検出結果として検出し、
     前記第1の検出結果に基づいて、前記コンテンツの効果を算出する
     請求項1に記載のコンテンツの効果算出方法。
    After displaying the content on the display surface, the person moves to the first area until a time longer than an average time required for the person to move from the second area to the first area has elapsed. is detected as the first detection result,
    The content effect calculation method according to claim 1, wherein the content effect is calculated based on the first detection result.
  3.  前記第1の検出結果として検出する対象の期間の開始時よりも終了時の方が低くなるポイントが計測対象期間における経過時間に応じて設定されており、
     前記人の移動が検出された時点におけるポイントを、検出された人のそれぞれについて合計した値を求めることで、前記コンテンツの効果を算出する
     請求項2に記載のコンテンツの効果算出方法。
    A point at which the first detection result is lower at the end of the period to be detected than at the beginning is set according to the elapsed time in the period to be measured,
    3. The content effect calculation method according to claim 2, wherein the effect of the content is calculated by calculating the sum of points for each detected person at the time when the movement of the person is detected.
  4.  前記第1の領域を撮像するカメラから得られる撮像画像に人が含まれている場合に、当該人の移動を検出する
     請求項1から請求項3のうちいずれか1項に記載のコンテンツの効果算出方法。
    The effect of the content according to any one of claims 1 to 3, wherein when a person is included in a captured image obtained from a camera that images the first area, movement of the person is detected. Calculation method.
  5.  前記表示面に前記コンテンツを表示した後、前記第2の領域から前記第2の領域の外への人の移動を第2の検出結果として検出し、
     さらに前記第2の検出結果に基づいて、前記コンテンツの効果を算出する
     請求項2から請求項4のうちいずれか1項に記載のコンテンツの効果算出方法。
    After displaying the content on the display surface, detecting movement of a person from the second area to outside the second area as a second detection result;
    The content effect calculation method according to any one of claims 2 to 4, further comprising calculating the effect of the content based on the second detection result.
  6.  前記第1の領域が撮像された画像に人が含まれている場合に、前記第2の検出結果として検出する
     請求項5に記載のコンテンツの効果算出方法。
    6. The content effect calculation method according to claim 5, wherein when a person is included in the image captured in the first area, the second detection result is detected.
  7.  前記第2の領域を撮像するカメラから得られる撮像画像に含まれる人の数が基準値以上となった場合に、前記コンテンツを表示する
     請求項5または請求項6に記載のコンテンツの効果算出方法。
    The content effect calculation method according to claim 5 or 6, wherein the content is displayed when the number of people included in a captured image obtained from a camera that captures the second area is equal to or greater than a reference value. .
  8.  前記第1の領域と前記第2の領域とが離間しており、前記第1の領域と前記第2の領域の間の領域を撮像するカメラから得られる撮像画像に基づいて、前記第2の領域から第3の領域への人の移動を検出し、当該移動が検出された人について前記第3の領域から前記第1の領域へ人の移動を検出したことに基づく、前記コンテンツの効果を算出する
     請求項4から請求項7のうちいずれか1項に記載のコンテンツの効果算出方法。
    The first area and the second area are separated from each other, and based on a captured image obtained from a camera that images an area between the first area and the second area, Detecting the movement of a person from the area to the third area, and determining the effect of the content based on the detection of the movement of the person from the third area to the first area for the person for whom the movement was detected. The content effect calculation method according to any one of claims 4 to 7.
  9.  上流から下流に人が移動することが想定された移動順路において、前記第1の領域が前記上流に設定され、前記第2の領域は、前記下流に設定される
     請求項1から請求項8のうちいずれか1項に記載のコンテンツの効果算出方法。
    In a travel route in which a person is assumed to move from upstream to downstream, the first area is set at the upstream side, and the second area is set at the downstream side. A method for calculating the effect of content as described in any one of the above.
  10.  前記第1の領域に対して人を移動させる要求を受けると、前記コンテンツを表示する
     請求項1から請求項8のうちいずれか1項に記載のコンテンツの効果算出方法。
    The content effect calculation method according to any one of claims 1 to 8, wherein the content is displayed when a request to move a person to the first area is received.
  11.  第1の領域の外に設置された表示装置の表示面に、前記第1の領域に関連するコンテンツを表示させる表示制御部と、
     前記表示面に前記コンテンツが表示された後、前記第1の領域への人の移動を検出する検出部と、
     少なくとも前記表示面を視認可能であり前記第1の領域とは異なる第2の領域から前記第1の領域への人の移動に基づく、前記コンテンツの効果を算出する算出部と
     を有する効果算出システム。
    a display control unit that displays content related to the first area on a display surface of a display device installed outside the first area;
    a detection unit that detects movement of a person to the first area after the content is displayed on the display screen;
    an effect calculation system comprising: a calculation unit that calculates the effect of the content based on movement of a person from a second area where at least the display surface is visible and which is different from the first area to the first area; .
PCT/JP2022/019283 2022-04-28 2022-04-28 Content effect calculation method and effect calculation system WO2023209951A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/019283 WO2023209951A1 (en) 2022-04-28 2022-04-28 Content effect calculation method and effect calculation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/019283 WO2023209951A1 (en) 2022-04-28 2022-04-28 Content effect calculation method and effect calculation system

Publications (1)

Publication Number Publication Date
WO2023209951A1 true WO2023209951A1 (en) 2023-11-02

Family

ID=88518151

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/019283 WO2023209951A1 (en) 2022-04-28 2022-04-28 Content effect calculation method and effect calculation system

Country Status (1)

Country Link
WO (1) WO2023209951A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005251170A (en) * 2004-01-23 2005-09-15 Sony United Kingdom Ltd Display
JP2008102176A (en) * 2006-10-17 2008-05-01 Mitsubishi Electric Corp Action counting system
JP2012208854A (en) * 2011-03-30 2012-10-25 Nippon Telegraph & Telephone East Corp Action history management system and action history management method
JP2012234464A (en) * 2011-05-09 2012-11-29 Nec Software Kyushu Ltd Information processing apparatus, information processing method, information processing system and information processing program
JP2019046318A (en) * 2017-09-05 2019-03-22 トヨタ自動車株式会社 Information processing apparatus, information processing system, information processing method, program, and recording medium
JP2019125309A (en) * 2018-01-19 2019-07-25 ヤフー株式会社 Sales assist device, sales assist method and program
JP2019159468A (en) * 2018-03-08 2019-09-19 大日本印刷株式会社 Advertisement display system, display device, advertisement output device, program and advertisement display method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005251170A (en) * 2004-01-23 2005-09-15 Sony United Kingdom Ltd Display
JP2008102176A (en) * 2006-10-17 2008-05-01 Mitsubishi Electric Corp Action counting system
JP2012208854A (en) * 2011-03-30 2012-10-25 Nippon Telegraph & Telephone East Corp Action history management system and action history management method
JP2012234464A (en) * 2011-05-09 2012-11-29 Nec Software Kyushu Ltd Information processing apparatus, information processing method, information processing system and information processing program
JP2019046318A (en) * 2017-09-05 2019-03-22 トヨタ自動車株式会社 Information processing apparatus, information processing system, information processing method, program, and recording medium
JP2019125309A (en) * 2018-01-19 2019-07-25 ヤフー株式会社 Sales assist device, sales assist method and program
JP2019159468A (en) * 2018-03-08 2019-09-19 大日本印刷株式会社 Advertisement display system, display device, advertisement output device, program and advertisement display method

Similar Documents

Publication Publication Date Title
US10902441B2 (en) Techniques for automatic real-time calculation of user wait times
US10922632B2 (en) People flow prediction device
JP4794453B2 (en) Method and system for managing an interactive video display system
US10867295B2 (en) Queuing system
JP6120404B2 (en) Mobile body behavior analysis / prediction device
CN108734502B (en) Data statistics method and system based on user position
US20140189096A1 (en) Detecting relative crowd density via client devices
WO2017030177A1 (en) Exhibition device, display control device and exhibition system
US9589189B2 (en) Device for mapping physical world with virtual information
US20120310737A1 (en) Method for providing advertisement, computer-readable medium including program for performing the method and advertisement providing system
JP4676160B2 (en) Information notification method and information notification system
Hwang et al. Detecting stop episodes from GPS trajectories with gaps
US20120190382A1 (en) System And Method For Tracking A Mobile Node
WO2010053192A1 (en) Behavioral analysis device, behavioral analysis method, and recording medium
JP2023029568A (en) Generation device, generation method, and generation program
JP7272522B2 (en) Data analysis device, data analysis system, data analysis method and data analysis program
CN108734501A (en) A kind of mobile position platform
JP7149063B2 (en) Monitoring system
WO2023209951A1 (en) Content effect calculation method and effect calculation system
US20230351832A1 (en) Methods of estimating a throughput of a resource, a length of a queue associated with the resource and/or a wait time of the queue
CN105376714B (en) A kind of localization method and device
JP7179220B2 (en) Information provision system and information provision method
JP2020071605A (en) Data analysis device, data analysis system, data analysis method, and program
JP2010257156A (en) Device, method and program for planning of advertisement delivery
JP2023032032A (en) Recommendation control system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22940225

Country of ref document: EP

Kind code of ref document: A1