JP4932161B2 - Viewer information measuring device - Google Patents

Viewer information measuring device Download PDF

Info

Publication number
JP4932161B2
JP4932161B2 JP2005008031A JP2005008031A JP4932161B2 JP 4932161 B2 JP4932161 B2 JP 4932161B2 JP 2005008031 A JP2005008031 A JP 2005008031A JP 2005008031 A JP2005008031 A JP 2005008031A JP 4932161 B2 JP4932161 B2 JP 4932161B2
Authority
JP
Japan
Prior art keywords
viewer
display
face
means
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2005008031A
Other languages
Japanese (ja)
Other versions
JP2006197373A (en
Inventor
泰範 椿
満將 櫻井
田中  敦
徹 石井
能広 芦崎
仁志 藤本
靖子 鈴木
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2005008031A priority Critical patent/JP4932161B2/en
Publication of JP2006197373A publication Critical patent/JP2006197373A/en
Application granted granted Critical
Publication of JP4932161B2 publication Critical patent/JP4932161B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to audience rating measurement for measuring information of a person who views content such as a program or an advertisement, and in particular, a viewer information measuring apparatus for measuring information of a person who views a large screen display device installed in a public space or the like. It is about.

  Traditionally, audience rating has been measured as an indicator of how many people in each district watched each TV program. The audience rating is an important value for the sales activities of each broadcasting company, as it is a criterion for estimating the advertising effectiveness and determining the advertising fee. The audience rating is measured based on, for example, a diary questionnaire survey, a survey through a dedicated device connected to the surveyed household, etc., and the household audience survey surveyed for each household and the individual viewing surveyed for each individual There are rates. For example, the personal audience rating is used for organizing and producing programs by collecting data by gender and age by, for example, pressing a button for each viewer connected to a dedicated machine in the household. Furthermore, there is a method as in Patent Document 1 in which the number of viewers existing in front of the television is automatically collected by face recognition by a camera.

JP 2004-64368 A

  Similarly, in a large screen display device installed in a public space or the like, the viewer information is important because it can serve as an index for determining the advertising effect, the advertising fee, the installation value of the large screen display device, the operation cost, and the like. However, such a large-screen display device is often displayed for a small and wide-scale unspecified number of people on a predetermined schedule, rather than a viewer selecting content, so that a conventional home TV There was a problem that the audience rating measurement method for broadcasting could not be applied. In addition, there has been a problem that it is impossible to always collect data from a wide range of unspecified majority in data collection using questionnaires and buttons that bother viewers.

  The present invention has been made to solve the above-described problems. Information regarding a viewer of a display installed in a public place or the like and viewed by a small, wide-range and unspecified number of people is obtained from the viewer. It is an object of the present invention to obtain a viewer information measuring apparatus that performs measurement without bothering hands.

The present invention viewer information measuring apparatus according to the display control means for controlling the display content to the multi display for displaying a plurality of contents at the same time, each is placed so that it has a different field of view, views facing the multi-display screen a plurality of imaging means, each for imaging and a viewer data generating means for generating the image data by extracting the viewer in which a plurality of imaging means to view multiple display from each captured scenery, the viewer data generating means A viewer measuring means for determining whether or not the person is the same based on the generated image data, counting the number of viewers without duplication, and measuring the number of viewers viewing each content simultaneously displayed on the multi-display. It is provided.

According to the present invention, it is possible to bother viewers with the number of viewers for each content that is installed in a public place or the like and is simultaneously displayed on a multi-display that is viewed by a large number of small and unspecified people. The effect that it can measure without being acquired is acquired.
In addition, since a scene facing the multi-display screen is photographed using a plurality of imaging means, an effect of capturing a wide-angle and wide-range viewer can be obtained.
Furthermore, since a wide range of viewers can be captured at this time, viewers photographed from multiple directions can improve the accuracy of calculating the orientation of the face, and do not extract the same person. An effect of preventing the same person from being counted a plurality of times can be obtained.

Embodiment 1 FIG.
FIG. 1 is a block diagram showing a configuration of a viewer information measuring apparatus according to Embodiment 1 of the present invention. As shown in the figure, the viewer information measuring device includes a display control device (display control means) 101, a display 102, a camera (imaging means) 103, a viewer data generation unit (viewer data generation means) 104, and viewer measurement. Unit (viewer measuring means) 105.

  The display control apparatus 101 holds content data to be displayed on the display 102 and controls display on the display 102. The camera 103 is installed in such a direction as to capture a landscape facing the display 102, and captures the landscape and generates imaging data. The viewer data generation unit 104 receives imaging data from the camera 103, extracts a person image included therein, and generates viewer data. For example, a background difference or a frame difference is used as a method for separating a person region from a video. In the background difference, a person is detected by comparison with a background image captured in advance. Further, in the frame difference, a moving object is detected based on a difference from another frame. The viewer measurement unit 105 is displayed based on the viewer data generated by the viewer data generation unit 104 and display content information indicating which content is displayed at a certain time held by the display control apparatus 101. The number of viewers of the content to be calculated is calculated.

Next, the operation will be described.
The display control apparatus 101 outputs and displays content on the display 102. The camera 103 captures a scene facing the display 102 and supplies the captured data to the viewer data generation unit 104. The viewer data generation unit 104 extracts a region of a person who seems to be a viewer from the received imaging data, generates viewer data for recognizing the viewer, and supplies the viewer data to the viewer measurement unit 105. The viewer measuring unit 105 counts the number of viewers based on the viewer data, receives display content information from the display control apparatus 101, and calculates the number of viewers of the current display content.

  As described above, according to the first embodiment, the camera 103 captures a scene facing the display 102, the viewer data generation unit 104 extracts a person area from the captured data, generates viewer data, Since the viewer measurement unit 105 calculates the number of viewers of the current display content from the display content information and viewer data, the number of viewers of the display content in an unspecified wide range without bothering the viewer. The effect that can be calculated is obtained.

Embodiment 2. FIG.
FIG. 2 is a block diagram showing the configuration of the viewer data generation unit 104 according to Embodiment 2 of the present invention. Other configurations are the same as those shown in FIG. As shown in FIG. 2, the viewer data generation unit 104 includes a face recognition unit (face recognition means) 201. The face recognition unit 201 extracts an area that can be recognized as a human face from the imaging data transmitted from the camera 103 as a face area. For example, a face template image may be provided in advance, and the face may be extracted by sequentially comparing with a partial area of the imaging data. Since the size of the face to be imaged is arbitrary, a face having an arbitrary size can be extracted by preparing and comparing image data having a plurality of different resolutions. Further, if the face recognition unit 201 includes a front face image as a face template image and extracts only the face area facing the front from the face areas extracted from the imaging data, the direction is independent of the display 102. It is possible to recognize only the person who is facing the direction of the display 102 without recognizing the person facing the direction.

  As described above, according to the second embodiment, the face recognition unit 201 extracts the face area from the imaging data, and can recognize only the person facing the direction of the display 102, for example. The effect of further improving the accuracy of viewer recognition can be obtained.

Embodiment 3 FIG.
FIG. 3 is a block diagram showing a configuration of the viewer data generation unit 104 according to Embodiment 3 of the present invention. Other configurations are the same as those shown in FIG. As shown in FIG. 3, the viewer data generation unit 104 includes a face recognition unit 201, a face position calculation unit 302, a face direction calculation unit 303, and a gaze coordinate calculation unit 304.

  The face recognizing unit 201 extracts a face area from the imaging data as described in the second embodiment. At this time, the size of the face area is also extracted. The face position calculation unit 302 calculates the position of the viewer facing the display 102 from the size of the face recognition area extracted by the face recognition unit 201. Since the size of a person's face is almost constant, although there are individual differences, an approximate position can be estimated. That is, if the size of the face recognition area is small, it can be determined that the viewer is far away, and if it is large, the viewer is near. The face direction calculation unit 303 calculates the face direction from the extracted face area, and thereby estimates which area of the display 102 the face is looking at. For example, face orientation images having various orientations may be provided in advance, and the orientation of the face may be collated, or the orientation of the face may be calculated by detecting an eye-nose position shift. Subsequently, the gaze coordinate calculation unit 304 calculates a gaze direction from the calculated face position and face direction, and estimates a gaze coordinate on the display 102 based on the gaze direction.

  The viewer data generation unit 104 sends the calculated face position, face direction, and gaze coordinate to the viewer measurement unit 105. The viewer measuring unit 105 calculates the number of viewers facing the display 102 based on the gaze coordinates.

  As described above, according to the third embodiment, the viewer data generation unit 104 calculates the gaze direction from the face position and the face direction, estimates the gaze coordinate on the display 102, and sets the direction unrelated to the display 102. Since the person who is facing is not recognized as a viewer, the effect of improving the accuracy of viewer recognition can be obtained.

Embodiment 4 FIG.
FIG. 4 is a block diagram showing a configuration of the viewer data generation unit 104 according to Embodiment 4 of the present invention. Other configurations are the same as those shown in FIG. As shown in FIG. 4, the viewer data generation unit 104 further includes a line-of-sight calculation unit 402 and a gaze coordinate calculation unit 403 in the configuration shown in FIG.

  The line-of-sight calculation unit 402 is a means for calculating the line-of-sight direction, and can be obtained, for example, by a feature extraction operation that extracts the movement of the black eye in face recognition. The face position calculation unit 302 and the face direction calculation unit 303 The line-of-sight direction is calculated together with the direction. The gaze coordinate calculation unit 403 obtains gaze coordinates on the display 102 from the calculated gaze direction. The viewer data generation unit 104 sends the calculated face position, face direction, and gaze coordinate to the viewer measurement unit 105. The viewer measuring unit 105 calculates the number of viewers facing the display 102 based on the gaze coordinates.

  As described above, according to the fourth embodiment, the viewer data generation unit 104 calculates the gaze direction by extracting the movement of the black eye in face recognition, etc., and estimates the gaze coordinates on the display 102. Since a person who is facing an unrelated direction is not recognized as a viewer, the viewer recognition accuracy can be further improved.

Embodiment 5 FIG.
FIG. 5 is a block diagram showing a configuration of a viewer information measuring apparatus according to Embodiment 5 of the present invention. As shown in the figure, the viewer information measuring device includes a display control device 101, a multi display 501, cameras 103a to 103c, viewer data generating units 104a to 104c, and a viewer measuring unit 105. In the fifth embodiment, a case where viewers are measured with a plurality of cameras will be described.

  The display control apparatus 101 holds content data to be displayed on the multi-display 501 in which a plurality of displays are arranged, and display content information indicating which content is displayed at a certain time, and displays the multi-display 501 in accordance with the display content information. Display content. The cameras 103 a to 103 c are cameras that capture the periphery of the multi-display 501, and are installed in directions that can capture a scene facing the multi-display 501. Each camera is installed in such a position and orientation as to have a different field of view, and images a scene facing the multi-display 501 and generates imaging data.

  The viewer data generation units 104a to 104c receive the imaging data of the cameras 103a to 103c, respectively, extract the person images included therein, and generate viewer data. The viewer measurement unit 105 includes an identical person determination unit 502 and a viewer number calculation unit 503, and generates viewer measurement data related to content. The same person determination unit 502 receives viewer data from the viewer data generation units 104a to 104c, and determines whether or not the same person exists in the extracted person image. The viewer number calculation unit 503 calculates the number of viewers based on the determination result from the same person determination unit 502 and the display content information from the display control apparatus 101.

Next, the operation will be described.
The camera 103a captures a scene facing the multi-display 501, generates captured image data, and supplies it to the viewer data generation unit 104a. The viewer data generation unit 104a extracts viewer data of viewers captured within the field of view of the camera 103a. Similarly, the viewer data generation units 104b and 104c extract viewer data from the imaging data of the cameras 103b and 103c, respectively. The viewer data generation units 104 a to 104 c each transmit viewer data to the same person determination unit 502.

  The same person determination unit 502 performs the same person determination to prevent duplication of data such that the same person is counted a plurality of times from the viewer data transmitted from the viewer data generation units 104a to 104c. For example, when the viewer data generation units 104a to 104c include the face recognition unit 201 shown in FIG. 2, the viewer data generation units 104a to 104c each extract a face area from a person image to generate viewer data, The same person determination unit 502 compares a plurality of extracted face regions and performs similar determination based on the feature amount. The viewer number calculation unit 503 counts as one viewer data determined to be similar based on the determination result of the same person determination unit 502. In addition, the viewer number calculation unit 503 receives display content information from the display control apparatus 101, and obtains the audience rating for each time and each content together with the counted viewers.

  As described above, according to the fifth embodiment, since a scene facing the multi-display 501 is captured using a plurality of cameras, an effect of capturing a wide-angle and wide-range audience can be obtained. . At this time, since the same person determination unit 502 does not extract the same person when the imaging areas of a plurality of cameras overlap, it is possible to prevent the same person from being counted a plurality of times.

  There may be one display for measuring the audience rating. Further, it may be a multi-display arranged in a shape other than a plane such as a sector shape or a cylindrical shape, and the number of cameras and viewer data generation units may be any number. By increasing the number of cameras, viewers of up to 360 degrees can be captured in a wider range.

Embodiment 6 FIG.
FIG. 6 is a block diagram showing a configuration of a viewer measurement display device according to Embodiment 6 of the present invention. Constituent elements common to those in FIGS. 1, 3, and 5 are denoted by the same reference numerals, and description thereof is omitted. In the sixth embodiment, viewer measurement when displaying a plurality of types of content on a multi-display will be described.

  As shown in FIG. 6, it is assumed that the multi-display 501 displays two types of content, content 1 and content 2, in response to an instruction from the display control apparatus 101. The display control apparatus 101 holds the display area coordinates of each content as display content information. The multi display 501 is provided with a camera 103 that captures an opposing landscape. Image data from the camera 103 is sent to the viewer data generation unit 104. The viewer data generation unit 104 extracts the face area by the face recognition unit 201, calculates the face position by the face position calculation unit 302, and calculates the face direction by the face direction calculation unit 303, and gazes based on the calculation result. The coordinate calculation unit 304 calculates gaze coordinates.

  The gaze coordinate calculation unit 304 sends the calculated gaze coordinate to the viewer measurement unit 105 as viewer data. The viewer measuring unit 105 receives the coordinate data of the currently displayed content from the display control apparatus 101 as display content information, and measures the number of viewers for each content based on this and the viewer data.

Next, the operation will be described.
Assume that the camera 103 captures the person 1 and the person 2 who are viewing the multi-display 501 and generates imaging data. The face recognition unit 201 extracts face 1 and face 2 from the imaging data using a face template image or the like. The extracted face area is shown in FIG. The face position calculation unit 302 calculates the distance between the face 1 and the face 2 from the camera 103 based on the extracted face area size. Since face 2 is smaller than face 1, it can be seen that person 2 is located farther from camera 103 than person 1. In this way, the face position calculation unit 302 can calculate the distance and direction from the camera 103 from the face area size, and can specify the face position.

  Subsequently, the face direction calculation unit 303 obtains the face direction by matching with the face template image. Subsequently, the gaze coordinate calculation unit 304 identifies the viewer's gaze direction from the extracted face position and orientation data, and obtains coordinates on the multi-display 501 that intersect the gaze direction. The viewer measurement unit 105 receives gaze coordinates as viewer data from the viewer data generation unit 104, and further receives display area coordinates of display content from the display control apparatus 101. Based on these, the viewer measurement unit 105 calculates which content includes the viewer's gaze coordinates, and determines the content that each viewer views. For example, in FIG. 6, it is determined that the person 1 is viewing the content 1 and the person 2 is viewing the content 2.

As described above, according to the sixth embodiment, the viewer data generation unit 104 calculates the gaze coordinates from the position and direction of the viewer's face, and the viewer measurement unit 105 selects the person based on the display content information. Since the content being viewed is determined, the effect that the number of viewers for each content can be measured is obtained. At this time, if a plurality of cameras are installed, a wide range of viewers can be captured. Therefore, viewers photographed from multiple directions can improve the accuracy of calculating the face orientation.
Note that the viewer data generation unit 104 in FIG. 6 may have the configuration shown in FIG.

Embodiment 7 FIG.
FIG. 7 is a block diagram showing the configuration of the viewer measurement unit 105 according to Embodiment 7 of the present invention. Other configurations are the same as those shown in FIG. As shown in FIG. 7, the viewer measurement unit 105 includes a viewer number calculation unit 503, a population calculation unit 701, and an audience rating calculation unit 702.

  The audience rating is a value indicating the ratio of viewers to a certain population. The viewer number calculation unit 503 receives the viewer data from the viewer data generation unit 104 and receives the coordinate data of the currently displayed content from the display control device 101, and measures the number of viewers for each content. The population calculation unit 701 calculates a population from the viewer data from the viewer data generation unit 104. The audience rating calculation unit 702 receives the number of viewers from the viewer number calculation unit 503 and the population from the population calculation unit 701, and calculates the ratio of the number of viewers to the population, that is, the audience rating.

  Here, the population indicates the number of all persons whose face recognition area images have been extracted by the viewer data generation unit 104, for example. The viewer is the gaze coordinate calculated by the gaze coordinate calculation unit 304. Only the person in the display area is shown. The population is the number of persons captured by a person detection device that can count a wide range of persons with a camera installed near the ceiling of the space where the multi-display 501 is installed, for example. The number of persons in front of the displayed multi-display 501 may be used.

  As described above, according to the seventh embodiment, since the audience rating calculation unit 702 calculates the audience rating from the number of viewers and the population, not only the number of viewers of the displayed content but also the audience rating is calculated. The effect that audience rating can also be acquired is acquired.

Embodiment 8 FIG.
FIG. 8 is a block diagram showing a configuration of a viewer information measuring apparatus according to Embodiment 8 of the present invention. In the figure, components common to those in FIG. In the eighth embodiment, it will be described that the content is changed based on information about the viewer.

  As illustrated in FIG. 8, the display control apparatus 101 includes a content generation unit 801, a distribution unit 802, and display instruction units 803a to 803h. This configuration is the same as that shown in FIGS. 5 and 6, but is omitted in these drawings. The content generation unit 801 reads out or newly creates content data stored in the display control apparatus 101. The distribution unit 802 receives content data from the content generation unit 801 and distributes it to one or more of the display instruction units 803a to 803h according to the coordinates of the multi-display to be displayed. The display instruction units 803a to 803h are connected to a corresponding display among the multi-displays 501, and display the corresponding content data.

Next, the operation will be described.
When the viewer measurement unit 105 receives the viewer data from the viewer data generation units 104a to 104c, the viewer measurement unit 105 receives display content information including display area coordinates on the multi-display 501 from the display control device 101, and the currently displayed content. The number of viewers is measured to generate viewer measurement data, which is sent back to the display control apparatus 101. For example, the viewer measurement unit 105 generates viewer measurement data related to the audience rating, the number of viewers, viewer measurement data related to the viewer's age group and gender, or viewer measurement data related to the gaze time from the viewer data. May be. The content generation unit 801 receives the viewer measurement data, and changes or changes the content to be displayed on the multi-display 501 based on the content.

  Here, changing or changing content means, for example, switching from one advertisement to another advertisement, switching from a still image to a moving image, moving a character object displayed in 3D graphics, or generating an event, For example, changing the display area or increasing / decreasing the number of contents to be displayed simultaneously. For example, when the display uses a multi-display as shown in FIG. 8, the content may be enlarged and displayed using an enlarger. That is, for example, an image displayed on one display may be displayed on a plurality of displays. Alternatively, the content generation unit 801 may transmit a drawing command to the distribution unit 802. In this case, the distribution unit 802 distributes the drawing command to the display instruction units 803a to 803h connected to each display. The display instruction units 803a to 803h display corresponding images in each display area.

  For example, when the viewer measurement unit 105 generates viewer measurement data regarding the audience rating and the number of viewers, the content generation unit 801 may change or change the content to be displayed according to the audience rating and the number of viewers. For example, when it is determined that the audience rating included in the viewer measurement data is low, the content may be switched to another content.

  Further, when the viewer measurement unit 105 generates viewer measurement data regarding the viewer's age group and gender, the content generation unit 801 may change and change content such as advertisements according to the age group and gender. For example, when the ratio of female viewers is higher than that of males, advertisements such as cosmetics may be advertised and displayed toward a person who can expect the female viewers to promote the viewers.

  In addition, when the viewer measurement unit 105 generates viewer measurement data regarding the gaze time, the content generation unit 801 may change and change the content according to the gaze time. For example, since it is considered that a viewer with a long gaze time is interested in the content, the content generation unit 801 may generate and display detailed information when the gaze time is long, and the display size is increased. You may make it display on the corresponding display instruction | indication part 803a-803h. Furthermore, when it is recognized that the elderly are gazing at some content on the multi-display 501 by combining gaze time and age group discrimination, the display area is enlarged and the size is increased. It is also possible to make it easier to see.

  As described above, the content to be displayed is changed or changed based on the viewer measurement data, so that various types of content can be displayed according to the viewer's condition, and more viewers can be acquired. Can be expected. For example, in a television program that is broadcast nationwide, it is difficult to immediately feed back viewer information and switch the content according to the viewer. However, in the viewer information measuring device according to the present invention, the advertising content is an advertisement. Since the advertisement can be flowed to the viewer intended by the main, it is possible to obtain an effect that the advertisement can be transmitted toward the target intended by the advertiser and the efficient display.

It is a block diagram which shows the structure of the viewer information measuring apparatus by Embodiment 1 of this invention. It is a block diagram which shows the structure of the viewer data generation part which concerns on Embodiment 2 of this invention. It is a block diagram which shows the structure of the viewer data generation part which concerns on Embodiment 3 of this invention. It is a block diagram which shows the structure of the viewer data generation part which concerns on Embodiment 4 of this invention. It is a block diagram which shows the structure of the viewer information measuring apparatus by Embodiment 5 of this invention. It is a block diagram which shows the structure of the viewer measurement display apparatus by Embodiment 6 of this invention. It is a block diagram which shows the structure of the viewer measurement part which concerns on Embodiment 7 of this invention. It is a block diagram which shows the structure of the viewer information measuring apparatus by Embodiment 8 of this invention.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 101 Display control apparatus (display control means), 102 Display, 103,103a-103c Camera (imaging means), 104,104a-104c Viewer data generation part (viewer data generation means), 105 Viewer measurement part (viewer) Measuring unit), 201 face recognition unit (face recognition unit), 302 face position calculation unit (face position calculation unit), 303 face direction calculation unit (face direction calculation unit), 304,403 gaze coordinate calculation unit (gaze coordinate calculation unit) ) 402 Gaze calculation unit (line of sight calculation unit), 501 Multi-display, 502 Same person determination unit (Same person determination unit), 503 Number of viewers calculation unit, 701 Population calculation unit, 702 Audience rate calculation unit, 801 Content generation Part, 802 distribution part, 803a-803h display instruction part.

Claims (6)

  1. Display control means for controlling content display on a multi-display for displaying a plurality of contents simultaneously;
    A plurality of imaging means each configured to have a different field of view, and each imaging a landscape facing the multi-display;
    Viewer data generating means for extracting viewers of the multi-display from imaging data generated by imaging by each of the plurality of imaging means and generating viewer data for recognizing each viewer;
    Based on the viewer data generated by the viewer data generation means, it is determined whether or not they are the same person, the number of viewers is counted without duplication, and the number of viewers who view each content simultaneously displayed on the multi-display Viewer information measuring device comprising viewer measuring means for measuring
  2.   The viewer data generating means includes face recognition means for extracting a face area that is recognized as the face of the viewer of the multi-display from the imaging data generated by the imaging means, and based on the face area extracted by the face recognition means. The viewer information measuring device according to claim 1, wherein viewer data is generated.
  3. The viewer data generating means includes a face position calculating means for calculating the position of the viewer's face in the imaging data from the face area extracted by the face recognition means;
    Face direction calculating means for calculating the orientation of the viewer's face from the face region;
    Gaze coordinate calculation means for calculating gaze coordinates for the viewer's multi-display based on the calculated face position and orientation;
    3. The viewer information measuring apparatus according to claim 2, wherein viewer data is generated based on the calculated gaze coordinates.
  4. Gaze calculation means for calculating the gaze direction of the viewer based on the position and orientation of the face of the viewer,
    4. The viewer information measuring apparatus according to claim 3, wherein the gaze coordinate calculating means calculates a gaze coordinate from the calculated gaze direction.
  5. 2. The viewer information measuring apparatus according to claim 1 , wherein the viewer measuring means calculates the audience rating of the content displayed on the multi-display based on the measured number of viewers.
  6. 2. The viewer information measuring apparatus according to claim 1 , wherein the display control means changes content to be displayed on the multi-display according to the number of viewers measured by the viewer measuring means.
JP2005008031A 2005-01-14 2005-01-14 Viewer information measuring device Active JP4932161B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005008031A JP4932161B2 (en) 2005-01-14 2005-01-14 Viewer information measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005008031A JP4932161B2 (en) 2005-01-14 2005-01-14 Viewer information measuring device

Publications (2)

Publication Number Publication Date
JP2006197373A JP2006197373A (en) 2006-07-27
JP4932161B2 true JP4932161B2 (en) 2012-05-16

Family

ID=36803071

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005008031A Active JP4932161B2 (en) 2005-01-14 2005-01-14 Viewer information measuring device

Country Status (1)

Country Link
JP (1) JP4932161B2 (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008112401A (en) * 2006-10-31 2008-05-15 Mitsubishi Electric Corp Advertisement effect measurement apparatus
JP4853320B2 (en) 2007-02-15 2012-01-11 ソニー株式会社 Image processing apparatus and image processing method
JP2008225550A (en) 2007-03-08 2008-09-25 Sony Corp Image processing apparatus, image processing method and program
WO2008111164A1 (en) * 2007-03-13 2008-09-18 Mitsubishi Electric Corporation System for providing information in cage of elevator
JP4424364B2 (en) 2007-03-19 2010-03-03 ソニー株式会社 Image processing apparatus and image processing method
JP4396720B2 (en) 2007-03-26 2010-01-13 ソニー株式会社 Image processing apparatus, image processing method, and program
JP2009003334A (en) 2007-06-25 2009-01-08 Sony Corp Image pick-up device, and imaging control method
WO2009052574A1 (en) * 2007-10-25 2009-04-30 Andrew James Mathers Improvements in oudoor advertising metrics
US20100036720A1 (en) * 2008-04-11 2010-02-11 Microsoft Corporation Ubiquitous intent-based customer incentive scheme
WO2010021240A1 (en) * 2008-08-21 2010-02-25 コニカミノルタホールディングス株式会社 Image display device
JP5085598B2 (en) * 2009-03-31 2012-11-28 株式会社デンソーアイティーラボラトリ Advertisement display device, system, method and program
JP5337609B2 (en) * 2009-07-15 2013-11-06 日立コンシューマエレクトロニクス株式会社 Broadcast receiver
GB2473495A (en) * 2009-09-14 2011-03-16 Guy Edward John Margetson Display using data pulled or requested from remote computer and feedback, e.g. of viewer figures to remote computer.
JP5482206B2 (en) 2010-01-06 2014-05-07 ソニー株式会社 Information processing apparatus, information processing method, and program
CN101996536A (en) * 2010-08-24 2011-03-30 北京水晶石数字科技有限公司 Intelligent display method
JP2012123727A (en) * 2010-12-10 2012-06-28 Hitachi Solutions Ltd Advertising effect measurement server, advertising effect measurement device, program and advertising effect measurement system
JP5143262B1 (en) * 2011-08-30 2013-02-13 株式会社東芝 3D image processing apparatus and 3D image processing method
JP5834941B2 (en) * 2012-01-19 2015-12-24 富士通株式会社 Attention target identification device, attention target identification method, and program
JP2013059094A (en) * 2012-11-09 2013-03-28 Toshiba Corp Three-dimensional image processing apparatus and three-dimensional image processing method
JP2014238733A (en) * 2013-06-07 2014-12-18 シャープ株式会社 Content creation device, content creation method, and content creation program
JP2015015031A (en) * 2014-08-12 2015-01-22 株式会社日立ソリューションズ Advertising effect measurement server, advertising effect measurement system, and program
JP6598575B2 (en) * 2015-08-17 2019-10-30 株式会社コロプラ Method and program for controlling head mounted display system
JP5961736B1 (en) * 2015-08-17 2016-08-02 株式会社コロプラ Method and program for controlling head mounted display system
JP2017041229A (en) * 2016-06-08 2017-02-23 株式会社コロプラ Method and program for controlling head-mounted display system
WO2018235595A1 (en) * 2017-06-20 2018-12-27 日本電気株式会社 Information provision device, information provision method and program

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH077449B2 (en) * 1987-11-05 1995-01-30 三菱電機株式会社 Object number detecting device
JPH01204095A (en) * 1988-02-10 1989-08-16 Ryuichi Inami Electronic advertisement and public relation display device
JP3050329B2 (en) * 1991-01-31 2000-06-12 パイオニア株式会社 Display pattern control device
JPH05316500A (en) * 1992-05-08 1993-11-26 Nippon Telegr & Teleph Corp <Ntt> Spatial scan image pickup display device
JP3336102B2 (en) * 1994-02-07 2002-10-21 株式会社日立製作所 Image processing method for eliminating shadow and image monitoring device
JPH08185521A (en) * 1994-12-28 1996-07-16 Clarion Co Ltd Mobile object counter
JP3290334B2 (en) * 1995-06-28 2002-06-10 日本電信電話株式会社 Multi-screen control system using network protocol
JPH09244600A (en) * 1996-03-07 1997-09-19 Fujitsu General Ltd Advertisement display system
JPH09251342A (en) * 1996-03-15 1997-09-22 Toshiba Corp Device and method for estimating closely watched part and device, information display device/method using the same
JPH10187107A (en) * 1996-03-19 1998-07-14 Mac Res:Kk Information display system, display schedule forming device and guide information controller
JPH10312448A (en) * 1997-05-14 1998-11-24 Mitsubishi Electric Corp Number of person detector and elevator control system using the same
JPH11283010A (en) * 1998-03-31 1999-10-15 Mitsubishi Electric Corp Moving body number detector
JP2000347618A (en) * 1999-06-04 2000-12-15 Hitachi Ltd Multidisplay system
JP3489491B2 (en) * 1999-06-24 2004-01-19 日本電気株式会社 Personal analysis device and recording medium recording personality analysis program
JP2001166757A (en) * 1999-12-09 2001-06-22 Hitachi Device Eng Co Ltd Display system
JP2001256477A (en) * 2000-03-13 2001-09-21 Ecchandesu:Kk Information collecting device
TWI238319B (en) * 2000-03-24 2005-08-21 Norio Watanabe Commercial effect measuring system, commercial system, and appealing power sensor
JP2002183212A (en) * 2000-12-19 2002-06-28 Fuji Xerox Co Ltd System and method for processing electronic document and computer-readable recording medium
JP2002269290A (en) * 2001-03-09 2002-09-20 Sony Corp Advertisement delivery system
JP4607394B2 (en) * 2001-09-27 2011-01-05 株式会社シー・イー・デー・システム Person detection system and person detection program
JP2003216938A (en) * 2002-01-23 2003-07-31 Sanyo Electric Co Ltd Information collecting device
JP2003259337A (en) * 2002-02-26 2003-09-12 Toshiba Lighting & Technology Corp Monitor camera system
JP2004064368A (en) * 2002-07-29 2004-02-26 Toshiba Corp Electronic apparatus
JP4208614B2 (en) * 2003-03-17 2009-01-14 亀山 渉 Video content evaluation device
JP2004289581A (en) * 2003-03-24 2004-10-14 Minolta Co Ltd Monitoring system
JP4879456B2 (en) * 2003-12-22 2012-02-22 アビックス株式会社 Public display management system

Also Published As

Publication number Publication date
JP2006197373A (en) 2006-07-27

Similar Documents

Publication Publication Date Title
Sitzmann et al. Saliency in VR: How do people explore virtual environments?
CN105264572B (en) Information processing equipment, information processing method and program
US9357203B2 (en) Information processing system using captured image, information processing device, and information processing method
US10182270B2 (en) Methods and apparatus for content interaction
CA2903241C (en) Attention estimation to control the delivery of data and audio/video content
JP6074177B2 (en) Person tracking and interactive advertising
US10182720B2 (en) System and method for interacting with and analyzing media on a display using eye gaze tracking
EP2859719B1 (en) Apparatus and method for image content replacement
US8724845B2 (en) Content determination program and content determination device
US9202112B1 (en) Monitoring device, monitoring system, and monitoring method
EP2271073B1 (en) Thermography methods
US9292736B2 (en) Methods and apparatus to count people in images
US9342921B2 (en) Control apparatus, electronic device, control method, and program
JP5728009B2 (en) Instruction input device, instruction input method, program, recording medium, and integrated circuit
JP5597762B1 (en) Activity map analyzer, activity map analysis system, and activity map analysis method
EP2509326A2 (en) Analysis of 3D video
JP4725595B2 (en) Video processing apparatus, video processing method, program, and recording medium
US9465443B2 (en) Gesture operation input processing apparatus and gesture operation input processing method
JP5994397B2 (en) Information processing apparatus, information processing method, and program
JP5834254B2 (en) People counting device, people counting system, and people counting method
JP4195991B2 (en) Surveillance video monitoring system, surveillance video generation method, and surveillance video monitoring server
JP4281819B2 (en) Captured image data processing device, viewing information generation device, viewing information generation system, captured image data processing method, viewing information generation method
US20150145762A1 (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
EP2924613A1 (en) Stay condition analyzing apparatus, stay condition analyzing system, and stay condition analyzing method
TWI439120B (en) Display device

Legal Events

Date Code Title Description
RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20071011

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20071128

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20080722

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20100303

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100622

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100803

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110405

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110510

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120207

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120215

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150224

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250