JP2011123657A - Information collection system and method - Google Patents

Information collection system and method Download PDF

Info

Publication number
JP2011123657A
JP2011123657A JP2009280568A JP2009280568A JP2011123657A JP 2011123657 A JP2011123657 A JP 2011123657A JP 2009280568 A JP2009280568 A JP 2009280568A JP 2009280568 A JP2009280568 A JP 2009280568A JP 2011123657 A JP2011123657 A JP 2011123657A
Authority
JP
Japan
Prior art keywords
family
person
age
recognition
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2009280568A
Other languages
Japanese (ja)
Inventor
Yoshisato Furukawa
Takahiro Konno
Kosuke Moriwaki
Toru Usuda
貴洋 今野
嘉識 古川
康介 森脇
亨 臼田
Original Assignee
Ntt Comware Corp
エヌ・ティ・ティ・コムウェア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ntt Comware Corp, エヌ・ティ・ティ・コムウェア株式会社 filed Critical Ntt Comware Corp
Priority to JP2009280568A priority Critical patent/JP2011123657A/en
Publication of JP2011123657A publication Critical patent/JP2011123657A/en
Pending legal-status Critical Current

Links

Images

Abstract

[PROBLEMS] To accurately estimate a family structure of a person photographed by a camera without performing registration processing of face images, and to distribute contents according to the family structure.
A face detection unit 22 detects a human face from a photographed image of a camera 13 to create a face image file, and a face recognition unit 23 performs recognition and is photographed by the camera 13. The age determination unit 25 determines the age of the person, and the gender determination unit 26 determines the gender of the person. The family structure estimation unit 33 assumes a parent-child relationship from the age group difference of the person who is in the parent-child relationship from the age and sex of the person photographed by the camera 13, and the child generation from the age group difference of the person in the brother-sister relationship The family structure is estimated by determining the sibling relationship and determining the marital relationship from the gender of the pair of men and women. The commercial distributed from the content server is determined according to the family structure.
[Selection] Figure 2

Description

  The present invention relates to an information collection system and method suitable for collecting a viewer's family structure.
  In recent years, communication technologies such as digital broadcasting and broadband lines have been developed, and a broadcasting business that distributes contents such as moving images in real time or on demand based on such communication technologies has entered a full-scale growth stage.
  Conventional television broadcasting using terrestrial broadcasting is one-way broadcasting, and therefore, when distributing a commercial, the same commercial is distributed to all viewers. On the other hand, in the Internet broadcasting business, real-time and on-demand content distribution is possible, and commercials with high advertising efficiency are provided according to the age, sex, hobbies, and preferences of people who are watching TV broadcasting. It is possible to deliver. For example, Patent Literature 1 discloses a program that allows a user to be identified by image recognition processing and broadcast based on the identified user preference information in broadcasting using terrestrial waves, satellites, cables, or the like as a medium. A selection device is described.
  Also, some television broadcast content may be inappropriate programs or commercials that you do not want your child to see. In the Internet broadcasting business, for example, as shown in Patent Document 2, a television viewer is identified by face recognition, and the necessity of television viewing restriction is determined based on the viewing history information of the identified person. It is possible to perform necessary viewing restrictions.
Further, as a technique for detecting a face from an image captured by a camera and estimating age and sex, for example, a technique as shown in Non-Patent Document 1 has been proposed.
Furthermore, Patent Document 3 is equipped with an “iris / retinal authentication” and “face authentication” system, and the user's eyes / faces are suitable for the television screen, indicating who the user is watching the television. Thus, there are shown a television receiver and a viewing information collecting device that can feed back information on whether or not the user is actually watching television to the broadcasting station.
JP 2006-311121 A JP 2008-182392 A JP 2008-278302 A
Internet <URL: http: // www. oki. com / jp / fse />
  As described above, in the Internet broadcasting business, it is possible to distribute contents such as commercials according to the age, sex, hobbies and preferences of viewers of television broadcasting. However, conventionally, viewers are captured in units of individuals, and content is not distributed by capturing viewers in units of families. If the viewer is regarded as a family unit and the content can be distributed according to the family structure, the content can be distributed more efficiently.
  For example, when a travel genre is distributed to a 25-year-old woman, if a single woman is popular with high-end luxury hotels, while the same 25-year-old woman has a small child, You may want a hotel with facilities for children. Therefore, even for people of the same age and gender, recommended advertisements differ depending on the family structure.
  Even if a commercial is for the same person, the recommended commercial differs depending on whether the person is watching television personally or watching the television for the whole family. For example, if the father of the family is watching television alone, a golf commercial that can be enjoyed individually is effective. However, if the family is watching television, the entire family, such as leisure land, is effective. Commercials for places to enjoy become effective.
The viewer's family structure can be easily known from the registered information if the viewer's information is registered in advance with the broadcaster. However, it is difficult to collect and register information of all viewers.
Therefore, the applicant of the present application considers installing a camera in front of the television receiver, photographing the viewer with this camera, estimating the viewer's age and gender from this captured image, and estimating the family structure. is doing. In order to estimate the age of the viewer from the photographed image of the camera, for example, the one shown in Non-Patent Document 1 can be used.
However, in order to estimate the age of the viewer from the photographed image of the camera and estimate the family structure, there are problems that must be solved as follows.
First, in the conventional face recognition, there is a problem that the recognition accuracy is remarkably lowered depending on the surrounding environment and the photographed image of the camera. In other words, in order to perform highly accurate face recognition, the viewer's face is pointed straight toward the camera provided in front of the television receiver, and the viewer's face is photographed to a certain extent. Need to be. If the surroundings are dark or too bright, face recognition cannot be performed with high accuracy. If a person who makes up the family stands in front of the camera, conditions of the surrounding environment are adjusted, and the face image is taken and registered, the face image can be easily analyzed. However, in this case, it is necessary to cause the viewer to register the face image.
  Next, there is a problem that the family structure estimation result becomes erroneous unless only the persons constituting the family are extracted and the family structure estimation is performed. That is, the person in front of the camera is not always the person who constitutes the family. A visitor may visit and be in front of the camera. Therefore, when performing the family composition estimation, it is necessary to perform the family composition estimation by excluding the visitors from the person photographed by the camera. However, it is not possible to determine whether or not a visitor is directly from the information obtained from the captured image of the camera.
  Furthermore, there is a problem that a complicated family structure must be considered. In other words, the number of members constituting the family varies. There are various forms of family structure, such as a three-generation family, living with an uncle, and an aunt. Therefore, it is necessary to estimate the family structure in consideration of various numbers of people and family forms.
  In view of the above-described problems, an object of the present invention is to provide an information collection system and method that can accurately estimate the family structure of a person photographed by a camera without performing registration processing of face images.
  In order to solve the above-described problems, the present invention collects information in which a determination result output from a face recognition system that determines a person's age and gender is input based on a face image obtained by photographing the person's face A system for capturing a face image of a plurality of persons in a specified home, and a face storing a determination result determined and output by a face recognition system based on the face image captured by the camera It is characterized by comprising a database and family structure estimation means for determining the family structure by following the relationship of the family structure in the home according to the age and sex indicated by the determination result.
  Further, according to the present invention, the family composition estimation means determines the parent-child relationship and sibling-sister relationship between the persons based on the age group differences and genders of the plurality of persons indicated by the determination result stored in the face database, and the age group Based on the difference and the sex, the marital relationship of a pair of a man and a woman is determined.
  Further, according to the present invention, the family composition estimation means determines a going-out pattern of a plurality of persons, and determines whether or not the person is a visitor based on the going-out pattern and a time zone when the person is photographed by the camera. In determining the family structure, a person determined to be a visitor is excluded.
  In addition, the present invention includes a recognition difficulty level determination unit that determines the difficulty level of face recognition based on a face image captured by a camera, and the family configuration estimation unit determines the recognition difficulty level when determining the family configuration. It is characterized by excluding the face image determined by the means that the degree of difficulty is not less than a predetermined value and the face recognition is difficult.
  Further, the present invention is characterized in that the recognition difficulty level determination means determines the recognition difficulty level based on an attribute of a face image photographed by a camera.
  In addition, the present invention provides a family configuration in which an information collection system is connected to a content server that distributes content in response to a content request among a plurality of determined content and a television receiver that receives and displays the content. A recommendation processing unit that determines the content to be transmitted to the television receiver based on the family structure determined by the estimation unit and the face image captured by the camera, and transmits a content request for the determined content to the content server. It is characterized by providing.
  In addition, the present invention provides an information collection method for an information collection system including a face database in which a determination result output from a face recognition system that determines the age and sex of a person based on a photographed person's face image is stored. A step in which the camera captures the face images of a plurality of persons in a defined home, a step in which the face recognition system outputs a determination result based on the face images captured by the camera, and a family configuration The estimating means comprises a step of determining the family structure by tracing the relationship of the family structure in the home according to the age and gender indicated by the determination result.
  According to the present invention, face recognition of a person photographed with a camera is performed, and the relationship of family composition such as brother-sister relationship, marital relationship, parent-child relationship, etc. is traced from the age and sex of the person whose face is recognized, The family structure is estimated. Thus, the family structure can be estimated with high accuracy even in the case of a complicated family structure without previously registering the face image to be referred to in the face recognition with the viewer.
  Further, according to the present invention, in the content distribution system, the face of a person photographed with a camera is recognized, and the content is actually viewed by estimating the family structure from the age and gender of the face recognized person. Content such as commercials suitable for the viewer can be distributed according to the age, sex, family structure, and preference of the person who is.
It is a block diagram which shows the network broadcasting system of the 1st Embodiment of this invention. It is a block diagram which shows the function structure of the family structure estimation system in the 1st Embodiment of this invention. It is explanatory drawing of determination of the recognition difficulty level in the family structure estimation system in the 1st Embodiment of this invention. It is explanatory drawing of the face database in the family structure estimation system in the 1st Embodiment of this invention. It is a flowchart which shows the process of the family structure estimation system in the 1st Embodiment of this invention. It is a flowchart which shows the outline | summary of the estimation process of the family structure in the 1st Embodiment of this invention. It is explanatory drawing of the pattern of the at-home / outing time of the visitor determination process in the 1st Embodiment of this invention. It is explanatory drawing of the pattern of at-home / outing time in the visitor determination process in the 1st Embodiment of this invention. It is a graph for determining a visitor probability by the visitor determination process in the 1st Embodiment of this invention. It is a flowchart which shows the visitor determination process in the 1st Embodiment of this invention. It is explanatory drawing of the face recognition log in the 1st Embodiment of this invention. It is explanatory drawing of the family index table in the 1st Embodiment of this invention. It is a flowchart which shows the outline | summary of the estimation process of the family structure in the 1st Embodiment of this invention. It is a flowchart used for description of the process which assumes the parent-child relationship in the estimation process of the family structure in the 1st Embodiment of this invention. It is a flowchart which shows the process which fixes the sibling relation of the child generation in the estimation process of the family structure in the 1st Embodiment of this invention. It is a flowchart which shows the process which determines the parent-child relationship and the marital relationship in the estimation process of the family structure in the 1st Embodiment of this invention. It is a flowchart which shows the process which determines the undecided person in the estimation process of the family structure in the 1st Embodiment of this invention. It is explanatory drawing of the family relationship in the 1st Embodiment of this invention. It is explanatory drawing of the family structure database in the 1st Embodiment of this invention. It is explanatory drawing of the estimation process of the family relationship in the 1st Embodiment of this invention. It is explanatory drawing of the recommendation basic database in the 1st Embodiment of this invention. It is explanatory drawing of the viewer pattern list | wrist in the 1st Embodiment of this invention. It is explanatory drawing of the recommendation list | wrist in the 1st Embodiment of this invention. It is a sequence diagram which shows the process in the recommendation process part in the 1st Embodiment of this invention. It is a block diagram which shows the network broadcasting system of the 2nd Embodiment of this invention. It is a block diagram which shows the structure of the personal computer in the 2nd Embodiment of this invention. It is explanatory drawing of the PC operation history database in the 2nd Embodiment of this invention. It is explanatory drawing of the preference data in the 2nd Embodiment of this invention. It is a block diagram which shows the structure of the family structure estimation system in the 3rd Embodiment of this invention. It is a flowchart used for description of the process in the 3rd Embodiment of this invention. It is a block diagram which shows the network broadcasting system of the 4th Embodiment of this invention. It is explanatory drawing of the camera arrangement | positioning in the 4th Embodiment of this invention.
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
<First Embodiment>
FIG. 1 shows a network broadcasting system according to a first embodiment of the present invention. This network broadcasting system performs television broadcasting by network communication from the content server 5 of the content provider 1 via the communication network 3. The communication network 3 is a bidirectional network such as the Internet.
A television receiver 11 is installed in each home 2. The television receiver 11 is a network-compatible television receiver and serves as a content receiving terminal in the home 2. The television receiver 11 is connected to the home network 4. The home network 4 is a network for connecting home devices. In addition to the television receiver 11, various devices such as a personal computer 12 can be connected to the home network 4.
A home gateway 6 is provided between the communication network 3 and the home network 4. The home gateway 6 provides functions such as a router, protocol conversion, and a firewall between the communication network 3 and the home network 4.
In the first embodiment of the present invention, a camera 13 is installed in the television receiver 11. The home gateway 6 is provided with a family structure estimation system 20.
A television broadcast is distributed from the content server 5 of the content provider 1 via the communication network 3. This television broadcast is sent to the television receiver 11 via the communication network 3, the home gateway 6, and the home network 4. The television receiver 11 displays content (video, video) distributed from the content server 5.
  Further, the camera 13 captures an image of a person watching the television receiver 11. The captured image of the camera 13 is sent to the home gateway 6 via the home network 4. The family composition estimation system 20 of the home gateway 6 determines the person who is watching the television receiver 11 from the moving image captured by the camera 13, and the family composition is estimated from the age and sex of the viewer. The When the commercial content is distributed from the content server 5 to the television receiver 11, the commercial according to the family structure can be distributed using the family structure information estimated by the family structure estimation system 20.
  Next, the family structure estimation system 20 in the first embodiment of the present invention will be described. The family composition estimation system 20 detects a face image from a photographed image of the camera 13, performs face recognition from the face image, determines a person, estimates the age and sex of the person, and based on this, The family structure estimation of 2 is performed.
When estimating the age of the viewer from the moving image captured by the camera 13 and estimating the family structure, in order to perform highly accurate face recognition, the camera 13 provided in front of the television receiver 11 is used. The viewer's face is directed straight toward the viewer, and the viewer's face needs to be photographed to a certain extent. If the surroundings are dark or too bright, face recognition cannot be performed with high accuracy.
Therefore, the family structure estimation system 20 according to the first embodiment of the present invention determines the recognition difficulty level of face recognition from the attributes of the face image, and excludes face data that is difficult to recognize, thereby improving the accuracy of face recognition. I try to improve. Here, the attributes of the face image are, for example, the face orientation, brightness, face size, and the like.
Also, the person in front of the camera 13 is not always a person who constitutes a family. A visitor may visit and be in front of the camera 13. When the family composition estimation is performed, it is necessary to perform the family composition estimation by excluding a visitor from the person photographed by the camera 13.
Therefore, in the family structure estimation system 20 according to the first exemplary embodiment of the present invention, it is determined whether or not a visitor is in front of the camera 13 using a statistical method. Except for, the family structure is estimated.
In addition, the number of people making up the family varies. Therefore, it is necessary to estimate the family structure in consideration of various numbers of people and family forms.
Therefore, in the family structure estimation system 20 according to the first embodiment of the present invention, the relationship of the family structure such as sibling relationship, marital relationship, parent-child relationship, etc. is determined by a logical method based on the age and gender of the person whose face is recognized. Trace the person's family structure. Thereby, even in the case of a complicated family structure, the family structure can be estimated with high accuracy.
  FIG. 2 is a block diagram showing a functional configuration of the family configuration estimation system 20 according to the first embodiment of the present invention. As shown in FIG. 2, the family structure estimation system 20 includes a face recognition system 21. The face recognition system 21 detects a face from a captured image of a camera and estimates an expression, age, and gender. As the face recognition system 21, for example, the one shown in Non-Patent Document 1 can be used.
The face recognition system 21 includes a face detection unit 22, a face recognition unit 23, a facial expression determination unit 24, an age determination unit 25, a gender determination unit 26, a face orientation determination unit 27, a brightness determination unit 28, A face size determination unit 29, a face database 30, and a log 31 are included.
The face detection unit 22 detects a human face from the captured image of the camera 13 and creates a face image file.
The face recognition unit 23 performs face recognition based on the face image file from the face detection unit 22. For the face recognition unit 23, an expression determination unit 24, an age determination unit 25, a gender determination unit 26, a face orientation determination unit 27, a brightness determination unit 28, a face size determination unit 29, and a face A database 30 and a log 31 are provided.
When the face image file is sent from the face detection unit 22, the face recognition unit 23 performs recognition and determines which person is captured by the camera 13. The facial expression determination unit 24 determines the facial expression of a person photographed by the camera 13. The age determination unit 25 determines the age of the person photographed by the camera 13. The gender determination unit 26 determines the gender of the person photographed by the camera 13.
  Further, when the face recognition unit 23 performs face recognition, the face direction determination unit 27 determines the face direction, the brightness determination unit 28 determines the surrounding brightness, and the face size determination unit 29 determines the camera. The size of the face in the 13 captured images is determined. These determination results are sent to the recognition difficulty level determination unit 32. The recognition difficulty level determination unit 32 determines the recognition difficulty level of face recognition based on the determination outputs from the face direction determination unit 27, the brightness determination unit 28, and the face size determination unit 29. In this embodiment, an example in which the recognition difficulty level is determined based on the orientation, brightness, and size of the face as described above will be described. However, other attributes that are determination criteria for the recognition difficulty level in the face image are determined. The recognition difficulty level may be determined based on the determination output.
  The recognition difficulty level determination unit 32 determines that it is difficult to recognize, for example, when the face orientation is horizontal. Further, the recognition difficulty level determination unit 32 determines that it is difficult to recognize when the surroundings are dark or too bright. Further, the recognition difficulty level determination unit 32 determines that recognition is difficult when the face image is small or too large. The determination result of the recognition difficulty level determination unit 32 is sent to the face database 30.
As a specific setting of the recognition difficulty level, for example, the recognition difficulty level determination unit 32 determines that there is a certain level of ease of recognition when the following conditions are satisfied.
-When the angle is displayed with the front direction as 0 degree, the left direction as-, and the right direction as +, the face direction is within ± 10 degrees.
-When brightness is represented by 0-100, brightness is the range of 60 or more and 90 or less.
-When one side of the square covering the face is displayed as a pixel value, the size of the face is not less than 60 and not more than 200 pixels.
  Further, in order to determine the recognition difficulty level, as shown in FIG. 3, a score is set for the face image attributes such as face orientation, brightness, face size and the like, and processing is performed. Also good. That is, as shown in FIG. 3A, the recognition difficulty level determination unit 32 determines 10 points for 0 degrees, 8 points for +5 degrees or -5 degrees, and 3 points for +10 degrees or -10 degrees. Set to point. As shown in FIG. 3B, the brightness is set to 6 points for 60 to 70, 10 points for 70 to 80, and 3 points for 80 to 90. As shown in FIG. 3C, the face size is set to 4 points for 60 to 70 pixels, 8 points for 80 to 120 pixels, 10 points for 120 to 160, and 3 points for 160 to 200 pixels. . Then, for example, the recognition difficulty level determination unit 32 calculates the total score of the face orientation, brightness, and face size, and if the total value is equal to or higher than a certain score, the recognition difficulty level is determined to be more than a certain level. judge. Further, instead of a simple total value, the face orientation may be evaluated with priority over brightness and face size. In addition, various evaluation methods are conceivable.
In FIG. 2, the face database 30 stores face data including a personal number, face image file name, face orientation, brightness, face size, etc., as shown in FIG. In the log 31, the time when face recognition is performed and the personal number are stored correspondingly.
The family configuration estimation unit 33 estimates the family configuration using the authentication result in the face recognition unit 23, the accumulated data in the face database 30, and the accumulated data in the log 31. The data of the family structure estimated by the family structure estimation unit 33 is sent to the family structure database 34 and stored.
  The recommendation processing unit 35 performs processing for recommending commercials. The recommendation processing unit 35 recognizes the time data from the time data generation unit 36, the weather data from the weather database 37, the recommended basic data from the recommendation basic database 38, and the preference data from the preference database 39 by the face recognition system 21. The face recognition data and the family composition data from the family composition database 34 are sent.
  The content server interface unit 40 has an interface function between the communication network 3 and the home network 4. The content server interface unit 40 also has an interface function with the content server 5.
  FIG. 5 is a flowchart showing processing of the family structure estimation system 20 shown in FIG. In FIG. 5, the face detection unit 22 detects a face from a moving image captured by the camera 13 installed in the television receiver 11 and creates a face image file (step S <b> 1). When a face image file is created by the face detection unit 22, the face recognition unit 23 performs face recognition processing using the face image file created by the face detection unit, and the recognized face is registered in the face database 30 as face data. It is determined whether it has been performed (step S2).
  If it is determined in step S2 that the recognized face is not registered in the face database 30 as face data, the face orientation determination unit 27, the brightness determination unit 28, and the face size determination unit 29 determine the face of the face image. The orientation, brightness, and face size are checked and notified to the recognition difficulty level determination unit 32 (step S3). The recognition difficulty level determination unit 32 examines the recognition difficulty level of the face image based on the face direction, brightness, and face size, and determines whether or not the recognition difficulty level is more than a certain level (step S4).
  In step S4, when the recognition difficulty level determination unit 32 determines that the recognition difficulty level is less than a predetermined value, the process returns to step S1. If the recognition difficulty level determination unit 32 determines in step S4 that the recognition difficulty level determination unit 32 is easier to recognize than a certain value, the recognition difficulty level determination unit 32 stores the face image file name, the face direction, and the brightness in the table of the face database 30. The face size is written and registered as face data (step S5).
  If it is determined in step S2 that the recognized face is registered in the face database 30 as face data, the face orientation determining unit 27, the brightness determining unit 28, and the face size determining unit 29 The direction, brightness, and face size are checked and notified to the recognition difficulty level determination unit 32 (step S6). The recognition difficulty level determination unit 32 compares the recognition difficulty level between the detected face image and the already registered face image according to the orientation, brightness, and face size of the face, and recognizes it more than the registered face image. It is determined whether or not it is easy (step S7). If it is determined that it is easier to recognize than the registered face image, the recognition difficulty level determination unit 32 replaces the detected face data with the registered face data, and updates the table of the face database 30 (step S8). .
  In step S5, the face recognition unit 23 registers the face data or updates the face data in step S8, and then the face recognition unit 23 displays a log 31 indicating who and what time the face is detected. Write (step S9). Then, it is determined whether or not a certain period (for example, about one week) has elapsed. If the certain period has not elapsed, the process returns to step S1.
By repeating the above processing, the face database 30 accumulates the face image file name, face orientation, brightness, and face size of the detected face data. It accumulates what time and how many minutes of detection.
When it is determined in step S10 that a certain period of time (for example, about one week) has elapsed, the family structure estimation unit 33 analyzes the log 31, and the time zone or stay in which the person registered as face data is reflected in the camera Considering the time, it is determined whether or not the visitor (step S11). Next, the family structure estimation unit 33 requests the face recognition unit 23 to determine the age and sex of the family, and receives the determination result (step S12). The family structure estimation unit 33 estimates the family structure based on the age and sex of the family and creates family structure data (step S13).
  Next, the family structure estimation process in the family structure estimation unit 33 will be described in further detail. FIG. 6 is a flowchart showing an outline of the family structure estimation process in the family structure estimation unit 33. As shown in FIG. 6, the process in the family structure estimation unit 33 includes a process of performing a visitor estimation by a statistical method (step S51 (corresponding to step S11 in FIG. 5)) and a person excluding the visitor. The process is divided into a process of estimating a family structure (step S52 (corresponding to step S13 in FIG. 5)) according to age and gender.
  In the first embodiment of the present invention, whether the recognition difficulty level determination unit 32 recognizes more than a certain level depending on the face direction, brightness, and face size as pre-processing for entering these processes. The data when the constant value is not easily recognized is excluded. Therefore, the face recognition, age determination, and gender determination results are highly reliable.
  First, the visitor estimation process in step S51 will be described. Family members have a pattern of staying at home and going out. For example, it is assumed that a person who is working leaves the house in the morning, goes to the work place, and returns from the work place at night. Suppose a child leaves home in the morning, goes to school, and comes home from school in the evening. Assume that a full-time housewife stays at home all day. This is the pattern of family staying home and going out. In addition, the time zone during which visitors are present is related to the pattern of the family's home / outing hours. For example, if you are a child's friend, your visitors will be concentrated in the evening when your child comes home from school. Moreover, the visitor stays for a short time compared with the family. Based on such a family home / outing time pattern and a visitor existing time pattern, it is estimated and determined by a statistical method whether a person recognized from an image taken by a camera is a visitor.
  For example, it is assumed that there are three types of patterns P1 to P3 as shown in FIG. 7 and FIG. FIG. 7 shows the correspondence between the patterns P1 to P3 and the family form, and FIGS. 8A to 8C show the time zones in which there is a high possibility that visitors will be present in each pattern. It is.
  As shown in FIG. 7, the pattern P <b> 1 is a pattern of a family where parents are working together or working as a single parent and there are no children or students going to school. As shown in FIG. 8A, in the case of such a pattern P1, on weekdays, from morning to night, all persons constituting the family are at work and are absent. And the time slot | zone T1 with a visitor becomes the time after returning from the night work place on a weekday. In addition, the visitor frequency is low.
  As shown in FIG. 7, pattern P2 is a family pattern in which there are children and students going to school, and parents are working together or working as one parent. In the case of such a pattern P2, as shown in FIG. 8 (B), on weekdays, the child goes to school from morning to evening, and the parents are at work from morning to night. Become. The time zone T2 where the visitor is present is a time after the child returns on weekdays. In addition, as the age group of the child increases, there is a possibility that visitors may exist until a later time.
  As shown in FIG. 7, the pattern P3 is an arbitrary combination that does not apply to the patterns P1 and P2, such as a full-time housewife, an elderly person, and a preschooler. In the case of such a pattern P3, as shown in FIG. 8C, there is no specific home / outing pattern. Moreover, the time zone T3 with a visitor covers a wide time zone in the daytime.
  The visitor probability can be obtained from the analysis of the pattern of the family staying home / outing time. FIG. 9 is a three-dimensional graph for determining the probability of a visitor. This example shows a graph in the case of the pattern P2 in FIGS. In FIG. 9, the X-axis indicates the stay time, the Y-axis indicates the detection start time, and the Z-axis indicates the probability of being a visitor.
  According to the graph of FIG. 9, there is a peak of the probability of being a visitor when the stay time is around 15:00. When the staying time is from 6:00 to 10:00, or after 19:00, the possibility of being a visitor is low. Even if the stay time is around 15:00, if the stay time exceeds 60 minutes, the probability of being a visitor is low. A three-dimensional graph for determining the probability of a visitor can be created in the same manner for other patterns.
If a graph of the visitor probability as shown in FIG. 9 can be created, the visitor estimation can be performed using this graph. FIG. 10 is a flowchart showing the visitor determination process.
In FIG. 10, in the visitor determination process, the family structure estimation unit 33 determines which of the patterns P1 to P3 is the pattern of the family staying home / outing time (step S101). The pattern can be determined as follows, for example.
  Based on the information of the extracted face recognition log (see FIG. 11), the number of face recognitions per hour is counted from 0:00 to 24:00 on each day of the week, and a table representing the number of face recognitions in each time zone is created. As shown in FIG. 11, the face recognition log stores a personal number for identifying an individual and the time when the face is recognized.
  For example, if the day when the number of face recognition times is 0 is about 5 days at 10:00 to 17:00, which is the working hours, it can be determined as the pattern P1. Of these, it can be determined that the day (day of week) when the number of face recognition is not 0 at 10:00 to 17:00 is a holiday, and the other day (day of the week) is a work day. If the number of face recognition is 0 at 10:00 to 15:00, which is a time zone at school, and there are about 5 days when the number of face recognition is not 0 at 15:00 to 17:00, which is a work time zone The pattern P2 can be determined. Of these, the day (day of week) when the number of face recognition is not 0 at 10:00 to 15:00 can be determined as a holiday, and the other day (day of the week) can be determined as a school day. If none of these applies, it can be determined as the pattern P3. In the pattern P3, since it is at home on weekdays, it is not necessary to determine whether it is a holiday.
  If the pattern can be determined in step S101, the family structure estimation unit 33 refers to the face recognition log (step S102), and determines whether the visitor probability at the recognized time is lower than a predetermined threshold (step S102). S103). The threshold value of the visitor probability can be acquired based on the graph shown in FIG.
  If the time when the face is recognized in step S103 is a time when the visitor probability is high, the family structure estimation unit 33 determines whether or not the stay time is longer than a predetermined threshold (step S104). In step S103, the time when the face is recognized is a time with a high visitor probability, and if it is determined in step S104 that the staying time is short, the probability of being a visitor is high. In this case, the family structure estimation unit 33 determines whether or not the information of the log 31 is finished (step S105). If the log is not finished, the process returns to step S102.
  In step S103, if the face recognition time is a time when the visitor probability is low, the probability of being a family is high. Even if it is determined in step S103 that the face-recognized time is a time with a high visitor probability, if it is determined in step S104 that the staying time is long, the probability of being a family is high. In this case, the family composition estimation unit 33 adds the family index of the family index table (see FIG. 12) (step S106), determines whether or not the information of the log 31 is finished (step S105), and logs If not completed, the process returns to step S102.
If it is determined in step S105 that the information in the log 31 has ended, it is determined from the family index table (FIG. 12) that a person with a family index less than a certain value is a visitor (step S107), and the process ends.
As described above, the family structure estimation unit 33 can determine whether it is a visitor in front of the camera 13 using a statistical method.
  Next, the process for estimating the family structure in step S52 in FIG. 6 will be described. In the process of estimating the family structure, the family structure is estimated by following the relationship of the family structure such as the brother-sister relationship, the marital relationship, and the parent-child relationship of the persons who constitute the family according to the age and sex. The logical analysis of sibling relationships, marital relationships, and parent-child relationships for the members of a family is as follows.
Logical analysis of sibling relationship (a). The age group difference between the siblings is 0 or 10 (however, the age group is the 0s age group (0-9 years), the 10s age group (10-19 years) ), 20-year-old age group (20-29 years old,..., Every 10-year age group).
In many cases, siblings are two to five years old, but may be ten years or older apart. Therefore, the age group difference is 0 or 10. On rare occasions, there are brothers and sisters who are 20 years of age or older, with the exception.
(B). The number of sets of sibling relationships existing in one family is 0 or 1.
In many cases, since they are children of the same parent, the number of sets of sibling relationships is one. If there are only couples and no children, the set number of siblings will be zero.
(C). Sibling relationship and marital relationship do not hold at the same time.
Since brothers and sisters never become couples, brother-sister relationships and marital relationships do not hold at the same time.
(D). A person who becomes a brother and sister of a person who becomes a person and a brother and sister is also a person who becomes a brother and sister.
For example, in the case of the eldest son, the second son, and the third son, the eldest son and the second son are brothers. If the second son and the third son are brothers, the first son and the third son are brothers.
(E). When a family has a sibling relationship, there is always a person who has a parent-child relationship with the person who has the sibling relationship.
In general, families with only children are rare, and if you have children, you have their parents.
Logical analysis of marital relationship (f). The age group difference between the couples is 0, 10, or 20 (however, the age group is the age group of 0's (0-9 years old), the age group of 10's (10 years old- 19 years old), 20s age group (20 to 29 years old, etc.).
Generally, the age difference between couples is 20 years old or less. On rare occasions, there are couples who are over 30 years of age, but that is an exception.
(G). The age group of a person who is a couple is 20 years old or older (however, the age group is the age group of 0s (0-9 years), the age group of 10s (10-19 years) , 20-year-old age group (20-29 years old,..., Every 10-year age group).
Rarely, there are teenage couples, but that is an exception.
(H). The number of marital relationships existing in one family is 0, 1, or 2.
A family is a couple, but may live with a grandparent, an uncle, or an aunt.
(I). A marital relationship and a sibling relationship cannot be established at the same time.
Persons who are in a marital relationship do not have a sibling relationship.
(J). A marital relationship always consists of a pair of men and women.
In general, one of the couple is a man and the other is a woman.
Logical analysis of parent-child relationship (k). The age group difference of the person who becomes the parent-child relationship is any of 20, 30, 40 (however, the age group is the age group of 0s (0-9 years), the age group of 10s (10 (Age 19-19), age group in the 20s (20-29 years), etc.
In general, the age difference between parents and children is 20 years or older. There are rare exceptions for parents and children who are less than 20 years old, but that is an exception.
(L). The parent-child relationship includes a set of parents and a set of pairs of children.
A parent is a set of mother and father. A child is a group of brothers and sisters. Parents and their children are paired.
(M). The set of parents is a set having a person who has a marital relationship as an element, or a set having only one parent as an element.
Parents are marital relationships. It may also be a single parent.
(N). The child set is a set whose elements are brothers and sisters or a set whose elements are only one child.
A child is either a male-only brother, a female-only sister, a male-female sibling, or a single man or woman.
  The logical analysis described above is generally considered and does not cover all families. In reality, there are many more family forms. In this example, sufficient logic is given when content is provided according to the family form, but it is also possible to consider logic corresponding to many family forms.
  Next, based on the above-described logic analysis, the process of estimating the family structure by tracing the family structure relationships such as siblings, marital relationships, parent-child relationships, etc. of the persons making up the family from the age and sex (in FIG. 6) Step S52) will be described.
  As shown in FIG. 13, the process of estimating the family structure in step S52 further includes a process of assuming a parent-child relationship (step S201), a process of determining a sibling relationship of the child generation (step S202), a parent-child relationship, The process is divided into a process of determining a marital relationship (step S203) and a process of determining other relationships (step S204).
FIG. 14 shows processing for assuming the parent-child relationship in step S201. In this process, the parent-child relationship is assumed based on the logic “the age group difference of the person having the parent-child relationship is any one of 20, 30, 40” shown in (k) above.
In FIG. 14, the family structure estimation unit 33 extracts the data of persons constituting the family from the family structure database 34 (see FIG. 19) in the order of the personal numbers (step S <b> 301), and adds 20 years to the age group of the person. It is determined whether or not there are persons over the age group (step S302). If there are persons in the age group equal to or greater than the age group of 20 years, the family structure estimation unit 33 assumes that the person is a parent (step S303). If a child-to-parent relationship can be assumed, on the contrary, a parent-to-child relationship is assumed (step S304). And the family structure estimation part 33 determines whether it processed about all the families of the household (step S305), and if it has not processed about all the persons of the family, it will carry out to the person of the next personal number. Advance (step S306), and return to step S301.
  In step S302, when there is no person of an age group equal to or greater than the age group of the person plus 20 years old, the family composition estimation unit 33 determines whether or not the process has been performed for all the families in the household. (Step S305) If all the persons in the family have not been processed, the process proceeds to the person with the next personal number (Step S306), and the process returns to Step S301.
  By repeating the processes in steps S301 to S306, persons of an age group equal to or greater than the age group plus 20 years old are assumed as parents. When the process for all persons in the household is completed in step S305, the family structure estimation unit 33 ends the process of assuming the parent-child relationship in step S201, and determines the sibling relationship of the child generation in step S202. Enter processing.
  FIG. 15 is a flowchart showing the process for determining the brother-sister relationship of the child generation in step S202 in FIG. In this process, the brother-sister relationship is determined based on the logic shown in the above (a), that is, “the age group difference of a person who has a brother-sister relationship is 0 or 10.” In addition, by targeting people under the age of 20, the sibling relationship of the child generation is established in the family.
  In FIG. 15, the family structure estimation unit 33 extracts the data of persons constituting the family from the family structure database 34 (see FIG. 19) in order of personal numbers (step S401). It is determined whether it is an age group (step S402). If it is determined in step S402 that the person's age group is not in their 0s or 10s, the family structure estimation unit 33 determines whether or not all the families in the household have been processed (step S403). If the process has not been performed for all the persons, the process proceeds to the person with the next personal number (step S404), and the process returns to step S401.
  If it is determined in step S402 that the person is in their 0s or 10s, the family structure estimation unit 33 determines whether there is a person in the age group whose difference between the person and the age group is 0 or 10 (step S405). If there is a person in the age group whose age group difference is 0 or 10 in step S405, it is determined that the person is a sibling (step S406). The brother-sister relationship is determined (step S407). Here, among the persons who are determined to have a brother-sister relationship, an older person is a male, and a female is an older sister. In addition, a younger person is a younger brother if it is a younger person, and a younger sister if it is a female. Then, the family structure estimation unit 33 determines whether or not the process has been performed for all the families in the household (step S403), and if the process has not been performed for all the persons in the family, Advance (step S404), and return to step S401.
  If there is no person in the age group whose age group difference is 0 or 10 in step S405, the family structure estimation unit 33 determines whether or not the process has been performed for all the families in the household (step S403). If all the persons in the family have not been processed, the process proceeds to the person with the next personal number (step S404), and the process returns to step S401.
  By repeating the processes in steps S401 to S407, a person whose age group difference is 0 or 10 is determined as a sibling in the child generation. When the process for all the persons in the household is completed in step S403, the family structure estimation unit 33 ends the process of determining the brother-sister relationship of the child generation in step S202, and determines the parent-child relationship and the marital relationship in step S203. The process to confirm is entered.
  FIG. 16 is a flowchart showing processing for determining the parent-child relationship and the marital relationship in step S203 in FIG. In this process, based on the logic shown in (i) above, “the parent-child relationship is made up of a set of parent persons and a set of child person pairs”. The relationship is determined, and the relationship of the couple is determined based on the logic shown in (j) above, that “a couple relationship is always composed of a pair of men and women”.
  In FIG. 16, the family structure estimation unit 33 extracts the data of the persons making up the family from the family structure database 34 (see FIG. 19) in the order of individual numbers (step S <b> 501), and whether or not the person has been determined as a sibling. Is determined (step S502). If it is determined in step S502 that the person has not been confirmed as a brother or sister, it is determined whether or not processing has been performed for all family members in the household (step S503). The process advances to the person with the next personal number (step S504), and the process returns to step S501.
  If the person is determined as a sibling in step S502, the family structure estimation unit 33 determines the relationship between the child and the parent with the youngest male and female being among the parents who are common with other siblings (parents). Step S505). If the relationship between the child and the parent can be determined, the parent-child relationship is determined on the contrary (step S506). And the family structure estimation part 33 determines whether it processed about all the families of the household (step S503), and if it has not processed about all the persons of the family, it will change to the person of the next personal number. Advance (step S504), and return to step S501.
  By repeating the processing from step S501 to step S506, the parent-child relationship between the child and the parent is determined. When the parent-child relationship for all persons in the household is determined in step S503, the family structure estimation unit 33 determines the marital relationship for the determined parents (step S507), and determines the parent-child relationship and the marital relationship. When the process for determining the parent-child relationship and the marital relationship (step S203) ends, the family structure estimation unit 33 enters a process for determining other relationships (step S204).
The process for determining the other relationship in step S204 in FIG. 13 is to estimate the family structure of a person who cannot be determined by the above-described process. As an example of a person who could not be determined by the above-described processing, here, processing for determining an uncle and an aunt will be described.
FIG. 17 is a flowchart for processing an unconfirmed person as an aunt and aunt. In FIG. 17, the family structure estimation unit 33 extracts the data of the persons making up the family from the family structure database 34 (see FIG. 19) in the order of the personal numbers (step S <b> 601). It is determined whether or not (step S602). If it is determined in step S602 that the person is not a person whose marital relationship has been determined, the family composition estimation unit 33 determines whether or not processing has been performed for all the families in the household (step S603), and all the persons in the family are determined. If the process is not performed, the family structure estimation unit 33 proceeds to the person with the next personal number (step S604), and returns to step S601.
In step S602, if the person is a person whose husband and wife relationship is confirmed, the family composition estimation unit 33 determines that the unconfirmed person is the person's brother (uncle, aunt for the person's child) (step S605). . Then, the family composition estimation unit 33 determines whether or not processing has been performed for all the families in the home (step S603), and if processing has not been performed for all the members of the family, the process proceeds to the next person (step S603). S604), the process returns to step S601.
Here, the unconfirmed person is the brother of the person (uncle, aunt for the child of the person), but this is an example. Based on the logic shown in the above (a) to (n), the family relationship can be further estimated.
The fact that family relationships can be estimated by the processes shown in FIGS. 13 to 17 will be described with an example. First, as shown in FIG. 18, a description will be given by taking as an example a family of four, a father 39 years old, a mother 36 years old, an older sister (eldest daughter) 9 years old, and a younger brother (eldest son) 7 years old.
In the case of such a family structure, as a result of the face recognition performed by the face recognition unit 23 from the photographed image of the camera 13, as shown in FIG. 34 is registered. Further, as shown in FIG. 19B, the personal number “0001” in the family structure database 34 is a photographed image of the father, the personal number “0002” is the photographed image of the mother, the personal number “0003” is the photographed image of the sister, It is assumed that the personal number “0004” is a photographed image of the younger brother. Further, the age determination unit 25 and the gender determination unit 26 allow the father to be “30s” and “male”, the mother to be “30s” and “female”, the sister to be “0s”, “female”, and a brother. Are determined to be “0 years old” and “male”. In the initial state, as shown in FIG. 19A, all the relation items are “unconfirmed”.
In the case of the family composition data as shown in FIG. 19A, the family relationship is estimated as follows by the processing shown in FIGS.
First, in the parent-child assumption process shown in FIG. 14, by repeating the processes in steps S301 to S306, persons of an age group equal to or greater than the number of age groups including 20 years old are assumed as parents. FIG. 20A shows the parent-child relationship assumed by repeating the processing of steps S301 to S306. 20A to 20C show the relationship of the person in the upper column viewed from the person in the left column (the relationship with the person in the left column viewed from the person in the upper column). . In the figure, an underline is given below the age group of a person who is a woman. By repeating the processing of step S301 to step S306, it is possible to determine a person who is more than the number of age group plus 20 years old at the age of 0 years old sister and 0 years old brother. Can be assumed. In the opposite relationship, the child can be assumed from the father's perspective, and the child can be assumed from the mother's perspective.
Next, in the process for determining the brother-sister relationship of the child generation shown in FIG. 15, the difference in the age group is 0 or 10 as shown in FIG. Are confirmed as older sister and younger brother.
Next, in the process of determining the parent-child relationship and the marital relationship shown in FIG. 16, the parent-child relationship is determined by repeating the processing of step S501 to step S506, and the marital relationship is determined in step S507. Thereby, as shown in FIG. 20C, the marital relationship between the father and the mother is established.
Through the processing as described above, the relationship between the persons making up the family is determined. As a result, as shown in 19 (B), the determined relationship is described in the relationship item of each person.
Next, the recommendation processing unit 35 in FIG. 2 will be described. As described above, the recommendation processing unit 35 performs a selection process for recommending a commercial based on the face recognition data and family structure data together with the time data, weather data, basic data, and preference data.
The time data is information on the current date, hour, minute and second. The weather data is current weather information such as sunny, cloudy, rainy.
  The recommended basic data is data in which various types of data are associated with commercial numbers. FIG. 21 is an example of the recommended basic database 38. In this example, the commercial number is determined according to the number of family members, the number of children, age, sex, season, time zone, weather, genre, and detailed genre. The recommended basic data is automatically distributed to the family structure estimation system 20 when the system provider creates it on the content server 5 in cooperation with the content provider 1.
In the recommendation processing unit 35, a viewer pattern list and a recommendation list are created internally by the system provider. FIG. 22 shows an example of a viewer pattern list, and FIG. 23 shows an example of a recommendation list.
In the example of the viewer pattern list shown in FIG. 22, the case where the mother is watching the television receiver alone is referred to as the viewer pattern a, and the case where the whole family is watching the television receiver is referred to as the viewer pattern b. The other is the viewer pattern c. In the example of the recommendation list shown in FIG. 23, if the viewer pattern is “a”, that is, if the mother is watching a television receiver alone, if the genre is shopping, recommend clothes suitable for children, If it is a dish, recommend a dinner menu that looks good for families with children. If it is the viewer pattern b, that is, if the whole family is watching a television receiver, if the genre is travel, recommend a travel plan suitable for the family with children and estimate the accommodation fee according to the family structure Present the amount. If the genre is vacation, recommend an amusement park where children can play. If the genre is cooking, recommend a delicious restaurant that serves dishes that children can eat.
  FIG. 24 is a sequence diagram showing processing in the recommendation processing unit 35. In FIG. 24, when the face recognition system 21 receives an image captured by the camera 13 and specifies who the person shown in the camera 13 is, the recommendation processing unit 35 receives the personal number specified by the face recognition unit 23. Referring to the family structure data in the family structure database 34 (see FIG. 19B), it is specified who is in the family that is reflected in the camera 13 (step S701).
Next, the recommendation processing unit 35 determines which pattern in the viewer's pattern list corresponds to the viewer's situation, and extracts the recommended content / genre / detailed genre of the corresponding pattern in the recommendation list (step S702). .
Next, the recommendation processing unit 35 refers to the family structure data in the family structure database 34, and acquires the number of family members, the number of children, and the age / sex of each person (step S703).
Next, the recommendation processing unit 35 refers to the time data of the time data generation unit 36, acquires the current time, and determines the season and time zone (step S704).
Next, the recommendation processing unit 35 refers to the weather data in the weather database 37 and acquires today's weather information (step S705).
Next, the recommendation processing unit 35 sets data as necessary for each parameter of the number of families, the number of children, the age, the sex, the season, the time zone, the weather, the genre, and the detailed genre according to the content of the recommendation. The commercial numbers are extracted from the recommended basic database 38 (see FIG. 21) using these as keys (step S706).
Next, the recommendation processing unit 35 transmits a content request including the extracted commercial number to the content server 5 via the content server interface unit 40 (steps S707 and S708).
The content server 5 acquires the commercial video corresponding to the received commercial number from the content database 7 and transmits it to the content server interface unit 40 of the family structure estimation system 20 (step S709).
The content server interface unit 40 transmits the received commercial video to the television receiver 4 (step S710).
According to the first embodiment of the present invention described above, the face recognition system 21 is used to check who is currently shown on the camera 13, and the contents of the commercial are changed depending on who is shown. be able to.
For example, in the case of a configuration example of a family of four shown in FIG. 18 having a father of 39 years old, a mother of 36 years old, an older sister (eldest daughter) of 9 years old, and a younger brother (eldest son) of 7 years old, the content of the commercial is changed as follows: Can be considered.
  When a mother is watching television alone, she will deliver a commercial of clothes that will suit a 9-year-old girl or a 7-year-old boy, and recommend a menu for a good dinner for a family of four with two small children. Deliver commercials. Also, when all the family members are watching television, a commercial that recommends a travel plan suitable for a family of four of two adults and two children is distributed. In that case, there is an advantage that it is possible to present the estimated amount of the accommodation fee. Alternatively, a commercial that recommends an amusement park where two 9-year-old and seven-year-old children are likely to play, or a recommendation that recommends a delicious restaurant where two 9-year-old and seven-year-old children are likely to eat. To deliver.
In addition, according to the first embodiment of the present invention, the time data generation unit 36 and the weather database 37 are provided. Thereby, the contents of the commercial can be changed in consideration of the month and day (season), time zone, weather and the like together with the above-mentioned family structure.
That is, for example, since it is spring, a summer clothes commercial that is likely to suit a 9-year-old girl or a 7-year-old boy is distributed. Or, since it is a dinner time, refrain from recommending dinner menus. Because the cold days continue, we recommend a pan dish suitable for a family of four with two adults and two children.
  As described above, in the first embodiment of the present invention, a family structure is estimated from an image captured by a camera without a person registering a face image to be referred to in face recognition in advance. In consideration of the above, it is possible to distribute a commercial according to the situation of the viewer recognized by the face recognition system.
<Second Embodiment>
Next, a second embodiment of the present invention will be described. FIG. 25 shows a second embodiment of the present invention. In FIG. 25, the same parts as those in the first embodiment shown in FIG. 1 are denoted by the same reference numerals, and the description thereof is omitted. As shown in FIG. 25, in the second embodiment of the present invention, a camera 14 is provided for the personal computer 12 connected to the home network 4. A face of a person using the personal computer 12 is photographed by the camera 14, and the face of the person is recognized from a photographed image of the camera 14. Then, a preference database of the person is formed from the usage history of the personal computer 12.
  FIG. 26 shows a functional block diagram of the personal computer 12 in the second embodiment of the present invention. As shown in FIG. 26, the personal computer 12 includes a face recognition system 51, a PC operation history creation unit 52, a PC operation detection unit 53, a PC operation history database 54, a preference data analysis unit 55, and a preference database 56. And have.
  In FIG. 26, an image captured by the camera 14 is sent to the face recognition system 51. The face recognition system 51 performs face recognition from a photographed image of a camera. As the face recognition system 51, for example, the one shown in Non-Patent Document 1 can be used. The face recognition system 51 determines the user of the personal computer 12 by performing face recognition from the captured image of the camera 14. Information about the user of the personal computer 12 specified by the face recognition system 51 is sent to the PC operation history creation unit 52.
Further, the operation of the personal computer 12 is detected by the PC operation detection unit 53. The operation detection information of the PC operation detection unit 53 is sent to the PC operation history creation unit 52.
The PC operation history creation unit 52 generates PC operation history data from the information of the user of the personal computer 12 specified by the face recognition system 51 and the operation information of the personal computer 12 detected by the PC operation detection unit 53. Create and register in the PC operation history database 54.
In the PC operation history database 54, as shown in FIG. 27, the operation contents of the personal computer 12, the operation time, and the information of the user of the personal computer 12 specified by the face recognition system 51 are stored in association with each other. The
  The preference data analysis unit 55 analyzes the preference data for each person who operates each personal computer 12 based on the PC operation history data in the PC operation history database 54 and accumulates the preference data in the preference database 56. That is, the preference data analysis unit 55 extracts the PC operation history for each individual (personal number) from the PC operation history database 54 in order of time. Then, the preference data analysis unit 55 extracts a log when the access to the search site is executed, extracts a log when the search keyword is input near the time after the access time, The search keyword is extracted from among them. And the preference data analysis part 55 sorts the extracted search keyword in order of frequency, and produces preference data. Thereby, preference data as shown in FIG. 28 is generated. This preference data is sent to the family composition estimation system 20 shown in FIG. 2 and sent to the preference database 39 of the family composition estimation system 20.
  As described above, in the second embodiment of the present invention, the user of the personal computer 12 is determined by the face recognition system 51, and the operation information of the personal computer 12 is detected by the PC operation detection unit 53. From the usage history of the computer 12, the preference data is analyzed for each person who operates each personal computer 12 to obtain preference data. Thereby, commercials can be distributed based on each person's preference.
<Third Embodiment>
Next, a third embodiment of the present invention will be described. In the third embodiment, it is possible to restrict viewing of television broadcast contents that are inappropriate programs and commercials that are not desired to be shown to children.
FIG. 29 shows the configuration of a family structure estimation system according to the third embodiment of the present invention. In FIG. 29, the same parts as those in the first embodiment shown in FIG. 2 are given the same reference numerals and explanations thereof are omitted.
  In the family structure estimation system 20 in the third embodiment of the present invention, the age of the viewer is determined by the age determination unit 25 from the captured image of the camera 13. An age restriction unit 61 is provided for the recommendation processing unit 35, and the age restriction unit 61 restricts viewing of inappropriate programs and commercials.
FIG. 30 is a flowchart of the third embodiment of the present invention. In FIG. 30, the face recognition system 21 receives a moving image taken by the camera 13 installed in the television receiver 11, and the age determination unit 25 determines the age of the person shown in the camera 13 (step S <b> 801).
The age restriction unit 61 of the recommendation processing unit 35 receives the age determined by the face recognition system 21 and determines whether only the child is reflected in the camera 13 or whether the child is included in the reflected person. (Step S802). If no child is shown on the camera 13, the age restriction unit 61 ends the process.
  If it is determined in step S802 that only the child is reflected in the camera 13 or if the child is included in the reflected person, the recommendation processing unit 35 determines that the viewer is only a child or the viewer. Is notified to the content server interface unit 40 (step S803).
The content server interface unit 40 notifies the content server 5 that the viewer is only a child or that the viewer includes a child (step S804). The content server 5 restricts distribution so that inappropriate content that is not intended for children is not distributed (step S805).
As described above, in the third embodiment of the present invention, it is possible to restrict viewing of programs and commercials inappropriate for children by determining the age of the viewer from the captured image of the camera 13. it can.
<Fourth Embodiment>
FIG. 31 shows a fourth embodiment of the present invention. In FIG. 31, the same parts as those of the first embodiment shown in FIG. 1 are denoted by the same reference numerals, and the description thereof is omitted. In the third embodiment, cameras 15 and 16 are installed at the entrance 18 as shown in FIG.
  As shown in FIG. 32, one of the cameras 15 and 16 is installed inward and the other is installed outward. The captured images of the camera 15 and the camera 16 are sent to the face recognition system 21 of the family structure estimation system 20. The face recognition system 21 performs face recognition from images taken by the cameras 15 and 16.
  In the fourth embodiment, a person passing through the entrance 18 is photographed by the cameras 15 and 16, and a person who has passed through the entrance 18 can be determined from the captured images of the cameras 15 and 16. One of the cameras 15 and 16 is installed inward, and the other is installed outward. For this reason, it can be determined from the images of the cameras 15 and 16 whether the person has left the house or entered the house.
That is, in FIG. 32, if a face is reflected on the camera 16, it can be determined that the person has gone out. Further, if the face is reflected on the camera 15, it can be determined that the person has returned home.
In this fourth embodiment, the face recognition system 21 of the family structure estimation system 20 detects who's face at what time and from the video taken by the two cameras 15 and 16 installed near the entrance 18. Leave a log. Then, the face recognition system 21 of the family composition estimation system 20 analyzes the log, determines whether to go out and return home, and grasps who is in the family or away from home.
Moreover, in this 4th Embodiment, the recommendation process part 35 acquires the information of who is in a family at home by log analysis. Then, the recommendation processing unit 35 selects an appropriate pattern from the viewer's pattern list in consideration of not only the situation of the viewer but also who of the family is at home or absent.
In the fourth embodiment of the present invention, it is possible to grasp who is at home or away from the family, and to distribute a commercial depending on who is at home or away from the family.
For example, an example of a family of four shown in FIG. 18 of a father 39 years old, a mother 36 years old, an older sister (eldest daughter) 9 years old, and a younger brother (eldest son) 7 years old will be described. In this case, for example, a trip suitable for a family of four with two adults and two children, even when the father is at home but not watching television and the mother and two children are watching television. Deliver commercials that recommend plans. When the father is not at home and the mother and two children are watching television, a commercial of clothes that will suit a 9-year-old girl or a 7-year-old boy is delivered.
In the above description, an example in which two cameras are installed is shown. However, a method may be used in which only one camera is installed, the direction of movement is determined by image recognition, and whether the user is going out or going home.
  As described above, according to the embodiment of the present invention, by determining the recognition difficulty level of face recognition from the face orientation, brightness, face size, etc., and excluding face data that is difficult to recognize, The accuracy of face recognition can be improved. In addition, according to the embodiment of the present invention, it is determined whether or not a visitor is in front of the camera 13 by using a statistical technique, and the family composition is estimated by removing the visitor from the person photographed by the camera 13. It can be performed. Further, according to the embodiment of the present invention, the family structure of the person is traced by the logical method based on the age and sex of the person whose face is recognized, and the family structure such as brother-sister relationship, marital relationship, parent-child relationship, etc. We are trying to estimate. Thus, the family structure can be estimated with high accuracy even in the case of a complicated family structure without having the viewer register the face image to be referred to in the face recognition in advance.
  Further, according to the embodiment of the present invention, it is possible to grasp the age, sex, family structure, and preference of a person who is actually watching television, and to distribute a commercial suitable for that person. In addition, if the television viewer is only a child or the viewer includes a child, inappropriate content that is not intended for children can be prevented from being distributed. Also, whether you actually watched the commercial, how many people were watching the commercial, how many people were watching it, whether it was male or female, how smiled, what kind of family structure Detailed information such as can be collected.
Note that the present invention is not limited to the above-described embodiment, and various modifications and applications are possible without departing from the gist of the present invention.
For example, by using a face recognition system, detailed information such as how many people were watching a commercial, estimated viewer age, gender, facial expression at the time of viewing, and what kind of family members were collected is fed back to the content provider 1. , You can analyze the advertising effectiveness.
  Further, when determining a person reflected in the camera 13 as a viewer, it is determined whether or not the viewer is viewing based on whether or not the face is facing the front of the camera. You may make it determine whether it is viewing or not. In that case, since the number of people who were in front of the television but did not watch the commercial can be grasped, more detailed information can be collected. It is also possible to estimate the degree of likes of a commercial by grasping facial expressions such as the degree of smile when the commercial is being played. By examining the change in the number of detected faces during the commercial flow, for example, it is possible to grasp that “there is no longer during the commercial flow”.
  The program for realizing the function of the processing unit in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into the computer system and executed to determine the family structure. Etc. may be performed. Here, the “computer system” includes an OS and hardware such as peripheral devices. The “computer system” includes a WWW system having a homepage providing environment (or display environment). The “computer-readable recording medium” refers to a portable medium such as a flexible disk, a magneto-optical disk, a ROM, and a CD-ROM, and a storage device such as a hard disk built in the computer system. Further, the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
  The program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium. Here, the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line. The program may be for realizing a part of the functions described above. Furthermore, what can implement | achieve the function mentioned above in combination with the program already recorded on the computer system, and what is called a difference file (difference program) may be sufficient.
1: Content provider 2: Home 3: Communication network 4: Home network 5: Content server 6: Home gateway 7: Content database 11: Television receiver 12: Personal computer 13-16: Camera 18: Entrance 20: Family structure estimation System 21: Face recognition system 22: Face detection unit 23: Face recognition unit 24: Expression determination unit 25: Age determination unit 26: Gender determination unit 27: Face orientation determination unit 28: Brightness determination unit 29: Face size determination Unit 30: Face database 31: Log 32: Recognition difficulty determination unit 33: Family configuration estimation unit 34: Family configuration database 35: Recommendation processing unit 36: Time data generation unit 37: Weather database 38: Recommended basic database 39: Preference database 40: Content server interface unit 51: Face Identification system 52: operating history creation unit 53: operation detection unit 54: operation history database 55: taste data analysis unit 56: preference database 61: age limit unit

Claims (7)

  1. An information collection system to which a determination result output from a face recognition system that determines the age and sex of a person based on a face image obtained by photographing a person's face is input,
    A camera that captures face images of a plurality of persons in a defined home;
    A face database that stores the determination result determined and output by the face recognition system based on the face image captured by the camera;
    An information collection system comprising: a family configuration estimation unit that determines a family configuration by tracing a relationship of the family configuration in the home according to the age and sex indicated by the determination result.
  2. The family composition estimating means determines a parent-child relationship and a sibling relationship between the persons based on age group differences and gender of the plurality of persons indicated by the determination result stored in the face database, and the age group The information collection system according to claim 1, wherein a marital relationship of a pair of a man and a woman is determined based on the difference and the gender.
  3. The family composition estimation means determines the going-out pattern of the plurality of persons, determines whether the person is a visitor based on the going-out pattern and a time zone when the person was photographed by the camera, The information collection system according to claim 1 or 2, wherein when determining the family composition, the person determined to be a visitor is excluded.
  4. Recognizing difficulty determination means for determining the difficulty of face recognition based on the face image photographed by the camera;
    The family composition estimation unit excludes the face image determined to be difficult to recognize when the difficulty level determined by the recognition difficulty level determination unit is greater than or equal to a predetermined value when determining the family configuration. The information collection system according to any one of claims 1 to 3, wherein
  5. The information collection system according to claim 4, wherein the recognition difficulty level determination unit determines the recognition difficulty level based on an attribute of a face image captured by the camera.
  6. The information collection system is connected to a content server that distributes content in response to a content request among a plurality of defined content, and a television receiver that receives and displays the content,
    The content to be transmitted to the television receiver is determined based on the family configuration determined by the family configuration estimation means and the face image captured by the camera, and a content request for the determined content is received. The information collection system according to claim 1, further comprising a recommendation processing unit that transmits to the content server.
  7. An information collection method of an information collection system comprising a face database in which a determination result output from a face recognition system that determines the age and sex of a person based on a photographed person's face image is stored,
    A camera shooting face images of a plurality of persons in a defined home;
    The face recognition system outputting the determination result based on the face image photographed by the camera;
    A family structure estimation means comprising: determining a family structure by following a relationship of the family structure in the household according to the age and gender indicated by the determination result.
JP2009280568A 2009-12-10 2009-12-10 Information collection system and method Pending JP2011123657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009280568A JP2011123657A (en) 2009-12-10 2009-12-10 Information collection system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009280568A JP2011123657A (en) 2009-12-10 2009-12-10 Information collection system and method

Publications (1)

Publication Number Publication Date
JP2011123657A true JP2011123657A (en) 2011-06-23

Family

ID=44287494

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009280568A Pending JP2011123657A (en) 2009-12-10 2009-12-10 Information collection system and method

Country Status (1)

Country Link
JP (1) JP2011123657A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013140196A (en) * 2011-12-28 2013-07-18 Toshiba Tec Corp Information display device and program
JP2013150086A (en) * 2012-01-18 2013-08-01 Sony Corp Behavior information recognition system, information processing apparatus and behavior information recognition method
JP2013186791A (en) * 2012-03-09 2013-09-19 Kddi Corp Signage reproduction system, signage reproduction method and program
JP2014014014A (en) * 2012-07-04 2014-01-23 Toshiba Tec Corp Information distribution device, signage system and program
JP2014042235A (en) * 2012-07-24 2014-03-06 Takashi Hirabayashi Audience rating survey system, facial expression information generation device, and facial expression information generation program
JP2014089703A (en) * 2012-10-04 2014-05-15 Denso Wave Inc Supporting system
JP2015532803A (en) * 2012-08-07 2015-11-12 ウエブチユーナー・コーポレイシヨン Targeting multimedia ads and recommending content with a viewer identifier detection system
US9338311B2 (en) 2011-06-14 2016-05-10 Canon Kabushiki Kaisha Image-related handling support system, information processing apparatus, and image-related handling support method
JP2016174330A (en) * 2015-03-18 2016-09-29 カシオ計算機株式会社 Information processing device, content determination method and program

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9338311B2 (en) 2011-06-14 2016-05-10 Canon Kabushiki Kaisha Image-related handling support system, information processing apparatus, and image-related handling support method
JP2013140196A (en) * 2011-12-28 2013-07-18 Toshiba Tec Corp Information display device and program
JP2013150086A (en) * 2012-01-18 2013-08-01 Sony Corp Behavior information recognition system, information processing apparatus and behavior information recognition method
JP2013186791A (en) * 2012-03-09 2013-09-19 Kddi Corp Signage reproduction system, signage reproduction method and program
JP2014014014A (en) * 2012-07-04 2014-01-23 Toshiba Tec Corp Information distribution device, signage system and program
JP2014042235A (en) * 2012-07-24 2014-03-06 Takashi Hirabayashi Audience rating survey system, facial expression information generation device, and facial expression information generation program
JP2015532803A (en) * 2012-08-07 2015-11-12 ウエブチユーナー・コーポレイシヨン Targeting multimedia ads and recommending content with a viewer identifier detection system
JP2014089703A (en) * 2012-10-04 2014-05-15 Denso Wave Inc Supporting system
JP2016174330A (en) * 2015-03-18 2016-09-29 カシオ計算機株式会社 Information processing device, content determination method and program
US10163021B2 (en) 2015-03-18 2018-12-25 Casio Computer Co., Ltd. Information processor, content determining method, and computer-readable storing medium

Similar Documents

Publication Publication Date Title
US10820048B2 (en) Methods for identifying video segments and displaying contextually targeted content on a connected television
US10204264B1 (en) Systems and methods for dynamically scoring implicit user interaction
US9129151B2 (en) Method and apparatus for automated analysis and identification of a person in image and video content
CN104935970B (en) Carry out the method and Television clients of television content recommendation
US20160105721A1 (en) Subscriber characterization system with filters
US20200394296A1 (en) Psychographic device fingerprinting
US10235025B2 (en) Various systems and methods for expressing an opinion
US9338311B2 (en) Image-related handling support system, information processing apparatus, and image-related handling support method
Cotter et al. Ptv: Intelligent personalised tv guides
US9015139B2 (en) Systems and methods for performing a search based on a media content snapshot image
CN103597846B (en) Reception apparatus, method and information providing apparatus for providing an alert service
US6006265A (en) Hyperlinks resolution at and by a special network server in order to enable diverse sophisticated hyperlinking upon a digital network
CN100426860C (en) Method and apparatus for recommending items of interest to a user based on recommendations for one or more third parties
US9235574B2 (en) Systems and methods for providing media recommendations
AU752381B2 (en) System and method for tailoring television and/or electronic program guide features, such as advertising
KR100424848B1 (en) Television receiver
DE112011103903B4 (en) Method for receiving a particular service and video display device
CN1849818B (en) Content selection method and content selection device
JP4783283B2 (en) Program selection system
AU759014B2 (en) Smart agent based on habit, statistical inference and psycho-demographic profiling
US9047367B2 (en) Socially collaborative filtering
US8442922B2 (en) Sporting event image capture, processing and publication
US20140281971A1 (en) Methods and systems for generating objective specific playlists
CN104488275B (en) For customizing the method and system of television content
CN101828393B (en) Media-based recommendations