WO2016035632A1 - Data processing device, data processing system, data processing method, and program - Google Patents

Data processing device, data processing system, data processing method, and program Download PDF

Info

Publication number
WO2016035632A1
WO2016035632A1 PCT/JP2015/073955 JP2015073955W WO2016035632A1 WO 2016035632 A1 WO2016035632 A1 WO 2016035632A1 JP 2015073955 W JP2015073955 W JP 2015073955W WO 2016035632 A1 WO2016035632 A1 WO 2016035632A1
Authority
WO
WIPO (PCT)
Prior art keywords
attribute information
estimated
person
data processing
user
Prior art date
Application number
PCT/JP2015/073955
Other languages
French (fr)
Japanese (ja)
Inventor
一秀 梅田
Original Assignee
Necソリューションイノベータ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necソリューションイノベータ株式会社 filed Critical Necソリューションイノベータ株式会社
Priority to JP2016546579A priority Critical patent/JP6267350B2/en
Publication of WO2016035632A1 publication Critical patent/WO2016035632A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions

Definitions

  • the present invention relates to a data processing device, a data processing system, a data processing method, and a program.
  • Patent Document 1 discloses an information processing apparatus that performs face recognition on a face image and extracts attribute information representing a user's sex and age group.
  • Patent Document 1 By using the technique disclosed in Patent Document 1, for example, it is possible to analyze a user who has come to a certain spot (store, facility, etc.). However, since the accuracy of face recognition using a face image is not sufficient, the analysis accuracy becomes insufficient.
  • This invention makes it a subject to provide the new technique for analyzing the user who came to a certain spot.
  • Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device; Captured image acquisition means for acquiring image data of a captured image captured in the area; Recognizing a person shown in the photographed image by analyzing the photographed image, image processing means for estimating attribute information for each recognized person and generating estimated attribute information; The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information.
  • An extracting means for extracting at least one of them, A data processing apparatus is provided.
  • Computer An attribute information acquisition step for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
  • a captured image acquisition step of acquiring image data of a captured image captured in the area;
  • An image processing step of recognizing a person appearing in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
  • the attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information.
  • Computer Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device; Captured image acquisition means for acquiring image data of a captured image captured in the area; An image processing means for recognizing a person shown in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person; The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. Extraction means for extracting at least one of A program for functioning as a server is provided.
  • Each unit and each unit included in each embodiment of the present embodiment includes a CPU (Central Processing Unit), a memory, and a program loaded in the memory (a program stored in the memory from the stage of shipping the device in advance).
  • a CPU Central Processing Unit
  • a memory a program loaded in the memory (a program stored in the memory from the stage of shipping the device in advance).
  • a program loaded in the memory a program stored in the memory from the stage of shipping the device in advance
  • it includes hardware such as CD (Compact Disc) and programs downloaded from servers on the Internet, storage units such as hard disks that store the programs, and network connection interfaces. Realized by any combination of It will be understood by those skilled in the art that there are various modifications to the implementation method and apparatus.
  • each device and each terminal are described as being realized by one device, but the means for realizing the same is not limited thereto. That is, it may be a physically separated configuration or a logically separated configuration.
  • symbol is attached
  • the data processing system of this embodiment includes at least one of the data processing device 10, the installation device 20, the user portable terminal 30, and the photographing device 40.
  • the installation device 20 is installed at a spot (predetermined place) where user information is collected. Examples of such spots include stores, facilities, buildings, and the like, and more specifically, entrances and exits, arbitrary positions within the spots, and the like.
  • the installation device 20 is configured to be communicable with a predetermined wireless communication standard whose communication distance is within a predetermined distance, and performs wireless communication with the user portable terminal 30 that has entered the communication area 20A.
  • a predetermined wireless communication standard whose communication distance is within a predetermined distance, and performs wireless communication with the user portable terminal 30 that has entered the communication area 20A.
  • FIG. 1 one installation device 20 is shown, but a plurality of installation devices 20 may be installed at a certain spot.
  • a plurality of installation devices 20 may be installed at an entrance / exit of a certain spot and an arbitrary position inside the spot.
  • the user portable terminal 30 is carried by each user.
  • the user portable terminal 30 is configured to be able to perform wireless communication according to the wireless communication standard.
  • the user portable terminal 30 stores user attribute information (name, age, sex, etc.) in advance. Then, when the user portable terminal 30 enters the communication area 20A and performs wireless communication with the installation apparatus 20, the user attribute information is sent via the network 1 such as a wireless LAN (Local Area Network) or the Internet accordingly.
  • the network 1 such as a wireless LAN (Local Area Network) or the Internet accordingly.
  • the photographing device 40 photographs at least a part of the communication area 20A. Then, the captured image data is transmitted to the data processing device 10 by wired and / or wireless communication.
  • the communication between the data processing device 10 and the photographing device 40 and the communication between the data processing device 10 and the user portable terminal 30 are realized by different networks, but may be realized by the same network. .
  • the imaging device 40 is installed corresponding to each of the communication areas 20A of the plurality of installation devices 20, and images the inside of each communication area 20A.
  • the data processing apparatus 10 accumulates user attribute information collected from the user portable terminal 30.
  • image analysis is performed on the captured image acquired from the imaging device 40 to recognize a person shown in the image and to estimate the attributes (age, sex, etc.) of the recognized person.
  • estimated attribute information an attribute of a user estimated by image analysis is referred to as “estimated attribute information”.
  • the data processing apparatus 10 analyzes the user who has come to the spot using the user attribute information (including estimated attribute information) collected by two means in this way.
  • the contents of analysis are various, but the following examples can be considered, for example.
  • the user attribute information or the estimated attribute is to some extent only by either the means for acquiring the user's estimated attribute information using the face image and the means for acquiring the user's attribute information from the user portable terminal 30.
  • Information can be collected and attributes of users who come to the spot can be analyzed. However, in any case, there is a problem that the accuracy is not sufficient.
  • user attribute information and estimation collected by both means for acquiring estimated user attribute information using a face image and means for acquiring user attribute information from the user portable terminal 30 is analyzed.
  • the accuracy can be improved by acquiring the attribute information or the estimated attribute information of a certain user that cannot be acquired by one means by the other means.
  • the attribute information and estimated attribute information which represent the same user twice may generate
  • attribute information and estimated attribute information can be extracted and integrated. As a result, when analyzing the attributes of the users who have come to the spot, it is possible to eliminate inconvenience such as counting a certain user as two or more users.
  • attribute information is not acquired by means for acquiring user attribute information from the user portable terminal 30, It is possible to extract a user whose estimated attribute information has been acquired by means for acquiring estimated attribute information of a user using a face image. It can be estimated that such a user is a user who does not carry the user portable terminal 30 or is turned off.
  • estimated attribute information may be obtained depending on means for acquiring user estimated attribute information using face images.
  • a user who has acquired attribute information by means for acquiring user attribute information from the user portable terminal 30 can be extracted.
  • a case where the collection of estimated attribute information using a face image has failed can be accumulated by associating and accumulating a captured image at the timing when the attribute information is acquired.
  • the installation device 20 is configured to be communicable with a predetermined wireless communication standard whose communication distance is within a predetermined distance, and performs wireless communication with the user portable terminal 30 that has entered the communication area 20A.
  • the detection information transmission unit 21 periodically uses a predetermined wireless communication standard in which a communication distance is within a predetermined distance within a communication area (communication area 20A in FIG. 1) within a predetermined distance from the own device.
  • the detection information is transmitted intermittently or intermittently.
  • the detection information may include identification information of the installation device 20.
  • the user portable terminal 30 located in the said communication area (communication area 20A of FIG. 1) will receive the said detection information.
  • Examples of the wireless communication standard include, but are not limited to, a Bluetooth standard and a wireless LAN standard.
  • the installation device 20 may be, for example, a so-called beacon terminal or a wireless LAN access point, but is not limited thereto.
  • the user portable terminal 30 stores user attribute information in advance.
  • the attribute information includes various names such as the user's name, nickname, age, gender, occupation, and hobby.
  • the user portable terminal 30 is configured to be communicable with the same wireless communication standard as the installation device 20. Then, when the user portable terminal 30 enters the communication area 20A of the installation apparatus 20 and communicates with the installation apparatus 20 according to the wireless communication standard, the user attribute information stored therein is correspondingly stored in the data processing apparatus 10. Send to.
  • the user portable terminal 30 transmits date and time information such as the transmission date and time when the attribute information is transmitted to the data processing device 10 and the date and time when the user portable terminal 30 communicates with the installation device 20 in association with the user attribute information to the data processing device 10. May be.
  • the user portable terminal 30 when the user portable terminal 30 enters the communication area 20 ⁇ / b> A of the installation apparatus 20, the user portable terminal 30 receives the detection information transmitted from the installation apparatus 20 regularly or intermittently. Then, the user portable terminal 30 transmits user attribute information to the data processing device 10 in response to reception of the detection information. For example, when receiving the detection information, the user portable terminal 30 may write the user attribute information at a predetermined position in the detection information and transmit it to the data processing apparatus 10.
  • the user portable terminal 30 While the user portable terminal 30 is in the communication area 20A, the user portable terminal 30 continues to receive detection information transmitted from the installation apparatus 20 periodically or intermittently. Each time the user portable terminal 30 receives the detection information, the user portable terminal 30 may transmit user attribute information to the data processing device 10. Alternatively, the user portable terminal 30 may be configured not to transmit the user's attribute information at the second or subsequent reception when the detection information is continuously received from the installation device 20 at a time interval shorter than a predetermined time. Good.
  • the user portable terminal 30 may realize the above-described function by installing a predetermined application on an existing portable terminal such as a mobile phone such as a smartphone, a tablet terminal, or a portable game machine.
  • the user portable terminal 30 may be a terminal prepared exclusively for the data processing system of the present embodiment.
  • the imaging device 40 is installed corresponding to the installation device 20. Then, the imaging device 40 images the communication area 20 ⁇ / b> A of the installation device 20. For example, the imaging device 40 may capture a moving image, or may capture a still image periodically (at a longer time interval than the moving image).
  • the imaging device 40 transmits the captured image data to the data processing device 10 by wired and / or wireless communication.
  • the imaging device 40 may transmit information that can identify the device itself to the data processing device 10 so that the transmission source of the transmitted image data can be specified.
  • the imaging device 40 may further transmit to the 10 date / time information specifying the imaging date / time of each image data (each frame data).
  • the photographing device 40 can photograph a user who has entered the communication area 20A without omission.
  • the imaging device 40 may be installed so that the whole communication area 20A can be imaged, for example.
  • the imaging device 40 takes an image of the point through which the user always passes. It may be set as follows. In this case, the entire communication area 20A may not be captured.
  • FIG. 2 is a diagram conceptually illustrating an example of the hardware configuration of the data processing apparatus 10 of the present embodiment.
  • the data processing apparatus 10 of this embodiment includes, for example, a CPU 1A, a RAM (Random Access Memory) 2A, a ROM (Read Only Memory) 3A, a display control unit 4A, and a display 5A that are connected to each other via a bus 10A.
  • other elements such as an input / output interface connected to an external device by wire, a microphone, and a speaker may be provided. Further, some of the illustrated elements may not be included.
  • the CPU 1A controls the entire computer of the data processing apparatus 10 together with each element.
  • the ROM 3A includes an area for storing programs for operating the computer, various application programs, various setting data used when these programs operate.
  • the RAM 2A includes an area for temporarily storing data, such as a work area for operating a program.
  • the auxiliary storage device 9A is, for example, an HDD (Hard Disc Drive), and can store a large amount of data.
  • the touch panel display 5A includes a display device (LED (Light Emitting Diode) display, liquid crystal display, organic EL (Electro Luminescence) display, etc.) and a touch pad.
  • the display control unit 4A reads data stored in a VRAM (Video RAM), performs predetermined processing on the read data, and then sends the data to the touch panel display 5A to display various screens.
  • the operation reception unit 6A receives various operations via the operation unit 7A.
  • the operation unit 7A includes operation keys, operation buttons, switches, a jog dial, a touch panel display, a keyboard, and the like.
  • the communication unit 8A is wired and / or wirelessly connected to a network such as the Internet or a LAN, and communicates with other electronic devices.
  • FIG. 3 shows an example of a functional block diagram of the data processing apparatus 10.
  • the data processing apparatus 10 includes an attribute information acquisition unit 11, a captured image acquisition unit 12, an image processing unit 13, and an extraction unit 14.
  • the attribute information acquisition unit 11 acquires user attribute information from each of the user portable terminals 30 located in an area within a predetermined distance from the installation device 20 (communication area 20A in FIG. 1). That is, the attribute information acquisition unit 11 receives the user attribute information transmitted from the user portable terminal 30 as described above.
  • acquisition date information may be associated with the user attribute information acquired by the attribute information acquisition unit 11.
  • the acquisition date / time information here may be, for example, date / time information associated with the attribute information by the user portable terminal 30, or the date / time when the attribute information acquisition unit 11 acquired the attribute information from the user portable terminal 30. It may be the information shown.
  • FIG. 4 schematically shows an example of a plurality of pieces of attribute information acquired and accumulated by the attribute information acquisition unit 11.
  • the attribute information ID (genus ID)
  • the attribute information, and the acquisition date information are associated with each other.
  • the plurality of attribute information shown in FIG. 4 are arranged in the order of acquisition date information.
  • the attribute information of the genus ID “A0001” is “Mr. XX, 42 years old, male”, and the acquisition date information is “July 17, 2014 13:17:15”. I understand.
  • the captured image acquisition unit 12 acquires from the imaging device 40 image data of a captured image captured in an area within a predetermined distance from the installation device 20 (communication area 20A in FIG. 1). That is, the captured image acquisition unit 12 acquires the image data transmitted from the imaging device 40 as described above.
  • the image processing unit 13 analyzes the captured image of the image data acquired by the captured image acquisition unit 12, thereby recognizing a person shown in the captured image and estimating attribute information for each recognized person to estimate attributes. Generate information.
  • the estimated attribute information may include, for example, the age and gender of the recognized person.
  • the image processing unit 13 may extract a predetermined feature amount by analyzing a captured image for each recognized person.
  • the feature amount may be a feature amount extracted from each person's face, or may be a feature amount (color, shape, etc.) extracted from each person's clothes or belongings.
  • the image processing unit 13 analyzes the captured image of a certain frame and recognizes a new predetermined person, then tracks the person at the time of analysis of subsequent frames, and continuously captures the images across a plurality of frames.
  • Human image data may be collected as one image processing unit.
  • image analysis may be performed on the image processing unit to generate one estimated attribute information. Even if the person is the same person, after the frame-in, once the frame is out, and then the frame-in again, the captured image before the frame-out is regarded as one image processing unit, and the captured image after the frame-in is again after the frame-out. Can be another image processing unit.
  • FIG. 5 schematically shows an example of a plurality of estimated attribute information generated by the image processing unit 13.
  • an estimated attribute information ID (inferred ID)
  • estimated attribute information estimated attribute information
  • a feature amount estimated attribute information
  • acquisition date information is associated with each other.
  • the plurality of attribute information shown in FIG. 5 are arranged in order of acquisition date.
  • the acquisition date / time information indicates a range of shooting date / time of one or more frames included in the image processing unit from which each estimated attribute information is generated.
  • the estimated attribute information of the attribute ID “B0001” is “40's, male”, the feature amount extracted from the person is “xxx”, and the acquisition date / time information is “2014”. July 17, 2013 13:17:14 to 2014 July 17, 13:17:17 ".
  • estimated attribute information may further be associated with image data from which each estimated attribute information is generated.
  • the extraction unit 14 includes attribute information and estimated attribute information that represent the same person in duplicate, estimated attribute information that represents a person that is not represented by the attribute information, and attribute information that represents a person that is not represented by the estimated attribute information. Extract at least one of
  • the extraction unit 14 extracts estimated attribute information that represents the same person in duplicate from a plurality of estimated attribute information generated by analyzing each of a plurality of captured images captured at different timings. Also good.
  • the “plurality of captured images captured at different timings” may be captured images having different image processing units as described above.
  • the extraction unit 14 may extract estimated attribute information that represents the same person in duplicate from the plurality of estimated attribute information based on the feature amount of each user extracted by the image processing unit 13.
  • the extraction unit 14 extracts estimated attribute information acquired at the same timing as one piece of attribute information determined as a processing target.
  • the estimated attribute information whose acquisition date / time includes the acquisition date / time of the processing target may be extracted as the estimated attribute information acquired at the same timing.
  • estimated attribute information whose difference between the acquisition date representative time (any date within the estimated attribute information acquisition date) and the acquisition date of the processing target (attribute information) is within a predetermined time is acquired at the same timing.
  • the estimated attribute information may be extracted.
  • the extraction unit 14 calculates the degree of coincidence between the extracted estimated attribute information and the processing target (attribute information) based on the extracted estimated attribute information and a predetermined algorithm using the value of the processing target (attribute information). . Then, based on the calculated degree of coincidence, it is determined whether or not the attribute information to be processed and the extracted estimated attribute information represent the same person. For example, when the degree of coincidence is a predetermined value or more, it may be determined that the attribute information to be processed and the estimated attribute information represent the same person.
  • the extraction unit 14 indicates that the estimated attribute information with the highest degree of match and the attribute information to be processed represent the same person. You may judge.
  • the extraction unit 14 sets the genus ID “A0001” in FIG. 4 as a processing target.
  • the attribute information of the attribute IDs “B0001” and “B0002” is extracted from the estimated attribute information shown in FIG. 5 as the estimated attribute information acquired at the same timing.
  • the extraction unit 14 determines the degree of coincidence between the attribute information of the genus ID “A0001” and the estimated attribute information of the attribute ID “B0001”, and the estimation of the attribute information of the attribute ID “A0001” and the attribute ID “B0002”.
  • the degree of coincidence with the attribute information is calculated.
  • the higher degree of coincidence is calculated for the estimated attribute information of the attribute ID “B0001” having the same gender and age group than the attribute ID “B0002”.
  • the degree of coincidence of the estimated attribute information of the inferred ID “B0001” having the same gender and age group is equal to or greater than a predetermined value. Therefore, the extraction unit 14 determines that the attribute information of the genus ID “A0001” in FIG. 4 and the attribute information of the genus ID “B0001” in FIG. 5 represent the same value person.
  • the extraction unit 14 extracts attribute information acquired at the same timing as one estimated attribute information determined as a processing target.
  • the attribute information included in the acquisition date / time of the processing target may be extracted as the attribute information acquired at the same timing.
  • attribute information acquired at the same timing as attribute information whose difference between the acquisition date and time of the processing target (estimated attribute information) and the representative date and time (any date and time within the acquisition date and time of the estimated attribute information) is within a predetermined time May be extracted as
  • the extraction unit 14 calculates the degree of coincidence between the processing target (estimated attribute information) and the extracted attribute information based on a predetermined algorithm using the processing target (estimated attribute information) and the value of the extracted attribute information. . Then, based on the calculated matching degree, it is determined whether or not the estimated attribute information to be processed and the extracted attribute information represent the same person. For example, when the degree of coincidence is a predetermined value or more, it may be determined that the estimated attribute information to be processed and the attribute information represent the same person. If there is a plurality of attribute information whose degree of matching with the estimated attribute information to be processed is a predetermined value or more, the extraction unit 14 indicates that the attribute information having the highest degree of matching and the estimated attribute information to be processed represent the same person. You may judge.
  • the extraction unit 14 sets the inheritance ID “B0001” in FIG. 5 as a processing target.
  • the attribute information of the genus IDs “A0001” and “A0002” is extracted from the attribute information shown in FIG. 4 as the attribute information acquired at the same timing.
  • the extraction unit 14 matches the estimated attribute information of the attribute ID “B0001” with the attribute information of the attribute ID “A0001”, and the estimated attribute information of the attribute ID “B0001” and the attribute ID “A0002”.
  • the degree of coincidence with the attribute information is calculated.
  • a higher degree of coincidence is calculated for the attribute information of the genus ID “A0001” having the same gender and age group than the genus ID “A0002”.
  • the degree of coincidence of the attribute information of the genus ID “A0001” having the same gender and age group is a predetermined value or more. Therefore, the extraction unit 14 determines that the estimated attribute information of the attribute ID “B0001” in FIG. 5 and the attribute information of the attribute ID “A0001” in FIG. 4 represent the same value person.
  • Processing example 3 Extract estimated attribute information representing a person not represented by attribute information
  • the extraction unit 14 extracts attribute information acquired at the same timing as one piece of estimated attribute information determined as a processing target, for example, by the same method as in Processing Example 2.
  • the extraction unit 14 extracts estimated attribute information to be processed as estimated attribute information representing a person not represented by the attribute information.
  • the extraction unit 14 extracts the processing target (estimated attribute information) and the processing target (estimated attribute information) based on a predetermined algorithm using the value of the extracted attribute information. The degree of coincidence with the attribute information is calculated. Then, based on the calculated degree of coincidence, it is determined whether the processing target (estimated attribute information) and the extracted attribute information represent the same person. Then, when there is no attribute information whose degree of coincidence with the processing target is equal to or greater than a predetermined value, the extraction unit 14 estimates the processing target (estimated attribute information) as estimated attribute information representing a person who is not represented by the attribute information. Extract as
  • the extraction unit 14 sets the inheritance ID “B0003” in FIG. 5 as a processing target.
  • the attribute information acquired at the same timing is not extracted from the attribute information shown in FIG. Therefore, the extraction unit 14 extracts the estimated attribute information of the attribute ID “B0003” as estimated attribute information representing a person who is not represented by the attribute information.
  • Processing example 4 extracts attribute information representing a person not represented by estimated attribute information
  • the extraction unit 14 extracts estimated attribute information acquired at the same timing as one piece of attribute information determined as a processing target, for example, using the same method as in Processing Example 1.
  • the extraction unit 14 extracts the attribute information to be processed as attribute information representing a person not represented by the estimated attribute information.
  • the extraction unit 14 extracts the processing object (attribute information) based on a predetermined algorithm using the extracted estimated attribute information and the value of the processing object (attribute information). The degree of coincidence with the estimated attribute information is calculated. Then, based on the calculated degree of coincidence, it is determined whether the processing target (attribute information) and the extracted estimated attribute information represent the same person. Then, when there is no estimated attribute information whose degree of coincidence with the processing target is a predetermined value or more, the extraction unit 14 sets the processing target (attribute information) as attribute information representing a person who is not represented by the estimated attribute information. Extract as
  • the extraction unit 14 sets the genus ID “A0003” in FIG. 4 as a processing target.
  • the estimated attribute information acquired at the same timing is not extracted from the estimated attribute information shown in FIG. Therefore, the extraction unit 14 extracts the attribute information of the genus ID “A0003” as attribute information representing a person who is not represented by the estimated attribute information.
  • Processing Example 5 (Extracts estimated attribute information that represents the same person from multiple estimated attribute information)
  • the extraction unit 14 determines one piece of estimated attribute information as a processing target. Thereafter, the extraction unit 14 uses other estimated attribute information as a search target, and based on a predetermined algorithm using the estimated attribute information and the feature value, the degree of coincidence with the estimated attribute information to be processed is equal to or greater than a predetermined value. Extract estimated attribute information. Then, the extraction unit 14 determines that the processing target (estimated attribute information) and the extracted estimated attribute information represent the same person.
  • the extraction unit 14 sets the attribute ID “B0026” in FIG. 5 as a processing target. Then, the extraction unit 14 uses other estimated attribute information as a search target, and the degree of coincidence with the estimated attribute information of the estimated ID “B0026” calculated using the estimated attribute information and the feature value is equal to or greater than a predetermined value. Search for the estimated attribute information.
  • the extraction unit 14 uses the estimated attribute information and the estimated attribute information of the attribute ID “B0001” whose feature value values match, and the estimated attribute information whose degree of match with the processing target is equal to or greater than a predetermined value That is, it is extracted as estimated attribute information that represents the same person in duplicate.
  • Processing example 6 extract attribute information that represents the same person from multiple attribute information
  • the extraction unit 14 determines one piece of attribute information as a processing target. Thereafter, the extraction unit 14 uses other attribute information as a search target, and extracts attribute information whose attribute information values match. Then, the extraction unit 14 determines that the processing target (attribute information) and the extracted attribute information represent the same person.
  • the extraction unit 14 sets the genus ID “A0027” in FIG. 4 as a processing target. Then, the extraction unit 14 searches for attribute information that matches the attribute information value of the genus ID “A0026” with other attribute information as a search target.
  • the extraction unit 14 extracts the attribute information of the genus ID “A0001” having the same attribute information value as attribute information that represents the same person in duplicate.
  • Processing example 7 (arrangement of estimated attribute information and attribute information extracted in processing examples 1 to 6)
  • the attribute information acquired by the attribute information acquisition unit 11 and the estimated attribute information generated by the image processing unit 13 are collected as shown in FIG. 6, for example. be able to.
  • attribute information and estimated attribute information that represents the same person in duplicate, a plurality of estimated attribute information that represents the same person in duplicate, and a plurality of attribute information that represents the same person in duplicate are integrated.
  • One user ID is collected. As a result, inconvenience that two or more user IDs exist corresponding to the same person is reduced.
  • a user ID an attribute information ID (genus ID), an estimated attribute information ID (inferred ID), attribute information or estimated attribute information, and a feature amount are associated with each other.
  • the acquisition date and the like of the captured image, the attribute information, and / or the estimated attribute information may be associated.
  • the user ID “00001” is a combination of the attribute information of the genus ID “A0001” and “A0027” and the estimated attribute information of the genus ID “B0001” and “B0026”.
  • attribute information that is considered to be highly accurate is preferentially written in the columns of attribute information and estimated attribute information. That is, when both attribute information and estimated attribute information exist, attribute information is described. And when only attribute information exists, attribute information is described, and when only estimated attribute information exists, estimated attribute information is described. When both attribute information and estimated attribute information exist, estimated attribute information may be described, or both may be described.
  • Processing example 8 extract attribute information and estimated attribute information representing the same person in duplicate
  • the estimated attribute information of two persons recognized in a certain photographed image substantially match and two attribute information having the same content as the estimated attribute information is acquired at this timing.
  • FIG. 7 it is assumed that two pieces of estimated attribute information having the same estimated attribute information and having the same acquisition date and time are acquired.
  • FIG. 8 it is assumed that two pieces of attribute information having similar contents are acquired at this timing. It is impossible to determine which estimated attribute information and which attribute information represent the same person only by the information.
  • “Mr. XX” identified by the attribute information of the genus ID “B0061” in FIG. 8 is the estimated attribute obtained by acquiring attribute information acquired at other timings at other timings with reference to FIG. It is integrated with the information and is associated with the feature quantity “XX”. Based on this information, it can be seen that the feature amount of “Mr. XX” is “O”. Based on the information, the extraction unit 14 can specify that the attribute ID “B0058” and the attribute ID “B0061” are attribute information and estimated attribute information that represent the same person in duplicate. Then, it can be determined that the remaining attribute ID “B0059” and attribute ID “B0062” are attribute information and estimated attribute information that redundantly represent the same person.
  • association (integration) of attribute information acquired at other timing may be before or after the estimated attribute information and attribute information to be processed) and estimated attribute information. Based on the result, it is possible to extract attribute information and estimated attribute information that represent the same person repeatedly.
  • Processing example 9 extract attribute information and estimated attribute information representing the same person in duplicate
  • the estimated attribute information of two persons recognized in a certain photographed image almost match and one attribute information having the same content as the estimated attribute information is acquired at this timing.
  • FIG. 9 it is assumed that two pieces of estimated attribute information whose estimated attribute information matches and whose acquisition date and time overlap are acquired.
  • FIG. 10 it is assumed that one piece of attribute information having similar contents is acquired at this timing. It is impossible to determine which estimated attribute information and which attribute information represent the same person only by the information.
  • the extraction unit 14 can specify that the estimated attribute information of the attribute ID “B0072” is estimated attribute information representing a person not represented by the attribute information. Then, it can be determined that the remaining attribute ID “B0073” and attribute ID “A0076” are attribute information and estimated attribute information that redundantly represent the same person.
  • the attribute information acquired at other timing may be before or after the estimated attribute information and attribute information to be processed
  • Processing example 10 extract attribute information and estimated attribute information representing the same person in duplicate
  • three or more persons are recognized in a certain captured image, and three pieces of estimated attribute information are generated.
  • One attribute information may be acquired at this timing. For example, as shown in FIG. 11, it is assumed that three pieces of estimated attribute information having the acquisition dates and times are acquired. Then, as shown in FIG. 12, it is assumed that one piece of attribute information is acquired at this timing.
  • the extraction unit 14 can first determine that the attribute ID “B0093” and the attribute ID “A0101” do not represent the same person based on the degree of coincidence between the attribute information and the estimated attribute information.
  • the extraction unit 14 can specify that the estimated attribute information of the attribute ID “B0092” is estimated attribute information representing a person not represented by the attribute information. Then, it can be determined that the remaining attribute ID “B0094” and attribute ID “A0101” are attribute information and estimated attribute information that redundantly represent the same person.
  • the attribute information acquired at other timing may be before or after the estimated attribute information and attribute information to be processed
  • the extraction unit 14 acquires one piece of attribute information or estimated attribute information and sets it as a processing target (S10).
  • the attribute information acquisition unit 11 may sequentially input the attribute information acquired from the user portable terminal 30 to the extraction unit 14 by real-time processing.
  • the captured image acquisition unit 12 may acquire image data from the imaging device 40 by real-time processing.
  • the image processing unit 13 may generate the estimated attribute information by analyzing the image data by real-time processing and input the estimated attribute information to the extraction unit 14.
  • the extraction unit 14 may process the attribute information and the estimated attribute information input from the attribute information acquisition unit 11 and the image processing unit 13 as described above in the order of input and process them in real time.
  • the extraction unit 14 may batch-process a plurality of attribute information acquired by the attribute information acquisition unit 11 from the user portable terminal 30 and a plurality of estimated attribute information generated by the image processing unit 13. .
  • the extraction unit 14 may determine a processing target in order of acquisition date and time from the group of attribute information and estimated attribute information.
  • the extraction unit 14 performs integration processing based on the processing examples 1 to 10 using the processing target and the integrated data (S11).
  • the processing flow of the integration process is not particularly limited, and various aspects can be adopted. Thereafter, the same processing is repeated (S12).
  • the user portable terminal 30 transmits detection information including user attribute information in the communication area 30A, and the installation device 20 located in the communication area 30A receives the detection information. And the installation apparatus 20 which received the detection information transmits a user's attribute information to the data processing apparatus 10 according to the said reception.
  • the data processing system of the modification is different from the above example in this respect. Other configurations are the same as in the above example.
  • the user portable terminal 30 transmits user attribute information to the data processing device 10 via (via) the installation device 20.
  • the user portable terminal 30 of the modification may be a terminal prepared exclusively for the data processing system of the present embodiment, for example, a beacon terminal.
  • the installation device 20 and the data processing device 10 are communicated by wire and / or wireless.
  • the user attribute information (estimated attribute information is obtained by both the means for acquiring the user's estimated attribute information using the face image and the means for acquiring the user's attribute information from the user portable terminal 30. Can be collected). And a predetermined analysis can be performed using the information collected in this way. For example, an attribute of a user who has come to a predetermined spot is analyzed, a user who has come to a predetermined spot, a user who does not carry the user portable terminal 30 is extracted, the attribute is analyzed, and a face image is used Thus, it is possible to extract users who could not obtain estimated attribute information and analyze the attributes or accumulate the data.
  • the accuracy of analysis can be improved as compared with the case where user attribute information (including estimated attribute information) is acquired and analyzed by either means.
  • the range of contents that can be analyzed can be expanded.
  • Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device; Captured image acquisition means for acquiring image data of a captured image captured in the area; Recognizing a person shown in the photographed image by analyzing the photographed image, image processing means for estimating attribute information for each recognized person and generating estimated attribute information; The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extracting means for extracting at least one of them, A data processing apparatus. 2.
  • the extraction means further extracts the estimated attribute information that represents the same person from a plurality of the estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings.
  • Data processing device 3.
  • the image processing means extracts a feature amount by analyzing the captured image for each person recognized by the analysis of the captured image,
  • the data processing device extracts the estimated attribute information that represents the same person in duplicate from the plurality of estimated attribute information based on the feature amount. 4).
  • An installation device installed at a predetermined location and capable of communicating with a predetermined wireless communication standard having a communication distance within a predetermined distance;
  • a user portable terminal that is carried by the user, can communicate with the wireless communication standard, and enters the area within the predetermined distance from the installation device, and communicates with the installation device with the wireless communication standard;
  • a photographing device for photographing the inside of the area;
  • a data processing device according to any one of 1 to 3, A data processing system. 5. 4
  • the data processing system according to When the user portable terminal communicates with the installation apparatus according to the wireless communication standard, the user portable terminal transmits user attribute information stored in advance in the terminal to the data processing apparatus according to the communication. 6).
  • the estimated attribute information that represents the same person is further extracted from the plurality of estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings.
  • Data processing method. 7-3 In the data processing method described in 7-2, In the image processing step, for each person recognized by the analysis of the captured image, the feature amount is extracted by analyzing the captured image, A data processing method for extracting the estimated attribute information representing the same person in duplicate from the plurality of estimated attribute information based on the feature amount in the extracting step. 8).
  • Computer Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device; Captured image acquisition means for acquiring image data of a captured image captured in the area; An image processing means for recognizing a person shown in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person; The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. Extraction means for extracting at least one of Program to function as. 8-2.
  • the extraction means further extracts the estimated attribute information that represents the same person from among the plurality of estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings. Program to make. 8-3.
  • the image processing means causes the feature amount to be extracted by analyzing the captured image, A program that causes the extraction means to extract the estimated attribute information that represents the same person from a plurality of the estimated attribute information based on the feature amount.

Abstract

Provided is a data processing device (10), comprising: an attribute information acquisition unit (11) which acquires user attribute information from each user portable terminal which is located in an area within a prescribed distance from an installed device; a photographic image acquisition unit (12) which acquires image data of a photographic image which is photographed within the area; an image processing unit (13) which, by analyzing the photographic image, recognizes persons appearing in the photographic image, estimates the attribute information of each of the recognized persons, and generates estimated attribute information; and an extraction unit (14) which extracts at least one of the attribute information and the estimated attribute information which represent the same person, the estimated attribute information which represents the person who is not represented with the attribute information, or the attribute information which represents the person who is not represented with the estimated attribute information.

Description

データ処理装置、データ処理システム、データ処理方法及びプログラムData processing apparatus, data processing system, data processing method and program
 本発明は、データ処理装置、データ処理システム、データ処理方法及びプログラムに関する。 The present invention relates to a data processing device, a data processing system, a data processing method, and a program.
 特許文献1には、顔画像に対して顔認識を行い、ユーザの性別や年令層を表す属性情報を抽出する情報処理装置が開示されている。 Patent Document 1 discloses an information processing apparatus that performs face recognition on a face image and extracts attribute information representing a user's sex and age group.
特開2010-282590号公報JP 2010-282590 A
 特許文献1に開示の技術を用いることで、例えば、あるスポット(店舗、施設等)に来たユーザの分析を行うことができる。しかし、顔画像を用いた顔認識の精度は十分でないため、分析精度が不十分になる。 By using the technique disclosed in Patent Document 1, for example, it is possible to analyze a user who has come to a certain spot (store, facility, etc.). However, since the accuracy of face recognition using a face image is not sufficient, the analysis accuracy becomes insufficient.
 本願発明は、あるスポットに来たユーザの分析を行うための新たな技術を提供することを課題とする。 This invention makes it a subject to provide the new technique for analyzing the user who came to a certain spot.
 本発明によれば、
 設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得手段と、
 前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得手段と、
 前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理手段と、
 同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出手段と、
を有するデータ処理装置が提供される。
According to the present invention,
Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
Captured image acquisition means for acquiring image data of a captured image captured in the area;
Recognizing a person shown in the photographed image by analyzing the photographed image, image processing means for estimating attribute information for each recognized person and generating estimated attribute information;
The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extracting means for extracting at least one of them,
A data processing apparatus is provided.
 また、本発明によれば、
 コンピュータが、
 設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得工程と、
 前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得工程と、
 前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理工程と、
 同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出工程と、
を実行するデータ処理方法が提供される。
Moreover, according to the present invention,
Computer
An attribute information acquisition step for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
A captured image acquisition step of acquiring image data of a captured image captured in the area;
An image processing step of recognizing a person appearing in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extraction process for extracting at least one of the above,
A data processing method for performing the above is provided.
 また、本発明によれば、
 コンピュータを、
 設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得手段、
 前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得手段、
 前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理手段、
 同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出手段、
として機能させるためのプログラムが提供される。
Moreover, according to the present invention,
Computer
Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
Captured image acquisition means for acquiring image data of a captured image captured in the area;
An image processing means for recognizing a person shown in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. Extraction means for extracting at least one of
A program for functioning as a server is provided.
 本発明によれば、あるスポットに来たユーザの分析を行うための新たな技術が実現される。 According to the present invention, a new technique for analyzing a user who has come to a certain spot is realized.
 上述した目的、およびその他の目的、特徴および利点は、以下に述べる好適な実施の形態、およびそれに付随する以下の図面によってさらに明らかになる。 The above-described object and other objects, features, and advantages will be further clarified by a preferred embodiment described below and the following drawings attached thereto.
本実施形態のデータ処理システムの全体像の一例を説明するための図である。It is a figure for demonstrating an example of the whole image of the data processing system of this embodiment. 本実施形態のデータ処理装置のハードウエア構成の一例を模式的に示す図である。It is a figure which shows typically an example of the hardware constitutions of the data processor of this embodiment. 本実施形態のデータ処理装置の機能ブロック図の一例を示す図である。It is a figure which shows an example of the functional block diagram of the data processor of this embodiment. 本実施形態の属性情報取得部が取得する属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the attribute information which the attribute information acquisition part of this embodiment acquires. 本実施形態の画像処理部が生成する推定属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the estimation attribute information which the image process part of this embodiment produces | generates. 本実施形態の抽出部により統合された属性情報及び推定属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the attribute information and the estimated attribute information integrated by the extraction part of this embodiment. 本実施形態の画像処理部が生成する推定属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the estimation attribute information which the image process part of this embodiment produces | generates. 本実施形態の属性情報取得部が取得する属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the attribute information which the attribute information acquisition part of this embodiment acquires. 本実施形態の画像処理部が生成する推定属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the estimation attribute information which the image process part of this embodiment produces | generates. 本実施形態の属性情報取得部が取得する属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the attribute information which the attribute information acquisition part of this embodiment acquires. 本実施形態の画像処理部が生成する推定属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the estimation attribute information which the image process part of this embodiment produces | generates. 本実施形態の属性情報取得部が取得する属性情報の一例を模式的に示す図である。It is a figure which shows typically an example of the attribute information which the attribute information acquisition part of this embodiment acquires. 本実施形態のデータ処理方法の処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of a process of the data processing method of this embodiment. 本実施形態のデータ処理システムの全体像の一例を説明するための図である。It is a figure for demonstrating an example of the whole image of the data processing system of this embodiment.
 以下、本発明の実施形態について説明する。本実施形態の各装置及び各端末が備える各部は、任意のコンピュータのCPU(Central Processing Unit)、メモリ、メモリにロードされたプログラム(あらかじめ装置を出荷する段階からメモリ内に格納されているプログラムのほか、CD(Compact Disc)等の記憶媒体やインターネット上のサーバ等からダウンロードされたプログラムも含む)、そのプログラムを格納するハードディスク等の記憶ユニット、ネットワーク接続用インタフェイスを中心にハードウエアとソフトウエアの任意の組合せによって実現される。そして、その実現方法、装置にはいろいろな変形例があることは、当業者には理解されるところである。 Hereinafter, embodiments of the present invention will be described. Each unit and each unit included in each embodiment of the present embodiment includes a CPU (Central Processing Unit), a memory, and a program loaded in the memory (a program stored in the memory from the stage of shipping the device in advance). In addition, it includes hardware such as CD (Compact Disc) and programs downloaded from servers on the Internet, storage units such as hard disks that store the programs, and network connection interfaces. Realized by any combination of It will be understood by those skilled in the art that there are various modifications to the implementation method and apparatus.
 以下の実施形態の説明において利用する機能ブロック図は、ハードウエア単位の構成ではなく、機能単位のブロックを示している。これらの図においては、各装置及び各端末は1つの機器により実現されるよう記載されているが、その実現手段はこれに限定されない。すなわち、物理的に分かれた構成であっても、論理的に分かれた構成であっても構わない。なお、同一の構成要素には同一の符号を付し、適宜説明を省略する。 The functional block diagram used in the following description of the embodiment shows a functional unit block, not a hardware unit configuration. In these drawings, each device and each terminal are described as being realized by one device, but the means for realizing the same is not limited thereto. That is, it may be a physically separated configuration or a logically separated configuration. In addition, the same code | symbol is attached | subjected to the same component and description is abbreviate | omitted suitably.
 まず、図1を用いて、本実施形態のデータ処理システムの全体像について説明する。 First, the overall image of the data processing system of this embodiment will be described with reference to FIG.
 本実施形態のデータ処理システムは、データ処理装置10、設置装置20、ユーザ携帯端末30及び撮影装置40の中の少なくとも一つを有する。 The data processing system of this embodiment includes at least one of the data processing device 10, the installation device 20, the user portable terminal 30, and the photographing device 40.
 設置装置20は、ユーザ情報を収集するスポット(所定の場所)に設置される。このようなスポットとしては、店舗、施設、建物等が考えられ、より詳細には、これらの出入口、スポット内の任意の位置等が考えられる。設置装置20は、通信距離が所定の距離以内である所定の無線通信規格で通信可能に構成され、通信エリア20A内に入ってきたユーザ携帯端末30と無線通信を行う。なお、図1では、1つの設置装置20が示されているが、あるスポットに複数の設置装置20が設置されてもよい。例えば、あるスポットの出入口と、スポットの内部の任意の位置等に複数の設置装置20を設置してもよい。 The installation device 20 is installed at a spot (predetermined place) where user information is collected. Examples of such spots include stores, facilities, buildings, and the like, and more specifically, entrances and exits, arbitrary positions within the spots, and the like. The installation device 20 is configured to be communicable with a predetermined wireless communication standard whose communication distance is within a predetermined distance, and performs wireless communication with the user portable terminal 30 that has entered the communication area 20A. In FIG. 1, one installation device 20 is shown, but a plurality of installation devices 20 may be installed at a certain spot. For example, a plurality of installation devices 20 may be installed at an entrance / exit of a certain spot and an arbitrary position inside the spot.
 ユーザ携帯端末30は、各ユーザに携帯される。ユーザ携帯端末30は、上記無線通信規格で無線通信可能に構成されている。ユーザ携帯端末30は、予め、ユーザの属性情報(名称、年令、性別等)を記憶している。そして、ユーザ携帯端末30は、通信エリア20A内に入って設置装置20と無線通信を行うと、それに応じて、上記ユーザの属性情報を無線LAN(Local Area Network)やインターネット等のネットワーク1を介してデータ処理装置10に送信する。 The user portable terminal 30 is carried by each user. The user portable terminal 30 is configured to be able to perform wireless communication according to the wireless communication standard. The user portable terminal 30 stores user attribute information (name, age, sex, etc.) in advance. Then, when the user portable terminal 30 enters the communication area 20A and performs wireless communication with the installation apparatus 20, the user attribute information is sent via the network 1 such as a wireless LAN (Local Area Network) or the Internet accordingly. To the data processing device 10.
 撮影装置40は、通信エリア20A内の少なくとも一部を撮影する。そして、撮影した画像データを有線及び/又は無線での通信により、データ処理装置10に送信する。なお、図1では、データ処理装置10と撮影装置40の通信と、データ処理装置10とユーザ携帯端末30の通信とは、別のネットワークで実現されているが、同じネットワークで実現されてもよい。設置装置20が複数設置される場合、撮影装置40は、複数の設置装置20の通信エリア20A各々に対応して設置され、各通信エリア20A内を撮影する。 The photographing device 40 photographs at least a part of the communication area 20A. Then, the captured image data is transmitted to the data processing device 10 by wired and / or wireless communication. In FIG. 1, the communication between the data processing device 10 and the photographing device 40 and the communication between the data processing device 10 and the user portable terminal 30 are realized by different networks, but may be realized by the same network. . When a plurality of installation devices 20 are installed, the imaging device 40 is installed corresponding to each of the communication areas 20A of the plurality of installation devices 20, and images the inside of each communication area 20A.
 データ処理装置10は、ユーザ携帯端末30から収集したユーザの属性情報を蓄積する。また、撮影装置40から取得した撮影画像に対して画像解析を行い、画像内に写っている人物を認識するとともに、認識した人物の属性(年令、性別等)を推定する。以下、画像解析で推定したユーザの属性を「推定属性情報」という。 The data processing apparatus 10 accumulates user attribute information collected from the user portable terminal 30. In addition, image analysis is performed on the captured image acquired from the imaging device 40 to recognize a person shown in the image and to estimate the attributes (age, sex, etc.) of the recognized person. Hereinafter, an attribute of a user estimated by image analysis is referred to as “estimated attribute information”.
 データ処理装置10は、このように2つの手段で収集したユーザの属性情報(推定属性情報を含む)を用いて、スポットに来たユーザの分析を行う。分析内容は、様々であるが、例えば、以下の例が考えられる。 The data processing apparatus 10 analyzes the user who has come to the spot using the user attribute information (including estimated attribute information) collected by two means in this way. The contents of analysis are various, but the following examples can be considered, for example.
(1) スポットに来たユーザの属性を分析
 例えば、属性情報及び推定属性情報を用いた分析により、スポットに来たユーザの割合、男女比率、年令層等を分析することができる。
(1) Analyzing attributes of users who have come to the spot For example, by using the attribute information and the estimated attribute information, it is possible to analyze the ratio of users who have come to the spot, the gender ratio, the age group, and the like.
 なお、上記した顔画像を用いてユーザの推定属性情報を取得する手段、及び、ユーザ携帯端末30からユーザの属性情報を取得する手段、いずれか一方のみでも、ある程度はユーザの属性情報又は推定属性情報を収集し、スポットに来たユーザの属性を分析することができる。しかし、いずれの手段の場合も、その精度が十分でないという問題がある。 It should be noted that the user attribute information or the estimated attribute is to some extent only by either the means for acquiring the user's estimated attribute information using the face image and the means for acquiring the user's attribute information from the user portable terminal 30. Information can be collected and attributes of users who come to the spot can be analyzed. However, in any case, there is a problem that the accuracy is not sufficient.
 顔画像を用いてユーザの推定属性情報を取得する手段の場合、例えば、複数のユーザが重なり合った状態で撮影されるなどの不都合が生じうる。この場合、後方に位置するユーザを人物として認識できず、そのユーザの推定属性情報を収集できないという事態が生じうる。また、ユーザ携帯端末30からユーザの属性情報を取得する手段の場合、ユーザがユーザ携帯端末30を携帯していなかったり、電源をOFFにしている場合等には、ユーザの属性情報を取得できない。ユーザ携帯端末30はユーザの管理下にあるため、ユーザ携帯端末30を携帯するか否か、電源をONにするかOFFにするか等は、ユーザ次第である。このため、このような不都合を改善して、精度を向上させるのは難しい。 In the case of means for acquiring estimated attribute information of a user using a face image, for example, there may be a disadvantage that a plurality of users are photographed in an overlapping state. In this case, a situation may occur in which the user located behind cannot be recognized as a person and the estimated attribute information of the user cannot be collected. Further, in the case of means for acquiring user attribute information from the user portable terminal 30, the user attribute information cannot be acquired when the user does not carry the user portable terminal 30 or the power is turned off. Since the user portable terminal 30 is under the control of the user, it is up to the user whether to carry the user portable terminal 30 or whether to turn the power on or off. For this reason, it is difficult to improve such inconvenience and improve accuracy.
 これに対し、本実施形態では、顔画像を用いてユーザの推定属性情報を取得する手段、及び、ユーザ携帯端末30からユーザの属性情報を取得する手段の両方で収集したユーザの属性情報及び推定属性情報を用いて、スポットに来たユーザの属性を分析する。この場合、一方の手段で取得できなかったあるユーザの属性情報又は推定属性情報を、他方の手段で取得することにより、精度を向上させることができる。 On the other hand, in the present embodiment, user attribute information and estimation collected by both means for acquiring estimated user attribute information using a face image and means for acquiring user attribute information from the user portable terminal 30. Using the attribute information, the attribute of the user who came to the spot is analyzed. In this case, the accuracy can be improved by acquiring the attribute information or the estimated attribute information of a certain user that cannot be acquired by one means by the other means.
 なお、当該手段の場合、同一ユーザを重複して表す属性情報及び推定属性情報が発生し得る。詳細は後述するが、本実施形態では、このような属性情報及び推定属性情報を抽出し、それらを統合することができる。結果、スポットに来たユーザの属性を分析する際、あるユーザを2人以上のユーザとして重ねてカウントしてしまう等の不都合を解消することができる。 In addition, in the case of the said means, the attribute information and estimated attribute information which represent the same user twice may generate | occur | produce. Although details will be described later, in the present embodiment, such attribute information and estimated attribute information can be extracted and integrated. As a result, when analyzing the attributes of the users who have come to the spot, it is possible to eliminate inconvenience such as counting a certain user as two or more users.
(2)ユーザ携帯端末30を携帯していないユーザを抽出
 詳細は後述するが、本実施形態では、ユーザ携帯端末30からユーザの属性情報を取得する手段によっては属性情報を取得されていないが、顔画像を用いてユーザの推定属性情報を取得する手段により推定属性情報を取得されているユーザを抽出することができる。このようなユーザは、ユーザ携帯端末30を携帯していない、又は、電源をOFFにしているユーザであると推定できる。
(2) Extracting a user who does not carry the user portable terminal 30 Although details will be described later, in the present embodiment, attribute information is not acquired by means for acquiring user attribute information from the user portable terminal 30, It is possible to extract a user whose estimated attribute information has been acquired by means for acquiring estimated attribute information of a user using a face image. It can be estimated that such a user is a user who does not carry the user portable terminal 30 or is turned off.
 当該分析により、例えば、スポットに来たユーザの中のユーザ携帯端末30を携帯していないユーザの割合、男女比率、年令層等を分析することができる。また、ユーザ携帯端末30を携帯してスポットに来ることをルール化していた場合や、ユーザ携帯端末30を配布されたユーザのみがスポットに来ることができる場合、このようなルールを破ってスポットに来ているユーザを特定することができる。例えば、推定属性情報を生成する元となった顔画像を各推定属性情報に対応付けておくことで、このようなルールを破ったユーザの推定年令、推定性別のみならず、顔画像をも取得することができる。 By this analysis, for example, it is possible to analyze the ratio of users who do not carry the user portable terminal 30 among the users who have come to the spot, the gender ratio, the age group and the like. In addition, when it is ruled that the user portable terminal 30 is carried and come to the spot, or when only a user who has been distributed to the user portable terminal 30 can come to the spot, such a rule is broken to become a spot. The user who is coming can be specified. For example, by associating the face image from which the estimated attribute information was generated with each estimated attribute information, not only the estimated age and estimated gender of the user who broke such a rule but also the face image Can be acquired.
(3)顔画像を用いて推定属性情報を取得できなかったユーザを抽出
 詳細は後述するが、本実施形態では、顔画像を用いてユーザの推定属性情報を取得する手段によっては推定属性情報を取得されていないが、ユーザ携帯端末30からユーザの属性情報を取得する手段により属性情報を取得されているユーザを抽出することができる。このようなユーザを抽出することで、顔画像を用いた解析による推定属性情報の取得が難しいユーザの割合、男女比率、年令層等を分析することができる。また、抽出した属性情報とともに、当該属性情報が取得されたタイミングの撮影画像を対応付けて蓄積することで、顔画像を用いた推定属性情報の収集に失敗した事例を蓄積することができる。
(3) Extracting users who could not obtain estimated attribute information using face images As will be described in detail later, in this embodiment, estimated attribute information may be obtained depending on means for acquiring user estimated attribute information using face images. Although not acquired, a user who has acquired attribute information by means for acquiring user attribute information from the user portable terminal 30 can be extracted. By extracting such users, it is possible to analyze the proportion of users who are difficult to obtain estimated attribute information by analysis using face images, the gender ratio, the age group, and the like. In addition to the extracted attribute information, a case where the collection of estimated attribute information using a face image has failed can be accumulated by associating and accumulating a captured image at the timing when the attribute information is acquired.
 以下、各装置の構成について説明する。 Hereinafter, the configuration of each device will be described.
<設置装置20>
 設置装置20は、通信距離が所定の距離以内である所定の無線通信規格で通信可能に構成され、通信エリア20A内に入ってきたユーザ携帯端末30と無線通信を行う。
<Installation device 20>
The installation device 20 is configured to be communicable with a predetermined wireless communication standard whose communication distance is within a predetermined distance, and performs wireless communication with the user portable terminal 30 that has entered the communication area 20A.
 例えば、検知情報送信部21は、通信距離が所定の距離以内である所定の無線通信規格を用いて、自装置から所定の距離以内の通信エリア(図1の通信エリア20A)内に、定期的に又は間欠的に検知情報を送信する。検知情報には、設置装置20の識別情報が含まれてもよい。そして、当該通信エリア(図1の通信エリア20A)内に位置するユーザ携帯端末30は、当該検知情報を受信することとなる。無線通信規格は、ブルートゥース規格や無線LAN規格等が挙げられるがこれらに限定されない。設置装置20は、例えば、いわゆるビーコン端末や無線LANアクセスポイントであってもよいが、これらに限定されない。 For example, the detection information transmission unit 21 periodically uses a predetermined wireless communication standard in which a communication distance is within a predetermined distance within a communication area (communication area 20A in FIG. 1) within a predetermined distance from the own device. The detection information is transmitted intermittently or intermittently. The detection information may include identification information of the installation device 20. And the user portable terminal 30 located in the said communication area (communication area 20A of FIG. 1) will receive the said detection information. Examples of the wireless communication standard include, but are not limited to, a Bluetooth standard and a wireless LAN standard. The installation device 20 may be, for example, a so-called beacon terminal or a wireless LAN access point, but is not limited thereto.
<ユーザ携帯端末30>
 ユーザ携帯端末30は、予め、ユーザの属性情報を記憶している。属性情報は、ユーザの名称、ニックネーム、年令、性別、職業、趣味等様々である。
<User portable terminal 30>
The user portable terminal 30 stores user attribute information in advance. The attribute information includes various names such as the user's name, nickname, age, gender, occupation, and hobby.
 また、ユーザ携帯端末30は、設置装置20と同じ無線通信規格で通信可能に構成されている。そして、ユーザ携帯端末30は、設置装置20の通信エリア20A内に入り、設置装置20と上記無線通信規格で通信を行うと、それに応じて、記憶しているユーザの属性情報をデータ処理装置10に送信する。なお、ユーザ携帯端末30は、ユーザの属性情報に対応付けて、属性情報をデータ処理装置10に送信した送信日時や、設置装置20と通信した日時などの日時情報をデータ処理装置10に送信してもよい。 In addition, the user portable terminal 30 is configured to be communicable with the same wireless communication standard as the installation device 20. Then, when the user portable terminal 30 enters the communication area 20A of the installation apparatus 20 and communicates with the installation apparatus 20 according to the wireless communication standard, the user attribute information stored therein is correspondingly stored in the data processing apparatus 10. Send to. The user portable terminal 30 transmits date and time information such as the transmission date and time when the attribute information is transmitted to the data processing device 10 and the date and time when the user portable terminal 30 communicates with the installation device 20 in association with the user attribute information to the data processing device 10. May be.
 例えば、ユーザ携帯端末30は、設置装置20の通信エリア20Aに入ると、設置装置20から定期的に又は間欠的に送信されている検知情報を受信する。そして、ユーザ携帯端末30は、当該検知情報の受信に応じて、ユーザの属性情報をデータ処理装置10に送信する。例えば、ユーザ携帯端末30は、検知情報を受信すると、検知情報内の所定の位置にユーザの属性情報を書き込み、それをデータ処理装置10に送信してもよい。 For example, when the user portable terminal 30 enters the communication area 20 </ b> A of the installation apparatus 20, the user portable terminal 30 receives the detection information transmitted from the installation apparatus 20 regularly or intermittently. Then, the user portable terminal 30 transmits user attribute information to the data processing device 10 in response to reception of the detection information. For example, when receiving the detection information, the user portable terminal 30 may write the user attribute information at a predetermined position in the detection information and transmit it to the data processing apparatus 10.
 なお、通信エリア20A内にユーザ携帯端末30がいる間、ユーザ携帯端末30は、設置装置20から定期的に又は間欠的に送信される検知情報を受信し続けることとなる。ユーザ携帯端末30は、検知情報を受信すると、その都度、ユーザの属性情報をデータ処理装置10に送信してもよい。または、ユーザ携帯端末30は、設置装置20から所定の時間よりも短い時間間隔で連続的に検知情報を受信した場合、2回目以降の受信時にはユーザの属性報を送信しないように構成してもよい。 In addition, while the user portable terminal 30 is in the communication area 20A, the user portable terminal 30 continues to receive detection information transmitted from the installation apparatus 20 periodically or intermittently. Each time the user portable terminal 30 receives the detection information, the user portable terminal 30 may transmit user attribute information to the data processing device 10. Alternatively, the user portable terminal 30 may be configured not to transmit the user's attribute information at the second or subsequent reception when the detection information is continuously received from the installation device 20 at a time interval shorter than a predetermined time. Good.
 ユーザ携帯端末30は、例えば、スマートフォン等の携帯電話、タブレット端末、携帯ゲーム機等の既存の携帯端末に所定のアプリをインストールすることで上記機能を実現したものであってもよい。または、ユーザ携帯端末30は、本実施形態のデータ処理システム専用に準備された端末であってもよい。 The user portable terminal 30 may realize the above-described function by installing a predetermined application on an existing portable terminal such as a mobile phone such as a smartphone, a tablet terminal, or a portable game machine. Alternatively, the user portable terminal 30 may be a terminal prepared exclusively for the data processing system of the present embodiment.
<撮影装置40>
 撮影装置40は、設置装置20に対応して設置される。そして、撮影装置40は、設置装置20の通信エリア20A内を撮影する。例えば、撮影装置40は動画像を撮影してもよいし、静止画像を定期的に(動画像よりも長い時間間隔で)撮影してもよい。
<Photographing device 40>
The imaging device 40 is installed corresponding to the installation device 20. Then, the imaging device 40 images the communication area 20 </ b> A of the installation device 20. For example, the imaging device 40 may capture a moving image, or may capture a still image periodically (at a longer time interval than the moving image).
 撮影装置40は、撮影した画像データを、有線及び/又は無線での通信により、データ処理装置10に送信する。なお、撮影装置40は、自装置を識別できる情報をデータ処理装置10に送信し、送信した画像データの送信元を特定できるようにしてもよい。また、撮影装置40は、各画像データ(各フレームデータ)の撮影日時を特定する日時情報をさらに10に送信してもよい。 The imaging device 40 transmits the captured image data to the data processing device 10 by wired and / or wireless communication. In addition, the imaging device 40 may transmit information that can identify the device itself to the data processing device 10 so that the transmission source of the transmitted image data can be specified. Further, the imaging device 40 may further transmit to the 10 date / time information specifying the imaging date / time of each image data (each frame data).
 撮影装置40は、通信エリア20A内に入ったユーザを漏れなく撮影できるのが好ましい。このため、撮影装置40は、例えば通信エリア20A内全体を撮影できるように設置されてもよい。または、通信エリア20Aに入ったユーザが必ず通るポイントがある場合、例えば通信エリア20A内にユーザが必ず通るスポットの出入口が含まれている場合、撮影装置40はこのユーザが必ず通るポイントを撮影するように設定されてもよい。この場合、通信エリア20Aの全体を撮影できなくてもよい。 It is preferable that the photographing device 40 can photograph a user who has entered the communication area 20A without omission. For this reason, the imaging device 40 may be installed so that the whole communication area 20A can be imaged, for example. Alternatively, when there is a point through which the user who enters the communication area 20A always passes, for example, when the entrance / exit of a spot through which the user always passes is included in the communication area 20A, the imaging device 40 takes an image of the point through which the user always passes. It may be set as follows. In this case, the entire communication area 20A may not be captured.
<データ処理装置10>
 図2は、本実施形態のデータ処理装置10のハードウエア構成の一例を概念的に示す図である。図示するように、本実施形態のデータ処理装置10は、例えば、バス10Aで相互に接続されるCPU1A、RAM(Random Access Memory)2A、ROM(Read Only Memory)3A、表示制御部4A、ディスプレイ5A、操作受付部6A、操作部7A、通信部8A、補助記憶装置9A等を有する。なお、図示しないが、その他、外部機器と有線で接続される入出力インタフェイス、マイク、スピーカ等の他の要素を備えてもよい。また、図示する要素の一部を含まなくてもよい。
<Data processing apparatus 10>
FIG. 2 is a diagram conceptually illustrating an example of the hardware configuration of the data processing apparatus 10 of the present embodiment. As shown in the figure, the data processing apparatus 10 of this embodiment includes, for example, a CPU 1A, a RAM (Random Access Memory) 2A, a ROM (Read Only Memory) 3A, a display control unit 4A, and a display 5A that are connected to each other via a bus 10A. , An operation reception unit 6A, an operation unit 7A, a communication unit 8A, an auxiliary storage device 9A, and the like. Although not shown, other elements such as an input / output interface connected to an external device by wire, a microphone, and a speaker may be provided. Further, some of the illustrated elements may not be included.
 CPU1Aは各要素とともにデータ処理装置10のコンピュータ全体を制御する。ROM3Aは、コンピュータを動作させるためのプログラムや各種アプリケーションプログラム、それらのプログラムが動作する際に使用する各種設定データなどを記憶する領域を含む。RAM2Aは、プログラムが動作するための作業領域など一時的にデータを記憶する領域を含む。補助記憶装置9Aは、例えばHDD(Hard Disc Drive)であり、大容量のデータを記憶可能である。 The CPU 1A controls the entire computer of the data processing apparatus 10 together with each element. The ROM 3A includes an area for storing programs for operating the computer, various application programs, various setting data used when these programs operate. The RAM 2A includes an area for temporarily storing data, such as a work area for operating a program. The auxiliary storage device 9A is, for example, an HDD (Hard Disc Drive), and can store a large amount of data.
 タッチパネルディスプレイ5Aは、表示装置(LED(Light Emitting Diode)表示器、液晶ディスプレイ、有機EL(Electro Luminescence)ディスプレイ等)と、タッチパッドとが一体になっている。表示制御部4Aは、VRAM(Video RAM)に記憶されたデータを読み出し、読み出したデータに対して所定の処理を施した後、タッチパネルディスプレイ5Aに送って各種画面表示を行う。操作受付部6Aは、操作部7Aを介して各種操作を受付ける。操作部7Aは、操作キー、操作ボタン、スイッチ、ジョグダイヤル、タッチパネルディスプレイ、キーボードなどを含む。通信部8Aは、有線及び/又は無線で、インターネット、LAN等のネットワークに接続し、他の電子機器と通信する。 The touch panel display 5A includes a display device (LED (Light Emitting Diode) display, liquid crystal display, organic EL (Electro Luminescence) display, etc.) and a touch pad. The display control unit 4A reads data stored in a VRAM (Video RAM), performs predetermined processing on the read data, and then sends the data to the touch panel display 5A to display various screens. The operation reception unit 6A receives various operations via the operation unit 7A. The operation unit 7A includes operation keys, operation buttons, switches, a jog dial, a touch panel display, a keyboard, and the like. The communication unit 8A is wired and / or wirelessly connected to a network such as the Internet or a LAN, and communicates with other electronic devices.
 図3に、データ処理装置10の機能ブロック図の一例を示す。図示するように、データ処理装置10は、属性情報取得部11と、撮影画像取得部12と、画像処理部13と、抽出部14とを有する。 FIG. 3 shows an example of a functional block diagram of the data processing apparatus 10. As illustrated, the data processing apparatus 10 includes an attribute information acquisition unit 11, a captured image acquisition unit 12, an image processing unit 13, and an extraction unit 14.
 属性情報取得部11は、設置装置20から所定の距離以内のエリア(図1の通信エリア20A)に位置するユーザ携帯端末30各々から、ユーザの属性情報を取得する。すなわち、属性情報取得部11は、上述のようにしてユーザ携帯端末30から送信されたユーザの属性情報を受信する。なお、属性情報取得部11が取得したユーザの属性情報には、取得日時情報が対応付けられてもよい。ここでの取得日時情報は、例えば、ユーザ携帯端末30が属性情報に対応付けた日時情報であってもよいし、又は、属性情報取得部11がユーザ携帯端末30から属性情報を取得した日時を示す情報であってもよい。 The attribute information acquisition unit 11 acquires user attribute information from each of the user portable terminals 30 located in an area within a predetermined distance from the installation device 20 (communication area 20A in FIG. 1). That is, the attribute information acquisition unit 11 receives the user attribute information transmitted from the user portable terminal 30 as described above. Note that acquisition date information may be associated with the user attribute information acquired by the attribute information acquisition unit 11. The acquisition date / time information here may be, for example, date / time information associated with the attribute information by the user portable terminal 30, or the date / time when the attribute information acquisition unit 11 acquired the attribute information from the user portable terminal 30. It may be the information shown.
 図4に、属性情報取得部11が取得し、蓄積した複数の属性情報の一例を模式的に示す。図示する例では、属性情報ID(属ID)と、属性情報と、取得日時情報とが対応付けられている。図4に示す複数の属性情報は、取得日時情報順に並んでいる。当該例によれば、属ID「A0001」の属性情報は、「○○さん、42才、男性」であり、取得日時情報は「2014年7月17日13時17分15秒」であることが分かる。 FIG. 4 schematically shows an example of a plurality of pieces of attribute information acquired and accumulated by the attribute information acquisition unit 11. In the illustrated example, the attribute information ID (genus ID), the attribute information, and the acquisition date information are associated with each other. The plurality of attribute information shown in FIG. 4 are arranged in the order of acquisition date information. According to the example, the attribute information of the genus ID “A0001” is “Mr. XX, 42 years old, male”, and the acquisition date information is “July 17, 2014 13:17:15”. I understand.
 撮影画像取得部12は、設置装置20から所定の距離以内のエリア(図1の通信エリア20A)内を撮影した撮影画像の画像データを撮影装置40から取得する。すなわち、撮影画像取得部12は、上述のようにして撮影装置40から送信されてきた画像データを取得する。 The captured image acquisition unit 12 acquires from the imaging device 40 image data of a captured image captured in an area within a predetermined distance from the installation device 20 (communication area 20A in FIG. 1). That is, the captured image acquisition unit 12 acquires the image data transmitted from the imaging device 40 as described above.
 画像処理部13は、撮影画像取得部12が取得した画像データの撮影画像を解析することにより、撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する。推定属性情報は、例えば、認識した人物の年令及び性別を含んでもよい。また、画像処理部13、認識した人物毎に、撮影画像を解析することにより所定の特徴量を抽出してもよい。特徴量は、各人物の顔から抽出される特徴量であってもよいし、各人物の服装や持ち物から抽出される特徴量(色、形等)であってもよい。 The image processing unit 13 analyzes the captured image of the image data acquired by the captured image acquisition unit 12, thereby recognizing a person shown in the captured image and estimating attribute information for each recognized person to estimate attributes. Generate information. The estimated attribute information may include, for example, the age and gender of the recognized person. Alternatively, the image processing unit 13 may extract a predetermined feature amount by analyzing a captured image for each recognized person. The feature amount may be a feature amount extracted from each person's face, or may be a feature amount (color, shape, etc.) extracted from each person's clothes or belongings.
 なお、画像処理部13は、あるフレームの撮影画像を解析して新たな所定の人物を認識した後、以降のフレームの解析時にその人物を追跡し、複数のフレームに跨って連続的に写る同一人物の画像データを1つの画像処理単位としてまとめてもよい。そして、当該画像処理単位に対して画像解析を行い、1つの推定属性情報を生成してもよい。なお、同一人物であっても、フレームイン後、一度フレームアウトし、その後再びフレームインした場合、フレームアウト前の撮影画像を1つの画像処理単位とし、フレームアウト後に再びフレームインした後の撮影画像を他の1つの画像処理単位とすることができる。 The image processing unit 13 analyzes the captured image of a certain frame and recognizes a new predetermined person, then tracks the person at the time of analysis of subsequent frames, and continuously captures the images across a plurality of frames. Human image data may be collected as one image processing unit. Then, image analysis may be performed on the image processing unit to generate one estimated attribute information. Even if the person is the same person, after the frame-in, once the frame is out, and then the frame-in again, the captured image before the frame-out is regarded as one image processing unit, and the captured image after the frame-in is again after the frame-out. Can be another image processing unit.
 図5に、画像処理部13により生成される複数の推定属性情報の一例を模式的に示す。図示する例では、推定属性情報ID(推属ID)と、推定属性情報と、特徴量と、取得日時情報とが対応付けられている。図5に示す複数の属性情報は、取得日時順に並んでいる。取得日時情報は、各推定属性情報を生成する元となった上記画像処理単位に含まれる1つまたは複数のフレームの撮影日時の範囲を示す。当該例によれば、推属ID「B0001」の推定属性情報は、「40代、男性」であり、当該人物から抽出した特徴量は、「×××」であり、取得日時情報は「2014年7月17日13時17分14秒~2014年7月17日13時17分17秒」であることが分かる。 FIG. 5 schematically shows an example of a plurality of estimated attribute information generated by the image processing unit 13. In the illustrated example, an estimated attribute information ID (inferred ID), estimated attribute information, a feature amount, and acquisition date information are associated with each other. The plurality of attribute information shown in FIG. 5 are arranged in order of acquisition date. The acquisition date / time information indicates a range of shooting date / time of one or more frames included in the image processing unit from which each estimated attribute information is generated. According to the example, the estimated attribute information of the attribute ID “B0001” is “40's, male”, the feature amount extracted from the person is “xxx”, and the acquisition date / time information is “2014”. July 17, 2013 13:17:14 to 2014 July 17, 13:17:17 ".
 なお、推定属性情報には、さらに、各推定属性情報を生成する元となった画像データが対応付けられてもよい。 Note that the estimated attribute information may further be associated with image data from which each estimated attribute information is generated.
 抽出部14は、同一人物を重複して表す属性情報及び推定属性情報、属性情報で表されていない人物を表す推定属性情報、及び、推定属性情報で表されていない人物を表す属性情報の中の少なくとも一つを抽出する。 The extraction unit 14 includes attribute information and estimated attribute information that represent the same person in duplicate, estimated attribute information that represents a person that is not represented by the attribute information, and attribute information that represents a person that is not represented by the estimated attribute information. Extract at least one of
 さらに、抽出部14は、互いに異なるタイミングで撮影された複数の撮影画像各々を解析することで生成された複数の推定属性情報の中から、同一人物を重複して表す推定属性情報を抽出してもよい。「異なるタイミングで撮影された複数の撮影画像」とは、上述した画像処理単位が異なる撮影画像であってもよい。例えば、抽出部14は、画像処理部13が抽出した各ユーザの特徴量に基づいて、複数の推定属性情報の中から、同一人物を重複して表す推定属性情報を抽出してもよい。 Furthermore, the extraction unit 14 extracts estimated attribute information that represents the same person in duplicate from a plurality of estimated attribute information generated by analyzing each of a plurality of captured images captured at different timings. Also good. The “plurality of captured images captured at different timings” may be captured images having different image processing units as described above. For example, the extraction unit 14 may extract estimated attribute information that represents the same person in duplicate from the plurality of estimated attribute information based on the feature amount of each user extracted by the image processing unit 13.
 以下、抽出部14による処理例を説明する。
○ 処理例1(同一人物を重複して表す属性情報及び推定属性情報を抽出)
 まず、抽出部14は、処理対象として決定している1つの属性情報と同じタイミングで取得された推定属性情報を抽出する。例えば、取得日時が、処理対象(属性情報)の取得日時を含む推定属性情報を、同じタイミングで取得された推定属性情報として抽出してもよい。または、取得日時の代表日時(推定属性情報の取得日時の中の任意の日時)と処理対象(属性情報)の取得日時との差が所定時間以内である推定属性情報を、同じタイミングで取得された推定属性情報として抽出してもよい。
Hereinafter, an example of processing performed by the extraction unit 14 will be described.
○ Processing example 1 (extract attribute information and estimated attribute information representing the same person in duplicate)
First, the extraction unit 14 extracts estimated attribute information acquired at the same timing as one piece of attribute information determined as a processing target. For example, the estimated attribute information whose acquisition date / time includes the acquisition date / time of the processing target (attribute information) may be extracted as the estimated attribute information acquired at the same timing. Alternatively, estimated attribute information whose difference between the acquisition date representative time (any date within the estimated attribute information acquisition date) and the acquisition date of the processing target (attribute information) is within a predetermined time is acquired at the same timing. The estimated attribute information may be extracted.
 その後、抽出部14は、抽出した推定属性情報、及び、処理対象(属性情報)の値を利用した所定のアルゴリズムに基づき、抽出した推定属性情報と処理対象(属性情報)の一致度を算出する。そして、算出した一致度に基づいて、処理対象の属性情報と抽出した推定属性情報は同一人物を表すか否かを判断する。例えば、一致度が所定値以上である場合、処理対象の属性情報とその推定属性情報は同一人物を表すと判断してもよい。なお、処理対象の属性情報との一致度が所定値以上である推定属性情報が複数ある場合、抽出部14は、一致度が最も高い推定属性情報と、処理対象の属性情報が同一人物を表すと判断してもよい。 Thereafter, the extraction unit 14 calculates the degree of coincidence between the extracted estimated attribute information and the processing target (attribute information) based on the extracted estimated attribute information and a predetermined algorithm using the value of the processing target (attribute information). . Then, based on the calculated degree of coincidence, it is determined whether or not the attribute information to be processed and the extracted estimated attribute information represent the same person. For example, when the degree of coincidence is a predetermined value or more, it may be determined that the attribute information to be processed and the estimated attribute information represent the same person. If there is a plurality of estimated attribute information whose degree of match with the attribute information to be processed is a predetermined value or more, the extraction unit 14 indicates that the estimated attribute information with the highest degree of match and the attribute information to be processed represent the same person. You may judge.
 ここで、具体例を説明する。例えば、抽出部14は、図4の属ID「A0001」を処理対象にしたとする。この場合、図5に示す推定属性情報の中から、同じタイミングで取得された推定属性情報として、推属ID「B0001」、「B0002」の属性情報のみが抽出される。 Here, a specific example will be described. For example, it is assumed that the extraction unit 14 sets the genus ID “A0001” in FIG. 4 as a processing target. In this case, only the attribute information of the attribute IDs “B0001” and “B0002” is extracted from the estimated attribute information shown in FIG. 5 as the estimated attribute information acquired at the same timing.
 その後、抽出部14は、属ID「A0001」の属性情報と推属ID「B0001」の推定属性情報との一致度、及び、属ID「A0001」の属性情報と推属ID「B0002」の推定属性情報との一致度を算出する。ここでは、性別及び年令層が一致する推属ID「B0001」の推定属性情報の方が、推属ID「B0002」よりも高い一致度が算出される。また、性別及び年令層が一致する推属ID「B0001」の推定属性情報の一致度は、所定値以上となる。そこで、抽出部14は、図4の属ID「A0001」の属性情報と、図5の推属ID「B0001」の属性情報は、同一値人物を表すと判断する。 Thereafter, the extraction unit 14 determines the degree of coincidence between the attribute information of the genus ID “A0001” and the estimated attribute information of the attribute ID “B0001”, and the estimation of the attribute information of the attribute ID “A0001” and the attribute ID “B0002”. The degree of coincidence with the attribute information is calculated. Here, the higher degree of coincidence is calculated for the estimated attribute information of the attribute ID “B0001” having the same gender and age group than the attribute ID “B0002”. In addition, the degree of coincidence of the estimated attribute information of the inferred ID “B0001” having the same gender and age group is equal to or greater than a predetermined value. Therefore, the extraction unit 14 determines that the attribute information of the genus ID “A0001” in FIG. 4 and the attribute information of the genus ID “B0001” in FIG. 5 represent the same value person.
○処理例2(同一人物を重複して表す属性情報及び推定属性情報を抽出)
 まず、抽出部14は、処理対象として決定している1つの推定属性情報と同じタイミングで取得された属性情報を抽出する。例えば、取得日時が、処理対象(推定属性情報)の取得日時に含まれる属性情報を、同じタイミングで取得された属性情報として抽出してもよい。または、処理対象(推定属性情報)の取得日時の代表日時(推定属性情報の取得日時の中の任意の日時)との差が所定時間以内である属性情報を、同じタイミングで取得された属性情報として抽出してもよい。
○ Processing example 2 (extract attribute information and estimated attribute information representing the same person in duplicate)
First, the extraction unit 14 extracts attribute information acquired at the same timing as one estimated attribute information determined as a processing target. For example, the attribute information included in the acquisition date / time of the processing target (estimated attribute information) may be extracted as the attribute information acquired at the same timing. Or, attribute information acquired at the same timing as attribute information whose difference between the acquisition date and time of the processing target (estimated attribute information) and the representative date and time (any date and time within the acquisition date and time of the estimated attribute information) is within a predetermined time May be extracted as
 その後、抽出部14は、処理対象(推定属性情報)及び抽出した属性情報の値を利用した所定のアルゴリズムに基づき、処理対象(推定属性情報)と、抽出した属性情報との一致度を算出する。そして、算出した一致度に基づいて、処理対象の推定属性情報と抽出した属性情報は同一人物を表すか否かを判断する。例えば、一致度が所定値以上である場合、処理対象の推定属性情報とその属性情報は同一人物を表すと判断してもよい。なお、処理対象の推定属性情報との一致度が所定値以上である属性情報が複数ある場合、抽出部14は、一致度が最も高い属性情報と、処理対象の推定属性情報が同一人物を表すと判断してもよい。 Thereafter, the extraction unit 14 calculates the degree of coincidence between the processing target (estimated attribute information) and the extracted attribute information based on a predetermined algorithm using the processing target (estimated attribute information) and the value of the extracted attribute information. . Then, based on the calculated matching degree, it is determined whether or not the estimated attribute information to be processed and the extracted attribute information represent the same person. For example, when the degree of coincidence is a predetermined value or more, it may be determined that the estimated attribute information to be processed and the attribute information represent the same person. If there is a plurality of attribute information whose degree of matching with the estimated attribute information to be processed is a predetermined value or more, the extraction unit 14 indicates that the attribute information having the highest degree of matching and the estimated attribute information to be processed represent the same person. You may judge.
 ここで、具体例を説明する。例えば、抽出部14は、図5の推属ID「B0001」を処理対象にしたとする。この場合、図4に示す属性情報の中から、同じタイミングで取得された属性情報として、属ID「A0001」、「A0002」の属性情報のみが抽出される。 Here, a specific example will be described. For example, it is assumed that the extraction unit 14 sets the inheritance ID “B0001” in FIG. 5 as a processing target. In this case, only the attribute information of the genus IDs “A0001” and “A0002” is extracted from the attribute information shown in FIG. 4 as the attribute information acquired at the same timing.
 その後、抽出部14は、推属ID「B0001」の推定属性情報と属ID「A0001」の属性情報との一致度、及び、推属ID「B0001」の推定属性情報と属ID「A0002」の属性情報との一致度を算出する。ここでは、性別及び年令層が一致する属ID「A0001」の属性情報の方が、属ID「A0002」よりも高い一致度が算出される。また、性別及び年令層が一致する属ID「A0001」の属性情報の一致度は、所定値以上となる。そこで、抽出部14は、図5の推属ID「B0001」の推定属性情報と、図4の属ID「A0001」の属性情報は、同一値人物を表すと判断する。 Thereafter, the extraction unit 14 matches the estimated attribute information of the attribute ID “B0001” with the attribute information of the attribute ID “A0001”, and the estimated attribute information of the attribute ID “B0001” and the attribute ID “A0002”. The degree of coincidence with the attribute information is calculated. Here, a higher degree of coincidence is calculated for the attribute information of the genus ID “A0001” having the same gender and age group than the genus ID “A0002”. Further, the degree of coincidence of the attribute information of the genus ID “A0001” having the same gender and age group is a predetermined value or more. Therefore, the extraction unit 14 determines that the estimated attribute information of the attribute ID “B0001” in FIG. 5 and the attribute information of the attribute ID “A0001” in FIG. 4 represent the same value person.
○ 処理例3(属性情報で表されていない人物を表す推定属性情報を抽出)
 まず、抽出部14は、処理対象として決定している1つの推定属性情報と同じタイミングで取得された属性情報を、例えば処理例2と同様の手法により抽出する。ここで、属性情報が抽出されなかった場合、抽出部14は、処理対象の推定属性情報を、属性情報で表されていない人物を表す推定属性情報として抽出する。
○ Processing example 3 (Extract estimated attribute information representing a person not represented by attribute information)
First, the extraction unit 14 extracts attribute information acquired at the same timing as one piece of estimated attribute information determined as a processing target, for example, by the same method as in Processing Example 2. Here, when attribute information is not extracted, the extraction unit 14 extracts estimated attribute information to be processed as estimated attribute information representing a person not represented by the attribute information.
 一方、属性情報が抽出された場合、抽出部14は、処理対象(推定属性情報)及び抽出された属性情報の値を利用した所定のアルゴリズムに基づき、処理対象(推定属性情報)と、抽出した属性情報との一致度を算出する。そして、算出した一致度に基づいて、処理対象(推定属性情報)と抽出した属性情報は同一人物を表すか否かを判断する。そして、処理対象との一致度が所定値以上である属性情報が存在しなかった場合、抽出部14は、処理対象(推定属性情報)を、属性情報で表されていない人物を表す推定属性情報として抽出する。 On the other hand, when the attribute information is extracted, the extraction unit 14 extracts the processing target (estimated attribute information) and the processing target (estimated attribute information) based on a predetermined algorithm using the value of the extracted attribute information. The degree of coincidence with the attribute information is calculated. Then, based on the calculated degree of coincidence, it is determined whether the processing target (estimated attribute information) and the extracted attribute information represent the same person. Then, when there is no attribute information whose degree of coincidence with the processing target is equal to or greater than a predetermined value, the extraction unit 14 estimates the processing target (estimated attribute information) as estimated attribute information representing a person who is not represented by the attribute information. Extract as
 ここで、具体例を説明する。例えば、抽出部14は、図5の推属ID「B0003」を処理対象にしたとする。この場合、図4に示す属性情報の中から、同じタイミングで取得された属性情報が抽出されない。そこで、抽出部14は、推属ID「B0003」の推定属性情報を、属性情報で表されていない人物を表す推定属性情報として抽出する。 Here, a specific example will be described. For example, it is assumed that the extraction unit 14 sets the inheritance ID “B0003” in FIG. 5 as a processing target. In this case, the attribute information acquired at the same timing is not extracted from the attribute information shown in FIG. Therefore, the extraction unit 14 extracts the estimated attribute information of the attribute ID “B0003” as estimated attribute information representing a person who is not represented by the attribute information.
○ 処理例4(推定属性情報で表されていない人物を表す属性情報を抽出)
 まず、抽出部14は、処理対象として決定している1つの属性情報と同じタイミングで取得された推定属性情報を、例えば処理例1と同様の手法により抽出する。ここで、推定属性情報が抽出されなかった場合、抽出部14は、処理対象の属性情報を、推定属性情報で表されていない人物を表す属性情報として抽出する。
○ Processing example 4 (extract attribute information representing a person not represented by estimated attribute information)
First, the extraction unit 14 extracts estimated attribute information acquired at the same timing as one piece of attribute information determined as a processing target, for example, using the same method as in Processing Example 1. Here, when the estimated attribute information is not extracted, the extraction unit 14 extracts the attribute information to be processed as attribute information representing a person not represented by the estimated attribute information.
 一方、推定属性情報が抽出された場合、抽出部14は、抽出された推定属性情報及び処理対象(属性情報)の値を利用した所定のアルゴリズムに基づき、処理対象(属性情報)と、抽出した推定属性情報との一致度を算出する。そして、算出した一致度に基づいて、処理対象(属性情報)と抽出した推定属性情報は同一人物を表すか否かを判断する。そして、処理対象との一致度が所定値以上である推定属性情報が存在しなかった場合、抽出部14は、処理対象(属性情報)を、推定属性情報で表されていない人物を表す属性情報として抽出する。 On the other hand, when the estimated attribute information is extracted, the extraction unit 14 extracts the processing object (attribute information) based on a predetermined algorithm using the extracted estimated attribute information and the value of the processing object (attribute information). The degree of coincidence with the estimated attribute information is calculated. Then, based on the calculated degree of coincidence, it is determined whether the processing target (attribute information) and the extracted estimated attribute information represent the same person. Then, when there is no estimated attribute information whose degree of coincidence with the processing target is a predetermined value or more, the extraction unit 14 sets the processing target (attribute information) as attribute information representing a person who is not represented by the estimated attribute information. Extract as
 ここで、具体例を説明する。例えば、抽出部14は、図4の属ID「A0003」を処理対象にしたとする。この場合、図5に示す推定属性情報の中から、同じタイミングで取得された推定属性情報が抽出されない。そこで、抽出部14は、属ID「A0003」の属性情報を、推定属性情報で表されていない人物を表す属性情報として抽出する。 Here, a specific example will be described. For example, it is assumed that the extraction unit 14 sets the genus ID “A0003” in FIG. 4 as a processing target. In this case, the estimated attribute information acquired at the same timing is not extracted from the estimated attribute information shown in FIG. Therefore, the extraction unit 14 extracts the attribute information of the genus ID “A0003” as attribute information representing a person who is not represented by the estimated attribute information.
○ 処理例5(複数の推定属性情報の中から、同一人物を重複して表す推定属性情報を抽出)
 抽出部14は、1つの推定属性情報を処理対象として決定する。その後、抽出部14は、他の推定属性情報を検索対象とし、推定属性情報及び特徴量の値を利用した所定のアルゴリズムに基づき、処理対象の推定属性情報との一致度が所定値以上である推定属性情報を抽出する。そして、抽出部14は、処理対象(推定属性情報)と、抽出した推定属性情報は同一人物を表すと判断する。
○ Processing Example 5 (Extracts estimated attribute information that represents the same person from multiple estimated attribute information)
The extraction unit 14 determines one piece of estimated attribute information as a processing target. Thereafter, the extraction unit 14 uses other estimated attribute information as a search target, and based on a predetermined algorithm using the estimated attribute information and the feature value, the degree of coincidence with the estimated attribute information to be processed is equal to or greater than a predetermined value. Extract estimated attribute information. Then, the extraction unit 14 determines that the processing target (estimated attribute information) and the extracted estimated attribute information represent the same person.
 ここで、具体例を説明する。例えば、抽出部14は、図5の推属ID「B0026」を処理対象にしたとする。そして、抽出部14は、他の推定属性情報を検索対象として、推定属性情報及び特徴量の値を利用して算出される推属ID「B0026」の推定属性情報との一致度が所定値以上である推定属性情報を検索する。 Here, a specific example will be described. For example, it is assumed that the extraction unit 14 sets the attribute ID “B0026” in FIG. 5 as a processing target. Then, the extraction unit 14 uses other estimated attribute information as a search target, and the degree of coincidence with the estimated attribute information of the estimated ID “B0026” calculated using the estimated attribute information and the feature value is equal to or greater than a predetermined value. Search for the estimated attribute information.
 当該例の場合、抽出部14は、推定属性情報、及び、特徴量の値が一致する推属ID「B0001」の推定属性情報を、処理対象との一致度が所定値以上である推定属性情報、すなわち、同一人物を重複して表す推定属性情報として抽出する。 In the case of the example, the extraction unit 14 uses the estimated attribute information and the estimated attribute information of the attribute ID “B0001” whose feature value values match, and the estimated attribute information whose degree of match with the processing target is equal to or greater than a predetermined value That is, it is extracted as estimated attribute information that represents the same person in duplicate.
○ 処理例6(複数の属性情報の中から、同一人物を重複して表す属性情報を抽出)
 抽出部14は、1つの属性情報を処理対象として決定する。その後、抽出部14は、他の属性情報を検索対象とし、属性情報の値が一致する属性情報を抽出する。そして、抽出部14は、処理対象(属性情報)と、抽出した属性情報は同一人物を表すと判断する。
○ Processing example 6 (extract attribute information that represents the same person from multiple attribute information)
The extraction unit 14 determines one piece of attribute information as a processing target. Thereafter, the extraction unit 14 uses other attribute information as a search target, and extracts attribute information whose attribute information values match. Then, the extraction unit 14 determines that the processing target (attribute information) and the extracted attribute information represent the same person.
 ここで、具体例を説明する。例えば、抽出部14は、図4の属ID「A0027」を処理対象にしたとする。そして、抽出部14は、他の属性情報を検索対象として、属ID「A0026」の属性情報の値と一致する属性情報を検索する。 Here, a specific example will be described. For example, it is assumed that the extraction unit 14 sets the genus ID “A0027” in FIG. 4 as a processing target. Then, the extraction unit 14 searches for attribute information that matches the attribute information value of the genus ID “A0026” with other attribute information as a search target.
 当該例の場合、抽出部14は、属性情報の値が一致する属ID「A0001」の属性情報を、同一人物を重複して表す属性情報として抽出する。 In the case of this example, the extraction unit 14 extracts the attribute information of the genus ID “A0001” having the same attribute information value as attribute information that represents the same person in duplicate.
○ 処理例7(処理例1乃至6で抽出した推定属性情報及び属性情報の整理)
 処理例1乃至6の一部または全部を利用することで、属性情報取得部11が取得した属性情報、及び、画像処理部13が生成した推定属性情報を、例えば、図6に示すようにまとめることができる。図6に示す例では、同一人物を重複して表す属性情報及び推定属性情報、同一人物を重複して表す複数の推定属性情報、及び、同一人物を重複して表す複数の属性情報を統合し、1つのユーザIDにまとめている。結果、同一人物に対応して2つ以上のユーザIDが重ねて存在する不都合を軽減している。
○ Processing example 7 (arrangement of estimated attribute information and attribute information extracted in processing examples 1 to 6)
By using a part or all of the processing examples 1 to 6, the attribute information acquired by the attribute information acquisition unit 11 and the estimated attribute information generated by the image processing unit 13 are collected as shown in FIG. 6, for example. be able to. In the example illustrated in FIG. 6, attribute information and estimated attribute information that represents the same person in duplicate, a plurality of estimated attribute information that represents the same person in duplicate, and a plurality of attribute information that represents the same person in duplicate are integrated. One user ID is collected. As a result, inconvenience that two or more user IDs exist corresponding to the same person is reduced.
 図示する例では、ユーザIDと、属性情報ID(属ID)と、推定属性情報ID(推属ID)と、属性情報又は推定属性情報と、特徴量とが対応付けられている。なお、その他、撮影画像、属性情報及び/又は推定属性情報の取得日時等が対応付けられてもよい。 In the illustrated example, a user ID, an attribute information ID (genus ID), an estimated attribute information ID (inferred ID), attribute information or estimated attribute information, and a feature amount are associated with each other. In addition, the acquisition date and the like of the captured image, the attribute information, and / or the estimated attribute information may be associated.
 図示する例の場合、ユーザID「00001」では、属ID「A0001」、「A0027」の属性情報と、推属ID「B0001」、「B0026」の推定属性情報とが統合されたものであることが分かる。なお、図示する例の場合、属性情報及び推定属性情報の欄には、精度が高いと考えられる属性情報を優先的に記載してある。すなわち、属性情報及び推定属性情報の両方が存在する場合には、属性情報を記載している。そして、属性情報のみが存在する場合には属性情報を記載し、推定属性情報のみが存在する場合には推定属性情報を記載している。なお、属性情報及び推定属性情報の両方が存在する場合には、推定属性情報を記載してもよいし、両方を記載してもよい。 In the case of the illustrated example, the user ID “00001” is a combination of the attribute information of the genus ID “A0001” and “A0027” and the estimated attribute information of the genus ID “B0001” and “B0026”. I understand. In the case of the illustrated example, attribute information that is considered to be highly accurate is preferentially written in the columns of attribute information and estimated attribute information. That is, when both attribute information and estimated attribute information exist, attribute information is described. And when only attribute information exists, attribute information is described, and when only estimated attribute information exists, estimated attribute information is described. When both attribute information and estimated attribute information exist, estimated attribute information may be described, or both may be described.
○処理例8(同一人物を重複して表す属性情報及び推定属性情報を抽出)
 ある撮影画像で認識された2人の人物の推定属性情報がほぼ一致し、かつ、このタイミングで推定属性情報と同様な内容の2つの属性情報が取得される場合がある。例えば、図7示すように、推定属性情報が一致し、取得日時が重なる2つの推定属性情報を取得したとする。そして、図8に示すように、このタイミングで同様な内容の2つの属性情報を取得したとする。当該情報だけでは、いずれの推定属性情報といずれの属性情報が同一人物を表しているのか判断できない。
○ Processing example 8 (extract attribute information and estimated attribute information representing the same person in duplicate)
There are cases where the estimated attribute information of two persons recognized in a certain photographed image substantially match and two attribute information having the same content as the estimated attribute information is acquired at this timing. For example, as shown in FIG. 7, it is assumed that two pieces of estimated attribute information having the same estimated attribute information and having the same acquisition date and time are acquired. Then, as shown in FIG. 8, it is assumed that two pieces of attribute information having similar contents are acquired at this timing. It is impossible to determine which estimated attribute information and which attribute information represent the same person only by the information.
 しかし、図8の属ID「B0061」の属性情報で特定される「××さん」は、図6を参照すれば、他のタイミングで取得された属性情報が他のタイミングで取得された推定属性情報と統合され、特徴量「○×○」が対応付けられている。この情報に基づき、「××さん」の特徴量は、「○×○」であることが分かる。抽出部14は、当該情報に基づき、推属ID「B0058」と属ID「B0061」が同一人物を重複して表す属性情報及び推定属性情報であることを特定することができる。そして、残った推属ID「B0059」と属ID「B0062」が同一人物を重複して表す属性情報及び推定属性情報であると判断できる。 However, “Mr. XX” identified by the attribute information of the genus ID “B0061” in FIG. 8 is the estimated attribute obtained by acquiring attribute information acquired at other timings at other timings with reference to FIG. It is integrated with the information and is associated with the feature quantity “XX”. Based on this information, it can be seen that the feature amount of “Mr. XX” is “O”. Based on the information, the extraction unit 14 can specify that the attribute ID “B0058” and the attribute ID “B0061” are attribute information and estimated attribute information that represent the same person in duplicate. Then, it can be determined that the remaining attribute ID “B0059” and attribute ID “B0062” are attribute information and estimated attribute information that redundantly represent the same person.
 このように、他のタイミング(処理対象の推定属性情報及び属性情報より前であってもよいし、後であってもよい)で取得された属性情報と推定属性情報との対応付け(統合)の結果に基づいて、同一人物を重複して表す属性情報及び推定属性情報を抽出することができる。 In this way, association (integration) of attribute information acquired at other timing (may be before or after the estimated attribute information and attribute information to be processed) and estimated attribute information. Based on the result, it is possible to extract attribute information and estimated attribute information that represent the same person repeatedly.
○処理例9(同一人物を重複して表す属性情報及び推定属性情報を抽出)
 ある撮影画像で認識された2人の人物の推定属性情報がほぼ一致し、かつ、このタイミングで推定属性情報と同様な内容の1つの属性情報を取得する場合がある。例えば、図9に示すように、推定属性情報が一致し、取得日時が重なる2つの推定属性情報を取得したとする。そして、図10に示すように、このタイミングで同様な内容の1つの属性情報を取得したとする。当該情報だけでは、いずれの推定属性情報といずれの属性情報が同一人物を表しているのか判断できない。
○ Processing example 9 (extract attribute information and estimated attribute information representing the same person in duplicate)
In some cases, the estimated attribute information of two persons recognized in a certain photographed image almost match and one attribute information having the same content as the estimated attribute information is acquired at this timing. For example, as shown in FIG. 9, it is assumed that two pieces of estimated attribute information whose estimated attribute information matches and whose acquisition date and time overlap are acquired. Then, as shown in FIG. 10, it is assumed that one piece of attribute information having similar contents is acquired at this timing. It is impossible to determine which estimated attribute information and which attribute information represent the same person only by the information.
 しかし、図6を参照すれば、図9の推属ID「B0072」に対応する特徴量と一致する特徴量が対応付いているユーザIDが既に存在する。そして、このユーザIDの属IDの欄が空であることに基づけば、このユーザIDに対応する推定属性情報は、属性情報で表されていない人物を表す推定属性情報であることが分かる。この情報に基づき、抽出部14は、推属ID「B0072」の推定属性情報は、属性情報で表されていない人物を表す推定属性情報であると特定できる。そして、残った推属ID「B0073」と属ID「A0076」が同一人物を重複して表す属性情報及び推定属性情報であると判断できる。 However, referring to FIG. 6, there is already a user ID associated with a feature amount corresponding to the feature amount corresponding to the attribute ID “B0072” in FIG. Then, based on the fact that the genus ID column of this user ID is empty, it is understood that the estimated attribute information corresponding to this user ID is estimated attribute information representing a person not represented by the attribute information. Based on this information, the extraction unit 14 can specify that the estimated attribute information of the attribute ID “B0072” is estimated attribute information representing a person not represented by the attribute information. Then, it can be determined that the remaining attribute ID “B0073” and attribute ID “A0076” are attribute information and estimated attribute information that redundantly represent the same person.
 このように、他のタイミング(処理対象の推定属性情報及び属性情報より前であってもよいし、後であってもよい)で取得された属性情報と推定属性情報との対応付けの結果に基づいて、同一人物を重複して表す属性情報及び推定属性情報を抽出することができる。 As described above, the attribute information acquired at other timing (may be before or after the estimated attribute information and attribute information to be processed) may be associated with the estimated attribute information. Based on this, it is possible to extract attribute information and estimated attribute information that represent the same person repeatedly.
○処理例10(同一人物を重複して表す属性情報及び推定属性情報を抽出)
 ある撮影画像で3人以上の人物が認識され、3つの推定属性情報が生成される場合がある。そして、このタイミングで1つの属性情報を取得する場合がある。例えば、図11示すように、取得日時が重なる3つの推定属性情報を取得したとする。そして、図12に示すように、このタイミングで1つの属性情報を取得したとする。
○ Processing example 10 (extract attribute information and estimated attribute information representing the same person in duplicate)
In some cases, three or more persons are recognized in a certain captured image, and three pieces of estimated attribute information are generated. One attribute information may be acquired at this timing. For example, as shown in FIG. 11, it is assumed that three pieces of estimated attribute information having the acquisition dates and times are acquired. Then, as shown in FIG. 12, it is assumed that one piece of attribute information is acquired at this timing.
 この場合、抽出部14は、まず、属性情報及び推定属性情報の一致度により、推属ID「B0093」と属ID「A0101」は同一人物を表していないと判断できる。 In this case, the extraction unit 14 can first determine that the attribute ID “B0093” and the attribute ID “A0101” do not represent the same person based on the degree of coincidence between the attribute information and the estimated attribute information.
 次いで、図6を参照すれば、図11の推属ID「B0092」に対応する特徴量と一致する特徴量が対応付いているユーザIDが既に存在する。そして、このユーザIDの属IDの欄が空であることに基づけば、このユーザIDに対応する推定属性情報は、属性情報で表されていない人物を表す推定属性情報であることが分かる。この情報に基づき、抽出部14は、推属ID「B0092」の推定属性情報は、属性情報で表されていない人物を表す推定属性情報であると特定できる。そして、残った推属ID「B0094」と属ID「A0101」が同一人物を重複して表す属性情報及び推定属性情報であると判断できる。 Next, referring to FIG. 6, there is already a user ID associated with a feature amount corresponding to the feature amount corresponding to the inheritance ID “B0092” in FIG. Then, based on the fact that the genus ID column of this user ID is empty, it is understood that the estimated attribute information corresponding to this user ID is estimated attribute information representing a person not represented by the attribute information. Based on this information, the extraction unit 14 can specify that the estimated attribute information of the attribute ID “B0092” is estimated attribute information representing a person not represented by the attribute information. Then, it can be determined that the remaining attribute ID “B0094” and attribute ID “A0101” are attribute information and estimated attribute information that redundantly represent the same person.
 このように、他のタイミング(処理対象の推定属性情報及び属性情報より前であってもよいし、後であってもよい)で取得された属性情報と推定属性情報との対応付けの結果に基づいて、同一人物を重複して表す属性情報及び推定属性情報を抽出することができる。 As described above, the attribute information acquired at other timing (may be before or after the estimated attribute information and attribute information to be processed) may be associated with the estimated attribute information. Based on this, it is possible to extract attribute information and estimated attribute information that represent the same person repeatedly.
 次に、図13のフローチャートを用いて、本実施形態のデータ処理方法の処理の流れの一例を説明する。なお、当該処理の流れはあくまで一例であり、これに限定されない。ここでは、属性情報取得部11が取得した属性情報(図4参照)、及び、画像処理部13が生成した推定属性情報(図5参照)を統合し、統合済データ(図6参照)を生成する処理につて説明する。 Next, an example of the processing flow of the data processing method of this embodiment will be described using the flowchart of FIG. The process flow is merely an example, and the present invention is not limited to this. Here, the attribute information acquired by the attribute information acquisition unit 11 (see FIG. 4) and the estimated attribute information generated by the image processing unit 13 (see FIG. 5) are integrated to generate integrated data (see FIG. 6). The processing to be performed will be described.
 まず、抽出部14は、1つの属性情報又は推定属性情報を取得し、処理対象とする(S10)。 First, the extraction unit 14 acquires one piece of attribute information or estimated attribute information and sets it as a processing target (S10).
 例えば、属性情報取得部11はユーザ携帯端末30から取得した属性情報を、リアルタイム処理で順次、抽出部14に入力してもよい。また、撮影画像取得部12はリアルタイム処理で撮影装置40から画像データを取得してもよい。そして、画像処理部13はリアルタイム処理で当該画像データを解析して推定属性情報を生成し、抽出部14に入力してもよい。抽出部14は、このように属性情報取得部11及び画像処理部13から入力される属性情報及び推定属性情報を、入力順に処理対象とし、リアルタイム処理で処理してもよい。 For example, the attribute information acquisition unit 11 may sequentially input the attribute information acquired from the user portable terminal 30 to the extraction unit 14 by real-time processing. The captured image acquisition unit 12 may acquire image data from the imaging device 40 by real-time processing. Then, the image processing unit 13 may generate the estimated attribute information by analyzing the image data by real-time processing and input the estimated attribute information to the extraction unit 14. The extraction unit 14 may process the attribute information and the estimated attribute information input from the attribute information acquisition unit 11 and the image processing unit 13 as described above in the order of input and process them in real time.
 その他の例として、抽出部14は、属性情報取得部11がユーザ携帯端末30から取得した複数の属性情報、及び、画像処理部13が生成した複数の推定属性情報を、バッチ処理してもよい。この場合、抽出部14は、属性情報及び推定属性情報の群の中から、取得日時順に処理対象を決定してもよい。 As another example, the extraction unit 14 may batch-process a plurality of attribute information acquired by the attribute information acquisition unit 11 from the user portable terminal 30 and a plurality of estimated attribute information generated by the image processing unit 13. . In this case, the extraction unit 14 may determine a processing target in order of acquisition date and time from the group of attribute information and estimated attribute information.
 その後、抽出部14は、処理対象と、統合済データとを用い、上記処理例1乃至10に基づいて、統合処理を行う(S11)。なお、統合処理の処理フローは特段制限されず、様々な態様を採用し得る。そして、以降、同様の処理を繰り返す(S12)。 Thereafter, the extraction unit 14 performs integration processing based on the processing examples 1 to 10 using the processing target and the integrated data (S11). The processing flow of the integration process is not particularly limited, and various aspects can be adopted. Thereafter, the same processing is repeated (S12).
 ここで、図14を用いて、本実施形態の変形例について説明する。当該変形例のデータ処理システムでは、ユーザ携帯端末30がユーザの属性情報を含む検知情報を通信エリア30A内に送信し、当該通信エリア30A内に位置する設置装置20が当該検知情報を受信する。そして、検知情報を受信した設置装置20は、当該受信に応じて、ユーザの属性情報をデータ処理装置10に送信する。当該変形例のデータ処理システムは、当該点で、上記例と異なる。その他の構成は、上記例と同様である。当該例の場合、ユーザ携帯端末30は、設置装置20を介して(経由して)、ユーザの属性情報をデータ処理装置10に送信することとなる。 Here, a modified example of the present embodiment will be described with reference to FIG. In the data processing system of the modification, the user portable terminal 30 transmits detection information including user attribute information in the communication area 30A, and the installation device 20 located in the communication area 30A receives the detection information. And the installation apparatus 20 which received the detection information transmits a user's attribute information to the data processing apparatus 10 according to the said reception. The data processing system of the modification is different from the above example in this respect. Other configurations are the same as in the above example. In the case of this example, the user portable terminal 30 transmits user attribute information to the data processing device 10 via (via) the installation device 20.
 当該変形例のユーザ携帯端末30は、本実施形態のデータ処理システム専用に準備された端末、例えばビーコン端末であってもよい。設置装置20とデータ処理装置10間は、有線及び/又は無線で通信される。 The user portable terminal 30 of the modification may be a terminal prepared exclusively for the data processing system of the present embodiment, for example, a beacon terminal. The installation device 20 and the data processing device 10 are communicated by wire and / or wireless.
 次に、本実施形態の作用効果について説明する。 Next, the function and effect of this embodiment will be described.
 本実施形態によれば、顔画像を用いてユーザの推定属性情報を取得する手段、及び、ユーザ携帯端末30からユーザの属性情報を取得する手段の両方で、ユーザの属性情報(推定属性情報を含む)を収集することができる。そして、このようにして収集した情報を用いて、所定の分析を行うことができる。例えば、所定のスポットに来たユーザの属性を分析したり、所定のスポットに来たユーザにおいて、ユーザ携帯端末30を携帯していないユーザを抽出し、その属性を分析したり、顔画像を用いて推定属性情報を取得できなかったユーザを抽出し、その属性を分析したり、そのデータを蓄積したりできる。 According to the present embodiment, the user attribute information (estimated attribute information is obtained by both the means for acquiring the user's estimated attribute information using the face image and the means for acquiring the user's attribute information from the user portable terminal 30. Can be collected). And a predetermined analysis can be performed using the information collected in this way. For example, an attribute of a user who has come to a predetermined spot is analyzed, a user who has come to a predetermined spot, a user who does not carry the user portable terminal 30 is extracted, the attribute is analyzed, and a face image is used Thus, it is possible to extract users who could not obtain estimated attribute information and analyze the attributes or accumulate the data.
 結果、いずれか一方の手段でユーザの属性情報(推定属性情報を含む)を取得し、分析する場合に比べて、分析の精度を向上させることができる。また、分析できる内容の幅を広げることができる。 As a result, the accuracy of analysis can be improved as compared with the case where user attribute information (including estimated attribute information) is acquired and analyzed by either means. In addition, the range of contents that can be analyzed can be expanded.
 以下、参考形態の例を付記する。
1. 設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得手段と、
 前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得手段と、
 前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理手段と、
 同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出手段と、
を有するデータ処理装置。
2. 1に記載のデータ処理装置において、
 前記抽出手段は、互いに異なるタイミングで撮影された複数の前記撮影画像各々を解析することで生成された複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報をさらに抽出するデータ処理装置。
3. 2に記載のデータ処理装置において、
 前記画像処理手段は、前記撮影画像の解析により認識した人物毎に、前記撮影画像を解析することにより特徴量を抽出し、
 前記抽出手段は、前記特徴量に基づいて、複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報を抽出するデータ処理装置。
4. 所定の場所に設置され、通信距離が所定の距離以内である所定の無線通信規格で通信可能である設置装置と、
 ユーザに携帯され、前記無線通信規格で通信可能であり、前記設置装置から前記所定の距離以内のエリアに入ると、前記無線通信規格で前記設置装置と通信するユーザ携帯端末と、
 前記エリア内を撮影する撮影装置と、
 1から3のいずれかに記載のデータ処理装置と、
を有するデータ処理システム。
5. 4に記載のデータ処理システムにおいて、
 前記ユーザ携帯端末は、前記設置装置と前記無線通信規格で通信すると、当該通信に応じて、自端末に予め記憶されているユーザの属性情報を前記データ処理装置に送信するデータ処理システム。
6. 5に記載のデータ処理システムにおいて、
 前記ユーザ携帯端末は、前記設置装置を介して、前記属性情報を前記データ処理装置に送信するデータ処理システム。
7. コンピュータが、
 設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得工程と、
 前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得工程と、
 前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理工程と、
 同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出工程と、
を実行するデータ処理方法。
7-2. 7に記載のデータ処理方法において、
 前記抽出工程では、互いに異なるタイミングで撮影された複数の前記撮影画像各々を解析することで生成された複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報をさらに抽出するデータ処理方法。
7-3. 7-2に記載のデータ処理方法において、
 前記画像処理工程では、前記撮影画像の解析により認識した人物毎に、前記撮影画像を解析することにより特徴量を抽出し、
 前記抽出工程では、前記特徴量に基づいて、複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報を抽出するデータ処理方法。
8. コンピュータを、
 設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得手段、
 前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得手段、
 前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理手段、
 同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出手段、
として機能させるためのプログラム。
8-2. 8に記載のプログラムにおいて、
 前記抽出手段に、互いに異なるタイミングで撮影された複数の前記撮影画像各々を解析することで生成された複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報をさらに抽出させるプログラム。
8-3. 8-2に記載のプログラムにおいて、
 前記画像処理手段に、前記撮影画像の解析により認識した人物毎に、前記撮影画像を解析することにより特徴量を抽出させ、
 前記抽出手段に、前記特徴量に基づいて、複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報を抽出させるプログラム。
Hereinafter, examples of the reference form will be added.
1. Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
Captured image acquisition means for acquiring image data of a captured image captured in the area;
Recognizing a person shown in the photographed image by analyzing the photographed image, image processing means for estimating attribute information for each recognized person and generating estimated attribute information;
The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extracting means for extracting at least one of them,
A data processing apparatus.
2. In the data processing apparatus according to 1,
The extraction means further extracts the estimated attribute information that represents the same person from a plurality of the estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings. Data processing device.
3. In the data processing apparatus according to 2,
The image processing means extracts a feature amount by analyzing the captured image for each person recognized by the analysis of the captured image,
The data processing device extracts the estimated attribute information that represents the same person in duplicate from the plurality of estimated attribute information based on the feature amount.
4). An installation device installed at a predetermined location and capable of communicating with a predetermined wireless communication standard having a communication distance within a predetermined distance;
A user portable terminal that is carried by the user, can communicate with the wireless communication standard, and enters the area within the predetermined distance from the installation device, and communicates with the installation device with the wireless communication standard;
A photographing device for photographing the inside of the area;
A data processing device according to any one of 1 to 3,
A data processing system.
5. 4, the data processing system according to
When the user portable terminal communicates with the installation apparatus according to the wireless communication standard, the user portable terminal transmits user attribute information stored in advance in the terminal to the data processing apparatus according to the communication.
6). In the data processing system according to 5,
The data processing system in which the user portable terminal transmits the attribute information to the data processing device via the installation device.
7). Computer
An attribute information acquisition step for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
A captured image acquisition step of acquiring image data of a captured image captured in the area;
An image processing step of recognizing a person appearing in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extraction process for extracting at least one of the above,
Data processing method to execute.
7-2. In the data processing method according to claim 7,
In the extracting step, the estimated attribute information that represents the same person is further extracted from the plurality of estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings. Data processing method.
7-3. In the data processing method described in 7-2,
In the image processing step, for each person recognized by the analysis of the captured image, the feature amount is extracted by analyzing the captured image,
A data processing method for extracting the estimated attribute information representing the same person in duplicate from the plurality of estimated attribute information based on the feature amount in the extracting step.
8). Computer
Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
Captured image acquisition means for acquiring image data of a captured image captured in the area;
An image processing means for recognizing a person shown in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. Extraction means for extracting at least one of
Program to function as.
8-2. In the program described in 8,
The extraction means further extracts the estimated attribute information that represents the same person from among the plurality of estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings. Program to make.
8-3. In the program described in 8-2,
For each person recognized by the analysis of the captured image, the image processing means causes the feature amount to be extracted by analyzing the captured image,
A program that causes the extraction means to extract the estimated attribute information that represents the same person from a plurality of the estimated attribute information based on the feature amount.
 この出願は、2014年9月2日に出願された日本出願特願2014-177858号を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2014-177858 filed on September 2, 2014, the entire disclosure of which is incorporated herein.

Claims (8)

  1.  設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得手段と、
     前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得手段と、
     前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理手段と、
     同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出手段と、
    を有するデータ処理装置。
    Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
    Captured image acquisition means for acquiring image data of a captured image captured in the area;
    Recognizing a person shown in the photographed image by analyzing the photographed image, image processing means for estimating attribute information for each recognized person and generating estimated attribute information;
    The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extracting means for extracting at least one of them,
    A data processing apparatus.
  2.  請求項1に記載のデータ処理装置において、
     前記抽出手段は、互いに異なるタイミングで撮影された複数の前記撮影画像各々を解析することで生成された複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報をさらに抽出するデータ処理装置。
    The data processing apparatus according to claim 1,
    The extraction means further extracts the estimated attribute information that represents the same person from a plurality of the estimated attribute information generated by analyzing each of the plurality of captured images captured at different timings. Data processing device.
  3.  請求項2に記載のデータ処理装置において、
     前記画像処理手段は、前記撮影画像の解析により認識した人物毎に、前記撮影画像を解析することにより特徴量を抽出し、
     前記抽出手段は、前記特徴量に基づいて、複数の前記推定属性情報の中から、同一人物を重複して表す前記推定属性情報を抽出するデータ処理装置。
    The data processing apparatus according to claim 2, wherein
    The image processing means extracts a feature amount by analyzing the captured image for each person recognized by the analysis of the captured image,
    The data processing device extracts the estimated attribute information that represents the same person in duplicate from the plurality of estimated attribute information based on the feature amount.
  4.  所定の場所に設置され、通信距離が所定の距離以内である所定の無線通信規格で通信可能である設置装置と、
     ユーザに携帯され、前記無線通信規格で通信可能であり、前記設置装置から前記所定の距離以内のエリアに入ると、前記無線通信規格で前記設置装置と通信するユーザ携帯端末と、
     前記エリア内を撮影する撮影装置と、
     請求項1から3のいずれか1項に記載のデータ処理装置と、
    を有するデータ処理システム。
    An installation device installed at a predetermined location and capable of communicating with a predetermined wireless communication standard having a communication distance within a predetermined distance;
    A user portable terminal that is carried by the user, can communicate with the wireless communication standard, and enters the area within the predetermined distance from the installation device, and communicates with the installation device with the wireless communication standard;
    A photographing device for photographing the inside of the area;
    A data processing device according to any one of claims 1 to 3,
    A data processing system.
  5.  請求項4に記載のデータ処理システムにおいて、
     前記ユーザ携帯端末は、前記設置装置と前記無線通信規格で通信すると、当該通信に応じて、自端末に予め記憶されているユーザの属性情報を前記データ処理装置に送信するデータ処理システム。
    The data processing system according to claim 4, wherein
    When the user portable terminal communicates with the installation apparatus according to the wireless communication standard, the user portable terminal transmits user attribute information stored in advance in the terminal to the data processing apparatus according to the communication.
  6.  請求項5に記載のデータ処理システムにおいて、
     前記ユーザ携帯端末は、前記設置装置を介して、前記属性情報を前記データ処理装置に送信するデータ処理システム。
    The data processing system according to claim 5, wherein
    The data processing system in which the user portable terminal transmits the attribute information to the data processing device via the installation device.
  7.  コンピュータが、
     設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得工程と、
     前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得工程と、
     前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理工程と、
     同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出工程と、
    を実行するデータ処理方法。
    Computer
    An attribute information acquisition step for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
    A captured image acquisition step of acquiring image data of a captured image captured in the area;
    An image processing step of recognizing a person appearing in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
    The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. An extraction process for extracting at least one of the above,
    Data processing method to execute.
  8.  コンピュータを、
     設置装置から所定の距離以内のエリアに位置するユーザ携帯端末各々から、ユーザの属性情報を取得する属性情報取得手段、
     前記エリア内を撮影した撮影画像の画像データを取得する撮影画像取得手段、
     前記撮影画像を解析することにより、前記撮影画像に写っている人物を認識するとともに、認識した人物毎に属性情報を推定して推定属性情報を生成する画像処理手段、
     同一人物を重複して表す前記属性情報及び前記推定属性情報、前記属性情報で表されていない人物を表す前記推定属性情報、及び、前記推定属性情報で表されていない人物を表す前記属性情報の中の少なくとも一つを抽出する抽出手段、
    として機能させるためのプログラム。
    Computer
    Attribute information acquisition means for acquiring user attribute information from each of the user portable terminals located in an area within a predetermined distance from the installation device;
    Captured image acquisition means for acquiring image data of a captured image captured in the area;
    An image processing means for recognizing a person shown in the captured image by analyzing the captured image and generating estimated attribute information by estimating attribute information for each recognized person;
    The attribute information and the estimated attribute information representing the same person in duplicate, the estimated attribute information representing a person not represented by the attribute information, and the attribute information representing a person not represented by the estimated attribute information. Extraction means for extracting at least one of
    Program to function as.
PCT/JP2015/073955 2014-09-02 2015-08-26 Data processing device, data processing system, data processing method, and program WO2016035632A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016546579A JP6267350B2 (en) 2014-09-02 2015-08-26 Data processing apparatus, data processing system, data processing method and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014177858 2014-09-02
JP2014-177858 2014-09-02

Publications (1)

Publication Number Publication Date
WO2016035632A1 true WO2016035632A1 (en) 2016-03-10

Family

ID=55439697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/073955 WO2016035632A1 (en) 2014-09-02 2015-08-26 Data processing device, data processing system, data processing method, and program

Country Status (2)

Country Link
JP (1) JP6267350B2 (en)
WO (1) WO2016035632A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021235355A1 (en) * 2020-05-22 2021-11-25 富士フイルム株式会社 Image data processing device and image data processing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007102342A (en) * 2005-09-30 2007-04-19 Fujifilm Corp Automatic counter
WO2009041242A1 (en) * 2007-09-28 2009-04-02 Nec Soft, Ltd. Gathering system, gathering device, and gathering method
JP2010140287A (en) * 2008-12-12 2010-06-24 Nomura Research Institute Ltd Purchase action analysis device, method and computer program
JP2011018300A (en) * 2009-06-08 2011-01-27 Jr East Mechatronics Co Ltd Gate system, server and association method in the gate system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012113557A (en) * 2010-11-25 2012-06-14 Nomura Research Institute Ltd Automatic ticket gate machine and information processing system
JP5652273B2 (en) * 2011-03-15 2015-01-14 オムロン株式会社 Gate device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007102342A (en) * 2005-09-30 2007-04-19 Fujifilm Corp Automatic counter
WO2009041242A1 (en) * 2007-09-28 2009-04-02 Nec Soft, Ltd. Gathering system, gathering device, and gathering method
JP2010140287A (en) * 2008-12-12 2010-06-24 Nomura Research Institute Ltd Purchase action analysis device, method and computer program
JP2011018300A (en) * 2009-06-08 2011-01-27 Jr East Mechatronics Co Ltd Gate system, server and association method in the gate system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021235355A1 (en) * 2020-05-22 2021-11-25 富士フイルム株式会社 Image data processing device and image data processing system
JP7377971B2 (en) 2020-05-22 2023-11-10 富士フイルム株式会社 Image data processing device and image data processing system

Also Published As

Publication number Publication date
JP6267350B2 (en) 2018-01-24
JPWO2016035632A1 (en) 2017-04-27

Similar Documents

Publication Publication Date Title
JP7405200B2 (en) person detection system
JP6814673B2 (en) Movement route prediction device and movement route prediction method
JPWO2018198373A1 (en) Video surveillance system
JP6730236B2 (en) Person identification system and person identification method
TWI586160B (en) Real time object scanning using a mobile phone and cloud-based visual search engine
JP6573311B2 (en) Face recognition system, face recognition server, and face recognition method
JP7238902B2 (en) Information processing device, information processing method, and program
JP5699802B2 (en) Information processing apparatus, information processing method, program, and information processing system
WO2019033567A1 (en) Method for capturing eyeball movement, device and storage medium
US20190147251A1 (en) Information processing apparatus, monitoring system, method, and non-transitory computer-readable storage medium
US10984249B2 (en) Information processing apparatus, system, control method for information processing apparatus, and non-transitory computer readable storage medium
US10719543B2 (en) Information processing apparatus, information processing method, and program
JP2024045460A (en) Information processing system, information processing device, information processing method, and program
JP6267350B2 (en) Data processing apparatus, data processing system, data processing method and program
US10219127B2 (en) Information processing apparatus and information processing method
JP6702402B2 (en) Image processing system, image processing method, and image processing program
JP2014215747A (en) Tracking device, tracking system, and tracking method
US9076031B2 (en) Image capture apparatus, control method of image capture apparatus, and recording medium
JP2008134868A (en) Image recognition device, electronic device, image recognition method and control program
JP2010170212A (en) Action estimation device and method
JP2022058833A (en) Information processing system, information processing apparatus, information processing method, and program
JP7102859B2 (en) Video Conference Systems, Video Conference Methods, and Programs
JP6112346B2 (en) Information collection system, program, and information collection method
US20160088261A1 (en) System and a method for specifying an image capturing unit, and a non-transitory computer readable medium thereof
JP2019003363A (en) Information providing system and information providing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15837634

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016546579

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15837634

Country of ref document: EP

Kind code of ref document: A1