WO2019193817A1 - 解析システム - Google Patents

解析システム Download PDF

Info

Publication number
WO2019193817A1
WO2019193817A1 PCT/JP2019/002171 JP2019002171W WO2019193817A1 WO 2019193817 A1 WO2019193817 A1 WO 2019193817A1 JP 2019002171 W JP2019002171 W JP 2019002171W WO 2019193817 A1 WO2019193817 A1 WO 2019193817A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
analysis
image
person
attribute
Prior art date
Application number
PCT/JP2019/002171
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
綾乃 河江
一仁 佐野
博 丹下
拓己 宇津木
鋭 水野
Original Assignee
矢崎エナジーシステム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 矢崎エナジーシステム株式会社 filed Critical 矢崎エナジーシステム株式会社
Priority to SG11202009797UA priority Critical patent/SG11202009797UA/en
Priority to CN201980024084.3A priority patent/CN111937026A/zh
Priority to TW108107249A priority patent/TW201944325A/zh
Publication of WO2019193817A1 publication Critical patent/WO2019193817A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present invention relates to an analysis system.
  • Patent Document 1 discloses a counting device.
  • the counting device detects the number of passers-by near the store, the number of people entering the store, and the number of people who purchase products in the store, based on the image data captured by the first to third imaging devices. The detected number of persons is totaled for each predetermined time unit.
  • the analysis system as described above has room for further improvement, for example, in terms of improvement of analysis accuracy.
  • the present invention has been made in view of the above circumstances, and an object thereof is to provide an analysis system capable of improving the analysis accuracy.
  • an analysis system is mounted on a moving body, and includes first image data representing an image outside the moving body that is captured along with the movement of the moving body, and A moving body analysis data collecting device that collects first analysis data including first position data representing a position where an image outside the moving body is captured, and the first analysis collected by the moving body analysis data collecting device. And a data analysis device for analyzing a human flow included in the image represented by the first image data and a human flow of the person whose attribute is specified based on the data.
  • the data analysis device extracts an image including a feature point of the person who can specify the attribute of the person from a plurality of images represented by the first image data, and based on the extracted image.
  • the attribute of the person included in the image can be analyzed.
  • the moving body analysis data collection device further includes second image data representing an image inside the moving body, and a second position representing a position where the image inside the moving body is captured. Second analysis data including position data is collected, and the data analysis device further displays an image represented by the second image data based on the second analysis data collected by the moving body analysis data collection device. It is possible to analyze the attributes of a person included in and the human flow of the person whose attribute is specified.
  • the third image data representing an image outside the fixed body that is mounted on the fixed body installed on the road and is imaged in a state of being mounted on the fixed body, and the fixed body
  • a fixed body analysis data collecting device that collects third analysis data including third position data representing a position where an external image is captured; and the data analysis device further collects the fixed body analysis data collecting device. Based on the third analysis data, the attribute of the person included in the image represented by the third image data and the human flow of the person whose attribute is specified can be analyzed.
  • the moving body may be one that repeatedly travels on a predetermined route.
  • the data analysis device may include a storage unit that accumulates analysis result data representing the analysis result analyzed by the data analysis device.
  • the attributes of the person analyzed by the data analysis apparatus may include gender, age, physique, social status, preference, or behavior orientation.
  • the analysis system can analyze the attributes of a person included in the image represented by the first image data based on the first analysis data and the human flow of the person whose attribute is specified based on the first analysis data. it can.
  • the analysis system collects the first image data constituting the first analysis data and the first position data as the moving body moves by the moving body analysis data collecting device mounted on the moving body. can do.
  • this analysis system can secure a relatively large amount of the first analysis data that can be used for the analysis by the data analysis apparatus, so that the analysis accuracy can be improved. Play.
  • FIG. 1 is a block diagram illustrating a schematic configuration of the analysis system according to the first embodiment.
  • FIG. 2 is a schematic diagram illustrating an example of analysis result data analyzed and processed in the analysis system according to the first embodiment.
  • FIG. 3 is a flowchart illustrating an example of processing in the analysis system according to the first embodiment.
  • FIG. 4 is a block diagram illustrating a schematic configuration of the analysis system according to the second embodiment.
  • FIG. 5 is a block diagram illustrating a schematic configuration of the analysis system according to the third embodiment.
  • the analysis system 1 includes a recording device 10 as a mobile body analysis data collection device and an analysis device 20 as a data analysis device, and analyzes the analysis result data analyzed by the analysis device 20 as a client.
  • This is a system provided to the terminal CL.
  • the analysis device 20 analyzes the attributes of a person and the human flow of the person whose attribute is specified based on the images collected by the recording device 10.
  • the analysis system 1 improves the analysis accuracy by using the recording device 10 mounted on the moving body V as a device for collecting analysis data used for analysis by the analysis device 20. It is a thing.
  • the configuration of the analysis system 1 will be described in detail with reference to the drawings.
  • the recording device 10 collects first analysis data used for analysis by the analysis device 20.
  • the first analysis data is data including first image data representing an image and first position data representing a position where the image is captured.
  • the recording apparatus 10 of the present embodiment is mounted on the moving body V.
  • the moving body V on which the recording device 10 is mounted is typically a vehicle that travels on the road surface. Any of a car etc. may be sufficient.
  • the recording apparatus 10 is typically mounted on each of the plurality of moving bodies V. That is, the analysis system 1 of the present embodiment includes a plurality of recording devices 10 respectively mounted on a plurality of moving bodies V, and can collect analysis data from the plurality of recording devices 10.
  • At least some of the plurality of recording devices 10 are preferably mounted on a vehicle that repeatedly travels a predetermined route, such as a route bus or a long-distance transportation truck. Then, the recording apparatus 10 includes, as the first analysis data, first image data representing an image outside the moving body V captured along with the movement of the moving body V, and an image outside the moving body V. First position data representing an imaged position is collected.
  • a vehicle-mounted device such as a so-called drive recorder mounted on the moving body V can be used, but the invention is not limited to this.
  • the recording apparatus 10 includes an external camera 11, a position information measuring device 12, a data input / output unit 13, and a control unit 14.
  • the external camera 11 is an external imaging device that captures an image outside the moving object V.
  • the external camera 11 captures an image outside the moving body V as the moving body V moves, and collects first image data representing an image outside the moving body V.
  • the external camera 11 typically captures a moving image outside the moving object V.
  • the external camera 11 is installed on the moving body V so as to have an angle of view capable of capturing an image of a person to be analyzed by the analysis system 1, here a person located on the road outside the moving body V.
  • a plurality of external cameras 11 may be provided on the front, side, rear, roof, etc. of the moving body V so that a person on the road can be more suitably imaged from the moving body V.
  • the external camera 11 may be a monocular camera or a stereo camera.
  • the image captured by the external camera 11 may be monochrome or color.
  • the external camera 11 is communicably connected to the control unit 14 and outputs the collected first image data to the control unit 14.
  • the position information measuring device 12 is a positioning device that measures the current position of the moving object V.
  • a GPS receiver that receives radio waves transmitted from a GPS (Global Positioning System) satellite can be used.
  • the position information measuring device 12 receives radio waves transmitted from GPS satellites and acquires GPS information (latitude and longitude coordinates) as information representing the current position of the moving body V, whereby an image outside the moving body V is captured. Collect first position data representing the position.
  • the position information measuring device 12 is communicably connected to the control unit 14 and outputs the collected first position data to the control unit 14.
  • the data input / output unit 13 inputs / outputs various data between a device different from the recording device 10 and the recording device 10.
  • the data input / output unit 13 of the present embodiment can output the first analysis data to the analysis device 20 which is a device different from the recording device 10.
  • the data input / output unit 13 may be configured to input / output data to / from a device different from the recording apparatus 10 through communication (whether wired or wireless) via a network.
  • the data input / output unit 13 may be configured to input / output data to / from a device different from the recording apparatus 10 via a recording medium that has a slot unit and is inserted into the slot unit. Good.
  • the recording medium is, for example, a memory (removable medium) that can be attached to and detached from the recording apparatus 10 via a slot portion.
  • a memory removable medium
  • various types of memory cards such as an SD card can be used as the recording medium, but the recording medium is not limited thereto.
  • the control unit 14 controls each unit of the recording apparatus 10 in an integrated manner.
  • the control unit 14 executes various arithmetic processes for collecting the first analysis data.
  • the control unit 14 includes a central processing unit such as a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and an electronic circuit mainly including a well-known microcomputer including an interface. Composed.
  • the control unit 14 is communicably connected to each unit such as the external camera 11, the position information measuring device 12, and the data input / output unit 13, and can exchange various signals and data with each unit.
  • control unit 14 includes a storage unit 14A and a processing unit 14B.
  • the storage unit 14A and the processing unit 14B can exchange various signals and data with each other.
  • the storage unit 14A stores conditions and information necessary for various processes in the processing unit 14B, various programs and applications executed by the control unit 14, control data, and the like.
  • the storage unit 14A can store the first analysis data together with the collected time and the like.
  • the first analysis data includes time data indicating the time when the data is collected and other data.
  • the storage unit 14A can temporarily store various data generated in the course of processing by the processing unit 14B. In the storage unit 14A, these data are read as necessary by the processing unit 14B, the data input / output unit 13, and the like.
  • the storage unit 14A can rewrite data such as a hard disk, a solid state drive (SSD), an optical disk, or a relatively large capacity storage device, or a RAM, flash memory, NVSRAM (Non Volatile Static Random Access Memory), etc.
  • a simple semiconductor memory may be used.
  • the processing unit 14B executes various programs stored in the storage unit 14A based on various input signals and the like, and the program operates to output various output signals to the respective units to realize various functions. Execute the process.
  • the processing unit 14B controls the operations of the external camera 11 and the position information measuring device 12, and executes processing for collecting first analysis data including first image data and first position data.
  • the processing unit 14 ⁇ / b> B executes processing related to data input / output via the data input / output unit 13.
  • the processing unit 14 ⁇ / b> B executes a process of outputting the first analysis data to the analysis device 20 via the data input / output unit 13.
  • the analysis device 20 analyzes the first analysis data collected by the recording device 10 and provides analysis result data representing the analysis result to the client terminal CL.
  • the analysis device 20 and the client terminal CL may constitute a so-called cloud service type device (cloud server) mounted on the network, or constitute a so-called stand-alone type device separated from the network. Also good.
  • the analysis device 20 specifies the attribute of the person included in the image represented by the first image data constituting the first analysis data and the attribute Analyze the human flow of the selected person.
  • the analysis apparatus 20 according to the present embodiment generates person attribute data representing the attribute of the person and attribute-specific person flow data representing the person flow of the person whose attribute is specified. Then, the analysis device 20 provides the client terminal CL with analysis result data including person attribute data and attribute-specific person flow data.
  • the analysis device 20 executes various arithmetic processes for analyzing the attributes of the person and the human flow of the person based on the first analysis data.
  • the analysis device 20 includes a central processing unit such as a CPU, an electronic circuit mainly composed of a well-known microcomputer including a ROM, a RAM, and an interface.
  • the analysis device 20 can also be configured by installing an application that realizes various processes described below in a computer system such as a known PC or workstation.
  • the analysis device 20 includes a data input / output unit 21, a storage unit 22, and a processing unit 23.
  • the data input / output unit 21, the storage unit 22, and the processing unit 23 can exchange various signals and data with each other.
  • the data input / output unit 21 inputs / outputs various data between a device different from the analysis device 20 and the analysis device 20.
  • the data input / output unit 21 of the present embodiment can input the first analysis data from the recording device 10, which is a device different from the analysis device 20.
  • the data input / output unit 21 of the present embodiment can output analysis result data to the client terminal CL which is a device different from the analysis device 20.
  • the data input / output unit 21 is configured to input / output data to / from a device different from the analysis device 20 by communication via a network (whether wired or wireless), for example. It may be.
  • the data input / output unit 21 is configured to input / output data to / from a device different from the analysis device 20 via, for example, a recording medium that has a slot unit and is inserted into the slot unit. Also good.
  • the storage unit 22 stores conditions and information necessary for various processes in the processing unit 23, various programs and applications executed by the processing unit 23, control data, and the like.
  • the storage unit 22 can store the first analysis data input by the data input / output unit 21.
  • the storage unit 22 can also temporarily store various data generated in the course of processing by the processing unit 23, for example. In the storage unit 22, these data are read out as necessary by the data input / output unit 21, the processing unit 23, and the like.
  • the storage unit 22 may be a relatively large-capacity storage device such as a hard disk, SSD, or optical disk, or a semiconductor memory that can rewrite data such as a RAM, a flash memory, or an NVSRAM.
  • the storage unit 22 is functionally conceptual in terms of an analysis target database (hereinafter abbreviated as “analysis target DB”) 22A, an analysis reference database (hereinafter abbreviated as “analysis reference DB”). 22B and an analysis result database (hereinafter abbreviated as “analysis result DB”) 22C.
  • analysis target DB an analysis target database
  • analysis reference DB an analysis reference database
  • analysis result DB an analysis result database
  • the analysis target DB 22A is a part that accumulates first analysis data (first image data, first position data, time data, etc.) that is analysis target data by the processing unit 23 and stores it as a database.
  • the first analysis data input from the recording device 10 to the data input / output unit 21 is stored in the analysis target DB 22A.
  • the analysis reference DB 22B is a part that accumulates analysis reference data to be referred to when the first analysis data is analyzed by the processing unit 23, and stores it as a database.
  • the analysis reference data includes, for example, map reference data, attribute prediction reference data, and the like.
  • the map reference data is data representing a map to be referred to when specifying the position of the moving object V based on the first position data, in other words, the position where the image outside the moving object V is captured.
  • the attribute prediction reference data is data that is referred to for estimating the attribute of a person included in the image represented by the first image data. The attribute prediction reference data will be described in detail later.
  • the analysis reference data is referred to by the processing unit 23 when analyzing the first analysis data.
  • the analysis result DB 22C is a part that accumulates analysis result data representing the analysis result of the first analysis data by the processing unit 23, stores it as a database, and stores it.
  • the analysis result data includes, for example, person attribute data representing the attribute of a person included in the image represented by the first image data, attribute-specific person flow data representing the person flow of the person whose attribute is specified, and the like.
  • the analysis result data is processed into a desired format by the processing unit 23, and is output and provided from the data input / output unit 21 to the client terminal CL.
  • the various data stored in the analysis target DB 22A, the analysis reference DB 22B, and the analysis result DB 22C can be used as so-called big data.
  • the processing unit 23 executes various programs stored in the storage unit 22 based on various input signals and the like, and executes various processes for analyzing the first analysis data by operating the program. .
  • the processing unit 23 executes processing for processing the analysis result data into a desired format.
  • the processing unit 23 executes processing related to data input / output via the data input / output unit 21. For example, the processing unit 23 executes a process of outputting the analysis result data processed into a desired format to the client terminal CL via the data input / output unit 21.
  • the processing unit 23 includes a data preprocessing unit 23A, a data analysis processing unit 23B, and a data processing unit 23C in terms of functional concept.
  • the data pre-processing unit 23A is a part that performs various pre-processing on the first analysis data that is analysis target data.
  • the data preprocessing unit 23A reads, for example, first analysis data as analysis target data from the analysis target DB 22A, and from the moving image represented by the first image data included in the first analysis data, a still image Execute the process of cutting out Moreover, the data preprocessing unit 23A executes, for example, a process of associating the cut out still image with the position represented by the first position data included in the first analysis data serving as analysis target data. .
  • the data preprocessing unit 23A detects and extracts a person from the cut out still image using various known image processing techniques as preprocessing.
  • the data preprocessing unit 23A executes a process of extracting an image including the feature point of the person from a plurality of images represented by the first image data.
  • the feature point of the person is a part that can specify the attribute of the person in the person included in the image.
  • the feature point of the person is, for example, a part such as a face where the person's facial expression appears, a limb where a gesture / gesture appears, a position where an accessory or the like tends to be worn. Since the first image data of the present embodiment is collected with the movement of the moving body V by the recording device 10 mounted on the moving body V, the images taken from different angles with respect to the same person. Is likely to be included.
  • the data pre-processing unit 23A captures the person's feature points that can be used for specifying the person's attributes from a number of images captured from different angles as the moving object V moves. As many data as possible for specifying the attributes of a person are secured.
  • the data analysis processing unit 23B specifies the attribute of the person included in the image represented by the first image data and the attribute based on the first analysis data preprocessed by the data preprocessing unit 23A. This is the part that analyzes the flow of people.
  • the data analysis processing unit 23B for example, various known artificial intelligence (Artificial Intelligence) technology and deep learning (Deep Learning) technology, the attributes of the person included in the image represented by the first image data, and A process for analyzing the flow of a person whose attribute is specified is configured to be executable.
  • the data analysis processing unit 23B analyzes the attribute of the person included in the image based on the image including the feature point of the person extracted from the first image data by the data preprocessing unit 23A.
  • the data analysis processing unit 23B includes, for example, attribute prediction reference data (analysis reference data) stored in the analysis reference DB 22B, and person feature points included in the image extracted from the first image data by the data preprocessing unit 23A.
  • attribute prediction reference data reflects the result of learning the attributes of the person that can be estimated according to the feature points of the person included in the image by various methods using artificial intelligence technology and deep learning technology. Information.
  • the attribute prediction reference data was databased using various techniques using artificial intelligence technology and deep learning technology to estimate the attributes of the person based on the feature points of the person included in the image. It is data.
  • This attribute prediction reference data can be updated sequentially.
  • analysis result data person attribute data representing an analysis result by the data analysis processing unit 23B can be used as data for learning.
  • an item that can be analyzed from a feature point of the appearance of the person for example, gender, age, physique, social status, preference, Or it includes behavioral orientation.
  • gender is an attribute representing male and female.
  • Age is an attribute that represents the length of the year from birth to the present (at that time).
  • the physique is an attribute representing height, weight, various dimensions, and the like.
  • the social status is an attribute representing occupation (self-employed, businessman, police officer, student, unemployed, part-time job), annual income, status, accompanying person, and the like.
  • the preference is an attribute that represents a trend of clothing, personal belongings, and fashion (casual orientation, elegant orientation, brand orientation, luxury orientation, fast fashion orientation), hobby (sports / subculture / outdoor / beauty, etc.).
  • Action-oriented is an attribute that represents the mood, interests (what you want to do, where you want to go), etc. at that time. That is, here, the data analysis processing unit 23B estimates gender, age, physique, social status, preference, behavior orientation, and the like as the attributes of the person.
  • the data analysis processing unit 23B refers to the attribute prediction reference data, extracts attributes (gender, age, physique, social status, preference, or behavior orientation) corresponding to the feature points of the person included in the image, The extracted attribute is estimated to be the attribute of a person reflected in the image.
  • the data analysis processing unit 23B refers to the attribute prediction reference data according to, for example, the facial expression that is the feature point of the person included in the image, the gesture / gesture of the limb, the attached accessory or clothes, and the like. Attributes matching the feature points are matched, and attributes such as gender, age, physique, social status, preference, and behavior orientation of the person are estimated.
  • the data analysis processing unit 23B executes a process of analyzing the human flow of the person whose attribute is specified as described above, based on the position data associated with the image data in which the attribute of the person is specified. For example, the data analysis processing unit 23B reads the first position data associated with the first image data in which the attribute of the person is specified from the analysis target DB 22A. Then, the data analysis processing unit 23B analyzes the human flow of the person whose attribute is specified based on the map reference data (analysis reference data) stored in the analysis reference DB 22B and the read first position data. . For example, the data analysis processing unit 23B refers to the map reference data and specifies the position where the image is captured based on the first position data.
  • the map reference data specifies the position where the image is captured based on the first position data.
  • the data analysis processing unit 23B specifies the position of the person whose attribute is specified based on the position represented by the first position data, and specifies the human flow by arranging them in time series.
  • a human flow of a person analyzed by the data analysis processing unit 23B is typically represented by a distribution, a moving direction, a moving speed, a movement start point / end point, and the like of the person in each time zone.
  • the data analysis processing unit 23B estimates the distribution, movement direction, movement speed, start point / end point of the movement, etc. of each person in each time zone as the flow of the person whose attribute is specified.
  • the data analysis processing unit 23B uses, as analysis result data obtained by analyzing the first analysis data, person attribute data representing the attribute of the person analyzed as described above, and a person flow by attribute representing the person flow of the person whose attribute is specified. Generate data. Then, the data analysis processing unit 23B accumulates the analysis result data including the generated person attribute data and attribute-specific person flow data in the analysis result DB 22C, and stores it in a database.
  • the data processing unit 23C is a part that executes processing for processing the analysis result data analyzed by the data analysis processing unit 23B into a desired format.
  • the data processing unit 23C processes the person attribute data and attribute-specific person flow data included in the analysis result data into a desired format. For example, as illustrated in FIG. 2, the data processing unit 23C plots analysis result data including person attribute data and attribute-specific person flow data on a map indicating when, where, and how many people have which attribute. Processed into various graphs, diagrams, etc. And the process part 23 performs the process which outputs the analysis result data processed by the data processing process part 23C to the desired format to the client terminal CL via the data input / output part 21.
  • the client terminal CL is a terminal that can use the analysis result data provided from the analysis device 20 for various purposes such as, for example, trade area survey, marketing, advertisement, disaster prevention / city planning.
  • the client terminal CL is configured by, for example, a notebook PC, a desktop PC, a tablet PC, a smartphone, a mobile terminal, or the like.
  • the recording apparatus 10 collects first analysis data including first image data and first position data as the moving body V moves (step S1).
  • the recording device 10 outputs the collected first analysis data via the data input / output unit 13 and inputs it to the analysis device 20 via the data input / output unit 21 of the analysis device 20 (step S2). .
  • the first analysis data input to the analysis device 20 is stored in the analysis target DB 22A.
  • the data preprocessing unit 23A of the analysis device 20 performs various preprocessing as described above on the first analysis data stored in the analysis target DB 22A (step S3).
  • the data analysis processing unit 23B of the analysis device 20 has the attribute of the person included in the image represented by the first image data based on the first analysis data preprocessed by the data preprocessing unit 23A, and The human flow of the person whose attribute is specified is analyzed (step S4).
  • the data analysis processing unit 23B generates person attribute data and attribute-specific person flow data as analysis result data, accumulates them in the analysis result DB 22C, stores them in a database (step S5).
  • the data processing unit 23C of the analysis apparatus 20 converts the analysis result data (person attribute data, attribute-specific person flow data) stored in the analysis result DB 22C into FIG. Are processed into a desired format as exemplified in (Step S6).
  • the processing unit 23 of the analysis device 20 outputs and provides the analysis result data processed into a desired format by the data processing processing unit 23C to the client terminal CL via the data input / output unit 21 (step S7). A series of processing ends.
  • the analysis system 1 described above uses the analysis device 20 to analyze the attributes of the person included in the image represented by the first image data based on the first analysis data and the human flow of the person whose attribute is specified. Can do.
  • the analysis system 1 collects the first image data and the first position data constituting the first analysis data as the moving body V moves by the recording device 10 mounted on the moving body V. can do.
  • the analysis system 1 allows the recording device 10 to collect the first analysis data while moving on the road by the moving body V, so that, for example, the first analysis data is collected by the data collection device fixed on the road. Compared with the case of collecting the data, it is possible to secure a significantly larger collection amount of the first analysis data.
  • the analysis system 1 since the first image data is collected by the recording apparatus 10 as the moving object V moves, the first image representing images taken from different angles with respect to the same person. A lot of data can be collected. As a result, the analysis system 1 can secure a relatively large collection amount of the first analysis data that can be used for the analysis by the analysis device 20, so that the analysis accuracy can be improved.
  • the analysis system 1 is a region where data is to be collected as compared to a case where data for analysis is collected using, for example, a specific application, a questionnaire of a fill-in type, or a material collected by a specific person.
  • the analysis system 1 can improve the analysis accuracy.
  • the analysis system 1 uses the first analysis data in which relatively high anonymity is ensured, for example, in comparison with a case where the analysis data is directly collected from a portable terminal owned by an individual. Can be collected. Thereby, the analysis system 1 can obtain analysis result data with relatively high anonymity from the viewpoint of personal information protection.
  • the analysis system 1 described above includes an image including a feature point of the person that can be used for specifying the person's attributes from a number of images captured from different angles by the recording apparatus 10 as the moving body V moves. By extracting, it is possible to secure as many data that can be used to specify the attributes of a person as much as possible. As a result, the analysis system 1 can further improve the analysis accuracy.
  • the analysis system 1 described above preferably includes a vehicle that repeatedly travels on a predetermined route as the moving body V on which the recording device 10 is mounted.
  • the analysis system 1 can efficiently collect a large number of first analysis data in different time zones in the same area by the recording device 10 mounted on the moving body V that repeatedly travels along a predetermined route. In this respect, the analysis accuracy can be improved.
  • analysis system 1 described above includes the storage unit 22 (analysis result DB 22C) that accumulates analysis result data, for example, the analysis result data can be easily used as big data. .
  • the analysis system 1 described above estimates gender, age, physique, social status, preference, behavior orientation, etc. as the attributes of the person included in the image represented by the first image data.
  • Data can be used for various purposes such as trade area surveys, marketing, advertising, disaster prevention and urban planning.
  • the analysis system according to the second embodiment is different from the first embodiment in that the analysis is performed based on the second analysis data.
  • the same components as those in the above-described embodiment are denoted by common reference numerals, and redundant description of common configurations, operations, and effects is omitted as much as possible (the same applies hereinafter).
  • the analysis system 201 according to the present embodiment shown in FIG. 4 is different from the above-described analysis system 1 in that it includes a recording device 210 as a moving body analysis data collection device instead of the recording device 10.
  • Other configurations of the analysis system 201 are substantially the same as those of the analysis system 1.
  • the recording apparatus 210 is different from the above-described recording apparatus 10 in that it includes an internal camera 215 in addition to the external camera 11, the position information measuring device 12, the data input / output unit 13, and the control unit 14.
  • Other configurations of the recording apparatus 210 are substantially the same as those of the recording apparatus 10.
  • the recording device 210 of the present embodiment collects second analysis data in addition to the first analysis data as analysis data used for analysis by the analysis device 20.
  • the second analysis data is data including second image data representing an image inside the moving body V and second position data representing a position where the image inside the moving body V is captured.
  • the recording device 210 collects the second image data and the second position data as the second analysis data.
  • the second image data is collected by the internal camera 215 described above.
  • the internal camera 215 is an internal imaging device that captures an image inside the moving body V, that is, in a vehicle.
  • the internal camera 215 captures an image inside the moving body V and collects second image data representing an image inside the moving body V.
  • the internal camera 215 typically captures a moving image inside the moving object V.
  • the internal camera 215 is installed in the moving body V so as to have an angle of view capable of capturing an image of a person to be analyzed by the analysis system 201, in this case, a passenger in the moving body V.
  • a plurality of internal cameras 215 may be provided on the ceiling or the like inside the moving body V so that a person inside the moving body V can be captured more suitably.
  • the internal camera 215 may be a monocular camera or a stereo camera.
  • the image captured by the internal camera 215 may be monochrome or color.
  • the control unit 14 is also communicably connected to the internal camera 215 and can exchange various signals and data with each other.
  • the internal camera 215 outputs
  • the position information measuring device 12 collects second position data representing the position where the image inside the moving object V is captured in addition to the first position data.
  • the position information measuring device 12 outputs the collected second position data to the control unit 14 in addition to the first position data.
  • the processing unit 14B controls the operation of the internal camera 215 and the position information measuring device 12, and collects the second analysis data including the second image data and the second position data. Execute the process. Further, the storage unit 14A can store the second analysis data together with the collected time and the like. In other words, like the first analysis data, the second analysis data includes time data indicating the time when the data is collected and other data. In addition to the first analysis data, the data input / output unit 13 can also output the second analysis data to the analysis device 20. The processing unit 14 ⁇ / b> B executes processing for outputting the second analysis data to the analysis device 20 via the data input / output unit 13.
  • the analysis device 20 of the present embodiment analyzes the second analysis data in addition to the first analysis data collected by the recording device 210, and provides the analysis result data representing the analysis result to the client terminal CL. .
  • the data input / output unit 21 can input the second analysis data from the recording device 210 in addition to the first analysis data.
  • the analysis target DB 22A also accumulates second analysis data (second image data, second position data, time data, etc.) that is analysis target data by the processing unit 23 and creates a database.
  • second analysis data (second image data, second position data, time data, etc.) that is analysis target data by the processing unit 23 and creates a database.
  • the second analysis data input from the recording device 210 to the data input / output unit 21 is stored in the analysis target DB 22A.
  • the analysis reference data stored in the analysis reference DB 22B is also referred to when the processing unit 23 analyzes the second analysis data.
  • the data pre-processing unit 23A performs various pre-processes on the second analysis data that is the analysis target data in the same manner as the first analysis data. Then, the data analysis processing unit 23B specifies the attribute of the person included in the image represented by the second image data and the attribute based on the second analysis data preprocessed by the data preprocessing unit 23A. Analyze the human flow of the selected person. The data analysis processing unit 23B analyzes the second analysis data in the same manner as the analysis for the first analysis data described above, and the person representing the attribute of the person included in the image represented by the second image data as the analysis result data Attribute data and attribute-specific human flow data representing the human flow of the person whose attribute is specified are generated.
  • the analysis result DB 22C also accumulates and stores the analysis result data representing the analysis result of the second analysis data by the data analysis processing unit 23B.
  • the analysis result data includes person attribute data representing the attribute of a person included in the image represented by the second image data, attribute-specific person flow data representing the person flow of the person whose attribute is specified, and the like.
  • the data processing unit 23C processes the person attribute data and attribute-specific person flow data included in the analysis result data into a desired format. Then, the processing unit 23 outputs and provides the analysis result data processed into a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21.
  • the analysis system 201 described above can secure a relatively large amount of first analysis data that can be used for analysis by the analysis device 20, thereby improving analysis accuracy. can do.
  • the analysis system 201 described above also includes second image data representing an image inside the moving object V by the recording device 210 and second analysis data including second position data corresponding thereto. It can be collected and used for analysis by the analysis device 20.
  • the analysis system 201 can obtain analysis result data based on the attributes and human flow of the person (passenger) inside the moving body V as well as the person on the road on which the moving body V travels, thereby further improving the analysis accuracy. can do.
  • the analysis system 201 uses the analysis result data, for example, for purposes such as bus and taxi passenger attributes, boarding / exiting positions, human flow analysis before and after boarding / exiting, and bus / taxi operation / distribution efficiency. Can do.
  • the analysis system according to the third embodiment is different from the second embodiment in that the analysis is performed based on the third analysis data.
  • the analysis system 301 includes a recording device 310 as a fixed body analysis data collection device mounted on the fixed body F in addition to the recording device 210 mounted on the moving body V. Different from the analysis system 201 described above. Other configurations of the analysis system 301 are substantially the same as those of the analysis system 201.
  • the recording apparatus 310 of the present embodiment is mounted on a fixed body F installed on the road, and the above-described recording apparatus 10 (see FIG. 1) and the recording apparatus 210 are the same as the mounting target. Different. Since the configuration of the recording device 310 is substantially the same as that of the recording device 10 described above, description thereof is omitted. Therefore, in the following, the configuration of the recording apparatus 310 refers to the configuration of the recording apparatus 10 in FIG. 1 as appropriate.
  • the fixed body F on which the recording apparatus 310 of the present embodiment is mounted is installed and fixed on the road.
  • the fixed body F may be, for example, a bus stop, a digital signage, a vending machine, a sign, a utility pole, or the like, or may be a dedicated structure provided for installing the recording device 310.
  • the recording device 310 of this embodiment further collects third analysis data as analysis data used for analysis by the analysis device 20.
  • the third analysis data includes the third image data representing an image outside the fixed body F captured in a state of being mounted on the fixed body F, and the position where the image outside the fixed body F is captured. It is data including the 3rd position data to represent.
  • the recording device 310 collects the third image data and the third position data as third analysis data.
  • the third image data is collected by the external camera 11 (see FIG. 1) described above.
  • the external camera 11 is installed on the fixed body F so as to have an angle of view capable of capturing an image of a person to be analyzed by the analysis system 1, here a person on the road outside the fixed body F.
  • a plurality of external cameras 11 may be provided on the fixed body F so that a person on the road outside the fixed body F can be imaged more suitably.
  • the external camera 11 outputs the collected third image data to the control unit 14.
  • the position information measuring device 12 collects third position data representing the position where an image outside the fixed body F is captured.
  • the position information measuring device 12 outputs the collected third position data to the control unit 14.
  • the processing unit 14B controls the operations of the external camera 11 and the position information measuring device 12, and executes a process of collecting third analysis data including third image data and third position data.
  • the storage unit 14A can store the third analysis data together with the collected time and the like.
  • the third analysis data includes time data representing the time at which the data is collected and other data, like the first analysis data.
  • the data input / output unit 13 can output the third analysis data to the analysis device 20.
  • the processing unit 14 ⁇ / b> B executes processing for outputting the third analysis data to the analysis device 20 via the data input / output unit 13.
  • the analysis device 20 of the present embodiment also analyzes the third analysis data collected by the recording device 310, and performs analysis. Analysis result data representing the result is provided to the client terminal CL.
  • the data input / output unit 21 can input the third analysis data from the recording device 310 in addition to the first analysis data and the second analysis data.
  • the analysis target DB 22A includes third analysis data (third image data, third position data, time data, etc.) that is analysis target data by the processing unit 23. ) Is also accumulated and stored in a database.
  • the third analysis data input from the recording device 310 to the data input / output unit 21 is stored in the analysis target DB 22A.
  • the analysis reference data stored in the analysis reference DB 22B is also referred to by the processing unit 23 when analyzing the third analysis data.
  • the data pre-processing unit 23A performs various pre-processing on the third analysis data that is the analysis target data in the same manner as the first analysis data and the second analysis data. Then, the data analysis processing unit 23B specifies the attribute of the person included in the image represented by the third image data and the attribute based on the third analysis data preprocessed by the data preprocessing unit 23A. Analyze the human flow of the selected person. The data analysis processing unit 23B analyzes the third analysis data in the same manner as the analysis for the first analysis data and the second analysis data described above, and is included in the image represented by the third image data as the analysis result data. Person attribute data representing person attributes and attribute-specific person flow data representing the person flow of the person whose attribute is specified are generated.
  • the analysis result DB 22C also stores analysis result data representing the analysis result of the third analysis data by the data analysis processing unit 23B.
  • the analysis result data includes person attribute data representing the attribute of a person included in the image represented by the third image data, attribute-specific person flow data representing the person flow of the person whose attribute is specified, and the like.
  • the data processing unit 23C processes the person attribute data and attribute-specific person flow data included in the analysis result data into a desired format. Then, the processing unit 23 outputs and provides the analysis result data processed into a desired format by the data processing processing unit 23C to the client terminal CL via the data input / output unit 21.
  • the analysis system 301 described above can secure a relatively large amount of collected first analysis data that can be used for analysis by the analysis apparatus 20. Can be improved.
  • the analysis system 301 described above is similar to the analysis system 201 in that the analysis result data is based not only on the person on the road on which the moving body V travels but also on the attributes of the person (passenger) inside the moving body V and the human flow. And the analysis accuracy can be further improved.
  • the analysis system 301 described above also includes third image data representing an image outside the fixed body F by the recording device 310 and third analysis data including third position data corresponding thereto. It can be collected and used for analysis by the analysis device 20. As a result, the analysis system 301 can add the third analysis data collected by the recording device 310 mounted on the fixed body F as the analysis data used for analyzing the person on the road. Therefore, the analysis system 301 can further improve the analysis accuracy.
  • analysis system according to the above-described embodiment of the present invention is not limited to the above-described embodiment, and various modifications are possible within the scope described in the claims.
  • the analysis system according to the present embodiment may be configured by appropriately combining the components of the embodiments and modified examples described above.
  • the analysis systems 1, 201, and 301 described above use, as analysis data used for the analysis, data collected by using a specific application, a questionnaire of a filling-in type, materials collected by a specific person, and the like. Also good.
  • the control unit 14 and the analysis device 20 described above may be configured such that each unit is configured separately, and each unit is connected so as to be able to exchange various electric signals with each other. It may be realized by another control device.
  • the program, application, various data, and the like described above may be updated as appropriate, or may be stored in a server connected to the analysis systems 1, 201, and 301 via an arbitrary network. Good.
  • the programs, applications, various data, and the like described above can be downloaded, for example, in whole or in part as necessary.
  • the processing function provided in the control unit 14 and the analysis device 20 may be realized in whole or in any part by, for example, a CPU or the like and a program interpreted and executed by the CPU or the like. Moreover, you may implement

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Traffic Control Systems (AREA)
PCT/JP2019/002171 2018-04-05 2019-01-24 解析システム WO2019193817A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
SG11202009797UA SG11202009797UA (en) 2018-04-05 2019-01-24 Analysis system
CN201980024084.3A CN111937026A (zh) 2018-04-05 2019-01-24 解析系统
TW108107249A TW201944325A (zh) 2018-04-05 2019-03-05 解析系統

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-072889 2018-04-05
JP2018072889A JP2019185237A (ja) 2018-04-05 2018-04-05 解析システム

Publications (1)

Publication Number Publication Date
WO2019193817A1 true WO2019193817A1 (ja) 2019-10-10

Family

ID=68100429

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/002171 WO2019193817A1 (ja) 2018-04-05 2019-01-24 解析システム

Country Status (5)

Country Link
JP (1) JP2019185237A (zh)
CN (1) CN111937026A (zh)
SG (1) SG11202009797UA (zh)
TW (1) TW201944325A (zh)
WO (1) WO2019193817A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7272244B2 (ja) * 2019-11-22 2023-05-12 トヨタ自動車株式会社 画像データ配信システム
JP7264028B2 (ja) * 2019-12-05 2023-04-25 トヨタ自動車株式会社 情報提供システム、情報提供方法、情報端末及び情報表示方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013045238A (ja) * 2011-08-23 2013-03-04 Denso Corp 地図表示システム、サーバ、及び端末
JP2016062414A (ja) * 2014-09-19 2016-04-25 クラリオン株式会社 車内監視装置及び車内監視システム
JP2017138861A (ja) * 2016-02-04 2017-08-10 ソフトバンク株式会社 道路交通調査システム
WO2017159060A1 (ja) * 2016-03-18 2017-09-21 日本電気株式会社 情報処理装置、制御方法、及びプログラム
JP2018022343A (ja) * 2016-08-03 2018-02-08 株式会社東芝 画像処理装置、および画像処理方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101464882B1 (ko) * 2013-06-13 2014-12-04 이화여자대학교 산학협력단 로보카를 이용한 실시간 영상 전송 방법 및 시스템
JP6139364B2 (ja) * 2013-10-02 2017-05-31 株式会社東芝 人物特定装置、人物特定方法及びプログラム
JP5856702B1 (ja) * 2015-01-29 2016-02-10 パナソニックIpマネジメント株式会社 ウェアラブルカメラシステム及び属性情報付与方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013045238A (ja) * 2011-08-23 2013-03-04 Denso Corp 地図表示システム、サーバ、及び端末
JP2016062414A (ja) * 2014-09-19 2016-04-25 クラリオン株式会社 車内監視装置及び車内監視システム
JP2017138861A (ja) * 2016-02-04 2017-08-10 ソフトバンク株式会社 道路交通調査システム
WO2017159060A1 (ja) * 2016-03-18 2017-09-21 日本電気株式会社 情報処理装置、制御方法、及びプログラム
JP2018022343A (ja) * 2016-08-03 2018-02-08 株式会社東芝 画像処理装置、および画像処理方法

Also Published As

Publication number Publication date
CN111937026A (zh) 2020-11-13
JP2019185237A (ja) 2019-10-24
SG11202009797UA (en) 2020-11-27
TW201944325A (zh) 2019-11-16

Similar Documents

Publication Publication Date Title
JP6999237B2 (ja) 案内システム
Suel et al. Multimodal deep learning from satellite and street-level imagery for measuring income, overcrowding, and environmental deprivation in urban areas
Blanco-Claraco et al. The Málaga urban dataset: High-rate stereo and LiDAR in a realistic urban scenario
EP3410378A1 (en) Provision and management of advertising via mobile entity
KR101643915B1 (ko) 지역적 특성이 반영된 맞춤형 광고를 위한 빅데이터 구축시스템
US20090208054A1 (en) Measuring a cohort's velocity, acceleration and direction using digital video
US10747991B2 (en) People stream analysis method, people stream analysis apparatus, and people stream analysis system
US8862500B1 (en) Automated billboard tagging and selling
WO2020100922A1 (ja) データ配信システム、センサデバイス及びサーバ
JP2007156637A (ja) 情報検索装置およびプログラムならびに情報検索システム
WO2019193817A1 (ja) 解析システム
JP2021177317A (ja) 3次元モデル構築システム、および3次元モデル構築方法
Ahmed et al. Survey of machine learning methods applied to urban mobility
US11276333B2 (en) Determination of parameters for use of an outdoor display unit
US11718176B2 (en) Information providing system, information providing method, information terminal, and information display method
Chen et al. Deep learning based real-time tourist spots detection and recognition mechanism
WO2020090310A1 (ja) 解析システム
JP2004333952A (ja) 分析システム
JP6914983B2 (ja) 解析システム
KR102038141B1 (ko) 부동산 정보를 제공하는 방법, 이를 이용하는 서버 및 시스템
CN114677627A (zh) 目标线索寻找方法、装置、设备及介质
JP7119985B2 (ja) 地図生成装置、地図生成システム、地図生成方法、及び地図生成プログラム
JP6982875B2 (ja) 情報提供システム
JP2018049318A (ja) 情報処理サーバ、プログラム、および情報処理方法
Gautama et al. Observing human activity through sensing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19781127

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19781127

Country of ref document: EP

Kind code of ref document: A1