WO2020195508A1 - Analysis system - Google Patents

Analysis system Download PDF

Info

Publication number
WO2020195508A1
WO2020195508A1 PCT/JP2020/007954 JP2020007954W WO2020195508A1 WO 2020195508 A1 WO2020195508 A1 WO 2020195508A1 JP 2020007954 W JP2020007954 W JP 2020007954W WO 2020195508 A1 WO2020195508 A1 WO 2020195508A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
analysis
specific point
image
processing unit
Prior art date
Application number
PCT/JP2020/007954
Other languages
French (fr)
Japanese (ja)
Inventor
綾乃 河江
Original Assignee
矢崎エナジーシステム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 矢崎エナジーシステム株式会社 filed Critical 矢崎エナジーシステム株式会社
Publication of WO2020195508A1 publication Critical patent/WO2020195508A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to an analysis system.
  • Patent Document 1 discloses an information display system including a traffic information acquisition unit, an information output unit, and a display unit.
  • the traffic information acquisition unit acquires traffic information related to the passage of a person.
  • the information output unit selectively outputs information based on the traffic information acquired by the traffic information acquisition unit.
  • the display unit displays the information selectively output by the information output unit at a place where a person corresponding to the traffic information is passing.
  • this information display system for example, by displaying information according to the traffic volume of a person based on the traffic information acquired by the traffic information acquisition unit, it is possible to charge according to the display effect of the information.
  • an index showing a tendency of the flow of people at an arbitrary point or region such as a traffic volume of a person and various indexes calculated based on the traffic volume
  • the analysis system may analyze the tendency of the flow of a person in a movement path in which a moving body such as a bus moves. In such a case, it is desired that the tendency of the flow of a person can be analyzed appropriately. ing.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an analysis system capable of appropriately analyzing the tendency of a person's flow.
  • the analysis system is mounted on a moving body, and image data representing an image outside the moving body, which is captured with the movement of the moving body, and the moving body. Identification in the movement path of the moving body based on the data collecting device that collects the analysis data including the position data representing the position where the external image of the moving body is captured and the analysis data collected by the data collecting device. It is characterized by being provided with a data analysis device that counts the number of people passing by at a point.
  • the data analysis device extracts the image data including the image of the specific point from the analysis data based on the position data, and includes the image data represented by the extracted image data.
  • the number of people passing by can be counted and aggregated to count the number of people passing by at the specific point.
  • the data analysis device is an estimated number of people passing through a virtual period in which the number of people included in one still image constituting the image data is counted in a preset virtual counting period. The number of people passing by at the specific point can be counted based on the estimated number of people passing through the virtual period.
  • the data analysis device counts the number of people passing through the plurality of specific points based on the analysis data, aggregates the number of people passing through the plurality of specific points, and sets the number of people passing through the specific points in the movement route. The number of passersby can be counted.
  • the data collection device is mounted on each of the plurality of moving objects, and the data analysis device is based on the analysis data collected by the plurality of data collection devices, and the specific point is specified. It is possible to count the number of people passing by.
  • the data analysis device can analyze the attributes of a person included in the image represented by the image data at each specific point based on the analysis data.
  • the data analysis device can calculate an index representing the number of passers-by who have passed the content acceptable range at the specific point based on the number of passersby at the specific point.
  • the analysis system can collect image data and analysis data including position data by a data collecting device mounted on a moving body.
  • the image data collected as the analysis data is data representing an image outside the moving body, which is captured as the moving body moves.
  • the data analysis device can count the number of people passing by at a specific point in the movement path of the moving body based on the analysis data collected by the data collection device.
  • this analysis system has the effect of being able to properly analyze the tendency of the flow of people.
  • FIG. 1 is a block diagram showing a schematic configuration of an analysis system according to an embodiment.
  • FIG. 2 is a schematic diagram showing a movement path of a moving body of the analysis system according to the embodiment and an example of a specific point.
  • FIG. 3 is a schematic diagram showing an example of collecting image data related to a specific point by the recording device of the analysis system according to the embodiment.
  • FIG. 4 is a schematic diagram showing an example of a still image for explaining a virtual counting period in the analysis system according to the embodiment.
  • FIG. 5 is a schematic diagram showing an example of a still image for explaining a virtual counting period in the analysis system according to the embodiment.
  • FIG. 6 is a schematic diagram showing an example for explaining the estimated number of people passing through the virtual period in the analysis system according to the embodiment.
  • FIG. 7 is a schematic diagram showing an example for explaining the number of people passing by in the target time zone in the analysis system according to the embodiment.
  • FIG. 8 is a schematic diagram showing an example in which the number of passersby counted in the analysis system according to the embodiment is aggregated on a daily basis.
  • FIG. 9 is a schematic diagram showing an example for explaining the counting of the number of people passing by the movement route in the analysis system according to the embodiment.
  • FIG. 10 is a schematic diagram showing an example of analysis result data analyzed and processed in the analysis system according to the embodiment.
  • FIG. 11 is a flowchart showing an example of processing in the analysis system according to the embodiment.
  • the analysis system 1 of the present embodiment shown in FIG. 1 includes a recording device 10 as a data collecting device and an analysis device 20 as a data analysis device, and transfers the analysis result data analyzed by the analysis device 20 to the client terminal CL. It is a system to provide.
  • the analysis system 1 of the present embodiment utilizes the recording device 10 mounted on the moving body V and analyzes the tendency of the flow of a person based on the image data or the like collected by the recording device 10. More specifically, in the analysis system 1 of the present embodiment, as illustrated in FIG. 2, the number of people passing by the specific point P in the movement path R of the moving body V is based on the analysis data collected by the recording device 10. By counting the above, a configuration is realized in which the tendency of the flow of a person at a specific point P or a movement path R can be appropriately analyzed.
  • the configuration of the analysis system 1 will be described in detail with reference to each figure.
  • the movement route R is a route on which the moving body V moves, and corresponds to, for example, a route such as a bus described later.
  • the specific point P is a specific point on the movement route R, and is arbitrarily set as a desired point for counting the number of people passing by.
  • the specific point P is, for example, a characteristic point in the movement route R such as a bus stop or an intersection, a point where an output device D or the like capable of outputting content is installed, a point where content from the output device D can be received, or the like. May be included.
  • the output device D is a device capable of outputting contents.
  • the output device D may constitute a so-called cloud service type device that is mounted on the network and various contents are provided via the network, or constitutes a so-called stand-alone type device that is separated from the network. You may.
  • the output device D includes a display capable of displaying an image according to the content, a speaker capable of outputting sound / sound according to the content, and the like.
  • As the content output by the output device D for example, in addition to content such as advertisements and coupons, various guidance information such as area information, route information to a predetermined facility, evacuation route / safety support information in the event of a disaster, etc. are configured. It may contain content.
  • the content data output by the output device D can be sequentially updated via a network, a recording medium, or the like.
  • the recording device 10 is mounted on the moving body V and collects analysis data used for analysis by the analysis device 20.
  • the analysis data collected by the recording device 10 is data including image data and position data.
  • the image data is data representing an image outside the moving body V, which is captured as the moving body V moves.
  • the position data is data representing the position where the image outside the moving body V is captured.
  • the recording device 10 collects image data and position data as analysis data.
  • the analysis data is used for analysis of the tendency of the flow of a person by the analysis device 20.
  • the moving body V on which the recording device 10 is mounted is, for example, a vehicle traveling on a road surface, for example, a private car, a rental car, a sharing car, a ride sharing car, a bus, a taxi, a truck, a transport vehicle, or a work. It is a car etc.
  • the moving body V is not limited to a vehicle, and may be a flying body such as a flying car or a drone that flies in the air.
  • the moving body V typically moves on a predetermined movement route R, and is, for example, capable of moving on a plurality of predetermined routes as a predetermined movement route R.
  • the moving body V of the present embodiment will be described as a fixed-route bus that repeatedly travels on a plurality of predetermined routes (moving route R) during a day.
  • the moving body V such as a fixed-route bus
  • one moving body V travels on a plurality of routes in one day, and a plurality of moving bodies V are on a plurality of routes. It may be operated depending on the usage.
  • the recording device 10 of the present embodiment is mounted on each of the plurality of moving bodies V that move on the plurality of routes in this way. That is, the analysis system 1 of the present embodiment includes a plurality of recording devices 10 mounted on each of the plurality of moving bodies V, and can collect analysis data from the plurality of recording devices 10.
  • the recording device 10 includes an external camera 11, a position information measuring device 12, a data input / output unit 13, and a control unit 14.
  • the recording device 10 for example, an in-vehicle device such as a so-called drive recorder mounted on the moving body V can be used, but the recording device 10 is not limited to this.
  • the external camera 11 is an external imaging device that captures an image of the outside of the moving body V.
  • the external camera 11 captures an image outside the moving body V as the moving body V moves, and collects image data representing an image outside the moving body V.
  • the external camera 11 typically captures a moving image outside the moving body V and collects image data representing the moving image.
  • a moving image is a time-series arrangement of still images of a plurality of frames.
  • the external camera 11 is installed on the moving body V so as to have an angle of view capable of capturing a person to be analyzed by the analysis system 1, here, a person located on a road outside the moving body V.
  • a plurality of external cameras 11 may be provided on the front portion, side portion, rear portion, roof portion, etc.
  • the external camera 11 may be a monocular camera or a stereo camera. Further, the image captured by the external camera 11 may be monochrome or color.
  • the external camera 11 is communicatively connected to the control unit 14, and outputs the collected image data to the control unit 14.
  • the position information measuring device 12 is a positioning device that measures the current position of the moving body V.
  • a GPS receiver or the like that receives radio waves transmitted from a GPS (Global Positioning System) satellite can be used.
  • the position information measuring device 12 receives radio waves transmitted from GPS satellites and acquires GPS information (latitude / longitude coordinates) as information indicating the current position of the moving body V, thereby capturing an image of the outside of the moving body V. Collect position data that represents the position.
  • the position information measuring device 12 is communicably connected to the control unit 14, and outputs the collected position data to the control unit 14.
  • the data input / output unit 13 inputs / outputs various data between a device different from the recording device 10 and the recording device 10.
  • the data input / output unit 13 of the present embodiment can output analysis data to the analysis device 20 which is a device different from the recording device 10.
  • the data input / output unit 13 may be configured to input / output data to / from a device different from the recording device 10 by, for example, communication via a network (whether wired or wireless). Further, even if the data input / output unit 13 has a slot unit and inputs / outputs data to / from a device different from the recording device 10 via a recording medium inserted into the slot unit, for example. Good.
  • the recording medium is, for example, a memory (removable media) that can be attached to and detached from the recording device 10 via the slot portion.
  • a memory removable media
  • various types of memory cards such as SD cards can be used, but the recording medium is not limited to this.
  • the control unit 14 comprehensively controls each unit of the recording device 10.
  • the control unit 14 executes various arithmetic processes for collecting analysis data.
  • the control unit 14 is mainly a well-known microcomputer including a central processing unit (CPU) (Central Processing Unit), a GPU (Graphics Processing Unit) and other central arithmetic processing units, a ROM (Read Only Memory), a RAM (Random Access Memory), and an interface. It is configured to include an electronic circuit.
  • the control unit 14 is communicably connected to each unit such as the external camera 11, the position information measuring device 12, and the data input / output unit 13, and can exchange various signals and data with each other.
  • control unit 14 includes a storage unit 14A and a processing unit 14B.
  • the storage unit 14A and the processing unit 14B can exchange various signals and data with each other.
  • the storage unit 14A stores conditions and information necessary for various processes in the processing unit 14B, various programs and applications executed by the control unit 14, control data, and the like.
  • the storage unit 14A can store the analysis data together with the collected time and the like.
  • the analysis data also includes time data and other data representing the time when the data was collected.
  • the storage unit 14A can also temporarily store various data generated in the process of processing by the processing unit 14B, for example. In the storage unit 14A, these data are read out as needed by the processing unit 14B, the data input / output unit 13, and the like.
  • the storage unit 14A can rewrite data such as a relatively large-capacity storage device such as a hard disk, SSD (Solid State Drive), or an optical disk, or data such as RAM, flash memory, NVSRAM (Non Volatile Static Random Access Memory), or the like. It may be a semiconductor memory.
  • a relatively large-capacity storage device such as a hard disk, SSD (Solid State Drive), or an optical disk, or data such as RAM, flash memory, NVSRAM (Non Volatile Static Random Access Memory), or the like. It may be a semiconductor memory.
  • the processing unit 14B executes various programs stored in the storage unit 14A based on various input signals and the like, and when the program operates, outputs an output signal to each unit and realizes various functions. Execute the process.
  • the processing unit 14B controls the operation of the external camera 11 and the position information measuring device 12, and executes a process of collecting analysis data including image data and position data. Further, the processing unit 14B executes a process related to data input / output via the data input / output unit 13.
  • the processing unit 14B executes, for example, a process of outputting analysis data to the analysis device 20 via the data input / output unit 13.
  • the analysis device 20 analyzes the analysis data collected by the recording device 10 and provides the analysis result data representing the analysis result to the client terminal CL.
  • the analysis device 20 and the client terminal CL may constitute a so-called cloud service type device (cloud server) mounted on the network, or a so-called stand-alone type device separated from the network. May be good.
  • the analysis device 20 of the present embodiment counts the number of people passing by the specific point P in the movement path R of the moving body V based on the analysis data collected by the recording device 10.
  • the analysis device 20 counts the number of people passing by the specific point P based on the analysis data collected by the plurality of recording devices 10.
  • the analysis device 20 of the present embodiment counts the number of people passing by the plurality of specific points P based on the analysis data, aggregates the number of people passing by the plurality of specific points P, and aggregates the number of people passing by the plurality of specific points P to move the moving body V.
  • R for example, counts the number of people passing on a particular line.
  • the analysis device 20 of the present embodiment analyzes the attributes of the person included in the image represented by the image data for each specific point P or for each movement path R based on the analysis data. Then, the analysis device 20 of the present embodiment generates analysis result data based on the counting result of the number of passing people, the analysis result of the attribute of the person, and the like, and provides the analysis result data to the client terminal CL.
  • the analysis device 20 executes various arithmetic processes for counting various passers-by based on the analysis data. Further, the analysis device 20 executes various arithmetic processes for analyzing the attributes of a person based on the analysis data.
  • the analysis device 20 includes a central processing unit such as a CPU and a GPU, a ROM, a RAM, and an electronic circuit mainly composed of a well-known microcomputer including an interface.
  • the analysis device 20 can also be configured by installing an application that realizes various processes described below on a known computer system such as a PC or workstation. Further, the analysis device 20 may be configured by combining a plurality of PCs so as to be able to communicate with each other.
  • the analysis device 20 includes a data input / output unit 21, a storage unit 22, and a processing unit 23.
  • the data input / output unit 21, the storage unit 22, and the processing unit 23 can exchange various signals and data with each other.
  • the data input / output unit 21 inputs / outputs various data between a device different from the analysis device 20 and the analysis device 20.
  • the data input / output unit 21 of the present embodiment can input analysis data from a recording device 10 which is a device different from the analysis device 20. Further, the data input / output unit 21 of the present embodiment can output the analysis result data to the client terminal CL, which is a device different from the analysis device 20. Similar to the data input / output unit 13, the data input / output unit 21 has a configuration for inputting / outputting data to / from a device different from the analysis device 20 by, for example, communication via a network (whether wired or wireless). It may be. Similarly, the data input / output unit 21 has a configuration in which, for example, data is input / output to / from a device different from the analysis device 20 via a recording medium having a slot unit and inserted into the slot unit. May be good.
  • the storage unit 22 stores conditions and information necessary for various processes in the processing unit 23, various programs and applications executed by the processing unit 23, control data, and the like.
  • the storage unit 22 can store the analysis data input by the data input / output unit 21.
  • the storage unit 22 can also temporarily store various data generated in the process of processing by the processing unit 23, for example. In the storage unit 22, these data are read out as needed by the data input / output unit 21, the processing unit 23, and the like.
  • the storage unit 22 may be, for example, a relatively large-capacity storage device such as a hard disk, SSD, or optical disk, or a semiconductor memory such as RAM, flash memory, or NVSRAM that can rewrite data.
  • the storage unit 22 functionally conceptually includes an analysis target database (hereinafter abbreviated as “analysis target DB”) 22A and an analysis reference database (hereinafter abbreviated as “analysis reference DB”). It includes 22B and an analysis result database (hereinafter, abbreviated as “analysis result DB”) 22C.
  • analysis target DB an analysis target database
  • analysis reference DB an analysis reference database
  • analysis result DB an analysis result database
  • the analysis target DB 22A is a part that accumulates analysis data (image data, position data, time data, etc.) that is analysis target data by the processing unit 23, creates a database, and stores it.
  • the analysis data input from the recording device 10 to the data input / output unit 21 is stored in the analysis target DB 22A.
  • the analysis reference DB 22B is a part that accumulates the analysis reference data to be referred to when the processing unit 23 analyzes the analysis data, creates a database, and stores the data.
  • the analysis reference data includes, for example, map reference data, attribute prediction reference data, and the like.
  • the map reference data is data representing a map to be referred to when specifying the position of the moving body V based on the position data or the like, in other words, the position where the image outside the moving body V is captured.
  • the attribute prediction reference data is data to be referred to when estimating the attributes of a person included in the image represented by the image data. The attribute prediction reference data will be described in detail later.
  • the analysis reference data is referred to by the processing unit 23 when analyzing the analysis data.
  • the analysis result DB 22C is a part that accumulates the analysis result data representing the analysis result of the analysis data by the processing unit 23, creates a database, and stores it.
  • the analysis result data includes, for example, a counting result of the number of people passing by a specific point P (counting data by specific point), a movement route R including a plurality of specific points P, for example, a counting result of the number of people passing by a specific route (by a specific route). It is data based on the count data), the analysis result of the attribute of the counted person (person attribute data), and the like.
  • the analysis result data is processed into a desired format by the processing unit 23, output from the data input / output unit 21 to the client terminal CL, and provided.
  • the various data stored in the analysis target DB 22A, the analysis reference DB 22B, and the analysis result DB 22C can be utilized as so-called big data (big data).
  • the processing unit 23 executes various programs stored in the storage unit 22 based on various input signals and the like, and executes various processes for analyzing analysis data when the programs operate. In addition, the processing unit 23 executes a process of processing the analysis result data into a desired format. Further, the processing unit 23 executes a process related to data input / output via the data input / output unit 21. The processing unit 23 executes, for example, a process of outputting the analysis result data processed into a desired format to the client terminal CL via the data input / output unit 21.
  • processing unit 23 is functionally conceptually configured to include a data preprocessing unit 23A, a data analysis processing unit 23B, and a data processing processing unit 23C.
  • the data preprocessing unit 23A is a portion that performs various preprocessing on the analysis data that is the analysis target data.
  • the data preprocessing unit 23A reads, for example, analysis data to be analysis target data from analysis target DB 22A, and cuts out a still image for each frame from the moving image represented by the image data of the analysis target data.
  • the data preprocessing unit 23A performs, for example, a process of associating the cut out still image with the position represented by the position data of the analysis data and the time represented by the time data of the analysis data. Execute.
  • the data analysis processing unit 23B is a part that counts the number of passing people based on the analysis data preprocessed by the data preprocessing unit 23A.
  • the data analysis processing unit 23B counts the number of people passing by at a specific point P arbitrarily set based on the image data included in the analysis data.
  • the data analysis processing unit 23B counts the number of people passing by based on the number of people included in the image represented by the image data.
  • the data analysis processing unit 23B executes a process of detecting and extracting a person from a still image cut out by the data preprocessing unit 23A based on the image data by using various known image processing techniques. Then, the data analysis processing unit 23B counts the number of detected and extracted persons, and counts the number of people passing by at the specific point P based on the counted number of persons.
  • the data analysis processing unit 23B divides the position data of the analysis data preprocessed by the data preprocessing unit 23A and the map reference data (analysis reference data) stored in the analysis reference DB 22B. Based on this, for example, as shown in FIG. 3, image data including an image of a specific point P is extracted. More specifically, the data analysis processing unit 23B extracts image data of a moving image obtained by capturing a region in a predetermined range including a specific point P from a plurality of analysis data stored in the analysis target DB 22A. At this time, the data analysis processing unit 23B also reads the time data associated with the extracted image data from the analysis target DB 22A, and specifies the time when the extracted image data is collected.
  • the image data extracted here is data representing a moving image obtained by capturing a region in a predetermined range including a specific point P in a specific period (for example, a period from time A to time B).
  • the moving image represented by the image data is composed of a plurality of frames of still images obtained by capturing a region in a predetermined range including a specific point P.
  • the data analysis processing unit 23B of the present embodiment is an image of a moving image including a specific point P as described above from a plurality of analysis data collected by a plurality of recording devices 10 mounted on each of the plurality of moving bodies V. Extract all data. Further, the data analysis processing unit 23B extracts all the image data as described above, including the analysis data collected at different dates and times. Then, the data analysis processing unit 23B uses various known image processing techniques as described above to count and aggregate the number of persons included in the still image cut out from the moving image represented by the extracted image data. The number of people passing by the specific point P is counted.
  • the person included in the image represented by the image data corresponds to a passerby at the specific point P, and includes a pedestrian, a person riding a bicycle, and the like.
  • the data analysis processing unit 23B of the present embodiment executes the following processing in a plurality of still images constituting the moving image of the extracted image data, for example, in order to avoid duplicate counting of the same person. That is, here, the data analysis processing unit 23B assumes that the number of people included in one still image constituting the image data is the virtual period estimated number of passersby n counted in the preset virtual counting period t. Count. Then, the data analysis processing unit 23B counts the number of people passing by the specific point P based on the estimated number of people passing by in the virtual period n.
  • the preset virtual counting period t is typically a case where the number of people included in one still image is actually measured at a specific point P by a staff member using a counter or the like. , It is a value that defines how many seconds the number of people pass by. Further, the virtual counting period t is, for example, as illustrated in FIG. 4, a person included in one still image, assuming that the person is moving at a passerby speed v assumed in advance. It corresponds to the required period required to measure and count the same number of people at a specific point P at a fixed point.
  • the required period is approximately the same as the period required to reach the specific point P when, for example, the person appearing at the end of the still image moves toward the specific point P at the passerby speed v assumed in advance. It can be regarded as equivalent. That is, the virtual counting period t is, for example, when the person who appears at the end of the still image among the persons included in one still image moves toward the specific point P at the passerby speed v assumed in advance. It can be approximated by the period required to reach the specific point P.
  • the virtual counting period t is typically determined according to the installation position of the external camera 11 for collecting image data, the camera angle of view, the assumed passerby speed v, etc., regardless of the moving speed of the moving body V.
  • the range of one still image constituting the image data is geometrically determined in the real space according to the installation position of the external camera 11 in the moving body V, the camera angle of view, and the like, and the depth of the still image in the real space is determined.
  • the distance L from the position corresponding to the end to the specific point P can also be calculated geometrically. For example, in the still image illustrated in FIG. 4, the distance L is assumed to be 10 [m], and the passerby speed v is assumed to be about 5 [km / h], which is the average walking speed of an adult.
  • the person who appears at the end of the still image reaches the specific point P when the person moves toward the specific point P at the passerby speed v assumed in advance.
  • the data analysis processing unit 23B of the present embodiment counts the number of people included in one still image constituting the image data by assuming that the number of people passing by is the estimated number of people passing through the virtual period.
  • the virtual period estimated number of people n is the number of people who are virtually counted in the virtual counting period t preset as described above.
  • the data analysis processing unit 23B can obtain one virtual period estimated number of passersby n from a moving image constituting one image data including a specific point P.
  • the moving image represented by the image data is composed of a plurality of frames of still images obtained by capturing a region in a predetermined range including a specific point P. Therefore, the data analysis processing unit 23B processes as follows in order to obtain one virtual period estimated number of passersby n from the moving image constituting one image data. That is, in this case, the data analysis processing unit 23B extracts an arbitrary one from the still images of a plurality of frames constituting one moving image, for example, as illustrated in FIG. The data analysis processing unit 23B extracts, for example, a still image at a time substantially in the center from a still image of a plurality of frames.
  • the data analysis processing unit 23B obtains one virtual period estimated number of passersby n by counting the virtual period estimated number of passersby n from the extracted still image.
  • the data analysis processing unit 23B estimates one virtual period by, for example, calculating the average value of the virtual period estimated number of people n of all the still images of a plurality of frames constituting the moving image once counted. It is also possible to obtain the number of passers n.
  • the data analysis processing unit 23B sets one virtual period estimated number of passers-by n obtained from one image data including the specific point P as described above, for example, at a time substantially in the center of the period in which the image data is captured. It is processed as if it is the number of passersby counted at the specific point P. For example, if the virtual period estimated number of passersby n is obtained from the image data captured in [9:00 to 9:02], the data analysis processing unit 23B sets the virtual period estimated number of passersby n to [9:01]. ] Is processed as the number of passersby counted at the specific point P.
  • the data analysis processing unit 23B sets the data analysis processing unit 23B at [9:01] as illustrated in FIG.
  • the number of people passing by the specific point P is counted by assuming that 15 [persons] have passed the specific point P during 7.2 [sec].
  • the data analysis processing unit 23B processes the virtual period estimated number of passersby n as the number of passersby counted at the specific point P at a time substantially in the center of the period in which the image data is captured as described above. I explained that it should be done, but it is not limited to this. If the time is within the period, the data analysis processing unit 23B may process, for example, the virtual period estimated number of passersby n as the number of passersby counted at the specific point P at the start time of the period. Then, it may be processed as the number of passersby counted at the specific point P at the end time of the period.
  • the virtual period estimated number of people n as described above is typically obtained once each time the moving body V equipped with the recording device 10 passes the specific point P once.
  • the data analysis processing unit 23B calculates the virtual period estimated number of passers-by n for each specific point P based on the plurality of analysis data collected by the plurality of recording devices 10 mounted on the plurality of moving bodies V, respectively. , Acquire a plurality of virtual period estimated number of people n at each specific point P.
  • the data analysis processing unit 23B calculates the number of people N at the specific point P in the target time zone for which the number of people is desired to be counted, based on the estimated number of people n in the virtual period at the predetermined time counted as described above. To do.
  • the data analysis processing unit 23B sets the unit counting period T in advance, and calculates the number of people N passing by the specific point P in the unit counting period T of the target time zone.
  • the unit counting period T is the minimum counting period used as a reference for counting the number of people passing by at a specific point P, and is arbitrarily set as a delimiter unit period for counting the number of people passing by. Will be done.
  • the data analysis processing unit 23B calculates the average value n ⁇ ave of all the virtual period estimated number of passersby n obtained in the unit counting period T of the target time zone. Then, the data analysis processing unit 23B multiplies the average value n ⁇ ave by the value obtained by dividing the unit counting period T by the virtual counting period t, so that the specific point P in the unit counting period T in the target time zone The number of people passing by N can be calculated. That is, the data analysis processing unit 23B sets the unit counting period to "T", the virtual counting period to "t", and sets the average value of all the virtual period estimated passers-by n obtained in the unit counting period T of the target time zone.
  • N n ⁇ ave ⁇ [T / t] ⁇ ⁇ ⁇ (1)
  • the case where n ⁇ ave 10 [person] is illustrated.
  • the plurality of symbols / bar graphs 100 in FIG. 7 represent the virtual period estimated number of passers-by n obtained in the unit counting period T of the target time zone, respectively.
  • the data analysis processing unit 23B calculates the number of people N passing by the specific point P in the unit counting period T of each target time zone, and aggregates them as a predetermined unit, for example, a daily unit.
  • FIG. 8 shows an example in which the number of people N passing by the specific point P in the unit counting period T of each target time zone is aggregated on a daily basis.
  • the plurality of symbols / bar graphs 200 in FIG. 8 represent the number of people passing N at the specific point P in the unit counting period T of each target time zone, and the symbol 300 is obtained in the unit counting period T of each target time zone. It represents the estimated number of people passing through the virtual period.
  • the accuracy of the number of people N passing at a specific point P in the unit counting period T of each target time zone tends to improve as the number of virtual period estimated number of people n obtained in the unit counting period T of each target time zone increases. is there. Conversely, if a large number of virtual period estimated passers-by n can be obtained evenly in each time zone, the passers-by N can ensure relatively high accuracy even if the unit counting period T is relatively short. This enables detailed analysis of the number of people passing by in a shorter period of time.
  • the data analysis processing unit 23B aggregates the number of people passing by N, it is not limited to the daily unit as described above, but aggregates in any desired unit such as a weekly unit, a monthly unit, and a day of the week unit. can do.
  • the data analysis processing unit 23B of the present embodiment performs the process of counting the number of people passing by the specific point P as described above for each of the plurality of set specific points P. Then, the data analysis processing unit 23B generates specific point-specific counting data representing the counting result of the number of passing persons at each specific point P counted as described above as the analysis result data obtained by analyzing the analysis data.
  • the specific point-specific counting data is data including various information calculated in the process of calculating the number of people N passing by each specific point P in the unit counting period T of each target time zone calculated as described above and the number of people passing N. is there. Then, the data analysis processing unit 23B stores the analysis result data including the generated counting data for each specific point in the analysis result DB 22C, creates a database, and stores the data.
  • the data analysis processing unit 23B of the present embodiment further counts the number of people passing by the plurality of specific points P, aggregates the number of people passing by the plurality of specific points P, and counts the number of people passing by the movement route R. ..
  • the data analysis processing unit 23B aggregates the number of people passing by the specific point P included in the moving route R (for example, a specific route) to be counted, and counts the number of people passing by the moving route R. To do.
  • the data analysis processing unit 23B generates, as the analysis result data obtained by analyzing the analysis data, the counting data for each specific route representing the counting result of the number of people passing by each moving route R counted as described above.
  • the specific route-specific counting data is data including various information calculated in the process of calculating the number of people passing by and the number of people N passing by each specific point P in the unit counting period T of each target time zone. Then, the data analysis processing unit 23B stores the analysis result data including the generated count data for each specific route in the analysis result DB 22C, creates a database, and stores the data.
  • the data analysis processing unit 23B of the present embodiment also analyzes the attributes of a person included in the image represented by the image data based on the analysis data preprocessed by the data preprocessing unit 23A. is there. Typically, the data analysis processing unit 23B analyzes the attributes of the person counted as described above.
  • the data analysis processing unit 23B analyzes the attributes of the person detected and extracted from the still image cut out by the data preprocessing unit 23A based on the image data.
  • the data analysis processing unit 23B typically analyzes the attributes of the person included in the image represented by the image data at each of the above-mentioned specific points P.
  • the data analysis processing unit 23B can also analyze the attributes of a person included in the image represented by the image data for each movement path R in which a plurality of specific points P are assembled.
  • the data analysis processing unit 23B extracts, for example, the analysis data for each specific point P from the analysis data collected by the plurality of recording devices 10 based on the position data and the like. Then, the data analysis processing unit 23B analyzes the attributes of the person included in the image represented by the image data based on the extracted analysis data, so that each of the plurality of specific points P is in the vicinity of the specific points P. Analyze the attributes of the person who is located and counted as described above. The attribute analysis of the person for each movement route R is almost the same as the case of the specific point P. The data analysis processing unit 23B analyzes the attributes of a person based on the analysis data collected along with the movement of the moving body V for each of a plurality of specific points P and each movement route R.
  • the data analysis processing unit 23B uses, for example, various known artificial intelligence techniques and deep learning techniques to attribute the person included in the image represented by the image data, and the attributes. Is configured to be able to execute the process of analyzing the flow of the identified person.
  • the data analysis processing unit 23B executes a process of detecting and extracting a person from the still image cut out by the data preprocessing unit 23A as described above. Then, the data analysis processing unit 23B of the present embodiment executes a process of extracting an image including the feature points of the detected and extracted person from the image represented by the image data.
  • the feature point of the person is a part where the attribute of the person can be specified in the person included in the image.
  • the characteristic points of the person are, for example, parts such as a face on which the person's facial expression appears, limbs on which gestures / gestures appear, and positions where accessories and the like tend to be easily attached.
  • the image data of the present embodiment is collected by the recording device 10 mounted on the moving body V as the moving body V moves, many images of the same person taken from different angles are obtained. It is likely to be included. Taking advantage of this, the data preprocessing unit 23A captures the feature points of the person that can be used to identify the attributes of the person from a large number of images taken from different angles as the moving body V moves. By extracting, as much data as possible to be used for identifying the attributes of a person is secured.
  • the data analysis processing unit 23B executes a process of analyzing the attributes of the person included in the image based on the image including the feature points of the person extracted from the image data.
  • the data analysis processing unit 23B is based on, for example, the attribute prediction reference data (analysis reference data) stored in the analysis reference DB 22B and the feature points of the person included in the image extracted from the image data. Analyze the attributes.
  • the attribute prediction reference data reflects the result of learning the attributes of the person that can be estimated according to the characteristic points of the person included in the image by various methods using artificial intelligence technology and deep learning technology. Information.
  • the attribute prediction reference data was created into a database using various methods using artificial intelligence technology and deep learning technology in order to estimate the attributes of the person based on the characteristic points of the person included in the image. It is data.
  • This attribute prediction reference data can be updated sequentially.
  • the analysis result data personal attribute data itself representing the analysis result by the data analysis processing unit 23B can be used as the data for learning.
  • the attributes of a person analyzed by the data analysis processing unit 23B typically include matters that can be analyzed from the characteristics of the appearance of the person, such as the person's gender, age, physique, social status, and preferences. Or, it includes behavioral orientation.
  • the gender is an attribute representing the distinction between male and female.
  • Age is an attribute that represents the length of years from birth to the present (at that time).
  • the physique is an attribute representing height, weight, various dimensions, and the like.
  • Social status is an attribute that represents occupation (self-employed, businessman, police officer, student, unemployed, part-time job), annual income, status, companion, etc.
  • Preference is an attribute that represents the tendency of clothes / belongings / fashion (casual orientation, elegant orientation, brand orientation, luxury orientation, fast fashion orientation), hobbies (sports / subculture / outdoor / beauty, etc.).
  • Behavioral orientation is an attribute that expresses the mood, interests (what you want to do, where you want to go), etc. at that time. That is, here, the data analysis processing unit 23B estimates gender, age, physique, social status, preference, behavioral orientation, etc. as the attributes of the person.
  • the data analysis processing unit 23B refers to the attribute prediction reference data and extracts the attributes (gender, age, physique, social status, preference, or behavior orientation) corresponding to the characteristic points of the person included in the image. It is presumed that the extracted attributes are the attributes of the person reflected in the image.
  • the data analysis processing unit 23B refers to the attribute prediction reference data according to, for example, facial expressions, gestures / gestures of limbs, attached accessories, clothes, etc., which are characteristic points of the person included in the image. The attributes that match the characteristic points are matched, and the attributes such as gender, age, physique, social status, preference, and behavioral orientation of the person are estimated.
  • the data analysis processing unit 23B executes a process of analyzing the position of the person whose attributes are specified as described above based on the position data associated with the image data whose attributes are specified. ..
  • the data analysis processing unit 23B reads, for example, the position data associated with the image data in which the attribute of the person is specified from the analysis target DB 22A. Then, the data analysis processing unit 23B analyzes the position and the like of the person whose attribute is specified based on the map reference data (analysis reference data) stored in the analysis reference DB 22B and the read position data. For example, the data analysis processing unit 23B refers to the map reference data and identifies the position where the image is captured based on the position data. Then, the data analysis processing unit 23B identifies the position of the person whose attribute is specified based on the position represented by the position data.
  • the data analysis processing unit 23B obtains the person attribute data representing the attribute of the person analyzed for each specific point P and the position of the person whose attribute is specified as described above. Generate position data for each attribute to be represented.
  • the data analysis processing unit 23B includes personal attribute data representing the attributes of the person analyzed for each movement path R in which a plurality of specific points P are assembled, and attribute-specific position data representing the position of the person whose attributes are specified. Also generate. Then, the data analysis processing unit 23B accumulates the generated person attribute data and the analysis result data including the attribute-specific position data in the analysis result DB 22C and stores them in a database.
  • the data analysis processing unit 23B of the present embodiment further generates, as analysis result data, commercial use data based on specific point-specific counting data, specific route-specific counting data, personal attribute data, attribute-specific position data, and the like. May also be configured to be viable.
  • the data analysis processing unit 23B of the present embodiment can calculate, for example, an index representing the number of passersby who have passed the content acceptable range at the specific point P.
  • the content acceptable range at the specific point P is a spatial range in which the person can accept the content output by the output device D at the specific point P, and the visible range in which the person can visually recognize the image displayed by the output device D.
  • the sound / voice output by the output device D is determined according to the audible range in which a person can hear.
  • the data analysis processing unit 23B is, for example, a passerby who has passed the content acceptable range for each specific point P in which the output device D exists in the vicinity, based on the count data for each specific point representing the number of people passing by the specific point P. Calculate an index showing the number of people.
  • the data analysis processing unit 23B may calculate the index for each of a plurality of movement routes R (for each of a plurality of routes) based on the counting data for each specific route. Then, the data analysis processing unit 23B generates commercial use data representing the index, stores the analysis result data including the generated commercial use data in the analysis result DB 22C, and stores it in a database.
  • the number of passers-by who have passed the above-mentioned content acceptable range can be typically regarded as the number of people who have received the content output at the specific point P.
  • the content acceptable range can be typically regarded as a region near the specific point P at the specific point P where the output device D exists in the vicinity. Therefore, it can be considered that the number of passers-by who have passed the content acceptable range at the specific point P is substantially the same as the number of passersby at the specific point P.
  • the data analysis processing unit 23B of the present embodiment sets the number of people passing by the specific point P as the number of people passing through the content acceptable range. That is, here, the data analysis processing unit 23B sets the number of people passing by the specific point P represented by the counting data for each specific point as the number of passersby in the content acceptable range for each specific point P. Similarly, the data analysis processing unit 23B sets the number of passersby for each movement route R represented by the counting data for each specific route as the number of passersby in the content acceptable range for each movement route R.
  • the data analysis processing unit 23B may use the number of passers-by who have passed the content acceptable range itself as an index showing the number of passers-by, or calculate an index showing the number of passers-by based on the number of passers-by. You may. Examples of the index representing the number of passers-by calculated by the data analysis processing unit 23B include "DEC: Daily Effective Circulation” and "GRP: Gross Rating Point". Both "DEC” and “GRP” are indicators of the effectiveness of advertising. “DEC” is typically the number of passers-by per day that pass through the content acceptable range (visible range) of the target ad.
  • DEC may be the number of passers-by for people who meet a predetermined age limit, such as 18 years of age or older, or may be the number of passers-by for all people without an age limit. ..
  • GRAP is typically the ratio of the number of passers-by per day to the content acceptable range in the target population within the area reachable per day for the target advertisement.
  • GRP can be represented by [DEC / target population in the target area].
  • the "target population in the target area” is the population that satisfies the age limit in the target area when the age limit is set for the target of "DEC".
  • the data analysis processing unit 23B can calculate "DEC” and "GRP" for each specific point P as an index showing the number of passers based on the number of passers in the content acceptable range for each specific point P. .. Similarly, the data analysis processing unit 23B calculates "DEC” and "GRP" for each movement route R as an index indicating the number of passers based on the number of passersby in the content acceptable range for each movement route R. be able to.
  • the data analysis processing unit 23B generates commercial use data representing "DEC” and "GRP" for each specific point P and each movement route R as an index showing the number of passersby who have passed the content acceptable range. Then, the analysis result data including the generated commercial use data can be stored in the analysis result DB 22C and stored in a database.
  • the data processing unit 23C is a part that executes a process of processing the analysis result data analyzed by the data analysis processing unit 23B into a desired format as described above.
  • the data processing unit 23C processes the counting data for each specific point, the counting data for each specific route, the person attribute data, the human flow data for each attribute, the commercial use data, etc. included in the analysis result data into a desired format. For example, as illustrated in FIG. 10, the data processing unit 23C identifies when, which movement route (route) R, which specific point P, and how many people have what attributes. "DEC", "GRP" for each point P, "DEC", “GRP” for each movement route R, etc. are plotted on a map and processed into various graphs, diagrams, and the like.
  • the processing unit 23 executes a process of outputting the analysis result data processed in a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21.
  • the client terminal CL is a terminal that makes it possible to use the analysis result data provided by the analysis device 20 for various purposes such as trade area surveys, marketing, advertisements, judgment materials when determining advertising fees, disaster prevention, and city planning. Is.
  • the client terminal CL is composed of, for example, a notebook PC, a desktop PC, a tablet PC, a smartphone, a mobile terminal, and the like.
  • the plurality of recording devices 10 mounted on the plurality of moving bodies V each collect analysis data including image data and position data as the moving body V moves (step S1).
  • the recording device 10 outputs the collected analysis data via the data input / output unit 13 and inputs the collected analysis data to the analysis device 20 via the data input / output unit 21 of the analysis device 20 (step S2).
  • the analysis data input to the analysis device 20 is stored in the analysis target DB 22A.
  • the data preprocessing unit 23A of the analysis device 20 performs various preprocessing as described above on the analysis data stored in the analysis target DB 22A (step S3).
  • the data analysis processing unit 23B of the analysis device 20 performs analysis based on the analysis data preprocessed by the data preprocessing unit 23A, and as the analysis result data, count data for each specific point and each specific route. Counting data, personal attribute data, personal flow data by attribute, commercial use data, etc. are generated (step S4).
  • the data analysis processing unit 23B accumulates the generated analysis result data such as the count data for each specific point, the count data for each specific route, the person attribute data, the human flow data for each attribute, and the commercial use data in the analysis result DB 22C and creates a database. It is memorized (step S5).
  • the data processing unit 23C of the analysis device 20 receives the analysis result data (counting data for each specific point, counting data for each specific route, person) stored in the analysis result DB 22C in response to a request from the client terminal CL or the like. Attribute data, personal flow data by attribute, commercial use data, etc.) are processed into a desired format as illustrated in FIG. 10 (step S6).
  • the processing unit 23 of the analysis device 20 outputs and provides the analysis result data processed in a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21 (step S7). Ends a series of processes.
  • the analysis system 1 described above can collect image data and analysis data including position data by the recording device 10 mounted on the moving body V.
  • the image data collected as the analysis data is data representing an image outside the moving body V, which is captured as the moving body V moves.
  • the analysis device 20 can count the number of people passing by the specific point P in the movement path R of the moving body V based on the analysis data collected by the plurality of recording devices 10. As a result, this analysis system 1 can appropriately analyze the tendency of the flow of a person at each specific point P in the movement path R in which the moving body V moves.
  • the analysis system 1 determines the tendency of the flow of people at each specific point P in the movement route R analyzed as described above, such as trade area survey, marketing, advertising, judgment material when determining advertising fees, disaster prevention / city planning, etc. It can be used for various purposes. Further, the analysis system 1 can reduce the workload of counting and significantly improve the frequency of counting itself, as compared with the case where the number of people passing by is manually counted, for example, by visual inspection. As a result, the analysis system 1 can analyze the tendency of the flow of a person more accurately over a long period of time.
  • the analysis system 1 described above extracts image data including an image of a specific point P from the analysis data based on the position data by the analysis device 20, and includes the image data represented by the extracted image data.
  • the number of people passing by at a specific point P is counted and aggregated.
  • the analysis system 1 reduces the work load of counting and greatly improves the frequency of counting itself, as compared with the case where the number of people passing by the specific point P is manually counted, for example, visually. Can be done.
  • the analysis system 1 can analyze the tendency of the flow of a person more accurately.
  • the analysis device 20 counts the number of persons included in one still image constituting the image data in a preset virtual counting period t. It is counted assuming that the estimated number of people passing by is n. Then, the analysis system 1 counts the number of people passing by the specific point P based on the estimated number of people passing by in the virtual period n. As a result, the analysis system 1 can suppress duplicate counting of the same person between a plurality of still images without using, for example, a conventional tracking algorithm, and thus analyze the tendency of the flow of the person more accurately. can do.
  • the analysis system 1 described above aggregates the number of people passing by the plurality of specific points P counted by the analysis device 20 and counts the number of people passing by the movement route R.
  • this analysis system 1 can appropriately analyze the tendency of the flow of a person for each movement path R in which the moving body V moves, and can be used for various purposes.
  • the analysis system 1 can also utilize the tendency of the flow of people for each movement route R, for example, to analyze the advertising effect of a moving body such as a wrapping bus.
  • the analysis system 1 described above counts the number of people passing by the specific point P and the movement route R based on the analysis data collected by the plurality of recording devices 10 mounted on the plurality of moving bodies V, respectively. ..
  • this analysis system 1 for example, in order to improve the efficiency of vehicle allocation, one moving body V moves on a plurality of routes (moving route R) in one day, and a plurality of moving bodies V are present. Even in the case of operating properly over the route, it is possible to efficiently collect analysis data related to the specific point P from all the moving bodies V that have passed through the specific point P. As a result, since the analysis system 1 can collect more analysis data and use it for the analysis, it is possible to more accurately analyze the tendency of the flow of the person for each specific point P and each movement route R.
  • the analysis system 1 described above further analyzes the attributes of the person included in the image represented by the image data for each specific point P and each movement path R based on the analysis data by the analysis device 20.
  • this analysis system 1 can analyze not only the number of people passing by but also the attributes of people counted as the number of people passing by as the tendency of the flow of people for each specific point P and each movement route R.
  • the analysis system 1 can grasp not only the number of people passing by each specific point P and each movement route R, but also the attribute tendency of the person counted as the number of people passing, and conversely, it is desired. It is possible to easily identify a specific point P, a movement route R, etc., which have many passersby with an attribute tendency.
  • the analysis system 1 can more preferably utilize the tendency of the flow of people for each specific point P and each movement route R for various purposes as described above.
  • the analysis system 1 described above is based on the number of people passing by each specific point P and each movement route R, and the number of passersby who have passed the content acceptable range for each specific point P and each movement route R. Calculate the index representing.
  • the analysis system 1 suitably utilizes the index for each specific point P and each movement route R as a judgment material when, for example, determining a usage fee (advertising fee, etc.) for the content output from the output device D. be able to.
  • the analysis system 1 can suitably utilize the attribute tendency of the passerby for each specific point P and each movement route R as, for example, as a judgment material when determining the content to be output from the output device D.
  • the moving body V on which the recording device 10 is mounted moves on a predetermined movement path R, and is described as being capable of moving on a plurality of predetermined routes, for example. Not limited to this. That is, the moving body V has been described as being a fixed-route bus that repeatedly travels on a plurality of predetermined routes during the day, but the present invention is not limited to this. Further, the recording device 10 has been described as being mounted on each of a plurality of moving bodies V moving on a plurality of routes, but the present invention is not limited to this. For example, the recording device 10 may only be mounted on one passenger car. That is, the analysis device 20 has been described as counting the number of people passing by the specific point P based on the analysis data collected by the plurality of recording devices 10, but the present invention is not limited to this.
  • the analysis device 20 has been described as counting the number of people passing by the movement route R as well as the number of people passing by the specific point P, but the present invention is not limited to this. Further, the analysis device 20 has been described as analyzing the attributes of a person included in the image represented by the image data for each specific point P and each movement path R based on the analysis data, but the present invention is not limited to this. The analysis device 20 may at least count the number of people passing by at the specific point P.
  • the analysis device 20 counts the number of people included in one still image constituting the image data as assuming that it is a virtual period estimated number of passersby n counted in a preset virtual counting period t.
  • the description is made assuming that the number of passersby at the specific point P is counted based on the estimated number of passersby n in the virtual period, but the present invention is not limited to this.
  • the control unit 14 and the analysis device 20 described above may be configured such that each unit is separately configured and the respective units are connected to each other so that various electric signals can be exchanged with each other. It may be realized by other control devices. Further, the programs, applications, various data and the like described above may be updated as appropriate, or may be stored in a server connected to the analysis system 1 via an arbitrary network. The programs, applications, various data, and the like described above can be downloaded in whole or in part as needed. Further, for example, with respect to the processing functions provided in the control unit 14 and the analysis device 20, all or any part thereof may be realized by, for example, a CPU or a program interpreted and executed by the CPU or the like. Further, it may be realized as hardware by wired logic or the like.
  • the analysis system 1 performs primary image analysis such as cutting out an image including a person on each recording device 10, and responds to data based on analysis data transmitted from each recording device 10 to the analysis device 20.
  • the analysis device 20 may perform secondary image analysis such as counting the number of people passing by and analyzing personal attributes.
  • the analysis system 1 individually counts the number of people passing by the specific point P for each moving body V on each recording device 10 side, generates counting data for each specific point for each moving body, and analyzes the data from each recording device 10.
  • the analysis device 20 aggregates the counting data for each specific point for each moving body and counts for each specific point described above. Data may be generated.
  • the output device D described above may be an outdoor board, a wall sheet, a self-standing signboard, or the like, in addition to a digital display.
  • the analysis system according to the present embodiment may be configured by appropriately combining the components of the embodiments and modifications described above.
  • Analysis system 10 Recording device (data collection device) 11 External camera 12 Position information measuring device 13, 21 Data input / output unit 14 Control unit 14A Storage unit 14B Processing unit 20 Analysis device (data analysis device) 22 Storage unit 22A Analysis target DB 22B analysis reference DB 22C analysis result DB 23 Processing unit 23A Data preprocessing unit 23B Data analysis processing unit 23C Data processing processing unit CL Client terminal D Output device n Virtual period Estimated number of passersby P Specific point R Movement route t Virtual counting period V Moving object

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Strategic Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Geometry (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

An analysis system (1) that is characterized by comprising: a data collection device (10) that is mounted on a moving body (V) and collects analysis data that includes image data and location data, the image data representing images of the outside of the moving body (V) that are captured in association with the movement of the moving body (V), and the location data representing the locations at which the images of the outside of the moving body (V) were captured; and a data analysis device (20) that, on the basis of the analysis data collected by the data collection device (10), counts the number of people that pass specific points on the movement path of the moving body (V). The analysis system (1) can thereby properly analyze trends in flows of people.

Description

解析システムAnalysis system
 本発明は、解析システムに関する。 The present invention relates to an analysis system.
 従来の解析システムとして、例えば、特許文献1には、通行情報取得ユニットと、情報出力ユニットと、表示ユニットとを備えた情報表示システムが開示されている。通行情報取得ユニットは、人の通行に関する通行情報を取得する。情報出力ユニットは、通行情報取得ユニットによって取得された通行情報に基づいて情報を選択的に出力する。表示ユニットは、情報出力ユニットが選択的に出力した情報を、通行情報に対応した人が通行している場所に表示する。この情報表示システムは、例えば、通行情報取得ユニットで取得された通行情報に基づいて人の通行量に応じて情報を表示させることで、当該情報の表示効果に応じた課金を可能としている。 As a conventional analysis system, for example, Patent Document 1 discloses an information display system including a traffic information acquisition unit, an information output unit, and a display unit. The traffic information acquisition unit acquires traffic information related to the passage of a person. The information output unit selectively outputs information based on the traffic information acquired by the traffic information acquisition unit. The display unit displays the information selectively output by the information output unit at a place where a person corresponding to the traffic information is passing. In this information display system, for example, by displaying information according to the traffic volume of a person based on the traffic information acquired by the traffic information acquisition unit, it is possible to charge according to the display effect of the information.
特開2003-302923号公報Japanese Unexamined Patent Publication No. 2003-302923
 ところで、上述のようなシステムは、例えば、人物の通行量や当該通行量を基に算出した各種指標等、任意の地点や地域における人の流動の傾向を表す指標を、商圏調査、マーケティング、広告、広告料を決める際の判断材料、防災・都市計画等の様々な用途で活用する場合がある。そして、解析システムは、例えば、バス等の移動体が移動する移動経路における人物の流動の傾向を解析する場合があるが、このような場合に適正に人物の流動の傾向が解析できることが望まれている。 By the way, in the above-mentioned system, for example, an index showing a tendency of the flow of people at an arbitrary point or region, such as a traffic volume of a person and various indexes calculated based on the traffic volume, is used for trade area survey, marketing, and advertising. , It may be used for various purposes such as judgment material when deciding advertising fees, disaster prevention and city planning. Then, the analysis system may analyze the tendency of the flow of a person in a movement path in which a moving body such as a bus moves. In such a case, it is desired that the tendency of the flow of a person can be analyzed appropriately. ing.
 本発明は、上記の事情に鑑みてなされたものであって、人物の流動の傾向を適正に解析することができる解析システムを提供することを目的とする。 The present invention has been made in view of the above circumstances, and an object of the present invention is to provide an analysis system capable of appropriately analyzing the tendency of a person's flow.
 上記目的を達成するために、本発明に係る解析システムは、移動体に搭載され、当該移動体の移動に伴って撮像された当該移動体の外部の画像を表す画像データ、及び、前記移動体の外部の画像が撮像された位置を表す位置データを含む解析用データを収集するデータ収集装置と、前記データ収集装置によって収集された前記解析用データに基づいて、前記移動体の移動経路における特定地点の通行人数を計数するデータ解析装置とを備えることを特徴とする。 In order to achieve the above object, the analysis system according to the present invention is mounted on a moving body, and image data representing an image outside the moving body, which is captured with the movement of the moving body, and the moving body. Identification in the movement path of the moving body based on the data collecting device that collects the analysis data including the position data representing the position where the external image of the moving body is captured and the analysis data collected by the data collecting device. It is characterized by being provided with a data analysis device that counts the number of people passing by at a point.
 また、上記解析システムでは、前記データ解析装置は、前記位置データに基づいて、前記解析用データから前記特定地点の画像を含む前記画像データを抽出し、当該抽出した前記画像データが表す画像に含まれる人物の数を計数し集約して前記特定地点の通行人数を計数するものとすることができる。 Further, in the analysis system, the data analysis device extracts the image data including the image of the specific point from the analysis data based on the position data, and includes the image data represented by the extracted image data. The number of people passing by can be counted and aggregated to count the number of people passing by at the specific point.
 また、上記解析システムでは、前記データ解析装置は、前記画像データを構成する1つの静止画像に含まれる人物の数を、予め設定される仮想計数期間において計数される仮想期間推定通行人数であるものとして計数し、当該仮想期間推定通行人数に基づいて、前記特定地点の通行人数を計数するものとすることができる。 Further, in the analysis system, the data analysis device is an estimated number of people passing through a virtual period in which the number of people included in one still image constituting the image data is counted in a preset virtual counting period. The number of people passing by at the specific point can be counted based on the estimated number of people passing through the virtual period.
 また、上記解析システムでは、前記データ解析装置は、前記解析用データに基づいて、複数の前記特定地点の通行人数を計数し、当該複数の前記特定地点の通行人数を集約して前記移動経路における通行人数を計数するものとすることができる。 Further, in the analysis system, the data analysis device counts the number of people passing through the plurality of specific points based on the analysis data, aggregates the number of people passing through the plurality of specific points, and sets the number of people passing through the specific points in the movement route. The number of passersby can be counted.
 また、上記解析システムでは、前記データ収集装置は、複数の前記移動体にそれぞれ搭載され、前記データ解析装置は、複数の前記データ収集装置によって収集された前記解析用データに基づいて、前記特定地点の通行人数を計数するものとすることができる。 Further, in the analysis system, the data collection device is mounted on each of the plurality of moving objects, and the data analysis device is based on the analysis data collected by the plurality of data collection devices, and the specific point is specified. It is possible to count the number of people passing by.
 また、上記解析システムでは、前記データ解析装置は、前記解析用データに基づいて、前記特定地点ごとに、前記画像データが表す画像に含まれる人物の属性を解析するものとすることができる。 Further, in the analysis system, the data analysis device can analyze the attributes of a person included in the image represented by the image data at each specific point based on the analysis data.
 また、上記解析システムでは、前記データ解析装置は、前記特定地点の通行人数に基づいて、当該特定地点におけるコンテンツ受容可能範囲を通過した通過者人数を表す指標を算出するものとすることができる。 Further, in the analysis system, the data analysis device can calculate an index representing the number of passers-by who have passed the content acceptable range at the specific point based on the number of passersby at the specific point.
 本発明に係る解析システムは、移動体に搭載されたデータ収集装置によって、画像データ、及び、位置データを含む解析用データを収集することができる。解析用データとして収集される画像データは、移動体の移動に伴って撮像された当該移動体の外部の画像を表すデータである。そして、データ解析装置は、データ収集装置によって収集された解析用データに基づいて、移動体の移動経路における特定地点の通行人数を計数することができる。この結果、この解析システムは、人物の流動の傾向を適正に解析することができる、という効果を奏する。 The analysis system according to the present invention can collect image data and analysis data including position data by a data collecting device mounted on a moving body. The image data collected as the analysis data is data representing an image outside the moving body, which is captured as the moving body moves. Then, the data analysis device can count the number of people passing by at a specific point in the movement path of the moving body based on the analysis data collected by the data collection device. As a result, this analysis system has the effect of being able to properly analyze the tendency of the flow of people.
図1は、実施形態に係る解析システムの概略構成を表すブロック図である。FIG. 1 is a block diagram showing a schematic configuration of an analysis system according to an embodiment. 図2は、実施形態に係る解析システムの移動体の移動経路、及び、特定地点の一例を表す模式図である。FIG. 2 is a schematic diagram showing a movement path of a moving body of the analysis system according to the embodiment and an example of a specific point. 図3は、実施形態に係る解析システムの記録装置による特定地点に関する画像データの収集の一例を表す模式図である。FIG. 3 is a schematic diagram showing an example of collecting image data related to a specific point by the recording device of the analysis system according to the embodiment. 図4は、実施形態に係る解析システムにおける仮想計数期間を説明するための静止画像の一例を表す模式図である。FIG. 4 is a schematic diagram showing an example of a still image for explaining a virtual counting period in the analysis system according to the embodiment. 図5は、実施形態に係る解析システムにおける仮想計数期間を説明するための静止画像の一例を表す模式図である。FIG. 5 is a schematic diagram showing an example of a still image for explaining a virtual counting period in the analysis system according to the embodiment. 図6は、実施形態に係る解析システムにおける仮想期間推定通行人数を説明するための一例を表す模式図である。FIG. 6 is a schematic diagram showing an example for explaining the estimated number of people passing through the virtual period in the analysis system according to the embodiment. 図7は、実施形態に係る解析システムにおける対象の時刻帯の通行人数を説明するための一例を表す模式図である。FIG. 7 is a schematic diagram showing an example for explaining the number of people passing by in the target time zone in the analysis system according to the embodiment. 図8は、実施形態に係る解析システムにおいて計数される通行人数を1日単位で集約した一例を表す模式図である。FIG. 8 is a schematic diagram showing an example in which the number of passersby counted in the analysis system according to the embodiment is aggregated on a daily basis. 図9は、実施形態に係る解析システムにおける移動経路の通行人数の計数を説明するための一例を表す模式図である。FIG. 9 is a schematic diagram showing an example for explaining the counting of the number of people passing by the movement route in the analysis system according to the embodiment. 図10は、実施形態に係る解析システムにおいて解析され加工された解析結果データの一例を表す模式図である。FIG. 10 is a schematic diagram showing an example of analysis result data analyzed and processed in the analysis system according to the embodiment. 図11は、実施形態に係る解析システムにおける処理の一例を表すフローチャート図である。FIG. 11 is a flowchart showing an example of processing in the analysis system according to the embodiment.
 以下に、本発明に係る実施形態を図面に基づいて詳細に説明する。なお、この実施形態によりこの発明が限定されるものではない。また、下記実施形態における構成要素には、当業者が置換可能かつ容易なもの、あるいは実質的に同一のものが含まれる。 Hereinafter, embodiments according to the present invention will be described in detail with reference to the drawings. The present invention is not limited to this embodiment. In addition, the components in the following embodiments include those that can be easily replaced by those skilled in the art, or those that are substantially the same.
[実施形態]
 図1に示す本実施形態の解析システム1は、データ収集装置としての記録装置10と、データ解析装置としての解析装置20とを備え、解析装置20によって解析された解析結果データをクライアント端末CLに提供するシステムである。本実施形態の解析システム1は、移動体Vに搭載された記録装置10を活用し、当該記録装置10によって収集される画像データ等に基づいて人物の流動の傾向を解析するものである。より詳細には、本実施形態の解析システム1は、図2に例示するように、記録装置10によって収集された解析用データに基づいて、移動体Vの移動経路Rにおける特定地点Pの通行人数を計数することで、特定地点Pや移動経路Rにおける人物の流動の傾向を適正に解析することができる構成を実現したものである。以下、各図を参照して解析システム1の構成について詳細に説明する。
[Embodiment]
The analysis system 1 of the present embodiment shown in FIG. 1 includes a recording device 10 as a data collecting device and an analysis device 20 as a data analysis device, and transfers the analysis result data analyzed by the analysis device 20 to the client terminal CL. It is a system to provide. The analysis system 1 of the present embodiment utilizes the recording device 10 mounted on the moving body V and analyzes the tendency of the flow of a person based on the image data or the like collected by the recording device 10. More specifically, in the analysis system 1 of the present embodiment, as illustrated in FIG. 2, the number of people passing by the specific point P in the movement path R of the moving body V is based on the analysis data collected by the recording device 10. By counting the above, a configuration is realized in which the tendency of the flow of a person at a specific point P or a movement path R can be appropriately analyzed. Hereinafter, the configuration of the analysis system 1 will be described in detail with reference to each figure.
 ここでまず、図2を参照して、通行人数の計数対象となる特定地点Pや移動経路Rについて説明しておく。移動経路Rは、移動体Vが移動する経路であり、例えば、後述するバス等の路線に相当する。特定地点Pは、移動経路Rにおける特定の地点であり、通行人数の計数を希望する地点として任意に設定される。特定地点Pは、例えば、バス停や交差点等、移動経路Rにおける特徴的な地点の他、コンテンツを出力可能な出力装置D等が設置された地点、出力装置Dからのコンテンツを受容可能な地点等が含まれていてもよい。出力装置Dは、コンテンツを出力可能な装置である。出力装置Dは、ネットワーク上に実装され、様々なコンテンツがネットワークを介して提供されるいわゆるクラウドサービス型の装置を構成してもよいし、ネットワークから切り離されたいわゆるスタンドアローン型の装置を構成してもよい。出力装置Dは、コンテンツに応じた画像を表示可能であるディスプレイ、コンテンツに応じた音・音声を出力可能であるスピーカ等を含んで構成される。出力装置Dが出力するコンテンツとしては、例えば、広告やクーポン等のコンテンツの他、地域情報や所定の施設への道順情報、災害時の避難経路/安全サポート情報等、様々な案内情報を構成するコンテンツを含んでいてもよい。出力装置Dが出力するコンテンツのデータは、ネットワークや記録媒体等を介して逐次更新可能である。 Here, first, with reference to FIG. 2, the specific point P and the movement route R to be counted by the number of passing people will be described. The movement route R is a route on which the moving body V moves, and corresponds to, for example, a route such as a bus described later. The specific point P is a specific point on the movement route R, and is arbitrarily set as a desired point for counting the number of people passing by. The specific point P is, for example, a characteristic point in the movement route R such as a bus stop or an intersection, a point where an output device D or the like capable of outputting content is installed, a point where content from the output device D can be received, or the like. May be included. The output device D is a device capable of outputting contents. The output device D may constitute a so-called cloud service type device that is mounted on the network and various contents are provided via the network, or constitutes a so-called stand-alone type device that is separated from the network. You may. The output device D includes a display capable of displaying an image according to the content, a speaker capable of outputting sound / sound according to the content, and the like. As the content output by the output device D, for example, in addition to content such as advertisements and coupons, various guidance information such as area information, route information to a predetermined facility, evacuation route / safety support information in the event of a disaster, etc. are configured. It may contain content. The content data output by the output device D can be sequentially updated via a network, a recording medium, or the like.
 図1に戻って記録装置10を説明する。記録装置10は、移動体Vに搭載され、解析装置20による解析に用いる解析用データを収集するものである。記録装置10によって収集される解析用データは、画像データ、及び、位置データを含むデータである。画像データは、移動体Vの移動に伴って撮像された当該移動体Vの外部の画像を表すデータである。位置データは、当該移動体Vの外部の画像が撮像された位置を表すデータである。記録装置10は、解析用データとして、画像データ、及び、位置データを収集する。解析用データは、解析装置20による人物の流動の傾向の解析に用いられる。 Returning to FIG. 1, the recording device 10 will be described. The recording device 10 is mounted on the moving body V and collects analysis data used for analysis by the analysis device 20. The analysis data collected by the recording device 10 is data including image data and position data. The image data is data representing an image outside the moving body V, which is captured as the moving body V moves. The position data is data representing the position where the image outside the moving body V is captured. The recording device 10 collects image data and position data as analysis data. The analysis data is used for analysis of the tendency of the flow of a person by the analysis device 20.
 ここで、記録装置10が搭載される移動体Vは、例えば、路面を走行する車両であり、例えば、自家用車、レンタカー、シェアリングカー、ライドシェアカー、バス、タクシー、トラック、輸送車、作業車等である。また、移動体Vは、車両に限らず、例えば、フライングカーやドローン等、空中を飛行する飛行体であってもよい。移動体Vは、典型的には、所定の移動経路Rを移動するものであり、例えば、所定の移動経路Rとして、予め定められた複数の路線を移動可能なものである。本実施形態の移動体Vは、一例として、一日の間に予め定められた複数の路線(移動経路R)を繰り返し走行する路線バスであるものとして説明する。路線バス等の移動体Vは、例えば、配車の効率化等のために、1台の移動体Vが一日の間に複数の路線を走行しつつ、複数の移動体Vが複数の路線に渡って使い分けられて運行する場合がある。本実施形態の記録装置10は、このように複数の路線を移動する当該複数の移動体Vにそれぞれ搭載される。つまり、本実施形態の解析システム1は、複数の移動体Vにそれぞれ搭載された複数の記録装置10を備え、当該複数の記録装置10から解析用データを収集することが可能である。 Here, the moving body V on which the recording device 10 is mounted is, for example, a vehicle traveling on a road surface, for example, a private car, a rental car, a sharing car, a ride sharing car, a bus, a taxi, a truck, a transport vehicle, or a work. It is a car etc. Further, the moving body V is not limited to a vehicle, and may be a flying body such as a flying car or a drone that flies in the air. The moving body V typically moves on a predetermined movement route R, and is, for example, capable of moving on a plurality of predetermined routes as a predetermined movement route R. As an example, the moving body V of the present embodiment will be described as a fixed-route bus that repeatedly travels on a plurality of predetermined routes (moving route R) during a day. As for the moving body V such as a fixed-route bus, for example, in order to improve the efficiency of vehicle allocation, one moving body V travels on a plurality of routes in one day, and a plurality of moving bodies V are on a plurality of routes. It may be operated depending on the usage. The recording device 10 of the present embodiment is mounted on each of the plurality of moving bodies V that move on the plurality of routes in this way. That is, the analysis system 1 of the present embodiment includes a plurality of recording devices 10 mounted on each of the plurality of moving bodies V, and can collect analysis data from the plurality of recording devices 10.
 具体的には、記録装置10は、外部カメラ11と、位置情報測定器12と、データ入出力部13と、制御部14とを備える。記録装置10は、例えば、移動体Vに搭載されるいわゆるドライブレコーダ等の車載機器を用いることができるがこれに限らない。 Specifically, the recording device 10 includes an external camera 11, a position information measuring device 12, a data input / output unit 13, and a control unit 14. As the recording device 10, for example, an in-vehicle device such as a so-called drive recorder mounted on the moving body V can be used, but the recording device 10 is not limited to this.
 外部カメラ11は、移動体Vの外部の画像を撮像する外部撮像装置である。外部カメラ11は、移動体Vの移動に伴って当該移動体Vの外部の画像を撮像し、当該移動体Vの外部の画像を表す画像データを収集する。外部カメラ11は、典型的には、移動体Vの外部の動画像を撮像し、当該動画像を表す画像データを収集する。動画像は、複数のフレームの静止画像を時系列に並べたものである。外部カメラ11は、解析システム1による解析対象である人物、ここでは、移動体Vの外部の路上に位置する人物を撮像可能な画角となるように移動体Vに設置される。外部カメラ11は、移動体Vから路上の人物をより好適に撮像可能なように、移動体Vの前部、側部、後部、屋根部等に複数設けられてもよい。外部カメラ11は、単眼カメラであってもよいし、ステレオカメラであってもよい。また、外部カメラ11が撮像する画像は、モノクロであってもよいしカラーであってもよい。外部カメラ11は、制御部14と通信可能に接続されており、収集した画像データを制御部14に出力する。 The external camera 11 is an external imaging device that captures an image of the outside of the moving body V. The external camera 11 captures an image outside the moving body V as the moving body V moves, and collects image data representing an image outside the moving body V. The external camera 11 typically captures a moving image outside the moving body V and collects image data representing the moving image. A moving image is a time-series arrangement of still images of a plurality of frames. The external camera 11 is installed on the moving body V so as to have an angle of view capable of capturing a person to be analyzed by the analysis system 1, here, a person located on a road outside the moving body V. A plurality of external cameras 11 may be provided on the front portion, side portion, rear portion, roof portion, etc. of the moving body V so that a person on the road can be more preferably imaged from the moving body V. The external camera 11 may be a monocular camera or a stereo camera. Further, the image captured by the external camera 11 may be monochrome or color. The external camera 11 is communicatively connected to the control unit 14, and outputs the collected image data to the control unit 14.
 位置情報測定器12は、移動体Vの現在位置を測定する測位器である。位置情報測定器12は、例えば、GPS(Global Positioning System))衛星から送信される電波を受信するGPS受信器等を用いることができる。位置情報測定器12は、GPS衛星から送信される電波を受信し移動体Vの現在位置を表す情報としてGPS情報(緯度経度座標)を取得することで、移動体Vの外部の画像が撮像された位置を表す位置データを収集する。位置情報測定器12は、制御部14と通信可能に接続されており、収集した位置データを制御部14に出力する。 The position information measuring device 12 is a positioning device that measures the current position of the moving body V. As the position information measuring device 12, for example, a GPS receiver or the like that receives radio waves transmitted from a GPS (Global Positioning System) satellite can be used. The position information measuring device 12 receives radio waves transmitted from GPS satellites and acquires GPS information (latitude / longitude coordinates) as information indicating the current position of the moving body V, thereby capturing an image of the outside of the moving body V. Collect position data that represents the position. The position information measuring device 12 is communicably connected to the control unit 14, and outputs the collected position data to the control unit 14.
 データ入出力部13は、記録装置10とは異なる機器と当該記録装置10との間で各種データを入出力するものである。本実施形態のデータ入出力部13は、記録装置10とは異なる機器である解析装置20に対して、解析用データを出力可能である。データ入出力部13は、例えば、ネットワークを介した通信(有線、無線を問わない)によって、記録装置10とは異なる機器との間でデータを入出力する構成であってもよい。また、データ入出力部13は、例えば、スロット部を有し当該スロット部に差し込まれた記録媒体を介して、記録装置10とは異なる機器との間でデータを入出力する構成であってもよい。ここで、記録媒体は、例えば、スロット部を介して記録装置10に脱着可能なメモリ(リムーバブルメディア)である。記録媒体は、例えば、様々な形式のメモリカード、例えばSDカードなどを用いることができるがこれに限らない。 The data input / output unit 13 inputs / outputs various data between a device different from the recording device 10 and the recording device 10. The data input / output unit 13 of the present embodiment can output analysis data to the analysis device 20 which is a device different from the recording device 10. The data input / output unit 13 may be configured to input / output data to / from a device different from the recording device 10 by, for example, communication via a network (whether wired or wireless). Further, even if the data input / output unit 13 has a slot unit and inputs / outputs data to / from a device different from the recording device 10 via a recording medium inserted into the slot unit, for example. Good. Here, the recording medium is, for example, a memory (removable media) that can be attached to and detached from the recording device 10 via the slot portion. As the recording medium, for example, various types of memory cards such as SD cards can be used, but the recording medium is not limited to this.
 制御部14は、記録装置10の各部を統括的に制御するものである。制御部14は、解析用データを収集するための種々の演算処理を実行する。制御部14は、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)等の中央演算処理装置、ROM(Read Only Memory)、RAM(Random Access Memory)、及び、インターフェースを含む周知のマイクロコンピュータを主体とする電子回路を含んで構成される。制御部14は、外部カメラ11、位置情報測定器12、データ入出力部13等の各部と通信可能に接続され、各部との間で相互に各種信号、データを授受可能である。 The control unit 14 comprehensively controls each unit of the recording device 10. The control unit 14 executes various arithmetic processes for collecting analysis data. The control unit 14 is mainly a well-known microcomputer including a central processing unit (CPU) (Central Processing Unit), a GPU (Graphics Processing Unit) and other central arithmetic processing units, a ROM (Read Only Memory), a RAM (Random Access Memory), and an interface. It is configured to include an electronic circuit. The control unit 14 is communicably connected to each unit such as the external camera 11, the position information measuring device 12, and the data input / output unit 13, and can exchange various signals and data with each other.
 より具体的には、制御部14は、記憶部14A、及び、処理部14Bを含んで構成される。記憶部14A、及び、処理部14Bは、各部との間で相互に各種信号、データを授受可能である。 More specifically, the control unit 14 includes a storage unit 14A and a processing unit 14B. The storage unit 14A and the processing unit 14B can exchange various signals and data with each other.
 記憶部14Aは、処理部14Bでの各種処理に必要な条件や情報、制御部14で実行する各種プログラムやアプリケーション、制御データ等が格納されている。記憶部14Aは、解析用データを、収集した時刻等と共に記憶することができる。言い換えれば、解析用データは、当該データを収集した時刻を表す時刻データやその他のデータも含む。記憶部14Aは、例えば、処理部14Bによる処理の過程で生成される各種データを一時的に記憶することもできる。記憶部14Aは、処理部14B、データ入出力部13等によってこれらのデータが必要に応じて読み出される。記憶部14Aは、例えば、ハードディスク、SSD(Solid State Drive)、光ディスクなどの比較的に大容量の記憶装置、あるいは、RAM、フラッシュメモリ、NVSRAM(Non Volatile Static Random Access Memory)などのデータを書き換え可能な半導体メモリであってもよい。 The storage unit 14A stores conditions and information necessary for various processes in the processing unit 14B, various programs and applications executed by the control unit 14, control data, and the like. The storage unit 14A can store the analysis data together with the collected time and the like. In other words, the analysis data also includes time data and other data representing the time when the data was collected. The storage unit 14A can also temporarily store various data generated in the process of processing by the processing unit 14B, for example. In the storage unit 14A, these data are read out as needed by the processing unit 14B, the data input / output unit 13, and the like. The storage unit 14A can rewrite data such as a relatively large-capacity storage device such as a hard disk, SSD (Solid State Drive), or an optical disk, or data such as RAM, flash memory, NVSRAM (Non Volatile Static Random Access Memory), or the like. It may be a semiconductor memory.
 処理部14Bは、各種入力信号等に基づいて、記憶部14Aに記憶されている各種プログラムを実行し、当該プログラムが動作することにより各部に出力信号を出力し各種機能を実現するための種々の処理を実行する。処理部14Bは、外部カメラ11、位置情報測定器12の動作を制御し、画像データ、位置データを含む解析用データを収集する処理を実行する。また、処理部14Bは、データ入出力部13を介したデータの入出力に関わる処理を実行する。処理部14Bは、例えば、解析用データを、データ入出力部13を介して解析装置20に出力する処理を実行する。 The processing unit 14B executes various programs stored in the storage unit 14A based on various input signals and the like, and when the program operates, outputs an output signal to each unit and realizes various functions. Execute the process. The processing unit 14B controls the operation of the external camera 11 and the position information measuring device 12, and executes a process of collecting analysis data including image data and position data. Further, the processing unit 14B executes a process related to data input / output via the data input / output unit 13. The processing unit 14B executes, for example, a process of outputting analysis data to the analysis device 20 via the data input / output unit 13.
 解析装置20は、記録装置10によって収集された解析用データを解析し、解析結果を表す解析結果データをクライアント端末CLに提供するものである。解析装置20、及び、クライアント端末CLは、ネットワーク上に実装されるいわゆるクラウドサービス型の装置(クラウドサーバ)を構成してもよいし、ネットワークから切り離されたいわゆるスタンドアローン型の装置を構成してもよい。本実施形態の解析装置20は、記録装置10によって収集された解析用データに基づいて、移動体Vの移動経路Rにおける特定地点Pの通行人数を計数する。ここでは、解析装置20は、複数の記録装置10によって収集された解析用データに基づいて、当該特定地点Pの通行人数を計数する。また、本実施形態の解析装置20は、解析用データに基づいて、複数の特定地点Pの通行人数を計数し、当該複数の特定地点Pの通行人数を集約して、移動体Vの移動経路R、例えば、特定の路線における通行人数を計数する。さらに、本実施形態の解析装置20は、解析用データに基づいて、特定地点Pごとに、あるいは、移動経路Rごとに、画像データが表す画像に含まれる人物の属性を解析する。そして、本実施形態の解析装置20は、通行人数の計数結果や人物の属性の解析結果等に基づく解析結果データを生成し、当該解析結果データをクライアント端末CLに提供する。 The analysis device 20 analyzes the analysis data collected by the recording device 10 and provides the analysis result data representing the analysis result to the client terminal CL. The analysis device 20 and the client terminal CL may constitute a so-called cloud service type device (cloud server) mounted on the network, or a so-called stand-alone type device separated from the network. May be good. The analysis device 20 of the present embodiment counts the number of people passing by the specific point P in the movement path R of the moving body V based on the analysis data collected by the recording device 10. Here, the analysis device 20 counts the number of people passing by the specific point P based on the analysis data collected by the plurality of recording devices 10. Further, the analysis device 20 of the present embodiment counts the number of people passing by the plurality of specific points P based on the analysis data, aggregates the number of people passing by the plurality of specific points P, and aggregates the number of people passing by the plurality of specific points P to move the moving body V. R, for example, counts the number of people passing on a particular line. Further, the analysis device 20 of the present embodiment analyzes the attributes of the person included in the image represented by the image data for each specific point P or for each movement path R based on the analysis data. Then, the analysis device 20 of the present embodiment generates analysis result data based on the counting result of the number of passing people, the analysis result of the attribute of the person, and the like, and provides the analysis result data to the client terminal CL.
 解析装置20は、解析用データに基づいて、各種の通行人数を計数するための種々の演算処理を実行する。また、解析装置20は、解析用データに基づいて、人物の属性を解析するための種々の演算処理を実行する。解析装置20は、CPU、GPU等の中央演算処理装置、ROM、RAM、及び、インターフェースを含む周知のマイクロコンピュータを主体とする電子回路を含んで構成される。解析装置20は、既知のPCやワークステーションなどのコンピュータシステムに下記で説明する種々の処理を実現させるアプリケーションをインストールすることで構成することもできる。また、解析装置20は、複数のPCを相互通信可能に組み合わせることで構成されてもよい。 The analysis device 20 executes various arithmetic processes for counting various passers-by based on the analysis data. Further, the analysis device 20 executes various arithmetic processes for analyzing the attributes of a person based on the analysis data. The analysis device 20 includes a central processing unit such as a CPU and a GPU, a ROM, a RAM, and an electronic circuit mainly composed of a well-known microcomputer including an interface. The analysis device 20 can also be configured by installing an application that realizes various processes described below on a known computer system such as a PC or workstation. Further, the analysis device 20 may be configured by combining a plurality of PCs so as to be able to communicate with each other.
 具体的には、解析装置20は、データ入出力部21と、記憶部22と、処理部23とを備える。データ入出力部21、記憶部22、及び、処理部23は、各部との間で相互に各種信号、データを授受可能である。 Specifically, the analysis device 20 includes a data input / output unit 21, a storage unit 22, and a processing unit 23. The data input / output unit 21, the storage unit 22, and the processing unit 23 can exchange various signals and data with each other.
 データ入出力部21は、解析装置20とは異なる機器と当該解析装置20との間で各種データを入出力するものである。本実施形態のデータ入出力部21は、解析装置20とは異なる機器である記録装置10から解析用データを入力可能である。さらに、本実施形態のデータ入出力部21は、解析装置20とは異なる機器であるクライアント端末CLに対して解析結果データを出力可能である。データ入出力部21は、データ入出力部13と同様に、例えば、ネットワークを介した通信(有線、無線を問わない)によって、解析装置20とは異なる機器との間でデータを入出力する構成であってもよい。同様に、データ入出力部21は、例えば、スロット部を有し当該スロット部に差し込まれた記録媒体を介して、解析装置20とは異なる機器との間でデータを入出力する構成であってもよい。 The data input / output unit 21 inputs / outputs various data between a device different from the analysis device 20 and the analysis device 20. The data input / output unit 21 of the present embodiment can input analysis data from a recording device 10 which is a device different from the analysis device 20. Further, the data input / output unit 21 of the present embodiment can output the analysis result data to the client terminal CL, which is a device different from the analysis device 20. Similar to the data input / output unit 13, the data input / output unit 21 has a configuration for inputting / outputting data to / from a device different from the analysis device 20 by, for example, communication via a network (whether wired or wireless). It may be. Similarly, the data input / output unit 21 has a configuration in which, for example, data is input / output to / from a device different from the analysis device 20 via a recording medium having a slot unit and inserted into the slot unit. May be good.
 記憶部22は、処理部23での各種処理に必要な条件や情報、処理部23で実行する各種プログラムやアプリケーション、制御データ等が格納されている。記憶部22は、データ入出力部21によって入力された解析用データを記憶することができる。記憶部22は、例えば、処理部23による処理の過程で生成される各種データを一時的に記憶することもできる。記憶部22は、データ入出力部21、処理部23等によってこれらのデータが必要に応じて読み出される。記憶部22は、例えば、ハードディスク、SSD、光ディスクなどの比較的に大容量の記憶装置、あるいは、RAM、フラッシュメモリ、NVSRAMなどのデータを書き換え可能な半導体メモリであってもよい。 The storage unit 22 stores conditions and information necessary for various processes in the processing unit 23, various programs and applications executed by the processing unit 23, control data, and the like. The storage unit 22 can store the analysis data input by the data input / output unit 21. The storage unit 22 can also temporarily store various data generated in the process of processing by the processing unit 23, for example. In the storage unit 22, these data are read out as needed by the data input / output unit 21, the processing unit 23, and the like. The storage unit 22 may be, for example, a relatively large-capacity storage device such as a hard disk, SSD, or optical disk, or a semiconductor memory such as RAM, flash memory, or NVSRAM that can rewrite data.
 より具体的には、記憶部22は、機能概念的に、解析対象データベース(以下、「解析対象DB」と略記する。)22A、解析参照データベース(以下、「解析参照DB」と略記する。)22B、及び、解析結果データベース(以下、「解析結果DB」と略記する。)22Cを含んで構成される。 More specifically, the storage unit 22 functionally conceptually includes an analysis target database (hereinafter abbreviated as "analysis target DB") 22A and an analysis reference database (hereinafter abbreviated as "analysis reference DB"). It includes 22B and an analysis result database (hereinafter, abbreviated as "analysis result DB") 22C.
 解析対象DB22Aは、処理部23による解析対象データである解析用データ(画像データ、位置データ、時刻データ等)を蓄積しデータベース化して記憶する部分である。記録装置10からデータ入出力部21に入力された解析用データは、この解析対象DB22Aに記憶される。 The analysis target DB 22A is a part that accumulates analysis data (image data, position data, time data, etc.) that is analysis target data by the processing unit 23, creates a database, and stores it. The analysis data input from the recording device 10 to the data input / output unit 21 is stored in the analysis target DB 22A.
 解析参照DB22Bは、処理部23による解析用データの解析の際に参照する解析参照データを蓄積しデータベース化して記憶する部分である。解析参照データは、例えば、地図参照データ、属性予測参照データ等を含む。地図参照データは、位置データ等に基づいて移動体Vの位置、言い換えれば、移動体Vの外部の画像が撮像された位置を特定する際に参照する地図を表すデータである。属性予測参照データは、画像データが表す画像に含まれる人物の属性の推定の際等に参照するデータである。属性予測参照データについては、後で詳細に説明する。解析参照データは、処理部23によって解析用データの解析の際に参照される。 The analysis reference DB 22B is a part that accumulates the analysis reference data to be referred to when the processing unit 23 analyzes the analysis data, creates a database, and stores the data. The analysis reference data includes, for example, map reference data, attribute prediction reference data, and the like. The map reference data is data representing a map to be referred to when specifying the position of the moving body V based on the position data or the like, in other words, the position where the image outside the moving body V is captured. The attribute prediction reference data is data to be referred to when estimating the attributes of a person included in the image represented by the image data. The attribute prediction reference data will be described in detail later. The analysis reference data is referred to by the processing unit 23 when analyzing the analysis data.
 解析結果DB22Cは、処理部23による解析用データの解析結果を表す解析結果データを蓄積しデータベース化して記憶する部分である。解析結果データは、例えば、特定地点Pの通行人数の計数結果(特定地点別計数データ)、複数の特定地点Pを含む移動経路R、例えば、特定の路線における通行人数の計数結果(特定経路別計数データ)、計数された人物の属性の解析結果(人物属性データ)等に基づくデータである。解析結果データは、処理部23によって所望の形式に加工されて、データ入出力部21からクライアント端末CLに出力、提供される。 The analysis result DB 22C is a part that accumulates the analysis result data representing the analysis result of the analysis data by the processing unit 23, creates a database, and stores it. The analysis result data includes, for example, a counting result of the number of people passing by a specific point P (counting data by specific point), a movement route R including a plurality of specific points P, for example, a counting result of the number of people passing by a specific route (by a specific route). It is data based on the count data), the analysis result of the attribute of the counted person (person attribute data), and the like. The analysis result data is processed into a desired format by the processing unit 23, output from the data input / output unit 21 to the client terminal CL, and provided.
 なお、解析対象DB22A、解析参照DB22B、解析結果DB22Cに記憶される各種データは、いわゆるビッグデータ(big data)として活用することができる。 The various data stored in the analysis target DB 22A, the analysis reference DB 22B, and the analysis result DB 22C can be utilized as so-called big data (big data).
 処理部23は、各種入力信号等に基づいて、記憶部22に記憶されている各種プログラムを実行し、当該プログラムが動作することにより解析用データを解析するための種々の処理を実行する。また、処理部23は、解析結果データを所望の形式に加工する処理を実行する。また、処理部23は、データ入出力部21を介したデータの入出力に関わる処理を実行する。処理部23は、例えば、所望の形式に加工された解析結果データを、データ入出力部21を介してクライアント端末CLに出力する処理を実行する。 The processing unit 23 executes various programs stored in the storage unit 22 based on various input signals and the like, and executes various processes for analyzing analysis data when the programs operate. In addition, the processing unit 23 executes a process of processing the analysis result data into a desired format. Further, the processing unit 23 executes a process related to data input / output via the data input / output unit 21. The processing unit 23 executes, for example, a process of outputting the analysis result data processed into a desired format to the client terminal CL via the data input / output unit 21.
 より具体的には、処理部23は、機能概念的に、データ前処理部23A、データ解析処理部23B、及び、データ加工処理部23Cを含んで構成される。 More specifically, the processing unit 23 is functionally conceptually configured to include a data preprocessing unit 23A, a data analysis processing unit 23B, and a data processing processing unit 23C.
 データ前処理部23Aは、解析対象データである解析用データに対して種々の前処理を施す部分である。データ前処理部23Aは、前処理として、例えば、解析対象DB22Aから解析対象データとなる解析用データを読み出し、当該解析用データの画像データが表す動画像から各フレーム毎の静止画像を切り出す処理を実行する。また、データ前処理部23Aは、前処理として、例えば、切り出した当該静止画像と、当該解析用データの位置データが表す位置と、当該解析用データの時刻データが表す時刻とを紐付する処理を実行する。 The data preprocessing unit 23A is a portion that performs various preprocessing on the analysis data that is the analysis target data. As preprocessing, the data preprocessing unit 23A reads, for example, analysis data to be analysis target data from analysis target DB 22A, and cuts out a still image for each frame from the moving image represented by the image data of the analysis target data. Execute. Further, as preprocessing, the data preprocessing unit 23A performs, for example, a process of associating the cut out still image with the position represented by the position data of the analysis data and the time represented by the time data of the analysis data. Execute.
 データ解析処理部23Bは、データ前処理部23Aによって前処理が施された解析用データに基づいて、通行人数の計数を行う部分である。データ解析処理部23Bは、当該解析用データに含まれる画像データに基づいて、任意に設定された特定地点Pにおける通行人数を計数する。ここでは、データ解析処理部23Bは、画像データが表す画像に含まれる人物の数に基づいて通行人数を計数する。データ解析処理部23Bは、種々の公知の画像処理技術を用いて、画像データに基づいてデータ前処理部23Aによって切り出された静止画像から人物を検出し抽出する処理を実行する。そして、データ解析処理部23Bは、検出、抽出した人物の数を計数し、当該計数した人物の数に基づいて、特定地点Pにおける通行人数を計数する。 The data analysis processing unit 23B is a part that counts the number of passing people based on the analysis data preprocessed by the data preprocessing unit 23A. The data analysis processing unit 23B counts the number of people passing by at a specific point P arbitrarily set based on the image data included in the analysis data. Here, the data analysis processing unit 23B counts the number of people passing by based on the number of people included in the image represented by the image data. The data analysis processing unit 23B executes a process of detecting and extracting a person from a still image cut out by the data preprocessing unit 23A based on the image data by using various known image processing techniques. Then, the data analysis processing unit 23B counts the number of detected and extracted persons, and counts the number of people passing by at the specific point P based on the counted number of persons.
 以下、図3~図8を参照して、データ解析処理部23Bによる特定地点Pにおける通行人数の計数について詳細に説明する。 Hereinafter, with reference to FIGS. 3 to 8, the counting of the number of people passing by at the specific point P by the data analysis processing unit 23B will be described in detail.
 具体的には、データ解析処理部23Bは、データ前処理部23Aによって前処理が施された解析用データの位置データと、解析参照DB22Bに記憶されている地図参照データ(解析参照データ)とに基づいて、例えば、図3に示すように、特定地点Pの画像を含む画像データを抽出する。より詳細には、データ解析処理部23Bは、解析対象DB22Aに記憶されている複数の解析用データから、特定地点Pを含む所定の範囲の領域を撮像した動画像の画像データを抽出する。このとき、データ解析処理部23Bは、解析対象DB22Aから、当該抽出した画像データに紐付された時刻データも読み出し、当該抽出した画像データが収集された時刻を特定する。つまり、ここで抽出される画像データは、特定の期間(例えば、時刻Aから時刻Bまでの期間)において、特定地点Pを含む所定の範囲の領域を撮像した動画像を表すデータである。当該画像データが表す動画像は、特定地点Pを含む所定の範囲の領域を撮像した複数フレームの静止画像によって構成されている。 Specifically, the data analysis processing unit 23B divides the position data of the analysis data preprocessed by the data preprocessing unit 23A and the map reference data (analysis reference data) stored in the analysis reference DB 22B. Based on this, for example, as shown in FIG. 3, image data including an image of a specific point P is extracted. More specifically, the data analysis processing unit 23B extracts image data of a moving image obtained by capturing a region in a predetermined range including a specific point P from a plurality of analysis data stored in the analysis target DB 22A. At this time, the data analysis processing unit 23B also reads the time data associated with the extracted image data from the analysis target DB 22A, and specifies the time when the extracted image data is collected. That is, the image data extracted here is data representing a moving image obtained by capturing a region in a predetermined range including a specific point P in a specific period (for example, a period from time A to time B). The moving image represented by the image data is composed of a plurality of frames of still images obtained by capturing a region in a predetermined range including a specific point P.
 本実施形態のデータ解析処理部23Bは、複数の移動体Vにそれぞれ搭載された複数の記録装置10によって収集された複数の解析用データから、上記のような特定地点Pを含む動画像の画像データをすべて抽出する。また、データ解析処理部23Bは、異なる日時に収集された解析用データも含め、上記のような画像データをすべて抽出する。そして、データ解析処理部23Bは、上述したように種々の公知の画像処理技術を用いて、抽出した画像データが表す動画像から切り出された静止画像に含まれる人物の数を計数し、集約して当該特定地点Pの通行人数を計数する。ここで、画像データが表す画像に含まれる人物は、特定地点Pにおける通行人に相当し、歩行者や自転車に乗った人物等を含む。 The data analysis processing unit 23B of the present embodiment is an image of a moving image including a specific point P as described above from a plurality of analysis data collected by a plurality of recording devices 10 mounted on each of the plurality of moving bodies V. Extract all data. Further, the data analysis processing unit 23B extracts all the image data as described above, including the analysis data collected at different dates and times. Then, the data analysis processing unit 23B uses various known image processing techniques as described above to count and aggregate the number of persons included in the still image cut out from the moving image represented by the extracted image data. The number of people passing by the specific point P is counted. Here, the person included in the image represented by the image data corresponds to a passerby at the specific point P, and includes a pedestrian, a person riding a bicycle, and the like.
 このとき、本実施形態のデータ解析処理部23Bは、抽出した画像データの動画像を構成する複数の静止画像において、例えば、同一人物の重複計数を避けるため、下記のような処理を実行する。すなわちここでは、データ解析処理部23Bは、画像データを構成する1つの静止画像に含まれる人物の数を、予め設定される仮想計数期間tにおいて計数される仮想期間推定通行人数nであるものとして計数する。そして、データ解析処理部23Bは、当該仮想期間推定通行人数nに基づいて、特定地点Pの通行人数を計数する。 At this time, the data analysis processing unit 23B of the present embodiment executes the following processing in a plurality of still images constituting the moving image of the extracted image data, for example, in order to avoid duplicate counting of the same person. That is, here, the data analysis processing unit 23B assumes that the number of people included in one still image constituting the image data is the virtual period estimated number of passersby n counted in the preset virtual counting period t. Count. Then, the data analysis processing unit 23B counts the number of people passing by the specific point P based on the estimated number of people passing by in the virtual period n.
 ここで、予め設定される仮想計数期間tとは、典型的には、1つの静止画像に含まれる人物の数を、仮に実際に係員がカウンタ等を用いて特定地点Pで定点測定した場合に、何秒分の通行人数に相当するのかを定義した値である。さらに言えば、仮想計数期間tとは、例えば、図4に例示するように、人物が予め想定する通行人速度vで移動しているものと仮定した場合に、1つの静止画像に含まれる人物の数と同数の人物を、特定地点Pで定点測定し計数するために要する所要期間に相当する。ここでは、当該所要期間は、例えば、静止画像の最も端に映る人物が予め想定する通行人速度vで特定地点Pに向かって移動した場合に当該特定地点Pに到達するまでに要する期間とほぼ同等であるものと捉えることができる。つまり、仮想計数期間tは、一例として、1つの静止画像に含まれる人物のうち当該静止画像の最も端に映る人物が予め想定する通行人速度vで特定地点Pに向かって移動した場合に当該特定地点Pに到達するまでに要する期間で近似することができる。 Here, the preset virtual counting period t is typically a case where the number of people included in one still image is actually measured at a specific point P by a staff member using a counter or the like. , It is a value that defines how many seconds the number of people pass by. Further, the virtual counting period t is, for example, as illustrated in FIG. 4, a person included in one still image, assuming that the person is moving at a passerby speed v assumed in advance. It corresponds to the required period required to measure and count the same number of people at a specific point P at a fixed point. Here, the required period is approximately the same as the period required to reach the specific point P when, for example, the person appearing at the end of the still image moves toward the specific point P at the passerby speed v assumed in advance. It can be regarded as equivalent. That is, the virtual counting period t is, for example, when the person who appears at the end of the still image among the persons included in one still image moves toward the specific point P at the passerby speed v assumed in advance. It can be approximated by the period required to reach the specific point P.
 当該仮想計数期間tは、典型的には、移動体Vの移動速度にかかわらず、画像データを収集する外部カメラ11の設置位置、カメラ画角、想定する通行人速度v等に応じて定まる。画像データを構成する1つの静止画像は、移動体Vにおける外部カメラ11の設置位置、カメラ画角等に応じて、実空間において撮像される範囲が幾何学的に定まり、実空間における静止画像奥行端部に相当する位置から特定地点Pまでの距離Lも幾何学的に算出できる。例えば、図4に例示する静止画像において、当該距離Lを10[m]と仮定し、通行人速度vを成人の平均歩行速度である5[km/h]程度と仮定する。この場合、図4に例示す静止画像に含まれる人物のうち当該静止画像の最も端に映る人物が予め想定する通行人速度vで特定地点Pに向かって移動した場合に当該特定地点Pに到達するまでに要する期間は、10×3600/5/1000=7.2[sec]となる。したがってこの場合、図4に例示する静止画像に含まれる人物の数と同数の人物を、特定地点Pで定点測定し計数するために要する所要期間は、7.2[sec]となり、仮想計数期間t=7.2[sec]となる。 The virtual counting period t is typically determined according to the installation position of the external camera 11 for collecting image data, the camera angle of view, the assumed passerby speed v, etc., regardless of the moving speed of the moving body V. The range of one still image constituting the image data is geometrically determined in the real space according to the installation position of the external camera 11 in the moving body V, the camera angle of view, and the like, and the depth of the still image in the real space is determined. The distance L from the position corresponding to the end to the specific point P can also be calculated geometrically. For example, in the still image illustrated in FIG. 4, the distance L is assumed to be 10 [m], and the passerby speed v is assumed to be about 5 [km / h], which is the average walking speed of an adult. In this case, among the persons included in the still image shown in FIG. 4, the person who appears at the end of the still image reaches the specific point P when the person moves toward the specific point P at the passerby speed v assumed in advance. The period required for this is 10 × 3600/5/1000 = 7.2 [sec]. Therefore, in this case, the required period required for fixed-point measurement and counting of the same number of people as the number of people included in the still image illustrated in FIG. 4 at a specific point P is 7.2 [sec], which is a virtual counting period. t = 7.2 [sec].
 そして、本実施形態のデータ解析処理部23Bは、画像データを構成する1つの静止画像に含まれる人物の数を、仮想期間推定通行人数nであるものと擬制して計数する。仮想期間推定通行人数nは、上述のようにして予め設定される仮想計数期間tにおいて、仮想的に計数される人物の数である。データ解析処理部23Bは、特定地点Pを含む1つの画像データを構成する動画像から、当該仮想期間推定通行人数nを1つ得ることができる。 Then, the data analysis processing unit 23B of the present embodiment counts the number of people included in one still image constituting the image data by assuming that the number of people passing by is the estimated number of people passing through the virtual period. The virtual period estimated number of people n is the number of people who are virtually counted in the virtual counting period t preset as described above. The data analysis processing unit 23B can obtain one virtual period estimated number of passersby n from a moving image constituting one image data including a specific point P.
 ここで、上述したように画像データが表す動画像は、特定地点Pを含む所定の範囲の領域を撮像した複数フレームの静止画像によって構成されている。このため、データ解析処理部23Bは、1つの画像データを構成する動画像から、1つの仮想期間推定通行人数nを得るために下記のように処理する。すなわちこの場合、データ解析処理部23Bは、例えば、図5に例示するように、1つの動画像を構成する複数フレームの静止画像から、任意の1つを抽出する。データ解析処理部23Bは、例えば、複数フレームの静止画像から略中央の時刻の静止画像を抽出する。そして、データ解析処理部23Bは、当該抽出した静止画像から仮想期間推定通行人数nを計数することで、1つの仮想期間推定通行人数nを得る。あるいは、データ解析処理部23Bは、例えば、動画像を構成する複数フレームの静止画像すべての仮想期間推定通行人数nを一旦計数した上でこれらの平均値を算出することで、1つの仮想期間推定通行人数nを得ることもできる。 Here, as described above, the moving image represented by the image data is composed of a plurality of frames of still images obtained by capturing a region in a predetermined range including a specific point P. Therefore, the data analysis processing unit 23B processes as follows in order to obtain one virtual period estimated number of passersby n from the moving image constituting one image data. That is, in this case, the data analysis processing unit 23B extracts an arbitrary one from the still images of a plurality of frames constituting one moving image, for example, as illustrated in FIG. The data analysis processing unit 23B extracts, for example, a still image at a time substantially in the center from a still image of a plurality of frames. Then, the data analysis processing unit 23B obtains one virtual period estimated number of passersby n by counting the virtual period estimated number of passersby n from the extracted still image. Alternatively, the data analysis processing unit 23B estimates one virtual period by, for example, calculating the average value of the virtual period estimated number of people n of all the still images of a plurality of frames constituting the moving image once counted. It is also possible to obtain the number of passers n.
 データ解析処理部23Bは、上記のようにして特定地点Pを含む1つの画像データから得られる1つの仮想期間推定通行人数nを、例えば、当該画像データが撮像された期間の略中央の時刻に特定地点Pで計数された通行人数であるものとして処理する。例えば、[9:00~9:02]に撮像された画像データから得られた仮想期間推定通行人数nであれば、データ解析処理部23Bは、当該仮想期間推定通行人数nを[9:01]に特定地点Pで計数された通行人数であるものとして処理する。例えば、仮想計数期間t=7.2[sec]、仮想期間推定通行人数n=15[人]である場合、データ解析処理部23Bは、図6に例示するように、[9:01]において7.2[sec]の間に15[人]の人物が特定地点Pを通過したものと擬制して、当該特定地点Pの通行人数を計数することとなる。 The data analysis processing unit 23B sets one virtual period estimated number of passers-by n obtained from one image data including the specific point P as described above, for example, at a time substantially in the center of the period in which the image data is captured. It is processed as if it is the number of passersby counted at the specific point P. For example, if the virtual period estimated number of passersby n is obtained from the image data captured in [9:00 to 9:02], the data analysis processing unit 23B sets the virtual period estimated number of passersby n to [9:01]. ] Is processed as the number of passersby counted at the specific point P. For example, when the virtual counting period t = 7.2 [sec] and the estimated number of people passing through the virtual period n = 15 [persons], the data analysis processing unit 23B sets the data analysis processing unit 23B at [9:01] as illustrated in FIG. The number of people passing by the specific point P is counted by assuming that 15 [persons] have passed the specific point P during 7.2 [sec].
 なおここでは、データ解析処理部23Bは、仮想期間推定通行人数nを、上記のように画像データが撮像された期間の略中央の時刻に特定地点Pで計数された通行人数であるものとして処理するものと説明したがこれに限らない。データ解析処理部23Bは、当該期間内の時刻であれば、例えば、仮想期間推定通行人数nを、当該期間の開始時刻に特定地点Pで計数された通行人数であるものとして処理してもよいし、当該期間の終了時刻に特定地点Pで計数された通行人数であるものとして処理してもよい。 Here, the data analysis processing unit 23B processes the virtual period estimated number of passersby n as the number of passersby counted at the specific point P at a time substantially in the center of the period in which the image data is captured as described above. I explained that it should be done, but it is not limited to this. If the time is within the period, the data analysis processing unit 23B may process, for example, the virtual period estimated number of passersby n as the number of passersby counted at the specific point P at the start time of the period. Then, it may be processed as the number of passersby counted at the specific point P at the end time of the period.
 上記のような仮想期間推定通行人数nは、典型的には、記録装置10を搭載した移動体Vが特定地点Pを1回通過するたびに1つ得られることとなる。データ解析処理部23Bは、複数の移動体Vにそれぞれ搭載された複数の記録装置10によって収集された複数の解析用データに基づいて、各特定地点Pについての仮想期間推定通行人数nを算出し、当該各特定地点Pにおける仮想期間推定通行人数nを複数取得する。 The virtual period estimated number of people n as described above is typically obtained once each time the moving body V equipped with the recording device 10 passes the specific point P once. The data analysis processing unit 23B calculates the virtual period estimated number of passers-by n for each specific point P based on the plurality of analysis data collected by the plurality of recording devices 10 mounted on the plurality of moving bodies V, respectively. , Acquire a plurality of virtual period estimated number of people n at each specific point P.
 そして、データ解析処理部23Bは、上記にように計数した所定の時刻の仮想期間推定通行人数nに基づいて、通行人数の計数を希望する対象の時刻帯の特定地点Pの通行人数Nを算出する。この場合、データ解析処理部23Bは、予め単位計数期間Tを設定しておき、対象の時刻帯の当該単位計数期間Tにおける特定地点Pの通行人数Nを算出する。ここで、単位計数期間Tとは、特定地点Pの通行人数を計数するために基準とする最小の計数期間であり、通行人数を計数するための区切りの単位期間として希望に応じて任意に設定される。 Then, the data analysis processing unit 23B calculates the number of people N at the specific point P in the target time zone for which the number of people is desired to be counted, based on the estimated number of people n in the virtual period at the predetermined time counted as described above. To do. In this case, the data analysis processing unit 23B sets the unit counting period T in advance, and calculates the number of people N passing by the specific point P in the unit counting period T of the target time zone. Here, the unit counting period T is the minimum counting period used as a reference for counting the number of people passing by at a specific point P, and is arbitrarily set as a delimiter unit period for counting the number of people passing by. Will be done.
 この場合、データ解析処理部23Bは、対象の時刻帯の単位計数期間Tで得られたすべての仮想期間推定通行人数nの平均値n・aveを算出する。そして、データ解析処理部23Bは、単位計数期間Tを仮想計数期間tで除算した値を、当該平均値n・aveに乗算することで、対象の時刻帯の単位計数期間Tにおける特定地点Pの通行人数Nを算出することができる。つまり、データ解析処理部23Bは、単位計数期間を「T」、仮想計数期間を「t」、対象の時刻帯の単位計数期間Tで得られたすべての仮想期間推定通行人数nの平均値を「n・ave」、対象の時刻帯の単位計数期間Tにおける特定地点Pの通行人数を「N」とした場合に、下記で示す数式(1)を用いて、当該通行人数Nを算出することができる。
 
 N=n・ave×[T/t] ・・・ (1)
 
In this case, the data analysis processing unit 23B calculates the average value n · ave of all the virtual period estimated number of passersby n obtained in the unit counting period T of the target time zone. Then, the data analysis processing unit 23B multiplies the average value n · ave by the value obtained by dividing the unit counting period T by the virtual counting period t, so that the specific point P in the unit counting period T in the target time zone The number of people passing by N can be calculated. That is, the data analysis processing unit 23B sets the unit counting period to "T", the virtual counting period to "t", and sets the average value of all the virtual period estimated passers-by n obtained in the unit counting period T of the target time zone. When the number of people passing by a specific point P in the unit counting period T of the target time zone is "n" and "N", the number of people passing N is calculated using the formula (1) shown below. Can be done.

N = n · ave × [T / t] ・ ・ ・ (1)
 例えば、図7に一例に挙げて具体例を説明する。図7は、対象の時刻帯=[5:00~6:00]、単位計数期間T=1[h]、仮想計数期間t=7.2[sec]、仮想期間推定通行人数nの平均値n・ave=10[人]である場合を例示している。図7中の複数の記号/棒グラフ100は、それぞれ対象の時刻帯の単位計数期間Tで得られた仮想期間推定通行人数nを表している。この場合、対象の時刻帯である[5:00~6:00]の単位計数期間T=1[h]における特定地点Pの通行人数Nは、10×[1・3600/7.2]=5000[人]となる。 For example, a specific example will be described with reference to FIG. 7. FIG. 7 shows the average value of the target time zone = [5:00 to 6:00], the unit counting period T = 1 [h], the virtual counting period t = 7.2 [sec], and the estimated number of people passing through the virtual period n. The case where n · ave = 10 [person] is illustrated. The plurality of symbols / bar graphs 100 in FIG. 7 represent the virtual period estimated number of passers-by n obtained in the unit counting period T of the target time zone, respectively. In this case, the number of passers-by N at the specific point P in the unit counting period T = 1 [h] in the target time zone [5:00 to 6:00] is 10 × [1.3600 / 7.2] = It becomes 5000 [people].
 そして、データ解析処理部23Bは、各対象の時刻帯の単位計数期間Tにおける特定地点Pの通行人数Nを算出し、所定の単位、例えば、1日単位として集約する。図8は、各対象の時刻帯の単位計数期間Tにおける特定地点Pの通行人数Nを1日単位で集約した場合の例を表している。図8中の複数の記号/棒グラフ200は、各対象の時刻帯の単位計数期間Tにおける特定地点Pの通行人数Nを表し、記号300は、それぞれ各対象の時刻帯の単位計数期間Tで得られた仮想期間推定通行人数nを表している。各対象の時刻帯の単位計数期間Tにおける特定地点Pの通行人数Nは、各対象の時刻帯の単位計数期間Tで得られる仮想期間推定通行人数nの数が多いほど精度が向上する傾向にある。逆に言えば、仮想期間推定通行人数nが各時刻帯で万遍なく多数取得できれば、当該通行人数Nは、単位計数期間Tを相対的に短い期間としても相対的に高い精度を確保することができ、より短い期間での詳細な通行人数分析が可能となる。 Then, the data analysis processing unit 23B calculates the number of people N passing by the specific point P in the unit counting period T of each target time zone, and aggregates them as a predetermined unit, for example, a daily unit. FIG. 8 shows an example in which the number of people N passing by the specific point P in the unit counting period T of each target time zone is aggregated on a daily basis. The plurality of symbols / bar graphs 200 in FIG. 8 represent the number of people passing N at the specific point P in the unit counting period T of each target time zone, and the symbol 300 is obtained in the unit counting period T of each target time zone. It represents the estimated number of people passing through the virtual period. The accuracy of the number of people N passing at a specific point P in the unit counting period T of each target time zone tends to improve as the number of virtual period estimated number of people n obtained in the unit counting period T of each target time zone increases. is there. Conversely, if a large number of virtual period estimated passers-by n can be obtained evenly in each time zone, the passers-by N can ensure relatively high accuracy even if the unit counting period T is relatively short. This enables detailed analysis of the number of people passing by in a shorter period of time.
 なお、データ解析処理部23Bは、当該通行人数Nを集約する場合、上記のように1日単位に限らず、1週間単位、1か月単位、各曜日単位等、希望する任意の単位で集約することができる。 When the data analysis processing unit 23B aggregates the number of people passing by N, it is not limited to the daily unit as described above, but aggregates in any desired unit such as a weekly unit, a monthly unit, and a day of the week unit. can do.
 そして、本実施形態のデータ解析処理部23Bは、上記のようにして特定地点Pの通行人数を計数する処理を、設定された複数の特定地点Pごとに行う。そして、データ解析処理部23Bは、解析用データを解析した解析結果データとして、上記のようにして計数した各特定地点Pの通行人数の計数結果を表す特定地点別計数データを生成する。特定地点別計数データは、上述のようにして算出した各対象の時刻帯の単位計数期間Tにおける各特定地点Pの通行人数Nや当該通行人数Nの算出過程で算出した各種情報を含むデータである。そして、データ解析処理部23Bは、生成した特定地点別計数データを含む解析結果データを解析結果DB22Cに蓄積しデータベース化して記憶させる。 Then, the data analysis processing unit 23B of the present embodiment performs the process of counting the number of people passing by the specific point P as described above for each of the plurality of set specific points P. Then, the data analysis processing unit 23B generates specific point-specific counting data representing the counting result of the number of passing persons at each specific point P counted as described above as the analysis result data obtained by analyzing the analysis data. The specific point-specific counting data is data including various information calculated in the process of calculating the number of people N passing by each specific point P in the unit counting period T of each target time zone calculated as described above and the number of people passing N. is there. Then, the data analysis processing unit 23B stores the analysis result data including the generated counting data for each specific point in the analysis result DB 22C, creates a database, and stores the data.
 そして、本実施形態のデータ解析処理部23Bは、さらに、複数の特定地点Pの通行人数を計数し、当該複数の特定地点Pの通行人数を集約して、移動経路Rにおける通行人数を計数する。この場合、データ解析処理部23Bは、通行人数の計数の対象となる移動経路R(例えば、特定の路線)に含まれる特定地点Pの通行人数を集約して当該移動経路Rにおける通行人数を計数する。そして、データ解析処理部23Bは、解析用データを解析した解析結果データとして、上記のようにして計数した各移動経路Rの通行人数の計数結果を表す特定経路別計数データを生成する。特定経路別計数データは、各対象の時刻帯の単位計数期間Tにおける各特定地点Pの通行人数Nを集約した通行人数や当該通行人数Nの算出過程で算出した各種情報を含むデータである。そして、データ解析処理部23Bは、生成した特定経路別計数データを含む解析結果データを解析結果DB22Cに蓄積しデータベース化して記憶させる。 Then, the data analysis processing unit 23B of the present embodiment further counts the number of people passing by the plurality of specific points P, aggregates the number of people passing by the plurality of specific points P, and counts the number of people passing by the movement route R. .. In this case, the data analysis processing unit 23B aggregates the number of people passing by the specific point P included in the moving route R (for example, a specific route) to be counted, and counts the number of people passing by the moving route R. To do. Then, the data analysis processing unit 23B generates, as the analysis result data obtained by analyzing the analysis data, the counting data for each specific route representing the counting result of the number of people passing by each moving route R counted as described above. The specific route-specific counting data is data including various information calculated in the process of calculating the number of people passing by and the number of people N passing by each specific point P in the unit counting period T of each target time zone. Then, the data analysis processing unit 23B stores the analysis result data including the generated count data for each specific route in the analysis result DB 22C, creates a database, and stores the data.
 またさらに、本実施形態のデータ解析処理部23Bは、データ前処理部23Aによって前処理が施された解析用データに基づいて、画像データが表す画像に含まれる人物の属性の解析を行う部分でもある。典型的には、データ解析処理部23Bは、上記のように計数された人物の属性の解析を行う。 Further, the data analysis processing unit 23B of the present embodiment also analyzes the attributes of a person included in the image represented by the image data based on the analysis data preprocessed by the data preprocessing unit 23A. is there. Typically, the data analysis processing unit 23B analyzes the attributes of the person counted as described above.
 データ解析処理部23Bは、画像データに基づいてデータ前処理部23Aによって切り出された静止画像から検出、抽出された人物の属性を解析する。データ解析処理部23Bは、典型的には、上記の特定地点Pごとに、画像データが表す画像に含まれる人物の属性を解析する。またさらに、データ解析処理部23Bは、複数の特定地点Pを集合させた移動経路Rごとに、画像データが表す画像に含まれる人物の属性を解析することもできる。 The data analysis processing unit 23B analyzes the attributes of the person detected and extracted from the still image cut out by the data preprocessing unit 23A based on the image data. The data analysis processing unit 23B typically analyzes the attributes of the person included in the image represented by the image data at each of the above-mentioned specific points P. Furthermore, the data analysis processing unit 23B can also analyze the attributes of a person included in the image represented by the image data for each movement path R in which a plurality of specific points P are assembled.
 この場合、データ解析処理部23Bは、例えば、複数の記録装置10によって収集された解析用データから、位置データ等に基づいて特定地点Pごとの解析用データを抽出する。そして、データ解析処理部23Bは、抽出した解析用データに基づいて、画像データが表す画像に含まれる人物の属性を解析することで、複数の特定地点Pごとに、当該特定地点Pの近傍に位置し上記のようにして計数された人物の属性を解析する。移動経路Rごとの人物の属性解析も特定地点Pの場合とほぼ同様である。データ解析処理部23Bは、複数の特定地点Pごと、移動経路Rごとに、移動体Vの移動に伴って収集された解析用データに基づいて、人物の属性を解析する。 In this case, the data analysis processing unit 23B extracts, for example, the analysis data for each specific point P from the analysis data collected by the plurality of recording devices 10 based on the position data and the like. Then, the data analysis processing unit 23B analyzes the attributes of the person included in the image represented by the image data based on the extracted analysis data, so that each of the plurality of specific points P is in the vicinity of the specific points P. Analyze the attributes of the person who is located and counted as described above. The attribute analysis of the person for each movement route R is almost the same as the case of the specific point P. The data analysis processing unit 23B analyzes the attributes of a person based on the analysis data collected along with the movement of the moving body V for each of a plurality of specific points P and each movement route R.
 ここでは、データ解析処理部23Bは、例えば、種々の公知の人工知能(Artificial Intelligence)技術や深層学習(Deep Learning)技術を用いて画像データが表す画像に含まれる人物の属性、及び、当該属性が特定された人物の人流を解析する処理を実行可能に構成される。 Here, the data analysis processing unit 23B uses, for example, various known artificial intelligence techniques and deep learning techniques to attribute the person included in the image represented by the image data, and the attributes. Is configured to be able to execute the process of analyzing the flow of the identified person.
 具体的には、データ解析処理部23Bは、上述のように、データ前処理部23Aによって切り出された静止画像から人物を検出し抽出する処理を実行する。そして、本実施形態のデータ解析処理部23Bは、画像データが表す画像から当該検出、抽出された人物の特徴点を含む画像を抽出する処理を実行する。ここで、当該人物の特徴点とは、画像に含まれる人物において当該人物の属性を特定可能な部位である。当該人物の特徴点とは、例えば、当該人物の表情が現われる顔、しぐさ・ジェスチャが現れる手足、アクセサリ等が装着されやすい傾向にある位置等の部位である。本実施形態の画像データは、移動体Vに搭載された記録装置10によって当該移動体Vの移動に伴って収集されたものであるため、同一の人物に対して異なる角度から撮像した画像が多数含まれる可能性が高い。このことを利用し、データ前処理部23Aは、移動体Vの移動に伴って異なる角度から撮像された多数の画像から、人物の属性特定に用いることができる当該人物の特徴点が写った画像を抽出することで、人物の属性特定に用いるデータをできる限り多数確保する。 Specifically, the data analysis processing unit 23B executes a process of detecting and extracting a person from the still image cut out by the data preprocessing unit 23A as described above. Then, the data analysis processing unit 23B of the present embodiment executes a process of extracting an image including the feature points of the detected and extracted person from the image represented by the image data. Here, the feature point of the person is a part where the attribute of the person can be specified in the person included in the image. The characteristic points of the person are, for example, parts such as a face on which the person's facial expression appears, limbs on which gestures / gestures appear, and positions where accessories and the like tend to be easily attached. Since the image data of the present embodiment is collected by the recording device 10 mounted on the moving body V as the moving body V moves, many images of the same person taken from different angles are obtained. It is likely to be included. Taking advantage of this, the data preprocessing unit 23A captures the feature points of the person that can be used to identify the attributes of the person from a large number of images taken from different angles as the moving body V moves. By extracting, as much data as possible to be used for identifying the attributes of a person is secured.
 そして、データ解析処理部23Bは、画像データから抽出した人物の特徴点を含む画像に基づいて、当該画像に含まれる人物の属性を解析する処理を実行する。データ解析処理部23Bは、例えば、解析参照DB22Bに記憶されている属性予測参照データ(解析参照データ)と、画像データから抽出された画像に含まれる人物の特徴点とに基づいて、当該人物の属性を解析する。ここで、属性予測参照データは、人工知能技術や深層学習技術を用いた様々な手法によって、画像に含まれる人物の特徴点等に応じて推定可能な当該人物の属性を学習した結果が反映される情報である。言い換えれば、属性予測参照データは、画像に含まれる人物の特徴点等に基づいて人物の属性を推定するために、人工知能技術や深層学習技術を用いた様々な手法を用いてデータベース化されたデータである。この属性予測参照データは、逐次更新可能である。属性予測参照データは、例えば、データ解析処理部23Bによる解析結果を表す解析結果データ(人物属性データ)自体を学習のためのデータとすることもできる。 Then, the data analysis processing unit 23B executes a process of analyzing the attributes of the person included in the image based on the image including the feature points of the person extracted from the image data. The data analysis processing unit 23B is based on, for example, the attribute prediction reference data (analysis reference data) stored in the analysis reference DB 22B and the feature points of the person included in the image extracted from the image data. Analyze the attributes. Here, the attribute prediction reference data reflects the result of learning the attributes of the person that can be estimated according to the characteristic points of the person included in the image by various methods using artificial intelligence technology and deep learning technology. Information. In other words, the attribute prediction reference data was created into a database using various methods using artificial intelligence technology and deep learning technology in order to estimate the attributes of the person based on the characteristic points of the person included in the image. It is data. This attribute prediction reference data can be updated sequentially. As the attribute prediction reference data, for example, the analysis result data (personal attribute data) itself representing the analysis result by the data analysis processing unit 23B can be used as the data for learning.
 データ解析処理部23Bによって解析される人物の属性としては、典型的には、当該人物の外観の特徴点から解析可能な事項、例えば、当該人物の性別、年齢、体格、社会的地位、嗜好、又は、行動志向等を含む。ここで、性別とは、男性、女性の別を表す属性である。年齢とは、生まれてから現在(その時)までの年月の長さを表す属性である。体格とは、身長、体重、各種寸法等を表す属性である。社会的地位とは、職業(自営業、ビジネスマン、警官、学生、無職、アルバイト)、年収、身分、同行者等を表す属性である。嗜好とは、服装・所持品・ファッションの傾向(カジュアル志向、エレガント志向、ブランド志向、高級志向、ファストファッション志向)、趣味(スポーツ/サブカルチャー/アウトドア/美容等)等を表す属性である。行動志向とは、その時点での気分、興味関心(やりたいこと、行きたいところ)等を表す属性である。つまりここでは、データ解析処理部23Bは、人物の属性として、性別、年齢、体格、社会的地位、嗜好、行動志向等を推定する。 The attributes of a person analyzed by the data analysis processing unit 23B typically include matters that can be analyzed from the characteristics of the appearance of the person, such as the person's gender, age, physique, social status, and preferences. Or, it includes behavioral orientation. Here, the gender is an attribute representing the distinction between male and female. Age is an attribute that represents the length of years from birth to the present (at that time). The physique is an attribute representing height, weight, various dimensions, and the like. Social status is an attribute that represents occupation (self-employed, businessman, police officer, student, unemployed, part-time job), annual income, status, companion, etc. Preference is an attribute that represents the tendency of clothes / belongings / fashion (casual orientation, elegant orientation, brand orientation, luxury orientation, fast fashion orientation), hobbies (sports / subculture / outdoor / beauty, etc.). Behavioral orientation is an attribute that expresses the mood, interests (what you want to do, where you want to go), etc. at that time. That is, here, the data analysis processing unit 23B estimates gender, age, physique, social status, preference, behavioral orientation, etc. as the attributes of the person.
 データ解析処理部23Bは、属性予測参照データを参照して、画像に含まれる人物の特徴点に対応する属性(性別、年齢、体格、社会的地位、嗜好、又は、行動志向)を抽出し、抽出した属性を当該画像に映り込んだ人物の属性であるものと推定する。データ解析処理部23Bは、例えば、画像に含まれる人物の特徴点である顔の表情、手足のしぐさ・ジェスチャ、装着されているアクセサリや洋服等に応じて、属性予測参照データを参照し、当該特徴点にあう属性をマッチングし、当該人物の性別、年齢、体格、社会的地位、嗜好、行動志向等の属性を推定する。 The data analysis processing unit 23B refers to the attribute prediction reference data and extracts the attributes (gender, age, physique, social status, preference, or behavior orientation) corresponding to the characteristic points of the person included in the image. It is presumed that the extracted attributes are the attributes of the person reflected in the image. The data analysis processing unit 23B refers to the attribute prediction reference data according to, for example, facial expressions, gestures / gestures of limbs, attached accessories, clothes, etc., which are characteristic points of the person included in the image. The attributes that match the characteristic points are matched, and the attributes such as gender, age, physique, social status, preference, and behavioral orientation of the person are estimated.
 そしてさらに、データ解析処理部23Bは、人物の属性が特定された画像データに紐付された位置データに基づいて、上記のようにして属性が特定された人物の位置等を解析する処理を実行する。データ解析処理部23Bは、例えば、解析対象DB22Aから、人物の属性が特定された画像データに紐付された位置データを読み出す。そして、データ解析処理部23Bは、解析参照DB22Bに記憶されている地図参照データ(解析参照データ)と、読み出した位置データとに基づいて、当該属性が特定された人物の位置等を解析する。例えば、データ解析処理部23Bは、地図参照データを参照して、当該位置データに基づいて当該画像が撮像された位置を特定する。そして、データ解析処理部23Bは、当該位置データが表す位置に基づいて、属性が特定された人物の位置を特定する。 Further, the data analysis processing unit 23B executes a process of analyzing the position of the person whose attributes are specified as described above based on the position data associated with the image data whose attributes are specified. .. The data analysis processing unit 23B reads, for example, the position data associated with the image data in which the attribute of the person is specified from the analysis target DB 22A. Then, the data analysis processing unit 23B analyzes the position and the like of the person whose attribute is specified based on the map reference data (analysis reference data) stored in the analysis reference DB 22B and the read position data. For example, the data analysis processing unit 23B refers to the map reference data and identifies the position where the image is captured based on the position data. Then, the data analysis processing unit 23B identifies the position of the person whose attribute is specified based on the position represented by the position data.
 データ解析処理部23Bは、解析用データを解析した解析結果データとして、上記のように、特定地点Pごとに解析した人物の属性を表す人物属性データ、及び、属性が特定された人物の位置を表す属性別位置データを生成する。また、データ解析処理部23Bは、複数の特定地点Pを集合させた移動経路Rごとに解析した人物の属性を表す人物属性データ、及び、属性が特定された人物の位置を表す属性別位置データも生成する。そして、データ解析処理部23Bは、生成した人物属性データ、及び、属性別位置データを含む解析結果データを解析結果DB22Cに蓄積しデータベース化して記憶させる。 As the analysis result data obtained by analyzing the analysis data, the data analysis processing unit 23B obtains the person attribute data representing the attribute of the person analyzed for each specific point P and the position of the person whose attribute is specified as described above. Generate position data for each attribute to be represented. In addition, the data analysis processing unit 23B includes personal attribute data representing the attributes of the person analyzed for each movement path R in which a plurality of specific points P are assembled, and attribute-specific position data representing the position of the person whose attributes are specified. Also generate. Then, the data analysis processing unit 23B accumulates the generated person attribute data and the analysis result data including the attribute-specific position data in the analysis result DB 22C and stores them in a database.
 そして、本実施形態のデータ解析処理部23Bは、さらに、解析結果データとして、特定地点別計数データ、特定経路別計数データ、人物属性データ、属性別位置データ等に基づく商用利用データを生成する処理も実行可能に構成されてもよい。具体的には、本実施形態のデータ解析処理部23Bは、例えば、特定地点Pにおけるコンテンツ受容可能範囲を通過した通過者人数を表す指標を算出することができる。ここで、特定地点Pにおけるコンテンツ受容可能範囲とは、特定地点Pにおいて、出力装置Dが出力するコンテンツを人物が受容できる空間範囲であり、出力装置Dが表示する画像を人物が視認できる可視範囲、出力装置Dが出力する音・音声を人物が聴き取りできる可聴範囲等に応じて定まる。 Then, the data analysis processing unit 23B of the present embodiment further generates, as analysis result data, commercial use data based on specific point-specific counting data, specific route-specific counting data, personal attribute data, attribute-specific position data, and the like. May also be configured to be viable. Specifically, the data analysis processing unit 23B of the present embodiment can calculate, for example, an index representing the number of passersby who have passed the content acceptable range at the specific point P. Here, the content acceptable range at the specific point P is a spatial range in which the person can accept the content output by the output device D at the specific point P, and the visible range in which the person can visually recognize the image displayed by the output device D. , The sound / voice output by the output device D is determined according to the audible range in which a person can hear.
 データ解析処理部23Bは、例えば、特定地点Pの通行人数を表す特定地点別計数データに基づいて、出力装置Dが近傍に存在する特定地点Pごとに、上記コンテンツ受容可能範囲を通過した通過者人数を表す指標を算出する。あわせて、データ解析処理部23Bは、特定経路別計数データに基づいて、複数の移動経路Rごと(複数の路線ごと)に上記指標を算出するようにしてもよい。そして、データ解析処理部23Bは、当該指標を表す商用利用データを生成し、生成した商用利用データを含む解析結果データを解析結果DB22Cに蓄積しデータベース化して記憶させる。 The data analysis processing unit 23B is, for example, a passerby who has passed the content acceptable range for each specific point P in which the output device D exists in the vicinity, based on the count data for each specific point representing the number of people passing by the specific point P. Calculate an index showing the number of people. At the same time, the data analysis processing unit 23B may calculate the index for each of a plurality of movement routes R (for each of a plurality of routes) based on the counting data for each specific route. Then, the data analysis processing unit 23B generates commercial use data representing the index, stores the analysis result data including the generated commercial use data in the analysis result DB 22C, and stores it in a database.
 ここで、上述のコンテンツ受容可能範囲を通過した通過者人数は、典型的には、特定地点Pにおいて出力されるコンテンツを受容した人数であるものとみなすることができる。そして、当該コンテンツ受容可能範囲は、出力装置Dが近傍に存在する特定地点Pにおいては、典型的には、特定地点P近傍の領域とみなすることができる。このため、特定地点Pにおいてコンテンツ受容可能範囲を通過した通過者人数は、当該特定地点Pの通行人数と略同数であるものとみなすることができる。 Here, the number of passers-by who have passed the above-mentioned content acceptable range can be typically regarded as the number of people who have received the content output at the specific point P. Then, the content acceptable range can be typically regarded as a region near the specific point P at the specific point P where the output device D exists in the vicinity. Therefore, it can be considered that the number of passers-by who have passed the content acceptable range at the specific point P is substantially the same as the number of passersby at the specific point P.
 上記を踏まえて、本実施形態のデータ解析処理部23Bは、特定地点Pの通行人数を、コンテンツ受容可能範囲の通過者人数とする。すなわちここでは、データ解析処理部23Bは、特定地点別計数データが表す特定地点Pごとの通行人数を、特定地点Pごとのコンテンツ受容可能範囲の通過者人数とする。同様に、データ解析処理部23Bは、特定経路別計数データが表す移動経路Rごとの通行人数を、移動経路Rごとのコンテンツ受容可能範囲の通過者人数とする。 Based on the above, the data analysis processing unit 23B of the present embodiment sets the number of people passing by the specific point P as the number of people passing through the content acceptable range. That is, here, the data analysis processing unit 23B sets the number of people passing by the specific point P represented by the counting data for each specific point as the number of passersby in the content acceptable range for each specific point P. Similarly, the data analysis processing unit 23B sets the number of passersby for each movement route R represented by the counting data for each specific route as the number of passersby in the content acceptable range for each movement route R.
 そして、データ解析処理部23Bは、コンテンツ受容可能範囲を通過した通過者人数そのものを当該通過者人数を表す指標としてもよいし、当該通過者人数に基づいて当該通過者人数を表す指標を算出してもよい。データ解析処理部23Bが算出する通過者人数を表す指標としては、例えば、「DEC : Daily Effective Circulation」や「GRP:Gross Rating Point」等が挙げられる。「DEC」、「GRP」は、共に広告の効果を表す指標である。「DEC」は、典型的には、対象の広告のコンテンツ受容可能範囲(可視範囲)を通過する1日の通過者人数である。「DEC」は、例えば、満18歳以上等、所定の年齢制限を満たす人を対象とした通過者人数としてもよいし、年齢制限を設けず全ての人を対象とした通過者人数としてもよい。「GRP」は、典型的には、対象の広告に対して1日に到達可能なエリア内の対象人口における上記コンテンツ受容可能範囲を通過する1日の通過者人数の割合である。「GRP」は、[DEC/対象エリア内の対象人口]で表すことができる。「対象エリア内の対象人口」は、「DEC」の対象に年齢制限を設けた場合には対象エリア内の当該年齢制限を満たす人口となる。 Then, the data analysis processing unit 23B may use the number of passers-by who have passed the content acceptable range itself as an index showing the number of passers-by, or calculate an index showing the number of passers-by based on the number of passers-by. You may. Examples of the index representing the number of passers-by calculated by the data analysis processing unit 23B include "DEC: Daily Effective Circulation" and "GRP: Gross Rating Point". Both "DEC" and "GRP" are indicators of the effectiveness of advertising. “DEC” is typically the number of passers-by per day that pass through the content acceptable range (visible range) of the target ad. “DEC” may be the number of passers-by for people who meet a predetermined age limit, such as 18 years of age or older, or may be the number of passers-by for all people without an age limit. .. "GRP" is typically the ratio of the number of passers-by per day to the content acceptable range in the target population within the area reachable per day for the target advertisement. "GRP" can be represented by [DEC / target population in the target area]. The "target population in the target area" is the population that satisfies the age limit in the target area when the age limit is set for the target of "DEC".
 データ解析処理部23Bは、特定地点Pごとのコンテンツ受容可能範囲の通過者人数に基づいて、通過者人数を表す指標として、特定地点Pごとの「DEC」、「GRP」を算出することができる。同様に、データ解析処理部23Bは、移動経路Rごとのコンテンツ受容可能範囲の通過者人数に基づいて、通過者人数を表す指標として、移動経路Rごとの「DEC」、「GRP」を算出することができる。 The data analysis processing unit 23B can calculate "DEC" and "GRP" for each specific point P as an index showing the number of passers based on the number of passers in the content acceptable range for each specific point P. .. Similarly, the data analysis processing unit 23B calculates "DEC" and "GRP" for each movement route R as an index indicating the number of passers based on the number of passersby in the content acceptable range for each movement route R. be able to.
 そして、データ解析処理部23Bは、コンテンツ受容可能範囲を通過した通過者人数を表す指標として、各特定地点Pごと、各移動経路Rごとの「DEC」、「GRP」を表す商用利用データを生成し、生成した商用利用データを含む解析結果データを解析結果DB22Cに蓄積しデータベース化して記憶させることができる。 Then, the data analysis processing unit 23B generates commercial use data representing "DEC" and "GRP" for each specific point P and each movement route R as an index showing the number of passersby who have passed the content acceptable range. Then, the analysis result data including the generated commercial use data can be stored in the analysis result DB 22C and stored in a database.
 データ加工処理部23Cは、上述のようにしてデータ解析処理部23Bによって解析された解析結果データを所望の形式に加工する処理を実行する部分である。データ加工処理部23Cは、解析結果データに含まれる特定地点別計数データ、特定経路別計数データ、人物属性データ、属性別人流データ、商用利用データ等を所望の形式に加工する。データ加工処理部23Cは、例えば、図10に例示するように、当該解析結果データを、いつ、どの移動経路(路線)Rのどこの特定地点Pに、どんな属性の人が何人いたか、特定地点Pごとの「DEC」、「GRP」、移動経路Rごとの「DEC」、「GRP」等を地図上にプロットしたものや各種グラフ、ダイヤグラム等に加工する。そして、処理部23は、データ加工処理部23Cによって所望の形式に加工された解析結果データを、データ入出力部21を介してクライアント端末CLに出力する処理を実行する。クライアント端末CLは、解析装置20から提供された解析結果データを、例えば、商圏調査、マーケティング、広告、広告料を決める際の判断材料、防災・都市計画等の各種用途にて利用可能とする端末である。クライアント端末CLは、例えば、ノート型PC、デスクトップ型PC、タブレット型PC、スマートフォン、携帯端末等によって構成される。 The data processing unit 23C is a part that executes a process of processing the analysis result data analyzed by the data analysis processing unit 23B into a desired format as described above. The data processing unit 23C processes the counting data for each specific point, the counting data for each specific route, the person attribute data, the human flow data for each attribute, the commercial use data, etc. included in the analysis result data into a desired format. For example, as illustrated in FIG. 10, the data processing unit 23C identifies when, which movement route (route) R, which specific point P, and how many people have what attributes. "DEC", "GRP" for each point P, "DEC", "GRP" for each movement route R, etc. are plotted on a map and processed into various graphs, diagrams, and the like. Then, the processing unit 23 executes a process of outputting the analysis result data processed in a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21. The client terminal CL is a terminal that makes it possible to use the analysis result data provided by the analysis device 20 for various purposes such as trade area surveys, marketing, advertisements, judgment materials when determining advertising fees, disaster prevention, and city planning. Is. The client terminal CL is composed of, for example, a notebook PC, a desktop PC, a tablet PC, a smartphone, a mobile terminal, and the like.
 次に、図11のフローチャート図を参照し解析システム1における処理の一例を説明する。 Next, an example of processing in the analysis system 1 will be described with reference to the flowchart of FIG.
 まず、複数の移動体Vにそれぞれ搭載された複数の記録装置10は、移動体Vの移動に伴って画像データ、位置データを含む解析用データを収集する(ステップS1)。 First, the plurality of recording devices 10 mounted on the plurality of moving bodies V each collect analysis data including image data and position data as the moving body V moves (step S1).
 次に、記録装置10は、収集した解析用データを、データ入出力部13を介して出力し、解析装置20のデータ入出力部21を介して解析装置20に入力する(ステップS2)。解析装置20に入力された解析用データは、解析対象DB22Aに記憶される。 Next, the recording device 10 outputs the collected analysis data via the data input / output unit 13 and inputs the collected analysis data to the analysis device 20 via the data input / output unit 21 of the analysis device 20 (step S2). The analysis data input to the analysis device 20 is stored in the analysis target DB 22A.
 次に、解析装置20のデータ前処理部23Aは、解析対象DB22Aに記憶されている解析用データに対して、上述したような種々の前処理を施す(ステップS3)。 Next, the data preprocessing unit 23A of the analysis device 20 performs various preprocessing as described above on the analysis data stored in the analysis target DB 22A (step S3).
 次に、解析装置20のデータ解析処理部23Bは、データ前処理部23Aによって前処理が施された解析用データに基づいて解析を行い、解析結果データとして、特定地点別計数データ、特定経路別計数データ、人物属性データ、属性別人流データ、商用利用データ等を生成する(ステップS4)。 Next, the data analysis processing unit 23B of the analysis device 20 performs analysis based on the analysis data preprocessed by the data preprocessing unit 23A, and as the analysis result data, count data for each specific point and each specific route. Counting data, personal attribute data, personal flow data by attribute, commercial use data, etc. are generated (step S4).
 そして、データ解析処理部23Bは、生成した特定地点別計数データ、特定経路別計数データ、人物属性データ、属性別人流データ、商用利用データ等の解析結果データを解析結果DB22Cに蓄積しデータベース化して記憶させる(ステップS5)。 Then, the data analysis processing unit 23B accumulates the generated analysis result data such as the count data for each specific point, the count data for each specific route, the person attribute data, the human flow data for each attribute, and the commercial use data in the analysis result DB 22C and creates a database. It is memorized (step S5).
 次に、解析装置20のデータ加工処理部23Cは、クライアント端末CL等からの要求に応じて、解析結果DB22Cに記憶されている解析結果データ(特定地点別計数データ、特定経路別計数データ、人物属性データ、属性別人流データ、商用利用データ等)を、図10に例示したような所望の形式に加工する(ステップS6)。 Next, the data processing unit 23C of the analysis device 20 receives the analysis result data (counting data for each specific point, counting data for each specific route, person) stored in the analysis result DB 22C in response to a request from the client terminal CL or the like. Attribute data, personal flow data by attribute, commercial use data, etc.) are processed into a desired format as illustrated in FIG. 10 (step S6).
 そして、解析装置20の処理部23は、データ加工処理部23Cによって所望の形式に加工された解析結果データを、データ入出力部21を介してクライアント端末CLに出力、提供し(ステップS7)、一連の処理を終了する。 Then, the processing unit 23 of the analysis device 20 outputs and provides the analysis result data processed in a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21 (step S7). Ends a series of processes.
 以上で説明した解析システム1は、移動体Vに搭載された記録装置10によって、画像データ、及び、位置データを含む解析用データを収集することができる。解析用データとして収集される画像データは、移動体Vの移動に伴って撮像された当該移動体Vの外部の画像を表すデータである。そして、解析装置20は、複数の記録装置10によって収集された解析用データに基づいて、移動体Vの移動経路Rにおける特定地点Pの通行人数を計数することができる。この結果、この解析システム1は、移動体Vが移動する移動経路Rにおける特定地点Pごとに適正に人物の流動の傾向を解析することができる。そして、解析システム1は、上記のように解析した移動経路Rにおける特定地点Pごとの人物の流動の傾向を、商圏調査、マーケティング、広告、広告料を決める際の判断材料、防災・都市計画等の様々な用途で活用させることができる。また、解析システム1は、例えば、目視等により人手で通行人数を計数するような場合と比較して、計数の作業負荷を低減し、計数自体の頻度を大幅に向上することができる。この結果、解析システム1は、長期間に渡りより精度よく人物の流動の傾向を解析することができる。 The analysis system 1 described above can collect image data and analysis data including position data by the recording device 10 mounted on the moving body V. The image data collected as the analysis data is data representing an image outside the moving body V, which is captured as the moving body V moves. Then, the analysis device 20 can count the number of people passing by the specific point P in the movement path R of the moving body V based on the analysis data collected by the plurality of recording devices 10. As a result, this analysis system 1 can appropriately analyze the tendency of the flow of a person at each specific point P in the movement path R in which the moving body V moves. Then, the analysis system 1 determines the tendency of the flow of people at each specific point P in the movement route R analyzed as described above, such as trade area survey, marketing, advertising, judgment material when determining advertising fees, disaster prevention / city planning, etc. It can be used for various purposes. Further, the analysis system 1 can reduce the workload of counting and significantly improve the frequency of counting itself, as compared with the case where the number of people passing by is manually counted, for example, by visual inspection. As a result, the analysis system 1 can analyze the tendency of the flow of a person more accurately over a long period of time.
 ここでは、以上で説明した解析システム1は、解析装置20によって、位置データに基づいて、解析用データから特定地点Pの画像を含む画像データを抽出し、当該抽出した画像データが表す画像に含まれる人物の数を計数し集約して特定地点Pの通行人数を計数する。これにより、解析システム1は、例えば、目視等により人手で特定地点Pの通行人数を計数するような場合と比較して、計数の作業負荷を低減し、計数自体の頻度を大幅に向上することができる。この結果、解析システム1は、より精度よく人物の流動の傾向を解析することができる。 Here, the analysis system 1 described above extracts image data including an image of a specific point P from the analysis data based on the position data by the analysis device 20, and includes the image data represented by the extracted image data. The number of people passing by at a specific point P is counted and aggregated. As a result, the analysis system 1 reduces the work load of counting and greatly improves the frequency of counting itself, as compared with the case where the number of people passing by the specific point P is manually counted, for example, visually. Can be done. As a result, the analysis system 1 can analyze the tendency of the flow of a person more accurately.
 より詳細には、以上で説明した解析システム1は、解析装置20によって、画像データを構成する1つの静止画像に含まれる人物の数を、予め設定される仮想計数期間tにおいて計数される仮想期間推定通行人数nであるものとして計数する。そして、解析システム1は、当該仮想期間推定通行人数nに基づいて、特定地点Pの通行人数を計数する。これにより、解析システム1は、複数の静止画像間において、例えば、従来の追跡アルゴリズム等を用いなくても同一人物の重複計数を抑制することができるので、より精度よく人物の流動の傾向を解析することができる。 More specifically, in the analysis system 1 described above, the analysis device 20 counts the number of persons included in one still image constituting the image data in a preset virtual counting period t. It is counted assuming that the estimated number of people passing by is n. Then, the analysis system 1 counts the number of people passing by the specific point P based on the estimated number of people passing by in the virtual period n. As a result, the analysis system 1 can suppress duplicate counting of the same person between a plurality of still images without using, for example, a conventional tracking algorithm, and thus analyze the tendency of the flow of the person more accurately. can do.
 また、以上で説明した解析システム1は、解析装置20によって計数した複数の特定地点Pの通行人数を集約して移動経路Rにおける通行人数を計数する。この結果、この解析システム1は、移動体Vが移動する移動経路Rごとに適正に人物の流動の傾向を解析することができ、様々な用途で活用させることができる。解析システム1は、移動経路Rごとの人物の流動の傾向を、例えば、ラッピングバス等の移動体の広告効果の分析等に活用させることもできる。 Further, the analysis system 1 described above aggregates the number of people passing by the plurality of specific points P counted by the analysis device 20 and counts the number of people passing by the movement route R. As a result, this analysis system 1 can appropriately analyze the tendency of the flow of a person for each movement path R in which the moving body V moves, and can be used for various purposes. The analysis system 1 can also utilize the tendency of the flow of people for each movement route R, for example, to analyze the advertising effect of a moving body such as a wrapping bus.
 また、以上で説明した解析システム1は、複数の移動体Vにそれぞれ搭載された複数の記録装置10によって収集された解析用データに基づいて、特定地点Pや移動経路Rの通行人数を計数する。この解析システム1は、例えば、配車の効率化等のために、1台の移動体Vが一日の間に複数の路線(移動経路R)を移動しつつ、複数の移動体Vが複数の路線に渡って使い分けられて運行するような場合であっても、特定地点Pを通過した全ての移動体Vから当該特定地点Pに関する解析用データを効率的に収集することができる。この結果、解析システム1は、より多くの解析用データを収集し解析に用いることができるので、特定地点Pごと、移動経路Rごとの人物の流動の傾向をより精度よく解析することができる。 Further, the analysis system 1 described above counts the number of people passing by the specific point P and the movement route R based on the analysis data collected by the plurality of recording devices 10 mounted on the plurality of moving bodies V, respectively. .. In this analysis system 1, for example, in order to improve the efficiency of vehicle allocation, one moving body V moves on a plurality of routes (moving route R) in one day, and a plurality of moving bodies V are present. Even in the case of operating properly over the route, it is possible to efficiently collect analysis data related to the specific point P from all the moving bodies V that have passed through the specific point P. As a result, since the analysis system 1 can collect more analysis data and use it for the analysis, it is possible to more accurately analyze the tendency of the flow of the person for each specific point P and each movement route R.
 また、以上で説明した解析システム1は、さらに、解析装置20によって解析用データに基づいて、特定地点Pごと、移動経路Rごとに、画像データが表す画像に含まれる人物の属性を解析する。この結果、この解析システム1は、特定地点Pごと、移動経路Rごとに、人物の流動の傾向として、通行人数に加えて、通行人数として計数された人物の属性も解析することができる。これにより、この解析システム1は、例えば、特定地点Pごと、移動経路Rごとの通行人数だけでなく、当該通行人数として計数された人物の属性傾向を把握させることができ、逆に、所望の属性傾向の通行人が多い特定地点Pや移動経路R等を容易に特定することができる。この結果、解析システム1は、特定地点Pごと、移動経路Rごとの人物の流動の傾向を、上記のような様々な用途でより好適に活用させることができる。 Further, the analysis system 1 described above further analyzes the attributes of the person included in the image represented by the image data for each specific point P and each movement path R based on the analysis data by the analysis device 20. As a result, this analysis system 1 can analyze not only the number of people passing by but also the attributes of people counted as the number of people passing by as the tendency of the flow of people for each specific point P and each movement route R. As a result, the analysis system 1 can grasp not only the number of people passing by each specific point P and each movement route R, but also the attribute tendency of the person counted as the number of people passing, and conversely, it is desired. It is possible to easily identify a specific point P, a movement route R, etc., which have many passersby with an attribute tendency. As a result, the analysis system 1 can more preferably utilize the tendency of the flow of people for each specific point P and each movement route R for various purposes as described above.
 一例として、以上で説明した解析システム1は、特定地点Pごと、移動経路Rごとの通行人数に基づいて、当該特定地点Pごと、移動経路Rごとに、コンテンツ受容可能範囲を通過した通過者人数を表す指標を算出する。この結果、解析システム1は、特定地点Pごと、移動経路Rごとの当該指標を、例えば、出力装置Dから出力するコンテンツの利用料(広告料等)を決める際の判断材料として好適に活用させることができる。また、解析システム1は、特定地点Pごと、移動経路Rごとの通行人の属性傾向を、例えば、出力装置Dから出力するコンテンツを決める際の判断材料として好適に活用させることができる。 As an example, the analysis system 1 described above is based on the number of people passing by each specific point P and each movement route R, and the number of passersby who have passed the content acceptable range for each specific point P and each movement route R. Calculate the index representing. As a result, the analysis system 1 suitably utilizes the index for each specific point P and each movement route R as a judgment material when, for example, determining a usage fee (advertising fee, etc.) for the content output from the output device D. be able to. Further, the analysis system 1 can suitably utilize the attribute tendency of the passerby for each specific point P and each movement route R as, for example, as a judgment material when determining the content to be output from the output device D.
 なお、上述した本発明の実施形態に係る解析システムは、上述した実施形態に限定されず、特許請求の範囲に記載された範囲で種々の変更が可能である。 The analysis system according to the embodiment of the present invention described above is not limited to the embodiment described above, and various modifications can be made within the scope described in the claims.
 以上の説明では、記録装置10が搭載される移動体Vは、所定の移動経路Rを移動するものであり、例えば、予め定められた複数の路線を移動可能なものであるものとして説明したがこれに限らない。すなわち、当該移動体Vは、一日の間に予め定められた複数の路線を繰り返し走行する路線バスであるものとして説明したがこれに限らない。また、記録装置10は、複数の路線を移動する複数の移動体Vにそれぞれ搭載されるものとして説明したがこれに限らない。例えば、記録装置10は、1台の乗用車に搭載されるだけであってもよい。つまり、解析装置20は、複数の記録装置10によって収集された解析用データに基づいて、特定地点Pの通行人数を計数するものとして説明したがこれに限らない。 In the above description, the moving body V on which the recording device 10 is mounted moves on a predetermined movement path R, and is described as being capable of moving on a plurality of predetermined routes, for example. Not limited to this. That is, the moving body V has been described as being a fixed-route bus that repeatedly travels on a plurality of predetermined routes during the day, but the present invention is not limited to this. Further, the recording device 10 has been described as being mounted on each of a plurality of moving bodies V moving on a plurality of routes, but the present invention is not limited to this. For example, the recording device 10 may only be mounted on one passenger car. That is, the analysis device 20 has been described as counting the number of people passing by the specific point P based on the analysis data collected by the plurality of recording devices 10, but the present invention is not limited to this.
 以上の説明では、解析装置20は、特定地点Pの通行人数と共に、移動経路Rの通行人数も計数するものとして説明したがこれに限らない。また、解析装置20は、解析用データに基づいて、特定地点Pごと、移動経路Rごとに、画像データが表す画像に含まれる人物の属性を解析するものとして説明したがこれに限らない。解析装置20は、少なくとも特定地点Pの通行人数を計数するものであればよい。 In the above description, the analysis device 20 has been described as counting the number of people passing by the movement route R as well as the number of people passing by the specific point P, but the present invention is not limited to this. Further, the analysis device 20 has been described as analyzing the attributes of a person included in the image represented by the image data for each specific point P and each movement path R based on the analysis data, but the present invention is not limited to this. The analysis device 20 may at least count the number of people passing by at the specific point P.
 以上の説明では、解析装置20は、画像データを構成する1つの静止画像に含まれる人物の数を、予め設定される仮想計数期間tにおいて計数される仮想期間推定通行人数nであるものとして計数し、当該仮想期間推定通行人数nに基づいて、特定地点Pの通行人数を計数するものとして説明したがこれに限らない。 In the above description, the analysis device 20 counts the number of people included in one still image constituting the image data as assuming that it is a virtual period estimated number of passersby n counted in a preset virtual counting period t. However, the description is made assuming that the number of passersby at the specific point P is counted based on the estimated number of passersby n in the virtual period, but the present invention is not limited to this.
 以上で説明した制御部14、解析装置20は、各部が別体に構成され、当該各部が各種の電気信号を相互に授受可能に接続されることで構成されてもよく、一部の機能が他の制御装置によって実現されてもよい。また、以上で説明したプログラム、アプリケーション、各種データ等は、適宜、更新されてもよいし、解析システム1に対して任意のネットワークを介して接続されたサーバに記憶されていてもよい。以上で説明したプログラム、アプリケーション、各種データ等は、例えば、必要に応じてその全部又は一部をダウンロードすることも可能である。また、例えば、制御部14、解析装置20が備える処理機能については、その全部又は任意の一部を、例えば、CPU等及び当該CPU等にて解釈実行されるプログラムにて実現してもよく、また、ワイヤードロジック等によるハードウェアとして実現してもよい。 The control unit 14 and the analysis device 20 described above may be configured such that each unit is separately configured and the respective units are connected to each other so that various electric signals can be exchanged with each other. It may be realized by other control devices. Further, the programs, applications, various data and the like described above may be updated as appropriate, or may be stored in a server connected to the analysis system 1 via an arbitrary network. The programs, applications, various data, and the like described above can be downloaded in whole or in part as needed. Further, for example, with respect to the processing functions provided in the control unit 14 and the analysis device 20, all or any part thereof may be realized by, for example, a CPU or a program interpreted and executed by the CPU or the like. Further, it may be realized as hardware by wired logic or the like.
 例えば、解析システム1は、解析装置20のデータ前処理部23A、データ解析処理部23Bの一部の機能が各記録装置10側に設けられていてもよい。例えば、解析システム1は、各記録装置10側で人物を含む画像の切り取り等の1次画像解析を行い、各記録装置10から解析装置20に送信された解析用データに基づくデータに応じて、解析装置20側で、通行人数の計数、人物属性の解析等の2次画像解析を行うようにしてもよい。また例えば、解析システム1は、各記録装置10側で各移動体Vごとに個別に特定地点Pの通行人数を計数し移動体別特定地点別計数データを生成し、各記録装置10から解析装置20に送信された解析用データに基づくデータ(移動体別特定地点別計数データを含む)に応じて、解析装置20側で、移動体別特定地点別計数データを集約し上述した特定地点別計数データを生成してもよい。 For example, in the analysis system 1, some functions of the data preprocessing unit 23A and the data analysis processing unit 23B of the analysis device 20 may be provided on each recording device 10 side. For example, the analysis system 1 performs primary image analysis such as cutting out an image including a person on each recording device 10, and responds to data based on analysis data transmitted from each recording device 10 to the analysis device 20. The analysis device 20 may perform secondary image analysis such as counting the number of people passing by and analyzing personal attributes. Further, for example, the analysis system 1 individually counts the number of people passing by the specific point P for each moving body V on each recording device 10 side, generates counting data for each specific point for each moving body, and analyzes the data from each recording device 10. According to the data based on the analysis data transmitted to 20 (including the counting data for each specific point for each moving body), the analysis device 20 aggregates the counting data for each specific point for each moving body and counts for each specific point described above. Data may be generated.
 以上で説明した出力装置Dは、デジタルディスプレイの他、屋外ボード、壁面シート、自立式看板等であってもよい。 The output device D described above may be an outdoor board, a wall sheet, a self-standing signboard, or the like, in addition to a digital display.
 本実施形態に係る解析システムは、以上で説明した実施形態、変形例の構成要素を適宜組み合わせることで構成してもよい。 The analysis system according to the present embodiment may be configured by appropriately combining the components of the embodiments and modifications described above.
1 解析システム
10 記録装置(データ収集装置)
11 外部カメラ
12 位置情報測定器
13、21 データ入出力部
14 制御部
14A 記憶部
14B 処理部
20 解析装置(データ解析装置)
22 記憶部
22A 解析対象DB
22B 解析参照DB
22C 解析結果DB
23 処理部
23A データ前処理部
23B データ解析処理部
23C データ加工処理部
CL クライアント端末
D 出力装置
n 仮想期間推定通行人数
P 特定地点
R 移動経路
t 仮想計数期間
V 移動体
1 Analysis system 10 Recording device (data collection device)
11 External camera 12 Position information measuring device 13, 21 Data input / output unit 14 Control unit 14A Storage unit 14B Processing unit 20 Analysis device (data analysis device)
22 Storage unit 22A Analysis target DB
22B analysis reference DB
22C analysis result DB
23 Processing unit 23A Data preprocessing unit 23B Data analysis processing unit 23C Data processing processing unit CL Client terminal D Output device n Virtual period Estimated number of passersby P Specific point R Movement route t Virtual counting period V Moving object

Claims (7)

  1.  移動体に搭載され、当該移動体の移動に伴って撮像された当該移動体の外部の画像を表す画像データ、及び、前記移動体の外部の画像が撮像された位置を表す位置データを含む解析用データを収集するデータ収集装置と、
     前記データ収集装置によって収集された前記解析用データに基づいて、前記移動体の移動経路における特定地点の通行人数を計数するデータ解析装置とを備えることを特徴とする、
     解析システム。
    Analysis including image data representing an image outside the moving body mounted on the moving body and captured with the movement of the moving body, and position data representing the position where the image outside the moving body is captured. A data collection device that collects data for
    It is characterized by including a data analysis device that counts the number of people passing by a specific point in the movement path of the moving body based on the analysis data collected by the data collection device.
    Analysis system.
  2.  前記データ解析装置は、前記位置データに基づいて、前記解析用データから前記特定地点の画像を含む前記画像データを抽出し、当該抽出した前記画像データが表す画像に含まれる人物の数を計数し集約して前記特定地点の通行人数を計数する、
     請求項1に記載の解析システム。
    The data analysis device extracts the image data including the image of the specific point from the analysis data based on the position data, and counts the number of people included in the image represented by the extracted image data. Aggregate and count the number of people passing by at the specific point,
    The analysis system according to claim 1.
  3.  前記データ解析装置は、前記画像データを構成する1つの静止画像に含まれる人物の数を、予め設定される仮想計数期間において計数される仮想期間推定通行人数であるものとして計数し、当該仮想期間推定通行人数に基づいて、前記特定地点の通行人数を計数する、
     請求項1又は請求項2に記載の解析システム。
    The data analysis device counts the number of people included in one still image constituting the image data as the estimated number of people passing through the virtual period counted in a preset virtual counting period, and counts the virtual period. Count the number of people passing by at the specific point based on the estimated number of people passing by.
    The analysis system according to claim 1 or 2.
  4.  前記データ解析装置は、前記解析用データに基づいて、複数の前記特定地点の通行人数を計数し、当該複数の前記特定地点の通行人数を集約して前記移動経路における通行人数を計数する、
     請求項1乃至請求項3のいずれか1項に記載の解析システム。
    Based on the analysis data, the data analysis device counts the number of people passing through the specific points, aggregates the number of people passing through the plurality of specific points, and counts the number of people passing through the movement route.
    The analysis system according to any one of claims 1 to 3.
  5.  前記データ収集装置は、複数の前記移動体にそれぞれ搭載され、
     前記データ解析装置は、複数の前記データ収集装置によって収集された前記解析用データに基づいて、前記特定地点の通行人数を計数する、
     請求項1乃至請求項4のいずれか1項に記載の解析システム。
    The data collecting device is mounted on each of the plurality of moving bodies.
    The data analysis device counts the number of people passing by at the specific point based on the analysis data collected by the plurality of data collection devices.
    The analysis system according to any one of claims 1 to 4.
  6.  前記データ解析装置は、前記解析用データに基づいて、前記特定地点ごとに、前記画像データが表す画像に含まれる人物の属性を解析する、
     請求項1乃至請求項5のいずれか1項に記載の解析システム。
    The data analysis device analyzes the attributes of a person included in the image represented by the image data at each specific point based on the analysis data.
    The analysis system according to any one of claims 1 to 5.
  7.  前記データ解析装置は、前記特定地点の通行人数に基づいて、当該特定地点におけるコンテンツ受容可能範囲を通過した通過者人数を表す指標を算出する、
     請求項1乃至請求項6のいずれか1項に記載の解析システム。
    The data analysis device calculates an index representing the number of passersby who have passed the content acceptable range at the specific point based on the number of passersby at the specific point.
    The analysis system according to any one of claims 1 to 6.
PCT/JP2020/007954 2019-03-27 2020-02-27 Analysis system WO2020195508A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-059611 2019-03-27
JP2019059611A JP6914983B2 (en) 2019-03-27 2019-03-27 Analysis system

Publications (1)

Publication Number Publication Date
WO2020195508A1 true WO2020195508A1 (en) 2020-10-01

Family

ID=72609277

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/007954 WO2020195508A1 (en) 2019-03-27 2020-02-27 Analysis system

Country Status (3)

Country Link
JP (1) JP6914983B2 (en)
TW (1) TW202036457A (en)
WO (1) WO2020195508A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090011364A (en) * 2007-07-26 2009-02-02 한국미디어렙(주) Method for measuring the effect of out of house advertisement using people counter module server
JP2018022343A (en) * 2016-08-03 2018-02-08 株式会社東芝 Image processing system and image processing method
JP2018116692A (en) * 2017-01-13 2018-07-26 キヤノン株式会社 Human flow analysis apparatus and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090011364A (en) * 2007-07-26 2009-02-02 한국미디어렙(주) Method for measuring the effect of out of house advertisement using people counter module server
JP2018022343A (en) * 2016-08-03 2018-02-08 株式会社東芝 Image processing system and image processing method
JP2018116692A (en) * 2017-01-13 2018-07-26 キヤノン株式会社 Human flow analysis apparatus and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HARA, YUSUKE: "Proposal of pedestrian flow estimation by deep learning using an in-vehicle camera", IPSJ SIG TECHNICAL REPORT: INTELLIGENT TRANSPORT SYSTEMS AND SMART COMMUNITY(ITS) 2018-ITS-072, vol. 3, 1 March 2018 (2018-03-01), pages 1 - 8 1 *

Also Published As

Publication number Publication date
TW202036457A (en) 2020-10-01
JP2020160811A (en) 2020-10-01
JP6914983B2 (en) 2021-08-04

Similar Documents

Publication Publication Date Title
JP6999237B2 (en) Guidance system
US9489581B2 (en) Vehicle counting and emission estimation
US20150227965A1 (en) Method and system for evaluting signage
WO2020100922A1 (en) Data distribution system, sensor device, and server
US20200074507A1 (en) Information processing apparatus and information processing method
US20200043058A1 (en) Advertising system and information processing method
WO2020048116A1 (en) Economic geographic factor post-processing analysis and mining method and system
US11825383B2 (en) Method, apparatus, and computer program product for quantifying human mobility
JP2015210713A (en) Driving recorder and cloud road-information operation system using the same
US20210117694A1 (en) Methods and systems for determining emergency data for a vehicle
WO2019193817A1 (en) Analysis system
JP7264028B2 (en) Information providing system, information providing method, information terminal and information display method
WO2020090310A1 (en) Analysis system
WO2020195508A1 (en) Analysis system
US20180234802A1 (en) Action analysis method, recording medium having recorded therein action analysis program, and action analysis system
US20200134673A1 (en) Information processing apparatus and information processing method
KR20220122832A (en) Apparatus and method for riding notification of mobility on demand
JP2021124633A (en) Map generation system and map generation program
WO2015170385A1 (en) Transportation means identification system, transportation means identification method, and computer-readable non-transient storage medium
US11252379B2 (en) Information processing system, information processing method, and non-transitory storage medium
Kutsch et al. TUMDOT–MUC: Data Collection and Processing of Multimodal Trajectories Collected by Aerial Drones
KR20160014189A (en) Apparatus for providing advertisement using advertising media in mass transportation and method thereof
JP7417686B2 (en) Vehicle occupant gaze detection system and usage method
US20210124955A1 (en) Information processing system, information processing method, and non-transitory storage medium
CN116958915B (en) Target detection method, target detection device, electronic equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20779131

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20779131

Country of ref document: EP

Kind code of ref document: A1