WO2020195508A1 - Système d'analyse - Google Patents

Système d'analyse Download PDF

Info

Publication number
WO2020195508A1
WO2020195508A1 PCT/JP2020/007954 JP2020007954W WO2020195508A1 WO 2020195508 A1 WO2020195508 A1 WO 2020195508A1 JP 2020007954 W JP2020007954 W JP 2020007954W WO 2020195508 A1 WO2020195508 A1 WO 2020195508A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
analysis
specific point
image
processing unit
Prior art date
Application number
PCT/JP2020/007954
Other languages
English (en)
Japanese (ja)
Inventor
綾乃 河江
Original Assignee
矢崎エナジーシステム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 矢崎エナジーシステム株式会社 filed Critical 矢崎エナジーシステム株式会社
Publication of WO2020195508A1 publication Critical patent/WO2020195508A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to an analysis system.
  • Patent Document 1 discloses an information display system including a traffic information acquisition unit, an information output unit, and a display unit.
  • the traffic information acquisition unit acquires traffic information related to the passage of a person.
  • the information output unit selectively outputs information based on the traffic information acquired by the traffic information acquisition unit.
  • the display unit displays the information selectively output by the information output unit at a place where a person corresponding to the traffic information is passing.
  • this information display system for example, by displaying information according to the traffic volume of a person based on the traffic information acquired by the traffic information acquisition unit, it is possible to charge according to the display effect of the information.
  • an index showing a tendency of the flow of people at an arbitrary point or region such as a traffic volume of a person and various indexes calculated based on the traffic volume
  • the analysis system may analyze the tendency of the flow of a person in a movement path in which a moving body such as a bus moves. In such a case, it is desired that the tendency of the flow of a person can be analyzed appropriately. ing.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an analysis system capable of appropriately analyzing the tendency of a person's flow.
  • the analysis system is mounted on a moving body, and image data representing an image outside the moving body, which is captured with the movement of the moving body, and the moving body. Identification in the movement path of the moving body based on the data collecting device that collects the analysis data including the position data representing the position where the external image of the moving body is captured and the analysis data collected by the data collecting device. It is characterized by being provided with a data analysis device that counts the number of people passing by at a point.
  • the data analysis device extracts the image data including the image of the specific point from the analysis data based on the position data, and includes the image data represented by the extracted image data.
  • the number of people passing by can be counted and aggregated to count the number of people passing by at the specific point.
  • the data analysis device is an estimated number of people passing through a virtual period in which the number of people included in one still image constituting the image data is counted in a preset virtual counting period. The number of people passing by at the specific point can be counted based on the estimated number of people passing through the virtual period.
  • the data analysis device counts the number of people passing through the plurality of specific points based on the analysis data, aggregates the number of people passing through the plurality of specific points, and sets the number of people passing through the specific points in the movement route. The number of passersby can be counted.
  • the data collection device is mounted on each of the plurality of moving objects, and the data analysis device is based on the analysis data collected by the plurality of data collection devices, and the specific point is specified. It is possible to count the number of people passing by.
  • the data analysis device can analyze the attributes of a person included in the image represented by the image data at each specific point based on the analysis data.
  • the data analysis device can calculate an index representing the number of passers-by who have passed the content acceptable range at the specific point based on the number of passersby at the specific point.
  • the analysis system can collect image data and analysis data including position data by a data collecting device mounted on a moving body.
  • the image data collected as the analysis data is data representing an image outside the moving body, which is captured as the moving body moves.
  • the data analysis device can count the number of people passing by at a specific point in the movement path of the moving body based on the analysis data collected by the data collection device.
  • this analysis system has the effect of being able to properly analyze the tendency of the flow of people.
  • FIG. 1 is a block diagram showing a schematic configuration of an analysis system according to an embodiment.
  • FIG. 2 is a schematic diagram showing a movement path of a moving body of the analysis system according to the embodiment and an example of a specific point.
  • FIG. 3 is a schematic diagram showing an example of collecting image data related to a specific point by the recording device of the analysis system according to the embodiment.
  • FIG. 4 is a schematic diagram showing an example of a still image for explaining a virtual counting period in the analysis system according to the embodiment.
  • FIG. 5 is a schematic diagram showing an example of a still image for explaining a virtual counting period in the analysis system according to the embodiment.
  • FIG. 6 is a schematic diagram showing an example for explaining the estimated number of people passing through the virtual period in the analysis system according to the embodiment.
  • FIG. 7 is a schematic diagram showing an example for explaining the number of people passing by in the target time zone in the analysis system according to the embodiment.
  • FIG. 8 is a schematic diagram showing an example in which the number of passersby counted in the analysis system according to the embodiment is aggregated on a daily basis.
  • FIG. 9 is a schematic diagram showing an example for explaining the counting of the number of people passing by the movement route in the analysis system according to the embodiment.
  • FIG. 10 is a schematic diagram showing an example of analysis result data analyzed and processed in the analysis system according to the embodiment.
  • FIG. 11 is a flowchart showing an example of processing in the analysis system according to the embodiment.
  • the analysis system 1 of the present embodiment shown in FIG. 1 includes a recording device 10 as a data collecting device and an analysis device 20 as a data analysis device, and transfers the analysis result data analyzed by the analysis device 20 to the client terminal CL. It is a system to provide.
  • the analysis system 1 of the present embodiment utilizes the recording device 10 mounted on the moving body V and analyzes the tendency of the flow of a person based on the image data or the like collected by the recording device 10. More specifically, in the analysis system 1 of the present embodiment, as illustrated in FIG. 2, the number of people passing by the specific point P in the movement path R of the moving body V is based on the analysis data collected by the recording device 10. By counting the above, a configuration is realized in which the tendency of the flow of a person at a specific point P or a movement path R can be appropriately analyzed.
  • the configuration of the analysis system 1 will be described in detail with reference to each figure.
  • the movement route R is a route on which the moving body V moves, and corresponds to, for example, a route such as a bus described later.
  • the specific point P is a specific point on the movement route R, and is arbitrarily set as a desired point for counting the number of people passing by.
  • the specific point P is, for example, a characteristic point in the movement route R such as a bus stop or an intersection, a point where an output device D or the like capable of outputting content is installed, a point where content from the output device D can be received, or the like. May be included.
  • the output device D is a device capable of outputting contents.
  • the output device D may constitute a so-called cloud service type device that is mounted on the network and various contents are provided via the network, or constitutes a so-called stand-alone type device that is separated from the network. You may.
  • the output device D includes a display capable of displaying an image according to the content, a speaker capable of outputting sound / sound according to the content, and the like.
  • As the content output by the output device D for example, in addition to content such as advertisements and coupons, various guidance information such as area information, route information to a predetermined facility, evacuation route / safety support information in the event of a disaster, etc. are configured. It may contain content.
  • the content data output by the output device D can be sequentially updated via a network, a recording medium, or the like.
  • the recording device 10 is mounted on the moving body V and collects analysis data used for analysis by the analysis device 20.
  • the analysis data collected by the recording device 10 is data including image data and position data.
  • the image data is data representing an image outside the moving body V, which is captured as the moving body V moves.
  • the position data is data representing the position where the image outside the moving body V is captured.
  • the recording device 10 collects image data and position data as analysis data.
  • the analysis data is used for analysis of the tendency of the flow of a person by the analysis device 20.
  • the moving body V on which the recording device 10 is mounted is, for example, a vehicle traveling on a road surface, for example, a private car, a rental car, a sharing car, a ride sharing car, a bus, a taxi, a truck, a transport vehicle, or a work. It is a car etc.
  • the moving body V is not limited to a vehicle, and may be a flying body such as a flying car or a drone that flies in the air.
  • the moving body V typically moves on a predetermined movement route R, and is, for example, capable of moving on a plurality of predetermined routes as a predetermined movement route R.
  • the moving body V of the present embodiment will be described as a fixed-route bus that repeatedly travels on a plurality of predetermined routes (moving route R) during a day.
  • the moving body V such as a fixed-route bus
  • one moving body V travels on a plurality of routes in one day, and a plurality of moving bodies V are on a plurality of routes. It may be operated depending on the usage.
  • the recording device 10 of the present embodiment is mounted on each of the plurality of moving bodies V that move on the plurality of routes in this way. That is, the analysis system 1 of the present embodiment includes a plurality of recording devices 10 mounted on each of the plurality of moving bodies V, and can collect analysis data from the plurality of recording devices 10.
  • the recording device 10 includes an external camera 11, a position information measuring device 12, a data input / output unit 13, and a control unit 14.
  • the recording device 10 for example, an in-vehicle device such as a so-called drive recorder mounted on the moving body V can be used, but the recording device 10 is not limited to this.
  • the external camera 11 is an external imaging device that captures an image of the outside of the moving body V.
  • the external camera 11 captures an image outside the moving body V as the moving body V moves, and collects image data representing an image outside the moving body V.
  • the external camera 11 typically captures a moving image outside the moving body V and collects image data representing the moving image.
  • a moving image is a time-series arrangement of still images of a plurality of frames.
  • the external camera 11 is installed on the moving body V so as to have an angle of view capable of capturing a person to be analyzed by the analysis system 1, here, a person located on a road outside the moving body V.
  • a plurality of external cameras 11 may be provided on the front portion, side portion, rear portion, roof portion, etc.
  • the external camera 11 may be a monocular camera or a stereo camera. Further, the image captured by the external camera 11 may be monochrome or color.
  • the external camera 11 is communicatively connected to the control unit 14, and outputs the collected image data to the control unit 14.
  • the position information measuring device 12 is a positioning device that measures the current position of the moving body V.
  • a GPS receiver or the like that receives radio waves transmitted from a GPS (Global Positioning System) satellite can be used.
  • the position information measuring device 12 receives radio waves transmitted from GPS satellites and acquires GPS information (latitude / longitude coordinates) as information indicating the current position of the moving body V, thereby capturing an image of the outside of the moving body V. Collect position data that represents the position.
  • the position information measuring device 12 is communicably connected to the control unit 14, and outputs the collected position data to the control unit 14.
  • the data input / output unit 13 inputs / outputs various data between a device different from the recording device 10 and the recording device 10.
  • the data input / output unit 13 of the present embodiment can output analysis data to the analysis device 20 which is a device different from the recording device 10.
  • the data input / output unit 13 may be configured to input / output data to / from a device different from the recording device 10 by, for example, communication via a network (whether wired or wireless). Further, even if the data input / output unit 13 has a slot unit and inputs / outputs data to / from a device different from the recording device 10 via a recording medium inserted into the slot unit, for example. Good.
  • the recording medium is, for example, a memory (removable media) that can be attached to and detached from the recording device 10 via the slot portion.
  • a memory removable media
  • various types of memory cards such as SD cards can be used, but the recording medium is not limited to this.
  • the control unit 14 comprehensively controls each unit of the recording device 10.
  • the control unit 14 executes various arithmetic processes for collecting analysis data.
  • the control unit 14 is mainly a well-known microcomputer including a central processing unit (CPU) (Central Processing Unit), a GPU (Graphics Processing Unit) and other central arithmetic processing units, a ROM (Read Only Memory), a RAM (Random Access Memory), and an interface. It is configured to include an electronic circuit.
  • the control unit 14 is communicably connected to each unit such as the external camera 11, the position information measuring device 12, and the data input / output unit 13, and can exchange various signals and data with each other.
  • control unit 14 includes a storage unit 14A and a processing unit 14B.
  • the storage unit 14A and the processing unit 14B can exchange various signals and data with each other.
  • the storage unit 14A stores conditions and information necessary for various processes in the processing unit 14B, various programs and applications executed by the control unit 14, control data, and the like.
  • the storage unit 14A can store the analysis data together with the collected time and the like.
  • the analysis data also includes time data and other data representing the time when the data was collected.
  • the storage unit 14A can also temporarily store various data generated in the process of processing by the processing unit 14B, for example. In the storage unit 14A, these data are read out as needed by the processing unit 14B, the data input / output unit 13, and the like.
  • the storage unit 14A can rewrite data such as a relatively large-capacity storage device such as a hard disk, SSD (Solid State Drive), or an optical disk, or data such as RAM, flash memory, NVSRAM (Non Volatile Static Random Access Memory), or the like. It may be a semiconductor memory.
  • a relatively large-capacity storage device such as a hard disk, SSD (Solid State Drive), or an optical disk, or data such as RAM, flash memory, NVSRAM (Non Volatile Static Random Access Memory), or the like. It may be a semiconductor memory.
  • the processing unit 14B executes various programs stored in the storage unit 14A based on various input signals and the like, and when the program operates, outputs an output signal to each unit and realizes various functions. Execute the process.
  • the processing unit 14B controls the operation of the external camera 11 and the position information measuring device 12, and executes a process of collecting analysis data including image data and position data. Further, the processing unit 14B executes a process related to data input / output via the data input / output unit 13.
  • the processing unit 14B executes, for example, a process of outputting analysis data to the analysis device 20 via the data input / output unit 13.
  • the analysis device 20 analyzes the analysis data collected by the recording device 10 and provides the analysis result data representing the analysis result to the client terminal CL.
  • the analysis device 20 and the client terminal CL may constitute a so-called cloud service type device (cloud server) mounted on the network, or a so-called stand-alone type device separated from the network. May be good.
  • the analysis device 20 of the present embodiment counts the number of people passing by the specific point P in the movement path R of the moving body V based on the analysis data collected by the recording device 10.
  • the analysis device 20 counts the number of people passing by the specific point P based on the analysis data collected by the plurality of recording devices 10.
  • the analysis device 20 of the present embodiment counts the number of people passing by the plurality of specific points P based on the analysis data, aggregates the number of people passing by the plurality of specific points P, and aggregates the number of people passing by the plurality of specific points P to move the moving body V.
  • R for example, counts the number of people passing on a particular line.
  • the analysis device 20 of the present embodiment analyzes the attributes of the person included in the image represented by the image data for each specific point P or for each movement path R based on the analysis data. Then, the analysis device 20 of the present embodiment generates analysis result data based on the counting result of the number of passing people, the analysis result of the attribute of the person, and the like, and provides the analysis result data to the client terminal CL.
  • the analysis device 20 executes various arithmetic processes for counting various passers-by based on the analysis data. Further, the analysis device 20 executes various arithmetic processes for analyzing the attributes of a person based on the analysis data.
  • the analysis device 20 includes a central processing unit such as a CPU and a GPU, a ROM, a RAM, and an electronic circuit mainly composed of a well-known microcomputer including an interface.
  • the analysis device 20 can also be configured by installing an application that realizes various processes described below on a known computer system such as a PC or workstation. Further, the analysis device 20 may be configured by combining a plurality of PCs so as to be able to communicate with each other.
  • the analysis device 20 includes a data input / output unit 21, a storage unit 22, and a processing unit 23.
  • the data input / output unit 21, the storage unit 22, and the processing unit 23 can exchange various signals and data with each other.
  • the data input / output unit 21 inputs / outputs various data between a device different from the analysis device 20 and the analysis device 20.
  • the data input / output unit 21 of the present embodiment can input analysis data from a recording device 10 which is a device different from the analysis device 20. Further, the data input / output unit 21 of the present embodiment can output the analysis result data to the client terminal CL, which is a device different from the analysis device 20. Similar to the data input / output unit 13, the data input / output unit 21 has a configuration for inputting / outputting data to / from a device different from the analysis device 20 by, for example, communication via a network (whether wired or wireless). It may be. Similarly, the data input / output unit 21 has a configuration in which, for example, data is input / output to / from a device different from the analysis device 20 via a recording medium having a slot unit and inserted into the slot unit. May be good.
  • the storage unit 22 stores conditions and information necessary for various processes in the processing unit 23, various programs and applications executed by the processing unit 23, control data, and the like.
  • the storage unit 22 can store the analysis data input by the data input / output unit 21.
  • the storage unit 22 can also temporarily store various data generated in the process of processing by the processing unit 23, for example. In the storage unit 22, these data are read out as needed by the data input / output unit 21, the processing unit 23, and the like.
  • the storage unit 22 may be, for example, a relatively large-capacity storage device such as a hard disk, SSD, or optical disk, or a semiconductor memory such as RAM, flash memory, or NVSRAM that can rewrite data.
  • the storage unit 22 functionally conceptually includes an analysis target database (hereinafter abbreviated as “analysis target DB”) 22A and an analysis reference database (hereinafter abbreviated as “analysis reference DB”). It includes 22B and an analysis result database (hereinafter, abbreviated as “analysis result DB”) 22C.
  • analysis target DB an analysis target database
  • analysis reference DB an analysis reference database
  • analysis result DB an analysis result database
  • the analysis target DB 22A is a part that accumulates analysis data (image data, position data, time data, etc.) that is analysis target data by the processing unit 23, creates a database, and stores it.
  • the analysis data input from the recording device 10 to the data input / output unit 21 is stored in the analysis target DB 22A.
  • the analysis reference DB 22B is a part that accumulates the analysis reference data to be referred to when the processing unit 23 analyzes the analysis data, creates a database, and stores the data.
  • the analysis reference data includes, for example, map reference data, attribute prediction reference data, and the like.
  • the map reference data is data representing a map to be referred to when specifying the position of the moving body V based on the position data or the like, in other words, the position where the image outside the moving body V is captured.
  • the attribute prediction reference data is data to be referred to when estimating the attributes of a person included in the image represented by the image data. The attribute prediction reference data will be described in detail later.
  • the analysis reference data is referred to by the processing unit 23 when analyzing the analysis data.
  • the analysis result DB 22C is a part that accumulates the analysis result data representing the analysis result of the analysis data by the processing unit 23, creates a database, and stores it.
  • the analysis result data includes, for example, a counting result of the number of people passing by a specific point P (counting data by specific point), a movement route R including a plurality of specific points P, for example, a counting result of the number of people passing by a specific route (by a specific route). It is data based on the count data), the analysis result of the attribute of the counted person (person attribute data), and the like.
  • the analysis result data is processed into a desired format by the processing unit 23, output from the data input / output unit 21 to the client terminal CL, and provided.
  • the various data stored in the analysis target DB 22A, the analysis reference DB 22B, and the analysis result DB 22C can be utilized as so-called big data (big data).
  • the processing unit 23 executes various programs stored in the storage unit 22 based on various input signals and the like, and executes various processes for analyzing analysis data when the programs operate. In addition, the processing unit 23 executes a process of processing the analysis result data into a desired format. Further, the processing unit 23 executes a process related to data input / output via the data input / output unit 21. The processing unit 23 executes, for example, a process of outputting the analysis result data processed into a desired format to the client terminal CL via the data input / output unit 21.
  • processing unit 23 is functionally conceptually configured to include a data preprocessing unit 23A, a data analysis processing unit 23B, and a data processing processing unit 23C.
  • the data preprocessing unit 23A is a portion that performs various preprocessing on the analysis data that is the analysis target data.
  • the data preprocessing unit 23A reads, for example, analysis data to be analysis target data from analysis target DB 22A, and cuts out a still image for each frame from the moving image represented by the image data of the analysis target data.
  • the data preprocessing unit 23A performs, for example, a process of associating the cut out still image with the position represented by the position data of the analysis data and the time represented by the time data of the analysis data. Execute.
  • the data analysis processing unit 23B is a part that counts the number of passing people based on the analysis data preprocessed by the data preprocessing unit 23A.
  • the data analysis processing unit 23B counts the number of people passing by at a specific point P arbitrarily set based on the image data included in the analysis data.
  • the data analysis processing unit 23B counts the number of people passing by based on the number of people included in the image represented by the image data.
  • the data analysis processing unit 23B executes a process of detecting and extracting a person from a still image cut out by the data preprocessing unit 23A based on the image data by using various known image processing techniques. Then, the data analysis processing unit 23B counts the number of detected and extracted persons, and counts the number of people passing by at the specific point P based on the counted number of persons.
  • the data analysis processing unit 23B divides the position data of the analysis data preprocessed by the data preprocessing unit 23A and the map reference data (analysis reference data) stored in the analysis reference DB 22B. Based on this, for example, as shown in FIG. 3, image data including an image of a specific point P is extracted. More specifically, the data analysis processing unit 23B extracts image data of a moving image obtained by capturing a region in a predetermined range including a specific point P from a plurality of analysis data stored in the analysis target DB 22A. At this time, the data analysis processing unit 23B also reads the time data associated with the extracted image data from the analysis target DB 22A, and specifies the time when the extracted image data is collected.
  • the image data extracted here is data representing a moving image obtained by capturing a region in a predetermined range including a specific point P in a specific period (for example, a period from time A to time B).
  • the moving image represented by the image data is composed of a plurality of frames of still images obtained by capturing a region in a predetermined range including a specific point P.
  • the data analysis processing unit 23B of the present embodiment is an image of a moving image including a specific point P as described above from a plurality of analysis data collected by a plurality of recording devices 10 mounted on each of the plurality of moving bodies V. Extract all data. Further, the data analysis processing unit 23B extracts all the image data as described above, including the analysis data collected at different dates and times. Then, the data analysis processing unit 23B uses various known image processing techniques as described above to count and aggregate the number of persons included in the still image cut out from the moving image represented by the extracted image data. The number of people passing by the specific point P is counted.
  • the person included in the image represented by the image data corresponds to a passerby at the specific point P, and includes a pedestrian, a person riding a bicycle, and the like.
  • the data analysis processing unit 23B of the present embodiment executes the following processing in a plurality of still images constituting the moving image of the extracted image data, for example, in order to avoid duplicate counting of the same person. That is, here, the data analysis processing unit 23B assumes that the number of people included in one still image constituting the image data is the virtual period estimated number of passersby n counted in the preset virtual counting period t. Count. Then, the data analysis processing unit 23B counts the number of people passing by the specific point P based on the estimated number of people passing by in the virtual period n.
  • the preset virtual counting period t is typically a case where the number of people included in one still image is actually measured at a specific point P by a staff member using a counter or the like. , It is a value that defines how many seconds the number of people pass by. Further, the virtual counting period t is, for example, as illustrated in FIG. 4, a person included in one still image, assuming that the person is moving at a passerby speed v assumed in advance. It corresponds to the required period required to measure and count the same number of people at a specific point P at a fixed point.
  • the required period is approximately the same as the period required to reach the specific point P when, for example, the person appearing at the end of the still image moves toward the specific point P at the passerby speed v assumed in advance. It can be regarded as equivalent. That is, the virtual counting period t is, for example, when the person who appears at the end of the still image among the persons included in one still image moves toward the specific point P at the passerby speed v assumed in advance. It can be approximated by the period required to reach the specific point P.
  • the virtual counting period t is typically determined according to the installation position of the external camera 11 for collecting image data, the camera angle of view, the assumed passerby speed v, etc., regardless of the moving speed of the moving body V.
  • the range of one still image constituting the image data is geometrically determined in the real space according to the installation position of the external camera 11 in the moving body V, the camera angle of view, and the like, and the depth of the still image in the real space is determined.
  • the distance L from the position corresponding to the end to the specific point P can also be calculated geometrically. For example, in the still image illustrated in FIG. 4, the distance L is assumed to be 10 [m], and the passerby speed v is assumed to be about 5 [km / h], which is the average walking speed of an adult.
  • the person who appears at the end of the still image reaches the specific point P when the person moves toward the specific point P at the passerby speed v assumed in advance.
  • the data analysis processing unit 23B of the present embodiment counts the number of people included in one still image constituting the image data by assuming that the number of people passing by is the estimated number of people passing through the virtual period.
  • the virtual period estimated number of people n is the number of people who are virtually counted in the virtual counting period t preset as described above.
  • the data analysis processing unit 23B can obtain one virtual period estimated number of passersby n from a moving image constituting one image data including a specific point P.
  • the moving image represented by the image data is composed of a plurality of frames of still images obtained by capturing a region in a predetermined range including a specific point P. Therefore, the data analysis processing unit 23B processes as follows in order to obtain one virtual period estimated number of passersby n from the moving image constituting one image data. That is, in this case, the data analysis processing unit 23B extracts an arbitrary one from the still images of a plurality of frames constituting one moving image, for example, as illustrated in FIG. The data analysis processing unit 23B extracts, for example, a still image at a time substantially in the center from a still image of a plurality of frames.
  • the data analysis processing unit 23B obtains one virtual period estimated number of passersby n by counting the virtual period estimated number of passersby n from the extracted still image.
  • the data analysis processing unit 23B estimates one virtual period by, for example, calculating the average value of the virtual period estimated number of people n of all the still images of a plurality of frames constituting the moving image once counted. It is also possible to obtain the number of passers n.
  • the data analysis processing unit 23B sets one virtual period estimated number of passers-by n obtained from one image data including the specific point P as described above, for example, at a time substantially in the center of the period in which the image data is captured. It is processed as if it is the number of passersby counted at the specific point P. For example, if the virtual period estimated number of passersby n is obtained from the image data captured in [9:00 to 9:02], the data analysis processing unit 23B sets the virtual period estimated number of passersby n to [9:01]. ] Is processed as the number of passersby counted at the specific point P.
  • the data analysis processing unit 23B sets the data analysis processing unit 23B at [9:01] as illustrated in FIG.
  • the number of people passing by the specific point P is counted by assuming that 15 [persons] have passed the specific point P during 7.2 [sec].
  • the data analysis processing unit 23B processes the virtual period estimated number of passersby n as the number of passersby counted at the specific point P at a time substantially in the center of the period in which the image data is captured as described above. I explained that it should be done, but it is not limited to this. If the time is within the period, the data analysis processing unit 23B may process, for example, the virtual period estimated number of passersby n as the number of passersby counted at the specific point P at the start time of the period. Then, it may be processed as the number of passersby counted at the specific point P at the end time of the period.
  • the virtual period estimated number of people n as described above is typically obtained once each time the moving body V equipped with the recording device 10 passes the specific point P once.
  • the data analysis processing unit 23B calculates the virtual period estimated number of passers-by n for each specific point P based on the plurality of analysis data collected by the plurality of recording devices 10 mounted on the plurality of moving bodies V, respectively. , Acquire a plurality of virtual period estimated number of people n at each specific point P.
  • the data analysis processing unit 23B calculates the number of people N at the specific point P in the target time zone for which the number of people is desired to be counted, based on the estimated number of people n in the virtual period at the predetermined time counted as described above. To do.
  • the data analysis processing unit 23B sets the unit counting period T in advance, and calculates the number of people N passing by the specific point P in the unit counting period T of the target time zone.
  • the unit counting period T is the minimum counting period used as a reference for counting the number of people passing by at a specific point P, and is arbitrarily set as a delimiter unit period for counting the number of people passing by. Will be done.
  • the data analysis processing unit 23B calculates the average value n ⁇ ave of all the virtual period estimated number of passersby n obtained in the unit counting period T of the target time zone. Then, the data analysis processing unit 23B multiplies the average value n ⁇ ave by the value obtained by dividing the unit counting period T by the virtual counting period t, so that the specific point P in the unit counting period T in the target time zone The number of people passing by N can be calculated. That is, the data analysis processing unit 23B sets the unit counting period to "T", the virtual counting period to "t", and sets the average value of all the virtual period estimated passers-by n obtained in the unit counting period T of the target time zone.
  • N n ⁇ ave ⁇ [T / t] ⁇ ⁇ ⁇ (1)
  • the case where n ⁇ ave 10 [person] is illustrated.
  • the plurality of symbols / bar graphs 100 in FIG. 7 represent the virtual period estimated number of passers-by n obtained in the unit counting period T of the target time zone, respectively.
  • the data analysis processing unit 23B calculates the number of people N passing by the specific point P in the unit counting period T of each target time zone, and aggregates them as a predetermined unit, for example, a daily unit.
  • FIG. 8 shows an example in which the number of people N passing by the specific point P in the unit counting period T of each target time zone is aggregated on a daily basis.
  • the plurality of symbols / bar graphs 200 in FIG. 8 represent the number of people passing N at the specific point P in the unit counting period T of each target time zone, and the symbol 300 is obtained in the unit counting period T of each target time zone. It represents the estimated number of people passing through the virtual period.
  • the accuracy of the number of people N passing at a specific point P in the unit counting period T of each target time zone tends to improve as the number of virtual period estimated number of people n obtained in the unit counting period T of each target time zone increases. is there. Conversely, if a large number of virtual period estimated passers-by n can be obtained evenly in each time zone, the passers-by N can ensure relatively high accuracy even if the unit counting period T is relatively short. This enables detailed analysis of the number of people passing by in a shorter period of time.
  • the data analysis processing unit 23B aggregates the number of people passing by N, it is not limited to the daily unit as described above, but aggregates in any desired unit such as a weekly unit, a monthly unit, and a day of the week unit. can do.
  • the data analysis processing unit 23B of the present embodiment performs the process of counting the number of people passing by the specific point P as described above for each of the plurality of set specific points P. Then, the data analysis processing unit 23B generates specific point-specific counting data representing the counting result of the number of passing persons at each specific point P counted as described above as the analysis result data obtained by analyzing the analysis data.
  • the specific point-specific counting data is data including various information calculated in the process of calculating the number of people N passing by each specific point P in the unit counting period T of each target time zone calculated as described above and the number of people passing N. is there. Then, the data analysis processing unit 23B stores the analysis result data including the generated counting data for each specific point in the analysis result DB 22C, creates a database, and stores the data.
  • the data analysis processing unit 23B of the present embodiment further counts the number of people passing by the plurality of specific points P, aggregates the number of people passing by the plurality of specific points P, and counts the number of people passing by the movement route R. ..
  • the data analysis processing unit 23B aggregates the number of people passing by the specific point P included in the moving route R (for example, a specific route) to be counted, and counts the number of people passing by the moving route R. To do.
  • the data analysis processing unit 23B generates, as the analysis result data obtained by analyzing the analysis data, the counting data for each specific route representing the counting result of the number of people passing by each moving route R counted as described above.
  • the specific route-specific counting data is data including various information calculated in the process of calculating the number of people passing by and the number of people N passing by each specific point P in the unit counting period T of each target time zone. Then, the data analysis processing unit 23B stores the analysis result data including the generated count data for each specific route in the analysis result DB 22C, creates a database, and stores the data.
  • the data analysis processing unit 23B of the present embodiment also analyzes the attributes of a person included in the image represented by the image data based on the analysis data preprocessed by the data preprocessing unit 23A. is there. Typically, the data analysis processing unit 23B analyzes the attributes of the person counted as described above.
  • the data analysis processing unit 23B analyzes the attributes of the person detected and extracted from the still image cut out by the data preprocessing unit 23A based on the image data.
  • the data analysis processing unit 23B typically analyzes the attributes of the person included in the image represented by the image data at each of the above-mentioned specific points P.
  • the data analysis processing unit 23B can also analyze the attributes of a person included in the image represented by the image data for each movement path R in which a plurality of specific points P are assembled.
  • the data analysis processing unit 23B extracts, for example, the analysis data for each specific point P from the analysis data collected by the plurality of recording devices 10 based on the position data and the like. Then, the data analysis processing unit 23B analyzes the attributes of the person included in the image represented by the image data based on the extracted analysis data, so that each of the plurality of specific points P is in the vicinity of the specific points P. Analyze the attributes of the person who is located and counted as described above. The attribute analysis of the person for each movement route R is almost the same as the case of the specific point P. The data analysis processing unit 23B analyzes the attributes of a person based on the analysis data collected along with the movement of the moving body V for each of a plurality of specific points P and each movement route R.
  • the data analysis processing unit 23B uses, for example, various known artificial intelligence techniques and deep learning techniques to attribute the person included in the image represented by the image data, and the attributes. Is configured to be able to execute the process of analyzing the flow of the identified person.
  • the data analysis processing unit 23B executes a process of detecting and extracting a person from the still image cut out by the data preprocessing unit 23A as described above. Then, the data analysis processing unit 23B of the present embodiment executes a process of extracting an image including the feature points of the detected and extracted person from the image represented by the image data.
  • the feature point of the person is a part where the attribute of the person can be specified in the person included in the image.
  • the characteristic points of the person are, for example, parts such as a face on which the person's facial expression appears, limbs on which gestures / gestures appear, and positions where accessories and the like tend to be easily attached.
  • the image data of the present embodiment is collected by the recording device 10 mounted on the moving body V as the moving body V moves, many images of the same person taken from different angles are obtained. It is likely to be included. Taking advantage of this, the data preprocessing unit 23A captures the feature points of the person that can be used to identify the attributes of the person from a large number of images taken from different angles as the moving body V moves. By extracting, as much data as possible to be used for identifying the attributes of a person is secured.
  • the data analysis processing unit 23B executes a process of analyzing the attributes of the person included in the image based on the image including the feature points of the person extracted from the image data.
  • the data analysis processing unit 23B is based on, for example, the attribute prediction reference data (analysis reference data) stored in the analysis reference DB 22B and the feature points of the person included in the image extracted from the image data. Analyze the attributes.
  • the attribute prediction reference data reflects the result of learning the attributes of the person that can be estimated according to the characteristic points of the person included in the image by various methods using artificial intelligence technology and deep learning technology. Information.
  • the attribute prediction reference data was created into a database using various methods using artificial intelligence technology and deep learning technology in order to estimate the attributes of the person based on the characteristic points of the person included in the image. It is data.
  • This attribute prediction reference data can be updated sequentially.
  • the analysis result data personal attribute data itself representing the analysis result by the data analysis processing unit 23B can be used as the data for learning.
  • the attributes of a person analyzed by the data analysis processing unit 23B typically include matters that can be analyzed from the characteristics of the appearance of the person, such as the person's gender, age, physique, social status, and preferences. Or, it includes behavioral orientation.
  • the gender is an attribute representing the distinction between male and female.
  • Age is an attribute that represents the length of years from birth to the present (at that time).
  • the physique is an attribute representing height, weight, various dimensions, and the like.
  • Social status is an attribute that represents occupation (self-employed, businessman, police officer, student, unemployed, part-time job), annual income, status, companion, etc.
  • Preference is an attribute that represents the tendency of clothes / belongings / fashion (casual orientation, elegant orientation, brand orientation, luxury orientation, fast fashion orientation), hobbies (sports / subculture / outdoor / beauty, etc.).
  • Behavioral orientation is an attribute that expresses the mood, interests (what you want to do, where you want to go), etc. at that time. That is, here, the data analysis processing unit 23B estimates gender, age, physique, social status, preference, behavioral orientation, etc. as the attributes of the person.
  • the data analysis processing unit 23B refers to the attribute prediction reference data and extracts the attributes (gender, age, physique, social status, preference, or behavior orientation) corresponding to the characteristic points of the person included in the image. It is presumed that the extracted attributes are the attributes of the person reflected in the image.
  • the data analysis processing unit 23B refers to the attribute prediction reference data according to, for example, facial expressions, gestures / gestures of limbs, attached accessories, clothes, etc., which are characteristic points of the person included in the image. The attributes that match the characteristic points are matched, and the attributes such as gender, age, physique, social status, preference, and behavioral orientation of the person are estimated.
  • the data analysis processing unit 23B executes a process of analyzing the position of the person whose attributes are specified as described above based on the position data associated with the image data whose attributes are specified. ..
  • the data analysis processing unit 23B reads, for example, the position data associated with the image data in which the attribute of the person is specified from the analysis target DB 22A. Then, the data analysis processing unit 23B analyzes the position and the like of the person whose attribute is specified based on the map reference data (analysis reference data) stored in the analysis reference DB 22B and the read position data. For example, the data analysis processing unit 23B refers to the map reference data and identifies the position where the image is captured based on the position data. Then, the data analysis processing unit 23B identifies the position of the person whose attribute is specified based on the position represented by the position data.
  • the data analysis processing unit 23B obtains the person attribute data representing the attribute of the person analyzed for each specific point P and the position of the person whose attribute is specified as described above. Generate position data for each attribute to be represented.
  • the data analysis processing unit 23B includes personal attribute data representing the attributes of the person analyzed for each movement path R in which a plurality of specific points P are assembled, and attribute-specific position data representing the position of the person whose attributes are specified. Also generate. Then, the data analysis processing unit 23B accumulates the generated person attribute data and the analysis result data including the attribute-specific position data in the analysis result DB 22C and stores them in a database.
  • the data analysis processing unit 23B of the present embodiment further generates, as analysis result data, commercial use data based on specific point-specific counting data, specific route-specific counting data, personal attribute data, attribute-specific position data, and the like. May also be configured to be viable.
  • the data analysis processing unit 23B of the present embodiment can calculate, for example, an index representing the number of passersby who have passed the content acceptable range at the specific point P.
  • the content acceptable range at the specific point P is a spatial range in which the person can accept the content output by the output device D at the specific point P, and the visible range in which the person can visually recognize the image displayed by the output device D.
  • the sound / voice output by the output device D is determined according to the audible range in which a person can hear.
  • the data analysis processing unit 23B is, for example, a passerby who has passed the content acceptable range for each specific point P in which the output device D exists in the vicinity, based on the count data for each specific point representing the number of people passing by the specific point P. Calculate an index showing the number of people.
  • the data analysis processing unit 23B may calculate the index for each of a plurality of movement routes R (for each of a plurality of routes) based on the counting data for each specific route. Then, the data analysis processing unit 23B generates commercial use data representing the index, stores the analysis result data including the generated commercial use data in the analysis result DB 22C, and stores it in a database.
  • the number of passers-by who have passed the above-mentioned content acceptable range can be typically regarded as the number of people who have received the content output at the specific point P.
  • the content acceptable range can be typically regarded as a region near the specific point P at the specific point P where the output device D exists in the vicinity. Therefore, it can be considered that the number of passers-by who have passed the content acceptable range at the specific point P is substantially the same as the number of passersby at the specific point P.
  • the data analysis processing unit 23B of the present embodiment sets the number of people passing by the specific point P as the number of people passing through the content acceptable range. That is, here, the data analysis processing unit 23B sets the number of people passing by the specific point P represented by the counting data for each specific point as the number of passersby in the content acceptable range for each specific point P. Similarly, the data analysis processing unit 23B sets the number of passersby for each movement route R represented by the counting data for each specific route as the number of passersby in the content acceptable range for each movement route R.
  • the data analysis processing unit 23B may use the number of passers-by who have passed the content acceptable range itself as an index showing the number of passers-by, or calculate an index showing the number of passers-by based on the number of passers-by. You may. Examples of the index representing the number of passers-by calculated by the data analysis processing unit 23B include "DEC: Daily Effective Circulation” and "GRP: Gross Rating Point". Both "DEC” and “GRP” are indicators of the effectiveness of advertising. “DEC” is typically the number of passers-by per day that pass through the content acceptable range (visible range) of the target ad.
  • DEC may be the number of passers-by for people who meet a predetermined age limit, such as 18 years of age or older, or may be the number of passers-by for all people without an age limit. ..
  • GRAP is typically the ratio of the number of passers-by per day to the content acceptable range in the target population within the area reachable per day for the target advertisement.
  • GRP can be represented by [DEC / target population in the target area].
  • the "target population in the target area” is the population that satisfies the age limit in the target area when the age limit is set for the target of "DEC".
  • the data analysis processing unit 23B can calculate "DEC” and "GRP" for each specific point P as an index showing the number of passers based on the number of passers in the content acceptable range for each specific point P. .. Similarly, the data analysis processing unit 23B calculates "DEC” and "GRP" for each movement route R as an index indicating the number of passers based on the number of passersby in the content acceptable range for each movement route R. be able to.
  • the data analysis processing unit 23B generates commercial use data representing "DEC” and "GRP" for each specific point P and each movement route R as an index showing the number of passersby who have passed the content acceptable range. Then, the analysis result data including the generated commercial use data can be stored in the analysis result DB 22C and stored in a database.
  • the data processing unit 23C is a part that executes a process of processing the analysis result data analyzed by the data analysis processing unit 23B into a desired format as described above.
  • the data processing unit 23C processes the counting data for each specific point, the counting data for each specific route, the person attribute data, the human flow data for each attribute, the commercial use data, etc. included in the analysis result data into a desired format. For example, as illustrated in FIG. 10, the data processing unit 23C identifies when, which movement route (route) R, which specific point P, and how many people have what attributes. "DEC", "GRP" for each point P, "DEC", “GRP” for each movement route R, etc. are plotted on a map and processed into various graphs, diagrams, and the like.
  • the processing unit 23 executes a process of outputting the analysis result data processed in a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21.
  • the client terminal CL is a terminal that makes it possible to use the analysis result data provided by the analysis device 20 for various purposes such as trade area surveys, marketing, advertisements, judgment materials when determining advertising fees, disaster prevention, and city planning. Is.
  • the client terminal CL is composed of, for example, a notebook PC, a desktop PC, a tablet PC, a smartphone, a mobile terminal, and the like.
  • the plurality of recording devices 10 mounted on the plurality of moving bodies V each collect analysis data including image data and position data as the moving body V moves (step S1).
  • the recording device 10 outputs the collected analysis data via the data input / output unit 13 and inputs the collected analysis data to the analysis device 20 via the data input / output unit 21 of the analysis device 20 (step S2).
  • the analysis data input to the analysis device 20 is stored in the analysis target DB 22A.
  • the data preprocessing unit 23A of the analysis device 20 performs various preprocessing as described above on the analysis data stored in the analysis target DB 22A (step S3).
  • the data analysis processing unit 23B of the analysis device 20 performs analysis based on the analysis data preprocessed by the data preprocessing unit 23A, and as the analysis result data, count data for each specific point and each specific route. Counting data, personal attribute data, personal flow data by attribute, commercial use data, etc. are generated (step S4).
  • the data analysis processing unit 23B accumulates the generated analysis result data such as the count data for each specific point, the count data for each specific route, the person attribute data, the human flow data for each attribute, and the commercial use data in the analysis result DB 22C and creates a database. It is memorized (step S5).
  • the data processing unit 23C of the analysis device 20 receives the analysis result data (counting data for each specific point, counting data for each specific route, person) stored in the analysis result DB 22C in response to a request from the client terminal CL or the like. Attribute data, personal flow data by attribute, commercial use data, etc.) are processed into a desired format as illustrated in FIG. 10 (step S6).
  • the processing unit 23 of the analysis device 20 outputs and provides the analysis result data processed in a desired format by the data processing unit 23C to the client terminal CL via the data input / output unit 21 (step S7). Ends a series of processes.
  • the analysis system 1 described above can collect image data and analysis data including position data by the recording device 10 mounted on the moving body V.
  • the image data collected as the analysis data is data representing an image outside the moving body V, which is captured as the moving body V moves.
  • the analysis device 20 can count the number of people passing by the specific point P in the movement path R of the moving body V based on the analysis data collected by the plurality of recording devices 10. As a result, this analysis system 1 can appropriately analyze the tendency of the flow of a person at each specific point P in the movement path R in which the moving body V moves.
  • the analysis system 1 determines the tendency of the flow of people at each specific point P in the movement route R analyzed as described above, such as trade area survey, marketing, advertising, judgment material when determining advertising fees, disaster prevention / city planning, etc. It can be used for various purposes. Further, the analysis system 1 can reduce the workload of counting and significantly improve the frequency of counting itself, as compared with the case where the number of people passing by is manually counted, for example, by visual inspection. As a result, the analysis system 1 can analyze the tendency of the flow of a person more accurately over a long period of time.
  • the analysis system 1 described above extracts image data including an image of a specific point P from the analysis data based on the position data by the analysis device 20, and includes the image data represented by the extracted image data.
  • the number of people passing by at a specific point P is counted and aggregated.
  • the analysis system 1 reduces the work load of counting and greatly improves the frequency of counting itself, as compared with the case where the number of people passing by the specific point P is manually counted, for example, visually. Can be done.
  • the analysis system 1 can analyze the tendency of the flow of a person more accurately.
  • the analysis device 20 counts the number of persons included in one still image constituting the image data in a preset virtual counting period t. It is counted assuming that the estimated number of people passing by is n. Then, the analysis system 1 counts the number of people passing by the specific point P based on the estimated number of people passing by in the virtual period n. As a result, the analysis system 1 can suppress duplicate counting of the same person between a plurality of still images without using, for example, a conventional tracking algorithm, and thus analyze the tendency of the flow of the person more accurately. can do.
  • the analysis system 1 described above aggregates the number of people passing by the plurality of specific points P counted by the analysis device 20 and counts the number of people passing by the movement route R.
  • this analysis system 1 can appropriately analyze the tendency of the flow of a person for each movement path R in which the moving body V moves, and can be used for various purposes.
  • the analysis system 1 can also utilize the tendency of the flow of people for each movement route R, for example, to analyze the advertising effect of a moving body such as a wrapping bus.
  • the analysis system 1 described above counts the number of people passing by the specific point P and the movement route R based on the analysis data collected by the plurality of recording devices 10 mounted on the plurality of moving bodies V, respectively. ..
  • this analysis system 1 for example, in order to improve the efficiency of vehicle allocation, one moving body V moves on a plurality of routes (moving route R) in one day, and a plurality of moving bodies V are present. Even in the case of operating properly over the route, it is possible to efficiently collect analysis data related to the specific point P from all the moving bodies V that have passed through the specific point P. As a result, since the analysis system 1 can collect more analysis data and use it for the analysis, it is possible to more accurately analyze the tendency of the flow of the person for each specific point P and each movement route R.
  • the analysis system 1 described above further analyzes the attributes of the person included in the image represented by the image data for each specific point P and each movement path R based on the analysis data by the analysis device 20.
  • this analysis system 1 can analyze not only the number of people passing by but also the attributes of people counted as the number of people passing by as the tendency of the flow of people for each specific point P and each movement route R.
  • the analysis system 1 can grasp not only the number of people passing by each specific point P and each movement route R, but also the attribute tendency of the person counted as the number of people passing, and conversely, it is desired. It is possible to easily identify a specific point P, a movement route R, etc., which have many passersby with an attribute tendency.
  • the analysis system 1 can more preferably utilize the tendency of the flow of people for each specific point P and each movement route R for various purposes as described above.
  • the analysis system 1 described above is based on the number of people passing by each specific point P and each movement route R, and the number of passersby who have passed the content acceptable range for each specific point P and each movement route R. Calculate the index representing.
  • the analysis system 1 suitably utilizes the index for each specific point P and each movement route R as a judgment material when, for example, determining a usage fee (advertising fee, etc.) for the content output from the output device D. be able to.
  • the analysis system 1 can suitably utilize the attribute tendency of the passerby for each specific point P and each movement route R as, for example, as a judgment material when determining the content to be output from the output device D.
  • the moving body V on which the recording device 10 is mounted moves on a predetermined movement path R, and is described as being capable of moving on a plurality of predetermined routes, for example. Not limited to this. That is, the moving body V has been described as being a fixed-route bus that repeatedly travels on a plurality of predetermined routes during the day, but the present invention is not limited to this. Further, the recording device 10 has been described as being mounted on each of a plurality of moving bodies V moving on a plurality of routes, but the present invention is not limited to this. For example, the recording device 10 may only be mounted on one passenger car. That is, the analysis device 20 has been described as counting the number of people passing by the specific point P based on the analysis data collected by the plurality of recording devices 10, but the present invention is not limited to this.
  • the analysis device 20 has been described as counting the number of people passing by the movement route R as well as the number of people passing by the specific point P, but the present invention is not limited to this. Further, the analysis device 20 has been described as analyzing the attributes of a person included in the image represented by the image data for each specific point P and each movement path R based on the analysis data, but the present invention is not limited to this. The analysis device 20 may at least count the number of people passing by at the specific point P.
  • the analysis device 20 counts the number of people included in one still image constituting the image data as assuming that it is a virtual period estimated number of passersby n counted in a preset virtual counting period t.
  • the description is made assuming that the number of passersby at the specific point P is counted based on the estimated number of passersby n in the virtual period, but the present invention is not limited to this.
  • the control unit 14 and the analysis device 20 described above may be configured such that each unit is separately configured and the respective units are connected to each other so that various electric signals can be exchanged with each other. It may be realized by other control devices. Further, the programs, applications, various data and the like described above may be updated as appropriate, or may be stored in a server connected to the analysis system 1 via an arbitrary network. The programs, applications, various data, and the like described above can be downloaded in whole or in part as needed. Further, for example, with respect to the processing functions provided in the control unit 14 and the analysis device 20, all or any part thereof may be realized by, for example, a CPU or a program interpreted and executed by the CPU or the like. Further, it may be realized as hardware by wired logic or the like.
  • the analysis system 1 performs primary image analysis such as cutting out an image including a person on each recording device 10, and responds to data based on analysis data transmitted from each recording device 10 to the analysis device 20.
  • the analysis device 20 may perform secondary image analysis such as counting the number of people passing by and analyzing personal attributes.
  • the analysis system 1 individually counts the number of people passing by the specific point P for each moving body V on each recording device 10 side, generates counting data for each specific point for each moving body, and analyzes the data from each recording device 10.
  • the analysis device 20 aggregates the counting data for each specific point for each moving body and counts for each specific point described above. Data may be generated.
  • the output device D described above may be an outdoor board, a wall sheet, a self-standing signboard, or the like, in addition to a digital display.
  • the analysis system according to the present embodiment may be configured by appropriately combining the components of the embodiments and modifications described above.
  • Analysis system 10 Recording device (data collection device) 11 External camera 12 Position information measuring device 13, 21 Data input / output unit 14 Control unit 14A Storage unit 14B Processing unit 20 Analysis device (data analysis device) 22 Storage unit 22A Analysis target DB 22B analysis reference DB 22C analysis result DB 23 Processing unit 23A Data preprocessing unit 23B Data analysis processing unit 23C Data processing processing unit CL Client terminal D Output device n Virtual period Estimated number of passersby P Specific point R Movement route t Virtual counting period V Moving object

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Strategic Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Geometry (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un système d'analyse (1) comprenant : un dispositif de collecte de données (10) monté sur un corps mobile (V) et qui collecte des données d'analyse comprenant des données d'image et des données d'emplacement, les données d'image représentant des images de l'extérieur du corps mobile (V) capturées en association avec le mouvement du corps mobile (V), et les données d'emplacement représentant les emplacements de capture des images de l'extérieur du corps mobile (V) ; et un dispositif d'analyse de données (20) qui, en fonction des données d'analyse collectées par le dispositif de collecte de données (10), compte le nombre de personnes qui passent des points spécifiques sur le trajet de déplacement du corps mobile (V). Le système d'analyse (1) peut ainsi analyser correctement des tendances dans des flux de personnes.
PCT/JP2020/007954 2019-03-27 2020-02-27 Système d'analyse WO2020195508A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-059611 2019-03-27
JP2019059611A JP6914983B2 (ja) 2019-03-27 2019-03-27 解析システム

Publications (1)

Publication Number Publication Date
WO2020195508A1 true WO2020195508A1 (fr) 2020-10-01

Family

ID=72609277

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/007954 WO2020195508A1 (fr) 2019-03-27 2020-02-27 Système d'analyse

Country Status (3)

Country Link
JP (1) JP6914983B2 (fr)
TW (1) TW202036457A (fr)
WO (1) WO2020195508A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090011364A (ko) * 2007-07-26 2009-02-02 한국미디어렙(주) 피플카운터 모듈서버를 이용한 옥외광고 효과 조사 방법
JP2018022343A (ja) * 2016-08-03 2018-02-08 株式会社東芝 画像処理装置、および画像処理方法
JP2018116692A (ja) * 2017-01-13 2018-07-26 キヤノン株式会社 人流解析装置およびシステム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090011364A (ko) * 2007-07-26 2009-02-02 한국미디어렙(주) 피플카운터 모듈서버를 이용한 옥외광고 효과 조사 방법
JP2018022343A (ja) * 2016-08-03 2018-02-08 株式会社東芝 画像処理装置、および画像処理方法
JP2018116692A (ja) * 2017-01-13 2018-07-26 キヤノン株式会社 人流解析装置およびシステム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HARA, YUSUKE: "Proposal of pedestrian flow estimation by deep learning using an in-vehicle camera", IPSJ SIG TECHNICAL REPORT: INTELLIGENT TRANSPORT SYSTEMS AND SMART COMMUNITY(ITS) 2018-ITS-072, vol. 3, 1 March 2018 (2018-03-01), pages 1 - 8 1 *

Also Published As

Publication number Publication date
TW202036457A (zh) 2020-10-01
JP2020160811A (ja) 2020-10-01
JP6914983B2 (ja) 2021-08-04

Similar Documents

Publication Publication Date Title
JP6999237B2 (ja) 案内システム
US9489581B2 (en) Vehicle counting and emission estimation
US20150227965A1 (en) Method and system for evaluting signage
WO2020100922A1 (fr) Procédé de distribution de données, dispositif capteur et serveur
US20200074507A1 (en) Information processing apparatus and information processing method
US20200043058A1 (en) Advertising system and information processing method
WO2020048116A1 (fr) Procédé et système d'exploration et d'analyse post-traitement de facteur géographique économique
US11825383B2 (en) Method, apparatus, and computer program product for quantifying human mobility
JP2015210713A (ja) ドライブレコーダおよびこれを用いたクラウド型道路情報等運用システム
US20210117694A1 (en) Methods and systems for determining emergency data for a vehicle
WO2019193817A1 (fr) Système d'analyse
JP7264028B2 (ja) 情報提供システム、情報提供方法、情報端末及び情報表示方法
WO2020090310A1 (fr) Système d'analyse
WO2020195508A1 (fr) Système d'analyse
US20180234802A1 (en) Action analysis method, recording medium having recorded therein action analysis program, and action analysis system
US20200134673A1 (en) Information processing apparatus and information processing method
KR20220122832A (ko) 온-디맨드 모빌리티의 승차 안내 장치 및 그 방법
JP2021124633A (ja) 地図生成システム及び地図生成プログラム
WO2015170385A1 (fr) Système d'identification de moyens de transport, procédé d'identification de moyens de transport, et support de stockage non transitoire lisible par ordinateur
US11252379B2 (en) Information processing system, information processing method, and non-transitory storage medium
Kutsch et al. TUMDOT–MUC: Data Collection and Processing of Multimodal Trajectories Collected by Aerial Drones
KR20160014189A (ko) 대중 교통 수단 내 광고 매체를 이용한 광고 제공 장치 및 방법
JP7417686B2 (ja) 車両乗員注視検出システムおよび使用方法
US20210124955A1 (en) Information processing system, information processing method, and non-transitory storage medium
CN116958915B (zh) 目标检测方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20779131

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20779131

Country of ref document: EP

Kind code of ref document: A1